Historical Maps into Minecraft: My Workflow

The folks at the New York Public Library have a workflow and python script for translating historical maps into Minecraft. It’s a three-step (quite big steps) process. First, they generate a DEM (digital elevation model) from the historical map, using QGIS. This is saved as ‘elevation.tiff’. Then, using Inkscape, they trace over the features from the historical map that they want to translate into Minecraft. Different colours equal different kinds of blocks. This is saved as ‘features.tiff’. Then, using a custom python script, the two layers are combined to create a minecraft map, which can either be in ‘creative’ mode or ‘survival’ mode.

There are a number of unspoken steps in that workflow, including a number of dependencies for the python script that have to be installed first. Similarly, QGIS and its plugins also have a steep (sometimes hidden) learning curve. As does Inkscape. And Imagemagick. This isn’t a criticism; it’s just the way this kind of thing works. The problem, from my perspective, is that if I want to use this in the classroom, I have to guide 40 students with widely varying degrees of digital fluency.* I’ve found in the past that many of my students “didn’t study history to have to work with computers” and that the payoff sometimes (to them) doesn’t seem to have (immediate) value. The pros and cons of that kind of work shall be the post for another day.

Right now, my immediate problem is, how can I smooth the gradient of the learning curve? I will do this by providing 3 separate paths for creating the digital elevation model.

Path 1, for when real world geography is not the most important aspect.

It may be that the shape of the world described by the historical map is what is of interest, rather than the current topography of the world. For example, I could imagine a student wanting to explore the historical geography of the Chats Falls before they were flooded by the building of a hydro dam. Current topographic maps and DEMs are not useful. For this path, the student will need to use the process described by the NYPL folks:

Requirements

QGIS 2.2.0 ( http://qgis.org )

  • Activate Contour plugin
  • Activate GRASS plugin if not already activated

A map image to work from

  • We used a geo-rectified TIFF exported from this map but any high rez scan of a map with elevation data and features will suffice.

Process:

Layer > Add Raster Layer > [select rectified tiff]

  • Repeat for each tiff to be analyzed

Layer > New > New Shapefile Layer

  • Type: Point
  • New Attribute: add ‘elevation’ type whole number
  • remove id

Contour (plugin)

  • Vector Layer: choose points layer just created
  • Data field: elevation
  • Number: at least 20 (maybe.. number of distinct elevations + 2)
  • Layer name: default is fine

Export and import contours as vector layer:

  • right click save (e.g. port-washington-contours.shp)
  • May report error like “Only 19 of 20 features written.” Doesn’t seem to matter much

Layer > Add Vector Layer > [add .shp layer just exported]

Edit Current Grass Region (to reduce rendering time)

  • clip to minimal lat longs

Open Grass Tools

  • Modules List: Select “v.in.ogr.qgis”
  • Select recently added contours layer
  • Run, View output, and close

Open Grass Tools

  • Modules List: Select “v.to.rast.attr”
  • Name of input vector map: (layer just generated)
  • Attribute field: elevation
  • Run, View output, and close

Open Grass Tools

  • Modules List: Select “r.surf.contour”
  • Name of existing raster map containing colors: (layer just generated)
  • Run (will take a while), View output, and close

Hide points and contours (and anything else above bw elevation image) Project > Save as Image

You may want to create a cropped version of the result to remove un-analyzed/messy edges

The hidden, tacit bits here involve installing the Countour plugin, and working with GRASS tools (especially the bit about ‘editing the current grass region’, which always is fiddly, I find). Students pursuing this path will need a lot of one-on-one.

Path 2, for when you already have a shapefile from a GIS:

This was cooked up for me by Joel Rivard, one of our GIS & Map specialists in the Library. He writes,

1) In the menu, go to Layer > Add Vector Layer. Find the point shapefile that has the elevation information.
Ensure that you select point in the file type.
2) In the menu, go to Raster > Interpolation. Select “Field 3″ (this corresponds to the z or elevation field) for Interpolation attribute and click on “Add”.
Feel free to keep the rest as default and save the output file as an Image (.asc, bmp, jpg or any other raster – probably best to use .asc since that’s what MicroDEM likes.
We’ll talk about MicroDEM in a moment. I haven’t tested this path yet, myself. But it should work.

Path 3 For when modern topography is fine for your purposes

In this situation, modern topography is just what you need.

1. Grab Shuttle Radar Topography Mission data for the area you are interested in (it downloads as a tiff.)

2. Install MicroDEM and all of its bits and pieces (the installer wants a whole bunch of other supporting bits; just say yes. MicroDEM is PC software, but I’ve run it on a Mac within WineBottler).

3. This video tutorial covers working with MicroDEM and Worldpainter:

https://www.youtube.com/watch?v=Wha2m4_CPoo

But here’s some screenshots – basically, you open up your .tiff or your .asc image file within MicroDEM, crop to the area you are interested in, and then convert the image to grayscale:

MicroDEM: open image, crop image.

MicroDEM: open image, crop image.

Convert to grayscale

Convert to grayscale

Remove legends, marginalia

Remove legends, marginalia

Save your grayscaled image as a .tiff.
Regardless of the path you took (and think about the historical implications of those paths) you now have a gray scale DEM image that you can use to generate your mindcraft world.

Converting your grayscale DEM to a Minecraft World

At this point, the easiest thing to do is to use WorldPainter. It’s free, but you can donate to its developers to help them maintain and update it. Now, the video shown above shows how to load your DEM image into WorldPainter. It parses the black-to-white pixel values and turns them into elevations. You have the option of setting where ‘sea level’ is on your map (so elevations below that point are covered with water). There are many, many options here; play with it! Adam Clarke, who made the video, suggests scaling up your image to 900%, but I’ve found that that makes absolutely monstrous worlds. You’ll have to play around to see what makes most sense for you, but with real-world data of any area larger than a few kilometres on a side, I think 100 to 200% is fine.

Now, the crucial bit for us: you can import an image into WorldPainter to use as an overlay to guide the placement of blocks, terrain, buildings, whatever. So, rather than me simply regurgitating what Adam narrates, go watch the video. Save as a .world file for editing; export to Minecraft when you’re ready (be warned: big maps can take *a very long time* to render. That’s another reason why I don’t scale up the way Adam suggests).

Go play.

To get you started: here are a number of DEMs and WorldPainter world files that I’ve been playing with. Try ‘em out for yourself.

 

* another problem I’ve encountered is that my features colours don’t map onto the index values for blocks in the script. I’ve tried modifying the script to allow for a bit of fuzziness (a kind of, ‘if the pixel value is between x and y, treat as z’). I end up with worlds filled with water. If I run the script on the Fort Washington maps provided by NYPL, it works perfectly. The script is supposed to only be looking at the R of the RGB values when it assigns blocks, but I wonder if there isn’t something else going on. I had it work once, correctly, for me – but I used MS Paint to recolour my image with the exact colours from the Fort Washington map. Tried it again, exact same workflow on a different map, nada. Nyet. Zip. Zilch. Just a whole of tears and heartache.

Historical Maps into Minecraft

dowslakemap1847

Dow’s Lake area, settlement by 1847 Map Source: Bruce Elliott, Nepean, The City Beyond, page 23, posted on http://www.bytown.net/dowslake.htm

The folks over at the New York Public Library published an excellent & comprehensive tutorial for digitizing historical maps, and then importing them into Minecraft.

First: thank you!

Unfortunately, for me, it’s not working. I document here what I’ve been doing and ideally someone far more clever than me will figure out what needs to happen…

The first parts of the tutorial – working with QGIS & Inkscape – go very well (although there might be a problem with colours, but more on that anon). Let’s look at the python script for combining the elevation map (generated from QGIS) with the blocks map (generated from Inkscape). Oh, you also need to install imagemagick, which you then run from the command line, to convert SVG to TIF.

“The script for generating the worlds uses PIL to load the TIFF bitmaps into memory, and pymclevel to generate a Minecraft worlds, one block at a time. It’s run successfully on both Mac OS X and Linux.”

After digitizing, looks like this.

After digitizing, looks like this.

I’ve tried both Mac and Linux, with python installed, and PIL, and pymclevel. No joy (for the same reasons as for Windows, detailed below). Like most things computational, there are dependencies that we only uncover quite by accident…

Anyway, when you’ve got python installed on Windows, you can just type the python file name at the command prompt and you’re off. So I download pymclevel, unzip it, open a command prompt in that folder (shift + right click, ‘open command prompt here’), and type ‘setup.py’. Error message. Turns out, I need setuptools. Which I obtain from:

https://pypi.python.org/pypi/setuptools#windows-7-or-graphical-install

Download, install. Works. Ok, back to the pymclevel folder, setup.py, and new error message. Looks like I need something called ‘cython’.

http://cython.org/#download

I download, unzip, go to that folder, setup.py. Problem. Some file called ‘vcvarsall.bat’ is needed. Solution? Turns out I need to donwload Microsoft Visual Studio 10. Then, I needed to create an environment variable called ‘vs90comntools’, which I did by typing this at the command prompt:

set VS90COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\

Wunderbar. I go back to the pymclevel folder, I run setup.py again, and hooray! It installs. I had PIL installed from a previous foray into things pythonesque, so at least I didn’t have to fight with that again.

I copy the generate_map.py script into notepad++, change the file names within it (so that it finds my own elevation.tif and features.tif files, which are called hogs-elevation.tif and hogs-features.tif; the area I’m looking at is the Hogsback Falls section of the Rideau. In the script, just change ‘fort-washington’ to ‘hogs’ or whatever your files are called). In my folder, at the command prompt, I type generate_map.py and get a whole bunch of error messages: various ‘yaml’ files can’t be found.

Did I mention PyYaml has to be installed? Fortunately, it has a windows installer.  Oh, and by the way – PyWin is also needed; I got that error message at one point (something obscure about win32api), and downloading/installing from here solved it: http://sourceforge.net/projects/pywin32/files/pywin32/

Ok, so where were we? Right, missing yaml files, like ‘minecraft.yaml’ and ‘classic.yaml’, and ‘indev.yaml’ and ‘pocket.yaml’. These files were there in the original repository, but for whatever, they didn’t install into the pymclevel that now lives in the Python directory. So I went to the pymclevel repo on github, copied-and-pasted the code into new documents in notepad++, saved them as thus:

c:\Python27\Lib\site-packages\pymclevel-0.1-py2.7-win32.egg\pymclevel\minecraft.yaml

Phew. Back to where I was working on my maps, and have my generate_map.py, which I duly enter and…. error. can’t find ‘tree import Tree, treeObjs’.  Googling around to solve this is a fool’s errand: ‘tree’ is such a common word, concept in programming that I just can’t figure out what’s going on here. So I turned that line off with a # in the code. Run it again…. and it seems to work (but is this the key glitch that kills all that follows?).

(update: as Jonathan Goodwin points out, ‘tree.py’ is there, in the NYPL repo

…so I uncommented out the line in generate_map.py, saved tree.py in the same directory, and ran the script again. Everything that follows still happens. So perhaps there’s something screwed-up with my map itself.)

The script tells me I need to tell it whether I’m creating a creative mode map or a survival mode:

so for creative mode: c:>generate_map.py map

for survival: c:>generate_map.py game

And it chugs along. All is good with the world. Then: error message. KeyError: 255 in line 241, block_id, block_data, depth = block_id_lookup[block_id]. This is the bit of code that tells the script how to map minecraft blocks to the colour scheme I used in Inkcraft to paint the information from the map into my features.tif. Thing is, I never used RGB R value of 255. Where’s it getting this from? I go back over my drawing, inspecting each element, trying to figure it out. All seems good with the drawing. So I just add this line to the code in the table:

block_id_lookup = {

[..existing code...]

255 : (m.Water.ID, 0, 1),

}

And run it again. Now it’s 254. And then 253. Then 249. 246. 244. 241. Now 238.

And which point, I say piss on this, and I provide you with my features tif and elevation tif and if you can please tell me what I’m doing wrong, I’d be ever so appreciative (and here’s the svg with the drawing layers, for good measure).

….when I first saw the tutorial from the NYPL, I figured, hey! I could use this with my students! I think not, at least, not yet.

(update 2: have downloaded the original map tifs that the NYPL folks used, and am running the script on them. So far, so good: which shows that, once all this stuff is installed, that it’s my maps that are the problem. This is good to know!)

Part Two:

(updated about 30 minutes after initial post) So after some to-and-fro on Twitter, we’ve got the tree.py problem sorted out. Thinking that it’s the maps where the problem is, I’ve opened the original Fort Washinton features.tif in MS Paint (which is really an underappreciated piece of software). I’ve zoomed in on some of the features, and compared the edges with my own map (similarly opened and zoomed upon). In my map, there are extremely faint colour differentations/gradations where blocks of colour meet. This, I think, is what has gone wrong. So, back to Inkscape I go…

Update the Third: looks like I made (another) silly error – big strip of white on the left hand side of my features.tif. So I’ve stripped that out. But I can’t seem to suss the pixel antialiasing issue. Grrrrr! Am now adding all of the pixels into the dictionary, thus:

lock_id_lookup = {
0 : (m.Grass.ID, None, 2),
10 : (m.Dirt.ID, 1, 1), # blockData 1 == grass can’t spread
11 : (m.Dirt.ID, 1, 1), # blockData 1 == grass can’t spread
12 : (m.Dirt.ID, 1, 1), # blockData 1 == grass can’t spread
14 : (m.Dirt.ID, 1, 1), # blockData 1 == grass can’t spread
16 : (m.Grass.ID, None, 2),
20 : (m.Grass.ID, None, 2),
30 : (m.Cobblestone.ID, None, 1),
40 : (m.StoneBricks.ID, None, 3),
200 : (m.Water.ID, 0, 2), # blockData 0 == normal state of water
210 : (m.WaterActive.ID, 0, 1),
220 : (m.Water.ID, 0, 1),
49 : (m.StoneBricks.ID, None, 3),
43 : (m.StoneBricks.ID, None, 3),
}

…there’s probably a far more elegant way of dealing with this. Rounding? Range lookup? I’m not v. python-able…

Update, 2.20pm: Ok. I can run the script on the Fort Washington maps and end up with a playable map (yay!). But my own maps continue to contain pixels of colours the script doesn’t want to play with. I suppose I could just add 255 lines worth, as above, but that seems silly. The imagemagick command, I’m told, works fine on a mac, but doesn’t seem to achieve anything on my PC. So something to look into (and perhaps try this http://www.graphicsmagick.org/ instead). In the meantime, I’ve opened the Fort Washington map in good ol’ Paint, grabbing snippets of the colours to paste into my own map (also open in Paint). Then, I use Paint’s tools to clean up the colour gradients at the edges on my map. In essence, I trace the outlines.

Then, I save, run the script and…… success!

I have a folder with everything I need (and you can have it, too.) I move it to

C:\Users\[me]\AppData\Roaming\.minecraft\saves and fire up the game:

Rideau River in Minecraft!

Rideau River in Minecraft!

Does it actually look like the Hogs’ Back to Dow’s Lake section of the Rideau Canal and the Rideau River? Well, not quite. Some issues with my basic elevation points. But – BUT! – the workflow works! So now to find some better maps and to start again…

Interview by Ben Meredith, for his article on procedurally generated archaeology sims

I was interviewed by Ben Meredith on procedurally generated game worlds and their affinities with archaeology, for Kill Screen Magazine. The piece was published this morning. It’s a good read, and an interesting take on one of the more interesting recent developments in gaming. I asked Ben if I could post the unedited communication we had, from which he drew on for his article. He said ‘yes!’, so here it is.

Hi Ben,

It seems to me that archaeology and video games share a number of affinities, not least of which because they are both procedurally generated. There is a method for field archaeology; follow the method, and you will have correctly excavated the site/surveyed the landscape/recorded the standing remains/etc. These procedures contain within them various ways of looking at the world, and emphasize certain kinds of values over others, which is why it is possible to have a marxist archaeology, or a gendered archaeology, or so on. Thus, it also seems obvious to me that you can have an archaeology within video games (not to be confused with media archaeology, or an archaeology of video games). A great example of this kind of work is Andrew Rheinhart’s exploration of the beta of Elder Scrolls Online – you should touch base with him, too.http://archaeogaming.wordpress.com/2014/01/22/beta-testing-archaeology-in-elder-scrolls-online-taken-down/

On to your questions!

What motivated you to become an archaeologist?

Romance, mystery, allure, the ‘other’, the desire to travel… my initial impetus for getting into archaeology comes from the fact that I’m ‘from the bush’ in rural Canada and as a teenager I wanted so much more from the world. I now recognize that there’s some amazing archaeology in my own backyard (as it were) but I was too young and immature to recognize it then. The Greek Bronze Age, the Mycenaean heroes, the Minoans, Thera… all these captured my imagination. And there was no snow!

Personally, what single facet of archaeology captures the spirit of the field most effectively?

Check out the work of Colleen Morgan http://middlesavagery.wordpress.com/2014/03/05/stop-saying-archaeology-is-actually-boring/ and Sophie Hay http://pompei79.wordpress.com/2014/03/05/scratching-the-surface/ and Lorna Richardson http://digipubarch.org/2014/03/14/all-the-swears-for-this/ If there is a ‘spirit of the field’, I think these three scholars capture it admirably. They are curious, reflective, aware of the impact that the doing of archaeology has in the wider world. Archaeology produces powerful narratives, powerful ways of framing our current situation regarding the past and the present. I aspire to be more like these three remarkable women.

Which game do you think, so far, best achieves this?

A hard question to answer. But I think I’d go with Minecraft, for its community and especially its ability to be adopted in educational circles, for the way it requires the player to build and engage with the environments created. The world is what you make it, in Minecraft. So too in archaeology.
If a game attempted to procedurally generate ancient civilizations, what do you think would be the three most important elements that had to be generated?
I’ve done a lot of agent-based simulation. http://www.graeworks.net/category/simulations/ . Such a game would have to be built on an agent-based framework, for the NPCs. Each NPC would have to be unique. Those rules of behaviours that describe how the NPCs interact with each other, the environment, and the player would have to accurately capture the target ancient civilization. You can’t just have an ‘ancient civilization'; you’ll have to consider one very particular culture in one very particular time and place. That’s what a procedural rhetoric is all about: an argument in code about how this aspect of the world worked/is/existed.
Would investigation play an integral part in a video game interpretation?
I’m not sure I follow. Procedural generation on its own still is meaningless; it would have to be interpreted. The act of playing the game (and see the work of Roger Travis on http://playthepast.org on practicomimetics) sings it into existence.
Conversely, for you would stumbling blindly upon a ruin diminish the effect?
If the world is procedurally generated, then there would be clues in the landscape that would attune the attentive player to the presence of the past in that location. If there is no rhyme or reason – we stumble blindly – then the procedures do not describe an ancient (or any) civilization.

Do you think an archaeology simulator would be best implemented in first person (e.g. Minecraft) or third person (e.g. Terraria)? Would it be more important to convey an intimate atmosphere or impressive scale?
I like first person, but on a screen, first person can just induce nausea in the player. Maybe with an Oculus Rift that’s not a concern, in which case I’d say go first person! On a screen, I think third is better. Why not go AR and put your procedurally generated civilization into the local landscape?

Why I Play Games

(originally posted at #HIST3812, my course blog for this term’s History3812: Gaming and Simulations for Historians, at Carleton University).

I play because I enjoy video games, obviously, but I also get something else out of it.  Games are a ‘lively art'; they are an expressive art, and the artistry lies in encoding rules (descriptions) about how the world works at some microlevel: and then watching how this artistry is further expressed in the unintended consequences of those rules, their intersections, their cancellations, causing new phenomena to emerge.

This strikes me as the most profound use of humanities computation out there. Physicists tell us that the world is made of itty bitty things that interact in particular ways. In which case, everything else is emergent: including history. I’m not saying that there are ‘laws’ of human action; but we do live in this universe. So, if I can understand some small part of the way life was lived in the past, I can model that understanding, and explore the unintended outcomes of that understanding… and go back to the beginning and model those.

I grew up with the video game industry. Adventure? I played that. We had a vic-20 . If you wanted to play a game, you had to type it in yourself. There used to be a magaine (Compute!) that would have all of the code printed within, along with screenshots. Snake, Tank Wars – yep. My older brother would type, and I would read the individual letters (and spaces, and characters) out. After about a week, we’d have a game.

And there would be bugs. O lord, there were bugs.

When we could afford games, we’d buy text adventures from Infocom. In high school, my older brother programmed a quiz game as his history project for the year. Gosh, we were cool. But it was! Here we were, making the machine do things.

As the years went on, I stopped programming my own games. Graphics & technology had moved too fast. In college, we used to play Doom (in a darkened room, with the computer wired to the stereo. Beer often figured). We played SimCity. We played the original Civilization.

These are the games that framed my interactions with computers. Then, after I finished my PhD, I returned to programming when I realized that I could use the incredible artificial intelligences, the simulation engines, of modern games, to do research. To enhance my teaching.

I got into Agent Based Modeling, using the Netlogo platform. This turned my career around: I ceased to be a run-of-the-mill materials specialist (Roman archaeology), and became this new thing, a ‘digital humanist’. Turns out, I’m now an expert on simulation and history.

Cool, eh?

And it’s all down to the fact that I’m a crappy player of games. I get more out of opening the hood, looking at how the thing works. Civilization IV and V are incredible simulation engines. So: what kinds of history are appropriate to simulate? What kinds of questions can we ask? That’s what I’m looking forward to exploring with you (and of course, seeing what you come up with in your final projects).

But maybe a more fruitful question to start with, in the context of the final project of this course, is, ‘what is the strangest game you’ve ever played?’

What made it strange? Was it the content, the mechanics, the interface?

I played one once where you had to draw the platform with crayons, and then the physics engine would take over. The point was to try to get a ball to roll up to a star. Draw a teeter-totter under the star, and perhaps the ball would fall on it, shooting the star up to fall down on the ball, for instance. A neat way of interacting with the underlying physics of game engines.

I’d encourage everyone to think differently about what the games might be. For instance, I could imagine a game that shows real-time documents (grabbed from a database), and you have to dive into it, following the connected discourses (procedurally generated using topic models and network graphing software to find these – and if this makes no sense to you, take a quick peek at the Programming Historian) within it to free the voices trapped within…

This is why I play. Because it makes me think differently about the materials I encounter.

HIST3812, Gaming and Simulation for Historians

Finally, with a bit of space to breathe, I am turning to getting my HIST3812 Gaming and Simulation for Historians course put together. In response to student queries about what this course will explore, I’ve put together a wee comic book (to capture the aesthetic of playfulness about history that games & simulations naturally contain). I’m not a particularly good maker of comic books, but it does the trick, more or less.

See it on Issuu here

Stranger in These Parts – An Interactive Fiction for Teaching

One of the things I want my students to engage with in my ‘cities and countryside in antiquity’ class is the idea that in antiquity, one navigates space not with a two dimensional top-down mental map, but rather as a series of what-comes-next. That navigating required socializing, asking directions, paying attention to landmarks.  I’m in part inspired by R. Ling’s 1990 article, Stranger in Town, and in part by Elijah Meek’s and Walter Scheidel’s ORBIS project. Elijah and I have in fact been talking about marrying a text-based interface for Orbis for this very reason.

But I’m also interested in gaming, simulation and storytelling for their own merits, so I’m trying my hand at an interactive fiction written using Inform 7  along the same lines. Instead of interfacing directly with the model represented in Orbis, I’ve queried Orbis for travel data, and have begun to write a bit of a narrative around it. (One could’ve composed this in Latin, in which case you’d get not just the spatial ideas, but also the language learning too!).

Anyway, I present to you version 0.1, a beta (perhaps ‘alpha’ is more appropriate) for ‘Stranger in These Parts‘, by Shawn Graham. I’m using Playfic to host it. I’d be happy to hear your thoughts. (And a hint to get going: check to see what you’ve got on you, and ‘ask Eros’ about things…)

Obviously, some things are lacking at the moment. I’ll want the player to be able to select different modes of transport sometimes (and thus to skip settings). There’s a point system, but it’s meant more to signal to the students that there is more to find. Depending on which NPCs a student talks with, different kinds of routes should become available. Time passes within the IF, and so night time matters – no travel then.  As far as I know, there’s no such thing as multi-player IF or head-to-head IF, but that’d be fun if it were possible: can you get to Pompeii before your classmates?

In terms of the learning exercise, the students will play through this, and then explore the same territory in Orbis. In the light of their readings and experiences, I’ll be asking them to reflect on the Roman experience of space. Once we’ve done that, now being suitably disabused of 21st century views of how to navigate space, we’ll start looking at the landscape archaeology of other ancient cultures.

That’s the plan, at any rate.

 

Roman Prosperity & Caesar IV

nb. I found this post lurking in a dark nether region of my wordpress dashboard, and it appears I never published it. So here it is!

Having spent a great deal of time in my thesis pondering the mysteries of Roman economics, it is curious to see how a city-builder game like Caesar IV demands many of the same skills – working with cost ratios, determining how much of a particular resource certain kinds of activities consume, distance & profit calculations – see for instance the discussion here and the tables here. Then go and study something like The Baths of Caracalla by Janet DeLaine. It is all strangely similar. I would have done better to have spent a few months playing the game and then looking at my copy of Finley or Hopkins. I’m not saying that the assumptions that underlie the game mechanics are analogous to the actual workings of the Roman economy; I’m saying that the game foregrounds the interconnectedness of production, consumption, taxes and society. I am constantly running out of money & resources as I play the game, which brings a whole new appreciation to the problems of monetary flow in the Roman world.

Augmenting Archaeology

[Originally posted on Play the Past, October 13]

In Matthew Johnson’s excellent ‘Ideas of Landscape‘ (Blackwell, 2006), he talks about the way archaeologists and historians look at landscape. (See Bill Caraher’s blog review piece). Landscape-as-palimpsest has been one of the most powerful metaphors for understanding landscapes and how they are formed. (A palimpsest being a manuscript that has been overwritten numerous times, with the earlier layer being erased more-or-less completely so that the parchment can be reused.) Johnson argues that the metaphor is too strong, in that while it helps us to untangle what we are looking at, it deflects attention away from what this ‘text’, with its grammar and sentences, all actually means (58).

In which case, I wonder if a playful approach to landscape might be useful? Instead of teaching students to ‘read’ landscapes, might ‘playing’ landscapes be better at generating meanings that go deeper? Katie Salen and Eric Zimmerman, in ‘Rules of Play: Game Design Fundamentals’ (MIT, 2004) situate meaningful play as a relationship between “player action and system outcome”, where both are “discernable and integrated into the larger context of the game” (34). Discernability in a game relates to being able to actually perceive whether an action had some sort of outcome; integration means that the action has consequences for later stages of play (35).  I’m suggesting then that the game of reading landscapes would not stop at just reading some aspect of past human activity in the landscape. Rather, it would be framed as part of some game whereby it competes with other readings of the landscape, where the story of the landscape emerges from game play, through some process of physically annotating & crafting competing visions of the palimpsest.

In short, an augmented reality game.

Image from CuriousLee's photostream on Flicr, with CC Attribution 2.0Panoramio augmented reality layer in Layar Iphone App, by CuriousLee

There are many competing definitions of augmented reality out there, but I think it is simplest to think of it as the imposition of layers of information onto the day-to-day experience of life. A GIS is not an augmented reality; but if you could project a GIS map of census data onto the field of view of a person standing in that street, it would be AR. Similarly, a device projecting Twitter streams about a place or made in that locale onto a screen in that place, is also a kind of AR. A Terminator style helmet with heads up display is AR; a two-dimensional bar code that whenscanned loads up a wiki page about the scanned object onto your cell phone is also AR. There is then a lot of potential ways to realize AR, and most of them no longer require lots of money or specialized equipment.

My putative reading-the-landscape game would be smartphone based.  It would involved annotating the world with what you are seeing, with some sort of ratings mechanism to ‘lock in’ readings of the landscape. The higher the ratings for your reading, the more points you get; points can be cashed in to overturn someone else’s readings. I think that would qualify as both discernable and integrated, per Salen and Zimmerman? Reading the landscape then becomes a contested activity, which reflects how landscapes can be made in the first place… This is just a first stab at the idea, but I think there’s potential here?

Some examples of existing historical/archaeological AR approaches or projects (all of these use the geolocating features of smartphones to ‘know’ roughly what you are looking at, and they display the data overlays on the phone’s camera as a heads-up display):

What AR applications/platforms are you using for your playful approach to the past?

 

Treasure Hunting & Alternate Reality Games for History

As part of my Digital History class, I introduced the students to the concept of alternate reality games. I don’t know of any that exist with the explicit purpose of teaching history, so we looked at some of the standards - I Love Bees, The Beast, Majestic. We looked at the work of Jane McGonigal. All in all, it was a fun couple of sessions. At the end of the last session, I mentioned that while I was in the library, a piece of paper fell out of a book I was reading, and this is what it said:

What could this mean? Points to the group who solved it first!

****

So that was my attempt at using some of the basic conventions of an alternate reality game – the puzzle, the riddles, the treasure-hunting aspect – to teach history. “Is this worldwide?” one student asked. “Safe to say, limited to this city” I replied.

I figured it would take them a couple of days, if they really tried hard. The first group returned two hours later with the whole thing solved! So what was I trying to do with this? By calling it ‘people are places are people’, I was pointing to the way we name buildings on the campus here. The first clue too was pretty easy, and I figured that when they solved it, it would alert them to the fact that we were looking at the buildings on campus here. To solve the puzzles, they had to perform one of the authentic tasks of the historian, and read closely.  But some groups didn’t read the clues very closely, and were stumped almost from the word go.

Interestingly, one group took my off-hand comment about ‘limited to the city’ to imply that the game would be played all over the city; and the line, ‘people are places are people’ to mean the founder of Ottawa, Lt Col. By. Amazingly, they found locations, statues, historic plaques all over Ottawa’s downtown that *could* be thought of as the answers to my clues… so in interacting with my text, they found items completely unrelated to my intentions – the sort of thing that happens when working with historic documents all the time, after a fashion.

As part of my continuing exploration of 7scenes, I’ve also tried translating it into a smartphone application. If you’re around Carleton, give it a try and let me know what you think. Consider this still as *draft*.

Some of my student feedback:

“[...] It is hard to figure out exactly what the clue means and once you find and solve and reveal the clue, you don’t want to stop.
Personally, I thought this game was really challenging because you didn’t really know where to start with a clue that could deal with generally anything. Once you figured out the clue or were on the right trail, I thought was exciting because it felt more like a race to be the first one to crack the clue. [...]“

“[..]What worked for me about this game was the mystery behind it. It really captured my attention because it had kind of like a secretive aspect to it; it made me want to decode the mystery. The “rabbit hole” of the game is to figure out the first clue first in order to solve the rest of the clues and therefore solve the riddle[...]“

“The procedural rhetoric is I suppose to inspire active as opposed to passive learning about a subject that, while right under our noses, goes overlooked even though it has a lot of history. The rabbit hole was the sheet of paper falling out of a book in the library.

I found that this game touched on every aspect of what makes a good augmented or alternate reality game, but what I found to be most frustrating was the appearent lack of an overarching objective, that is to say something that tied them all together explicitly and not just generally. I feel that the game could have been better if there was a longer back story with the rabbit hole, like a hint dropped about what we should look for based on what you were researching in the library. It was a fun experiance over all though, I enjoyed learning what I did and it was nice to be able to do so in a group.”