importing GIS data into unity

Per Stu’s workflow, I wanted to load a DEM of the local area into Unity.

1. Stu’s workflow:

I obtained a DEM from the university library. QGIS would not open the damned thing; folks on Twitter suggested that the header file might be corrupt.

However, I was able to view the DEM using MicroDEM. I exported a grayscale geotiff from MicroDEM. The next step is to import into Unity. Stu’s workflow is pretty complicated, but in the comment thread, he notes this:

2. Alastair’s workflow:

Alrighty then, gdal. I’d already installed gdal when I updated QGIS. But, I couldn’t seem to get it to work from the command line. Hmmm. Turns out, you’ve got to put things into the path (lord how I loathe environment variables, paths, etc.)

export PATH=/Library/Frameworks/GDAL.framework/Programs:$PATH

Now I can use the command line. Hooray! However, his suggested command for converting from geotiff to raw heightmap expected by unity:

gdal_translate –ot UInt16 –scale –of ENVI –outsize 1025 1025 srtm_36_02_warped_cropped.tif heightmap.raw

(using my own file names, of course) kept giving me ‘too many options’ errors. I examined the help file on that gdal_translate, and by rearranging the sequence of flags in my command to how they’re listed in the help,

gdal_translate -ot UInt16 -of ENVI -outsize 1025 1025 -scale localdem.tif heightmap.raw

the issue went away. Poof!

Into Unity3d I went, creating a new project, with a new terrain object, importing the raw heightmap. Nothing. Nada. Rien.

Knowing that with computers sometimes, when you just keep doing things over and over expecting a different result you actually get a different result:

So from above, something like my original dem, though flipped a bit. I’m not too sure how to tie scripts into things, so we’ll let that pass for now. But as I examined the object closely, there were all sorts of jitters and jags and … well, it only looks like terrain from directly above.

A bit more googling, and I found this video:

which seems to imply that interleaving in the Raw format might be to blame (? I donno). Anyway, I don’t have Photoshop or anything handy on this machine for dealing with raster images. I might just go back to Qgis with the geotiff I made with Microdem.

(I went to install Gimp, saw that you could do it with Macports, and I’ve been waiting for the better part of an hour. I should not have done that, I suppose).

Anyway, the reason for all this – I’m trying to replicate Stu’s embodied gis concept. The first part of that is to get the target landscape into the unity engine. Then it gets pushed through vuforia… (I think. Need to go back and read his book, if I could remember who I let borrow it).

update june 9 – Success!

  1. I opened the grayscale image exported from MicroDem in Gimp.
  2. I resized the image as power of 2 + 1 (*shrug* everything indicates this is what you do, with unity); in this case, 1025.
  3. Saved as RAW.
  4. Changed the file extension from .data to .raw.
  5. Created a new 3d terrain object in Unity.
  6. Imported my .raw image.
  7. On the import dialogue, changed it to 8-bit image rather than 16-bit.
  8. Changed the width, height, x and z to all be 1025. Changed the y to be 75 (as the original image height is somewhere around 60 m above sea level, the highest point 135 m above sea level. Otherwise, I was getting these monstrous mountains when going with the default ‘600’).
  9. This post provided the solution:

I still need to rotate things, add water, add controls, etc. But now I could add my 3d models of the cemetery (which is on the banks of this river), perhaps also Oculus support etc. Talking with Stu a bit more, I see that his embodied GIS is still a bit beyond what I can do (lots of custom scripting), but imagine publishing an excavation this way, especially if ‘Excavation is destruction digitization‘…

Screen Shot 2015-06-09 at 12.14.30 PM
Success! That’s the mighty Rideau you’re looking at there.

Fumbling towards Virtuality

With apologies to Sarah.

So the Oculus Rift arrived some time ago. What with conferences and illness, I didn’t really get to play with it until today. I followed all directions, and eventually got the damned thing wired to my 5 yr old Windows 7 machine.

I know, I know.

I have dual monitors. Dual VGA monitors. My box does not have VGA ports. Or rather, it does but they don’t hook to anything (curse you pimplyfaced Best Buy salesman). So, five years ago, I had to hunt high and low for dvi and display port adaptors. At the time, dvi and dp monitors were more than I had coin for. I tell you this to explain part of this morning shennanigans; hooking the Rift up to the hdmi port upset the delicate balance that keeps my monitors working (seriously – there’s  wire loose somewhere, which happened after I had to replace the power supply).

I know, I know.

Anyway, once everything was hooked up, the cool blue light of the Rift’s eyepieces beckoned me to put the thing on. Did I mention I have astigmatism in both eyes?

I know, I know.

Behold – my desktop upside down, and my two monitors no longer in position left and right. They inverted. So all alerts, buttons, windows etc were in the Rift view. I should mention that when I went to download the SDK & the runtime, my antivirus freaked right out about trojans (thank you, 360 total security). False alarm. But the auto-quarantine thing had the effect of buggering up the download, so I had to figure out what was going on there before I could get it all downloaded. Anyway, after much futzing, I got the desktop to display correctly in the Rift, even though it would no longer mirror to my monitors. ‘I can work with this’, I thought.

I know, I know.

When I tried the demo, I started getting all sorts of error messages. After more futzing and googling, I arrived at that point that all of us eventually get to:

…and I reinstalled the bloody sdk, and the runtime. And lo! the demo ran, appearing on my screen. The headtracker appeared to work as well, for on the screen as I moved the Rift around the ceiling of the tuscan villa would appear, then the walls, then the floor… except, not within the bloody Rift itself. No, the Rift was showing an orange ‘trouble’ light.

And then the viewer crashed, and the graphics all buggered up, and… and… I blame my graphics card & its software (whether rightly or wrongly, something’s gonna take the blame). It’s an AMD Radeon HD5570, but yeah, something’s up. And I’ve lost the better part of this morning futzing with this.

Things are getting dire.

Why I’m doing this: I want to do something like what these folks are doing, immersive network viz & navigation.

Anyway, it’s probably time to replace my box and when I do, surely most (all?) of my issues will automagically disappear.

(Well, this issue here is probably the culprit and I need to run in extended desktop mode, but still).

Update: I switched it to extended desktop mode; nothing. Back to normal mode. Then hot damn, the thing works! So I am now fully oculus rift’d.

Let’s do something cool.

Of Hockey, Sympathetic Magic, and Digital Dirt

We won tickets to see the Ottawa – Tampa Bay game on Saturday night. 100 level. Row B. This is a big deal for a hockey fan, since those are the kind of tickets that are normally not within your average budget. More to the point of this post, it put us right down at ice level, against the glass.

Against the glass!!!

Normally we watch a hockey game on TV, or from up in the nose-bleeds. From way up there, you can see the play develop, the guy out in the open (“pass! pass! pleeeease pass the puck!” we all shout, from our aerie abode), same way as you see it on the tv.

But down at the glass…. ah. It’s a different scene entirely. There is a tangle of legs, bodies, sticks. It is hectic, confusing. It’s fast! From above, everything unfolds slowly… but at the ice surface you really begin to appreciate how fast these guys move. Two men, skating as fast as they can, each one weighing around 200 pounds, slamming into the boards in the race to get the puck. For the entire first period, I’d duck every time they came close. I’d jump in my chair, sympathetic magic at work as I willed the hit, made the pass, launched the puck.

For three wonderful periods, I was on the ice. I was in the game. I was there.

So…. what does this have to do with Play the Past? It has to do with immersion, and the various kinds that may exist or that games might permit. Like sitting at the glass at the hockey game, an immersive world (whether Azeroth or somewhere else) doesn’t have to put me in the game itself; it’s enough to put me in close proximity, and let that sympathetic magic take over. Cloud my senses; remove the omniscient point of view, and let me feel it viscerally. Make me care, and I’ll be quite happy that I don’t actually have my skates on.

Good enough virtuality is what Ed Castronova called it a few years back, when Second Life was at the top of its hype cycle.But we never even began to approach what that might mean. I think perhaps it is time to revisit those worlds, as the ‘productivity plateau’ may be in site.

In an earlier post, Ethan asked, where are the serious games in archaeology? My response is, ‘working on it, boss’.  A few years ago, I was very much enamored of the possibilities that Second Life (and other similar worlds/platforms) could offer for public archaeology. I began working on a virtual excavation, where the metaphors of archaeology could be made real, where the participant could remove contexts, measure features, record the data for him or herself (I drew data in from Open Context; I was using Nabonidus for an in-world recording system).  But I switched institutions, the plug was pulled, and it all vanished into the aether (digital curation of digital artefacts is a very real and pressing concern, though not as discussed as it ought to be). I’m now working on reviving those experiments and implementing them in the Web.Alive environment. It’s part of our Virtual Carleton campus, a platform for distance education and other training situations.

My ur-dig for the digital doppleganger comes from a field experience program at a local high school that I helped direct.  I’m taking the context sheets, the plans, the photographs, and working on the problems of digital representation in the 3d environment. We’ve created contexts and layers that can be removed, measured, and planned. Ideally, we hope to learn from this experience the ways in which we can make immersion work. Can we re-excavate? Can we represent how archaeological knowledge is created? What will participants take away from the experience? If all those questions are answered positively, then what kinds of standards would we need to develop, if we turned this into a platform where we could take *any* excavation and procedurally represent it?  I’m releasing students into it towards the start of next month. We’ve only got a prototype up at the moment, so things are still quite rough.

The other part of immersion that sometimes gets forgotten is the part about, what do people do when they’re there? That’s the sympathetic magic, and maybe it’s the missing ingredient from the earlier hype about Second Life. There was nothing to do. In a world where ‘anything is possible‘, you need rules, boundaries, purpose. We sometimes call it gamification, meaningfication, crowdscaffolding, and roleplaying.  Mix it all together, and I don’t think there’s any reason for a virtual world to not be as exciting, as meaningful, as being there with your nose at the glass when Spezza scores.

Or when you uncover something wonderful in the digital dirt. But that’s a post for the future, when my students return from their virtual field season.

(cross-posted at Play the Past)

Blue Mars

The topic of virtual worlds for archaeology and history seems to have hit a bit of a lull in recent months; on the other hand, that could simply be because I haven’t been looking. This morning, in preparing for my talk to the WAGenesis developer community, I came across Blue Mars, an online virtual world whose tools would appear to be more useful than those in Second Life, in that you can import your meshes, grids, etc from common 3d modeling programs. For archaeologists with a lot of 3d CAD reconstructions, this could be quite a boon. You can even import topographic maps into Blue Mars, and so recreate not just the buildings, but the landscapes. Virtual Landscape Archaeology, anyone?

In the film below, the builder has imported the topographic map of Mars…

It’s the part of Mars that has the four large volcanoes,” he tells me, “roughly what’s in this photo.” Daniel imported NASA satellite imagery of the Tharsis Montes area of Mars where Olympus Mons resides, which is about 2400 x 2400 kilometers, and scaled that down to fit into the Blue Mars terrain map, which is 8 x 8 km. As he explains, “You can import terrain maps (both height and color texture) into a Blue Mars city. They need to be greyscale and color bitmap (.bmp) files respectively, and the correct size… But you can import any real world map as a base

Meanwhile, at Ball State, they’re recreating parts of the Panama 1915 World’s Fair in Blue Mars, for explicitly historical immersive education. Stay tuned!

Frischer on Rome Reborn

Podcast featuring Bernard Frischer on the Rome Reborn project

The audio is here.

Frischer mentions some problems he’s had getting materials into Second Life, so he’s been using something called Open Simulator instead. I recall a ruby-powered tool for pushing autocad or sketchup models into Second Life (it’s on this blog somewhere 😉 and of course there’s all sorts of work done using commercial game engines to ‘virtualize’ models. With the obvious resources he has, I wonder why those avenues weren’t explored. Anyway, I think it was Troels who once mentioned it – but whatever tool we use for these simulations, we need to be including the ‘shit’ – the horse droppings, the garbage, the people too. Right now, all of these always feel like Pompeii after the tourists go home… In fairness, Frischer notes that that is something they are working on for their virtual Hadrian’s Villa.

Kinda ironic, in a way – Hadrian’s villa being a virtual world when it was built in the first place. A virtual virtual world? We’re getting all recursive… As I’ve argued before, virtual worlds are nothing new in human experience. It’s just the delivery method & fidelity that keeps changing.

Virtual Worlds: and the most powerful graphics engine there is

Virtual worlds are not all about stunning immersive 3d graphics. No, to riff on the old Infocom advertisement, it’s your brain that matters most.  That’s right folks, the text adventure. Long time readers of this blog will know that I have experimented with this kind of immersive virtual world building for archaeological and historical purposes. But, with one thing and another, that all got put on a back shelf.

Today, I discover via Jeremiah McCall’s Historical Simulations / Serious Games in the Classroom site Interactive Fiction (text adventure) games about Viking Sagas – part of Christopher Fee’s English 401 course at Gettysburg College.

Yes, complete interactive fictions about various parts of the Viking world! (see the list below). I’m downloading these to my netbook to play on my next plane journey.

Now, interactive fiction can be quite complex, with interactions and artificial intelligence as compelling as anything generated in 3d – see the work of Emily Short. And while creating immersive 3d can be quite complex and costly in hardware/software, Inform 7 allows its generation quite easily (AND as a bonus teaches a lot about effective world building!)

Explore the Sites and Sagas of the Ancient and Medieval North Atlantic through one of Settings of The Secret of Otter’s Ransom IF Adventure Game:The earliest version of the Otter’s Ransom game was designed to be extremely simple, and to illustrate the pedagogical aims of the project as well as the ease of composing with Inform 7 software: In this iteration the game contains no graphics or links, utilizes very little in the way of software functions, tricks, or “bells and whistles,” and contains a number of rooms in each of sixteen different game settings; as the project progresses, more rooms, objects and situations will be added by the students and instructor of English 401, as well as appropriate “bells and whistles” and relevant links to pertinent multimedia objects from the Medieval North Atlantic project.

Using simple, plain English commands such as “go east,” “take spear-head,” “look at sign” and “open door” to navigate, the player may move through each game setting; moreover, as a by-product of playing the game successfully, a player concurrently may learn a great deal about a number of specific historical sites, as well as about such overarching themes as the history of Viking raids on monasteries, the character of several of the main Norse gods, and the volatile mix of paganism and Christianity in Viking Britain. The earliest form of the game is open-ended in each of the sixteen settings, but eventually the complete “meta-game” of The Secret of Otter’s Ransom will end when the player gathers the necessary magical knowledge to break an ancient curse, which concurrently will require that player to piece together enough historical and cultural information to pass an exit quiz.

Play all-text versions of the site games from The Secret of Otter’s Ransom using the Frotz game-playing software.

Play versions of the site games which include relevant images using the Windows Glulxe game-playing software.

In order to view images the player must “take” them, as in “take inscription;” very large images may come up as “[MORE]” which indicates that text will scroll off the screen when the image is displayed. Simply hit the return key once or twice and the image will be displayed.

We hope that you will enjoy engaging in adventure-style exploration of Viking sites and objects from the Ancient and Medieval North Atlantic!

Start by saving one of the following modules onto your desktop; next click the above game-playing software. When you try to open the Frotz software (you may have to click “Run” twice) your computer will ask you to select which game you’d like to play; simply select the module on your desktop to begin your adventure; you may have to search for “All Files.” Each game setting includes a short paragraph describing tips, traps, and techniques of playing:

Andreas Ragnarok Cross

Balladoole Ship Burial

Braaid Farmstead

Broch of Gurness

Brough of Birsay Settlement

Brussels Cross

Chesters Roman Fort

Cronk ny Merriu Fortlet

Cunningsburgh Quarry

Helgafell Settlement

Hvamm Settlement

Hadrian’s Wall

Jarlshof Settlement

Knock y Doonee Ship Burial

Laugar Hot Spring

Lindisfarne Priory

Maes Howe Chambered Cairn

Maughold – Go for a Wild Ride

Maughold- Look for the Sign of the Boar’s Head

Maughold – The Secret of the Otter Stone

Mousa Broch

Ring of Brodgar

Rushen Abbey Christian Lady

Ruthwell Cross

Shetland Magical Adventure

Skara Brae

Stones of Stenness

Sullom Voe Portage

Tap O’Noth Hillfort

Temple of Mithras at Carrawburgh

Ting Wall Holm Assembly Place

Tynwald Assembly Place

Yell Boat Burial

Second Life as an Archaeological Tool: Ruth Tringham

A podcast with Ruth Tringham on her work on Okapi Island: listen here ; transcript at

Kevin Ammons: Welcome to the Preservation Technology podcast. I am Kevin Ammons. Today I am visiting with Ruth Tringham, one of the founders of the University of California Berkley the People in Multimedia Authoring Center for Teaching in Anthropology at Berkley (MACTiA). As a professor of anthropology at the University of California at Berkley Ruth uses an online virtual environment called Second Life in her teaching.Kevin Ammons: Welcome Ruth! How did you find yourself at Berkley exploring the notion of Second Life as an archeological tool?

ÇatalhöyükÇatalhöyük (image courtesy of

Ruth Tringham: Well it sort of developed out of my work with digital forms of visualization things like multimedia 3D modeling and of neolithic archaeological sites in southeast Europe and in Anatolian more recently with Çatalhöyük. I actually did know anything about Second Life. It must of been in the early 2000’s because I had been doing this visualization multimedia stuff for – all through the 90’s – at least the last part of the 90’s. But then I was working with this digital technologist I suppose is not really that he is somebody who worked with museums and digital technology called Noah Whitman. He started working with us on a project called Remixing Çatalhöyük and I can tell you about that a little later but while we were working on that, which was really a method of sharing our Çatalhöyük media database with the public, he introduced me to Second Life. He said, “Have you seen this? You might be interested in this.”


Rebuilding Catalhoyuk

On my reading list:

Colleen Morgan, Rebuilding Catalhoyuk (full text)

Building virtual models of archaeological sites has been seen as a legitimate mode of representing the past, yet these models are too often the end product of a process in which archaeologists have relatively limited engagement. Instead of building static, isolated, uncanny, and authorless reconstructions, I argue for a more active role for archaeologists in virtual reconstruction and address issues of representational accuracy, personal expression in avatars and peopling the virtual past. Interactive virtual worlds such as Second Life provide tools and an environment that archaeologists can use to challenge static modes of representation and increases access to non-expert participants and audiences. The virtual model of Catalhoyuk in Second Life is discussed as an ongoing, multivocal experiment in building, re- building, and representing the past and present realities of the physical site.

Digital Media and Learning Competition, HASTAC, archaeological entries

Some archaeological entries in this year’s competition:

The heritage sites of the Mississippi Delta are important cultural monuments. This project brings three key Arkansas heritage sites into Second Life, allowing direct access to those sites for students and the general public. This virtual learning platform will be designed to allow a direct engagement with historic material.

The Children’s Museum of Indianapolis is planning a new exhibit called Treasures of the Earth. The goal is to create an adventure in archaeology featuring three major archeological discoveries and a lab where families can use technology to learn about science and uncover clues to the past.
Dive a hundred feet below sea level and take a voyage back hundreds of years in a virtual simulation game to learn how scientific archaeological methods are used to survey, explore, excavate and interpret submerged cultural resources.
Stone Mirror introduces archaeology via participation in a 3-D “virtual dig” of Çatalhöyük, Central Anatolia (southern Turkey). Based on Swigart’s Stone Mirror: A Novel of the Neolithic, students experience both past and present to create a “path of inference” from discovering objects to creating narratives describing their historical meaning.
The goal is to create a system of virtual collaborative environments able to teach how to virtually reconstruct ancient worlds in 3D, involving a community of young users. The system is based on the following archaeological case studies: Roman imperial Villas, ancient Chinese tombs and Mayan sites.