Hearing the Past

what follows is our draft chapter for ‘Seeing the Past‘, a colloquium hosted by Kevin Kee at Brock University. The chapter will eventually be published in ‘Seeing the Past: Augmented Reality and Computer Vision in History’ http://kevinkee.ca/seeing-the-past/book-abstract/

comments welcome.

Hearing the Past – S Graham, S Eve, C Morgan, A Pantos

This volume is about seeing the past. But ‘to see’ does not necessarily imply vision. To see something can also mean to understand it. We frequently see things that do not exist, in this sense. “I see your point” or “I see what you’re saying”. ‘I hear you’ we sometimes say, also meaning, I understand.

In which case, how should we “see” the past? You can’t see the past. You can only see the present. You might believe something of what you’re looking at as being ‘from’ the past, but it still lives in the here-and-now. Thus, there is always a cognitive load, a ‘break in presence’ [Turner, 2007] that interrupts what we are seeing with awkward details. This is why we talk of the historical imagination, or the archaeological eye. To understand the past through augmented reality might not require vision. Yet, the majority of augmented reality apps currently available privilege the visual, overlaying reconstructions or text on an image of the present through a keyhole, the viewport offered by our small screens. The clumsiness of our interfaces also creates a break in presence. Visual overlays are clunky, with low-resolution 2D graphics, all of which further contribute to breaks in presence.

In short, they do not help us see the past – to understand it –  in any meaningful way.

In this chapter, we suggest that ‘hearing’ the past is a more effective and affective way of providing immersive augmented reality. We argue from cognitive and perceptual grounds that audio – spoken word, soundscapes, acoustic horizons and spaces, and spatialized audio – should be a serious area of inquiry for historians exploring the possibilities of new media to create effective immersive augmented reality. We explore some of the phenomenology of archaeological landscapes and the idea of an ‘embodied GIS’ [Eve, 2014] as a platform for delivering an acoustic augmented reality. Drawing on Phil Turner’s work on ‘presence’ in an artificial environment [Turner, 2007], we explore ‘breaks’ in presence that occur in augmented, mixed, and virtual environments. The key idea is that presence is created via a series of relationships between humans and objects, that these relationships form affordances. When these relationships are broken, presence and immersion is lost. We argue that because the sense of hearing depends on attention, audio AR is particularly effective in maintaining what Turner calls ‘affective’ and ‘cognitive/perceptual’ intentionality. In short, the past can be ‘heard’ more easily than it can be ‘seen’.  We explore three case studies that offer possible routes forward for an augmented historical audio reality.

‘Eh? Can you speak up?’ The sense of hearing

The sense of hearing, and the active cognition that hearing requires, has not been studied to the same degree or in the same depth as the visual [Baldwin, 2012: 3].  Hearing – and understanding – is also a tactile, haptic experience. Sound waves actually touch us. They move the tiny hairs of the ear canal, and the tiny bones within, and the various structures of the middle and inner ear turn these into the electro-chemical pulses that light up the various parts of our brain.  Sound is a kind of tele-haptic:

“…the initial stage of hearing operates as a mechanical process. Later the mechanical energy of sound converts to hydraulic energy as the fluids play a larger vibratory role. Thus at its source, touch operates with and causes sound, and it is only through touch-at-a-distance that we have sound at all. The famous story of Edison’s ears bleeding from his aural experiments makes visceral this tele-touch, which is not always a gentle stroke, no matter how pleasant the sounds, voice or music we might encounter.” [Bishop 2011, 25-6]

But intentional hearing – listening-  requires attention. Consider – In the crowded foyer of a cinema, it can be quite difficult to make out what the person opposite is saying. You have to pay attention; the act is tiring. One can try to read lips,  trying to match visual cues with auditory cues. In the quiet of a classroom, with the teacher’s back turned, the teacher can hear the surreptitious whisper that while much quieter, speaks volumes.  Hearing, unlike sight, requires attention that divides our ability to make semantic or emotional sense of what’s being said, or even to remember quite what was said, when the original audio signal is poor [Baldwin, 2012: 6].  What’s more, our brain is processing the spatial organization of the sound (near sounds, far sounds, sounds that move from left to right), how it is being said, not just the what being said [Baldwin, 2012: 6].

Bishop goes on to argue that touch and vision are senses that can only know the surface; sound waves transcend surfaces, they cause surfaces to vibrate, to amplify (but also, to muffle). And so,

“Sound provides the means to access invisible, unseeable, untouchable interiors. If we consider the import of vision to the general sensorium and metaphorization of knowledge, then the general figurative language of “insight” runs counter to surface vs. deep understanding of the world. Sound, it would seem, not vision or touch, would lead us to the more desired deep understanding of an object or text.” [Bishop, 2011, 26]

Sound permeates and transgresses surfaces; sound gives access to the unseen. Bishop is discussing Karlheinz Stockhausen’s “Helicopter String Quartet’. Bishop goes on to argue that the piece exposes the ways that sound and touch, blur (and “slur”) into a kind of synaesthesia, which defies the ‘assumed neatness of the sensorium’ [Bishop, 2011, 28]. With our western rationality, we assume that the senses neatly cleave. With our western focus on the visual, we prioritize one way of knowing over the others. Chris Godsen, in an introductory piece to an issue of World Archaeology on the senses in the past, argues that our western ‘sensorium’ (what we think of as the five senses) influences and conditions how we understand material culture. He advocates unlearning and unpacking the privileged position of sight [Gosden, 166] what others have called ‘ocularcentrism’ [Thomas 2008]

The effect of structured sound (let’s call it ‘music’) on movement is another interesting area where the haptic qualities of sound may be perceived. Interestingly, there are aspects to music that seem to translate into movement ‘primitives’. A recent study explored the relationship of musical structure (dynamic, harmony, timber) to guided walking (mobile tours) [Hazzard et al, 2014]. The authors note that a focus on structure in music sits between the thematic (where the emotional content of the music is manipulated) and the sonic (which is associated with spatial cues). Thus, they wondered what aspects of structure would be perceived by their non-musically trained study subjects (western, university undergraduates at an Anglophone university) and how the subjects would translate these into music.  The subjects listened to four distinct compositions that were designed to emphasize one aspect of musical structure, as they moved around an open field. The subjects were observed and interviewed afterwards. Why did they stop in certain places? Why did they back-track, or move in a spiral?

The authors found that silence in the music was often interpreted as signalling a time to stop, while crescendi (a rising intensity in the music) impelled movement forward while a diminuendo, a lessening, did not imply movement away; rather it signalled the impending end of movement altogether. Musical punctuation caused listeners to try to understand the significance of the particular spot they were standing on. Timbre ‘coloured’ different areas. ‘Harmonic resolution’ signalled ‘arrival’[Hazzard et al, 2014: 609-613]. As will be seen in our case studies, this interplay of silence and crescendo can also be a powerful affective tool to convey the density or paucity of historical information in an area.

Sound requires cognition to make sense; there is nothing ‘natural’ about understanding the sounds that reach our ears. This act of attentiveness can elide other breaks in presence. Sound is tactile. It engages pathways in the brain similar to those involved with processing visual imagery.

Culture & Soundscape

‘As a little red-headed Metis kid, it never occurred to me that the city could sound different to anyone else.’ [Todd, 2014]  Zoe Todd recently wrote a moving piece in Spacing on ‘Creating citizen spaces through Indigenous soundscapes’, where she describes amongst other things the profound effect of a flash mob occupying the West Edmonton Mall’s replica Santa Maria, Columbus’ flagship. “The sounds of Indigenous music, language and drumming soaring high up into the mall’s glass ceiling was a revelation: decolonization of our cities is not merely a physical endeavor, but also an aural one.” [Todd, 2014]

Work on the cognitive basis of memory has shown that, rather than being like a filing cabinet from which we retrieve a memory, the act of recollection actively re-writes the memory in the present: our memories are as much about our present selves as they are about the past. Thus, cognitive scientists working in the field of post-traumatic stress disorder are finding that they can reprogram the emotional content of traumatic memories by altering the contexts within which those memories are recalled. Sound plays very much a role in all of this. [see S Hall’s review article 2013 on the state of research, http://www.technologyreview.com/featuredstory/515981/repairing-bad-memories/].

Soundscapes affect us profoundly, and as Todd demonstrates, can be used to radically reprogram, repatriate, decolonize, and contest spaces. Work on the cognitive foundations of memory suggests that sound can literally re-wire our brains and our understanding of memory. Tim Ingold talks about the ‘meshworks’ of interrelationships that create spaces and bind them in time [ref]. Can soundscapes help us ‘visualize’ the past, or at least, surface different patterns in the meshwork? Can we reprogram collective memories of place with sound?

The soundscape has been explored in a historical context by a number of scholars, and in particular, amongst archaeologists as the study of archaeoacoustics. Most work on archaeoacoustics has explored the properties of enclosed spaces [see Blesser & Salter 2007] such as caves [Reznikoff 2008], theatres [Lisa et al. 2004] and churches [Fausti et al. 2003]. For an excellent review of the increasingly extensive literature, see Mills [2010]. In particular, Mlekuz has investigated the soundscape of church bells in an area of Slovenia. He takes Schafer’s [1977] definition of the soundscape, who sits it in direct opposition to an acoustic space, explaining that where an acoustic space is the profile of the sound over a landscape, the soundscape is a sonic environment – with the emphasis being put on the way it is perceived and understood by the listener [Mlekuz 2004, para.2.2.1]. This clear distinction between the mechanics and properties of the sound (the acoustic nature) with the affect it has on the listener (the soundscape) fits perfectly with Turner’s idea of the Arc of Intentionality. Where we may be able to recreate the sounds of the historical past, we may not be able to recreate how these sounds came together to create the soundscape of a person existing in that past. The soundscape is a combination of the acoustic properties of sound, space and the individual. However, the acoustic nature of historical sounds will affect us as human beings and will evoke some kind of emotional/affective response – even if it could be argued that this response is not ‘historically authentic’.

The next question to ask, then, is that if sounds, music and voices from the past can affect us in certain ways – how do we deliver those sounds using Augmented Reality, to enable an in-situ experience?

Aural Augmented Reality

Audio tours, a handheld device rented or borrowed from a museum that guides a visitor through the exhibition, are a staple of many museums and heritage sites. The audio tour has been used since the 1950’s [see http://www.acoustiguide.com/ and http://www.duvaws.com/company/profile]. Once a bulky device that had to be curated and maintained by the museum or heritage site, audio tours are quickly taking advantage of the smartphone-enabled age and releasing their tours as downloadable apps or podcasts. This is democratizing the audio tour, allowing new and alternative tours of museums and cities to be released and followed, and potentially undermining the ‘truth’ of the official tour. While we certainly do not deny that the humble audio tour is a form of Aural Augmented Reality, experienced in-situ and influencing the way the user experiences a space, they serve as a narrative-led experience of a space (much as a tour guide in book form would) and do not often explore the haptic or more immersive properties of AAR.

Some applications have taken the idea of the audio guide further, such as the SoundWalk project [http://soundwalk.com/] that offers alternative tours of the Lourve, with a Da Vinci Code theme, or walking tours of the Hassidic areas of Williamsburg narrated by famous actors and members of the community. What makes the SoundWalk tours different, is that they are GPS-powered, and so specific to the place (for instance you are told to open specific doors when they are in front of you, or to look left or right to see individual features). They are also produced with a very high quality of narration, sound-recording and music/sound effects. In addition they play with the notion of yourself melding with the narrator “…ok, for today you are Joseph, that’s my Hebrew name, that’s my Jewish name and that’s your name, for today we are one.” [extract from the Williamsberg Men Hassidic tour http://www.soundwalk.com/#/TOURS/williamsburgmen/]. The SoundWalk tours attempt to create a feeling of immersion by effectively giving a ‘high-resolution’ aural experience, the acting, sound effects, music and beguiling narrative all come together to allow yourself to get lost in the experience, following the voice in your head.

An application that also uses the immersive aspect of storytelling to good effect is the fitness app, ‘Zombies, Run!’ [https://www.zombiesrungame.com/]. The app is designed to aid a fitness regime, by making running training more interesting. When you log into the app, you take on the role of ‘Runner 5’ a soon-to-be-hero that is going to save the world from the Zombie Apocalypse. The app uses your GPS location and compass to direct you on a run around your local neighbourhood, but all the the time you are being pursued by virtual zombies. Go too slowly and the sounds of the zombies will catch you up, their ragged breath chasing you around the park. As part of the run you can also collect virtual medical supplies or water bottles – – indicated to you by the use of in-game voice – that all help to stave off the Apocalypse. By using the very visceral sounds of a pursuer getting closer, combined with the affective power of physically being out of breath, tired and aching – the run becomes an immersive experience, you are not just trying to better your time – you are escaping zombies and trying to save the world. This app works so well, mainly because you don’t have to look at the screen and the suspense of the situation is created mainly through sound [see Perron 2004].

Three Archaeological/Historical Aural Augmented Reality Case Studies

The examples of AAR applications provided so far were not specifically created with an ear to exploring and experimenting with historical sounds or soundscapes. Instead, they provide an immersive narrative (audio tours) or gamify a journey through an alternate present (Zombies, Run!). Historians and archaeologists are currently experimenting with the technology not just as a means to simply tell a story – but to allow the user to ‘feel’ the sounds and have them be affected by what they are hearing. Each of the applications eschews any kind of visual interface, concentrating instead on the power of sound to direct, affect and allow alternate interpretations. The case studies are examples of prototype applications, proofs-of-concept, rather than fully-fledged applications with many users, however, even these experimental models demonstrate the potential benefits of hearing the past.

Using Aural Augmented Reality to explore Bronze Age Roundhouses

As part of his research using the embodied GIS to explore a Bronze Age settlement on Bodmin Moor, Cornwall, United Kingdom, Stuart Eve used a form of Aural Augmented Reality to aid navigation and immersion in the landscape [Eve 2014]. By using the Unity3D gaming engine (which can spatialize sound), Eve created a number of 3D audio sources that corresponded to the locations of the settlement’s houses. As the resulting app was geo-located, the user could walk around the settlement in-situ and hear the augmented sounds of the houses (indistinct voices, laughing, babies crying, etc) getting louder or quieter the closer they got to each sound source. The houses in the modern landscape are barely visible on the ground as circles of stones and rocks, making it hard to discern where each house is. Eve then introduced virtual models of the houses to act as audio occlusion layers, simulating the effect of the house walls and roofs in dampening the sounds coming from within – and only allowing unoccluded sound to emit from the doorways:

“At first, the occlusion of the sounds by the mass of the houses was a little disconcerting, as [visually] the buildings themselves do not exist. However, the sounds came to act as auditory markers as to where the doorways of the houses are. This then became a new and unexpected way of exploring the site. Rather than just looking at the remains of the houses and attempting to discern the doorway locations from looking at the in situ stones, I was able to walk around the houses and hear when the sounds got louder – which indicated the location of the doorway” [Eve 2014:114]

Eve then goes on to suggest that by modelling sound sources and relating them to the archaeological evidence, questions can be asked about the usage of the site, and can be explored in situ. For instance, if some of the houses were used for rituals (as is indicated by the archaeological evidence) what sort of sounds might these rituals make and how would this sound permeate across the settlement? More prosaically, if animals were kept in a certain area within the settlement, how would the sound of them affect the inhabitants? How far could people communicate across the settlement area using calls or shouts?

Eve’s use of AAR to ask archaeological questions of a landscape highlights the exploratory power of an Augmented Reality approach, a different application, Historical Friction, explores the power of AAR to inform us about our surroundings and make us question what is beneath our feet.

Historical Friction

‘Historical Friction’ was directly inspired by the work of Ed Summers (of the Maryland Institute for Technology in the Humanities), filtered through the example of ‘Zombies, Run!’. Summers programmed a web-app called ‘Ici’, french for ‘Here’. Ici uses the native abilities of the browser to ‘know’ where it is in space to search and return all of the Wikipedia articles  that were geotagged within a radius of that location. [http://inkdroid.org/ici/]. In its current iteration, it returns the article as points on a map, with the status of the article (stub, ‘needs citations’, ‘needs image’, etc) indicated. In its original form, it returned a list with a brief synopsis of the article. Summers’ intent was that the app could work as a call-to-action, to encourage users to expand the coverage of the area in Wikipedia.

Visually, it can be impressive to see the dots-on-the-map as an indication of the ‘thickness’ of the digital annotations of our physical world. Initially, we wanted to make that ‘thickness’ literal, to make it actually physically difficult to move through places dense with historical information by exploiting the haptic nature of sound.

We tried to make it painful, to increase the noise and discords, so that the user would be forced to stop still in their tracks, to take the headphones off, and to look at the area with new eyes. Initially, we took the output from ‘Ici’ and fed it through a musical generator called ‘Musical Algorithmns’. The idea was that the resulting ‘music’ would be an acoustic soundscape of quiet/loud, pleasant/harsh as one moved through space, a kind of cost surface, a slope. We wondered if it would push the user from noisy areas to quiet areas? Would the user discover places they hand’t known about? Would the quiet places begin to fill up as people discovered them? As we iterated, we switched to a text-to-speech algorithm. As ‘Ici’ loads the pages, the text-to-speech algorithm whispers the abstracts of the wiki articles, all at once, in slightly different speeds and tones. ‘Historical Friction’ may be found at at https://github.com/shawngraham/historicalfriction.

Historical Friction deliberately plays with the idea of creating a break in presence – a cacophony of voices that haptically forces the user to stop in her tracks- as a way of focussing attention on those areas that are think and thin with digital annotations about the history of a place.

Voices Recognition

During the inaugural York University ‘Heritage Jam’ an annual cultural heritage ‘hack-fest’, a group of archaeologists/artists/coders took the Historical Friction application as inspiration and created an AAR app called Voices Recognition.  

“Voices Recognition is an app designed to augment one’s interaction with York Cemetery, its spaces and visible features, by giving a voice to the invisible features that represent the primary reason for the cemetery’s existence: accommodation of the bodies buried underground” [Eve, Hoffman, et al., 2014 ].

The way this is achieved is by using a smartphone-based app that again uses the GPS and compass to geo-locate the user within the cemetery. Each of the graves in the cemetery is also geo-located and is attached to a database of online census data, burial records and available biographies of the persons buried within the cemetery. The app then plays the contents of this database for every grave within 10m of the user. In the example application the data themselves are voiced by actors, however, in the full application it is likely these will be computer-generated voices (due to the sheer amount of data attached and the number of graves in the cemetery). (A video of the app in action may be viewed at http://www.youtube.com/watch?v=wAdbynt4gyw).  The net result of this is in places a deafening cacophony of voices (especially in the areas of the mass graves) and in other places single stories being told. The umarked mass burials literally shout and clamour to be heard, whereas the grandiose individual monuments whisper the single stories. The usual experience of a cemetery is completely inverted [Eve, S., Hoffman, K., et al. 2014].

The voices recognition app uses augmented audio to represent abstract data in a visceral and tactile way.  The subject matter of the app – the deceased – is perhaps an extreme example of information that could potentially have strong emotional impact on visitors.  Careful thought is required for the appropriate presentation and distribution of material suitable for the intended cultural sphere to avoid unnecessary upset if such an app were to be made live. However the concept highlights the opportunity to relate a cultural location at a much closer and personal level through audio than can be achieved through the more ‘removed’ visual overlay and presentation. The Voices Recognition project, as well as the SoundWalk project described earlier, highlights the power of using sound not just as a way of exploring dense historical data, but also of presenting this in an engaging and unusual way. As the Voices project states, the app is part pedagogical and part an artistic soundscape. Its use of the overlapping voices as a representation akin to a ‘heat-map’, representing the clustered data because “it’s eminently possible to render delicate distinctions between layers/concentrations, and [for] the human ear to identify them more distinctly than they can colour, light or smell”. [Eve, S., Hoffman, K., et al., 2014].

Building an Aural, Haptic, Augmented Reality to Hear the Past

In a guest lecture to a digital history class at Carleton University in the Fall 2014 semester, Colleen Morgan recounted her experience with the ‘Voices Recognition’ app when it was being tested: ‘Voices, in the cemetery, was certainly the most powerful augmented reality I’ve experienced’.

Building a convincing visual AR experience, that does not cause any breaks in presence is the holy grail of Augmented Reality studies, and something that is virtually impossible to achieve. A break in presence will occur due to the mediation of the experience through a device (Head-Mounted Display, tablet computer, smartphone, etc.); the quality of the rendering of the virtual objects; the level of latency in software that delivers the experience to the eyes; the list is endless and scale-less – once you ‘solve’ one break in presence, then another occurs. The goal then can never be to completely eliminate breaks in presence, but instead to recognise them and treat them with an historian’s caution. Indeed, we can play with them deliberately to use their inevitability to underline the broader historical points we wish to make. For example, the use of artificial crescendo and diminuendo (such as with the Historical Friction and the Voices Recognition application) arrests the user, making them stop and consider why the sounds are getting louder or quieter. By inserting prehistoric sounds into the modern landscape, Eve is creating an anachronistic environment.  This is a clear break in presence as that sound should never be heard in the present. However, the alien nature of that particular sound in that landscape jars our cognitive intentional state and again prompts us to examine what that sound might be and why it might have been placed in that particular location.

In this way the case studies presented are showing that AAR does not always have to be a ‘recreation’ or a fully immersive experience. Instead, much as we would treat the written word as the result of a process of bias and production, we should treat any augmented reality experience as the result of a process of bias (what is represented), production (the quality of the experience) and delivery (the way in which it is delivered). Hearing the past requires that we pay attention not just to effect but also affect, and in so doing, it prompts the kind of historical thinking that we should wish to see in the world.

Text Analysis of the Grand Jury Documents

a topic in the grand jury documents, #ferguson

a topic in the grand jury documents, #ferguson

I watched Twitter and the CBC while the prosecutor was reading his statement. I watched the live feeds from Ferguson, and other cities around the US. Back in August, when this all first began, I was glued to my computer, several feeds going at once.

A spectator.

Yesterday, Mitch Fraas put the grand jury documents (transcripts of the statements, the proceedings) into Voyant Tools:

These ultimately came from here: http://apps.stlpublicradio.org/ferguson-project/evidence.html

So today, I began, in a small way, to try to make sense of it all, the only way that I can. Text analysis.

Here’s the Voyant Tools corpus

Not having read the full corpus closely (this is, of course, a *distant* tool), it certainly looks as if the focus was on working out what Brown was doing, rather than Wilson…

I started topic modeling, using R & MALLET.

and I put everything up on github

but then I felt that I could improve the analysis; I created one concatenated file, then broke it into 1000 line chunks. The latest inputs, outputs, and scripts, are all on my github page.

The most haunting…

And all 100 topics…

None of this counts as analysis. But – by putting it altogether, my hope is that more people will grab the text files, grab the R script, explore the Voyant corpus, and really put this all under the microscope. I was tremendously effected by Bethany’s latest blog post, ‘All at once‘, which discusses her own reaction to recent news in both Ferguson and UVa, and elsewhere. It was this bit at the end that really resonated:

[…]we need analytical and interpretive platforms, too, that help us embrace our own subjective positioning in the systems in which we labor–which means, inevitably, to embrace our own complicity and culpability in them. And we need these, at the same time, to help us see beyond: to see patterns and trends, to read close and distantly all at once, to know how to act and what to do next. We need platforms that help us understand the workings of the cogs, of which we are one.

So here’s my small contribution. Maybe this can be a platform for someone to do a deeper analysis, to get started with text analysis, to read distantly and closely, to see beyond, and to understand what happened during the Grand Jury.

I’m no MacGyver

I’m no MacGyver. Tim the Tool Man? Bill Nye, Science Guy? Hell, I’m nowhere near Heinz Doofenshmirtz. Or Phineas. I’d kill to be Ferb.

Wile. E. Coyote? Brain? Possibly Pinky.

I’m not handy. But I thought I could do Google Cardboard. Print out the template. Glue it to a sheet of cardboard. Cut. Fold. VR!

Tab A certainly doesn’t fit into Slot B. And how does the eyepiece, crossbrace thingy work out? A Pampers box is admittedly probably too thick for this. Sheesh. Google, go look at Ikea instructions; they are masters of the art.

As for me, I’m going back to the warm embrace of acoustic augmented reality.

Visual- meh.

On Academic Blogging – a Conversation with Matt Burton

Papyrus, Wikimedia Commons, http://bit.ly/1DkaNWG

Matt Burton, who is working on new web genres and informal scholarly communication, asked me some questions recently as part of his research. We thought it would be interesting to share our conversation.

MB: When did you start your blog (career wise: as a grad student,  undergrad, etc)?

I recently pulled my entire blog archive into github, as part of my open-notebook workflow. (http://shawngraham.github.io/open-notebook/ll_CC/#!pages/uploads/blogarchive/posts/contents.md)

 I see there that I posted my first post on Dec 18, 2006. I was, at the time, working in what would now be recognized as alt-ac, doing contract research for Kevin Kee at Brock U, as well as freelance heritage consulting work, some online teaching, and substitute teaching at the local high school. This was after my post-doc, nearly four years to the day that I won my PhD.

MB: Why did you decide to start blogging?

Earlier in the year I had won a spot at the first digital humanities workshop at Lincoln Nebraska. John Bonnett of Brock, whom I’d met at CAA 2006 in Fargo, saw the advertisement and forwarded it to me. (John was an early champion of my work in Canada, and I’m eternally grateful for that!) I met there folks like Alan Liu, Katharine Walter, William Thomas, Stephen Ramsay. I didn’t appreciate it then, but that was the seminal moment for me. At the workshop where I presented my work on agent based modelling of Roman social structures, I distinctly remember Alan saying, ‘you’ve got a nice static website; have you thought about blogging?’  Thereupon the room began discussing how a blog for my work might, well, work.  My postdoc terminated that September, and when I was out of the warm embrace of academia, I decided ‘what the hell; what am I afraid of?’ and I started blogging. I posted three times that day, along with a statement of why I’m blogging. I framed it as a record of my explorations in virtual worlds.

Even then, it was a kind of open notebook. Kevin, the other major supporter of my work in those early days, let me count the writing of blog posts towards the more general research goals of the projects he was employing me on. We expect projects these days to blog, but in those days, I think it was still fairly novel. I wasn’t even blogging about the main project, just the side roads and blind alleys I was stumbling around.

MB How do you host your blog, i.e. Do you use a generic web-host like Dreamhost with WordPress, do you use a blogging service like Blogger.com

I’m using plain old wordpress.com, though I did invest in buying a domain name. Initially, I’d called it ‘electric archaeology’ but in the wordpress.com domain I’d called it electricarchaeologist.wordpress which was, well, confusing and annoying. I host my course blogs with Dreamhost, which over the years has gotten more clunky it seems. That’s just an impression.

MB How did you learn to set up your blog? 

I spent an inordinate amount of time farting around with the settings, themes, etc. At one point I was the tech support for an online liberal arts college start up; because I’d pressed the button on a one-click dream host install of Moodle, that made me the most technically proficient person there.

Scary thought.

Anyway, they had a wordpress merged with moodle arrangement, and one day I utterly bolloxed up the moodle upgrade, which broke everything. I printed out every php file I could find, and with the help of a friend, laid them out on the floor, drawing arrows to connect files by dependencies, shared tables, etc, to sort out the mess.

I learned a lot that day. Primarily, that I didn’t like web development. I’ve stuck more or less with whatever the free theme gods throw my way, since then. My online tenure & promotion portfolio is built on wordpress (graeworks.net) and involved a bit of hacking around to get the right plugins I wanted.

MB What are the challenges with maintaining your blog (i.e. spam, approving comments, dealing with trolls, etc)?

Spam. Spam spam spam spam!

I don’t get many comments. I know people read the thing, but since I don’t often write long discursive pieces, I guess I just don’t attract that much in the way of comments. Although I do get emails directly in response to things I’m doing on the blog, so I suppose that counts.

The biggest issue is maintaining drive. It helps to keep in my mind that this is a research blog, an open notebook, the narrative bits that help me make sense of all the digital ephemera littering my computers. I often have to consult the blog to remind myself just what the hang I’ve been working on. Initially I was posting quite regularly, but over the years it goes in fits and starts.

MB What topics do you normally write about? Do you try and keep it strictly academic, or do you mix in other topics?

I like to futz about with new (digital) toys, to make them do unexpected things, to think through how they might be of use to others, to figure out how to tell others how they might want to use them. I do bits of analyses, munge data together to share with others. I do mix in other non academic stuff from time to time. For a while, the National Geographic channel used to send me dvds to review prior to one of their big ratings weeks. Perhaps it’s a coincidence, but after I wrote, of one episode, ‘bollocks’, the dvds stopped coming.

Probably a coincidence.

MB If you allow comments on your blog, do you often get comments? What has been your experience managing comments/commenters on your blog?

Again, not so much. Probably a function of the content, I suppose. Dealing with spam that gets by akismet is tiring though.

MB What kinds of interactions (scholarly or otherwise) emerge out of your blogging practice?

I like to say that my transformation from a ho-hum bog standard Roman brick guy (and there’s more of us than that sentence would lead you to think) into this thing ‘digital humanities’ was a direct consequence of the blogging. The blogging gained my simulation work (not many DH’ers do agent modelling) a larger audience, which led to many of my how-tos, to email exchanges with grad students (for the most part) who are now getting established in various places; invitations to contribute to edited volumes, conferences, and journals, speaking engagements – all this while I was formally outside of academia. Before twitter, the blogging helped me maintain a sense of community, a sense of purpose for my intellectual curiosity that I didn’t get in my day-to-day scramble to pay the bills. I think I might be the first person in Canada to be hired to a post formally with ‘digital humanities’ in the title (though of course I’m not saying I was the first DH person!!) and it was the blogging, the exposure to and engagement with wider trends going on in computational humanities beyond archaeology, that allowed me to say with confidence, ‘yes, I’m the DH person you’re looking for’.

The blogging made me.

MB Do you find these interactions informative, useful, enlightening, tedious, frustrating, obligatory, etc? How do they feel?

I still get excited when there’s a comment on the blog. The Louis Vitton bag people, they complete me.

Real comments send me over the moon. They’ve led to many productive relationships and partnerships.

MB How do you think digital humanities blogging is different from more traditional forms of academic writing and reading?

I think it’s a return, in some ways, to academic discourses of earlier, not-second-half-of-the-20th-century ways. But that’s mostly an impression; I’m pretty foggy on most things after AD 300. But I like the reflexivity of digital humanities blogging, the exploration of not just what the tool can do, or what computation has perhaps thrown into new light, but the consideration of what that does to us as researchers, as a public.

MB How would you characterize the relationship between blogging and the digital humanities (however broadly conceived)?

Not everybody has to blog. Nor should they. It’s perfectly possible to be a productive dh person and not blog. But speaking for myself, I think blogging keeps things fresh. We’re working on a book; the blogged draft has already had a bit of an impact. I’m worried the paper version will already be dated by the time it comes out (though this is one of the fastest book projects I’ve ever been involved with), precisely because the most interesting conversations are happening across the blogs, faster than the formal apparatus can keep up. But that’s ok.

MB What DH blogs/bloggers do you read and why do you read them? What do you like about them?

A partial list: I read Scott and Ian, obviously; Ted Underwood; Elijah Meeks, Alan Liu, Bethany, Ben Schmidt, Mills Kelley, Tom Brughmans, Caleb Daniels, Profhacker, Donna Yates, Colleen Morgan, Lorna Richardson, playthepast.org… it rather depends on what project I’m working on. I followed Stu Eve religiously for a while as he puzzled out the problems of an embodied GIS. Now that that project is done – and I’m not teaching locative computing for historians at the moment – I’ve moved away a bit. So has Stu, for that matter. It all really depends on what’s going on, and what’s caught my attention. I’m a bit of a magpie. dhnow is essential though for its global view.

I read these folks for the way they dissect ideas as much as for any how-tos or code they share. They help me see bigger picture. Some of them are historians, some are english-flavoured dh, others are archaeologists.

MB What was your most popular blog post? Why do you think it was so popular?

The all-time most popular post on my blog, according to wordpress stats, are:

Civilization IV World Builder Manual & other needful things

19,338

Getting Started with MALLET and Topic Modeling

10,621

Moodle + WordPress = Online University

9,710

So, two how-tos, and one that seems to have hit some kind of SEO sweetspot, since it’s fairly anodyne. A follow up to that last one hasn’t been as popular:

WordPress + Moodle (not equal to) Online University

3,733

But if you asked for my favourites, I’d say:

Signal Versus Noise: Why Academic Blogging Matters: A Structural Argument. SAA 2011

1,206

How I Lost the Crowd: A Tale of Sorrow and Hope

1,035

What is the half-life of blog posts, I wonder? The blogging represents quite a sustained effort. I did the math; I’ve written enough tweets to fill two typical academic books; I have no idea how many words these 700 (or so) blog posts I’ve got add up to. But I do think the sustained effort of writing regularly has made me a better writer. (Reader, you may wish to disagree!)

Historical Maps, Topography, Into Minecraft: QGIS

Building your Minecraft Topography

(An earlier version of this uses Microdem, which is just a huge page in the butt. I re-wrote this using Qgis, for my hist3812a students. If you’d like to see what some of them accomplished, head over to the github repo where there’s ‘Slave of Portus’, ‘Vimy Ridge’, and ‘Crafting the Canal’)

If you are trying to recreate a world as recorded in a historical map, then modern topography isn’t what you want. Instead, you need to create a blank, flat world in Worldpainter, and then import your historical map as an overlay. In worldpainter, File >> New World. In the dialogue box, uncheck ‘circular world’. Tick of ‘flat’ under topography. Then, on the main icon ribbon, select the ‘picture frame’ icon (‘image overlay’). In the dialogue box, tick ‘image overlay’. Select your file. You might have to fiddle with the scale and the x, y offset to get it exactly positioned where you want. Watch the video mentioned below to see all this in action. Then you can paint the terrain type (including water), raise, lower the terrain accordingly, put down blocks to indicate buildings… Worldpainter is pretty powerful.

If you already have elevation data as greyscale .bmp or .tiff

  • Watch the video about using Worldpainter.
  • Skip ahead to where he imports the topographic data and then the historical map imagery and shows you how to paint this against your topography.
  • You should also google for Worldpainter tutorials.

If you have an ARCGIS shapefile

This was cooked up for me by Joel Rivard, one of our GIS & Map specialists in the Library. He writes,

  • Using QGIS: In the menu, go to Layer > Add Vector Layer. Find the point shapefile that has the elevation information.
  • Ensure that you select point in the file type.
  • In the menu, go to Raster > Interpolation.
  • Select “Field 3″ (this corresponds to the z or elevation field) for Interpolation attribute and click on “Add”.
  • Feel free to keep the rest as default and save the output file as an Image (bmp, jpg or any other raster)

If you need to get topographic data

In some situations, modern topography is just what you need.

  • Grab Shuttle Radar Topography Mission data for the area you are interested in (it downloads as a tiff.) To help you orient yourself, click off ‘toggle cities’ at the bottom of that page. You then click on the tile that contains the region your are interested in. This is a large piece of geography; we’ll trim in a moment.
  • Open QGIS
  • Go to Layer >> Add Raster Layer. Navigate to the location where your srtm download is located. You’re looking for the .tiff file. Select that file.

Add Raster Layer

  • You now have a grayscale image in your QGIS workspace, which might look like this

Straights of Hercules, Spain, Morocco

  • Now you need to crop this image to just the part that you are interested in. On the main menu ribbon, select Raster >> Extraction >> Clipper

Select Clipper Tool

  • In the dialogue box that opens, make sure that ‘Clipping Mode’ is set to ‘Extent’. With this dialogue box open, you can click and drag on the image to highlight the area you wish to crop to. The extent coordinates will fill in automatically.

  • Hit ‘Select…’ beside ‘Output File’. Give your new cropped image a useful name. Hit ‘Save’.

  • Nothing much will appear to happen – but on the main QGIS window, under ‘layers’ a new layer will be listed.

Imgur

  • UNCHECK the original layer (which will have a name like srtm_36_05). Suddenly, only your cropped image is left on the screen. Use the magnifying glass with the plus sign (in the icons at the top of the window) to zoom so that your cropped image fills as much of the screen as possible.
  • Go to Project >> Save as image. Give it a useful name, and make sure to set ‘files of type’ to .bmp. You can now import the .bmp file to your Worldpainter file.

Importing your grayscale DEM to a Minecraft World

Video tutorial again – never mind the bit where he talks about getting the topographic data at the beginning

At this point, the easiest thing to do is to use WorldPainter. It’s free, but you can donate to its developers to help them maintain and update it. Now, the video shown above shows how to load your DEM image into WorldPainter. It parses the black-to-white pixel values and turns them into elevations. You have the option of setting where ‘sea level’ is on your map (so elevations below that point are covered with water). There are many, many options here; play with it! Adam Clarke, who made the video, suggests scaling up your image to 900%, but I’ve found that that makes absolutely monstrous worlds. You’ll have to play around to see what makes most sense for you, but with real-world data of any area larger than a few kilometres on a side, I think 100 to 200% is fine.

So: in Worldpainter – File >> Import >> Height map. In the dialogue box that opens, select your bmp file. You’ll probably need to reduce the vertical scale a bit. Play around.

Now, the crucial bit for us: you can import an image into WorldPainter to use as an overlay to guide the placement of blocks, terrain, buildings, whatever. So, again, rather than me simply regurgitating what Adam narrates, go watch the video. Save as a .world file for editing; export to Minecraft when you’re ready (be warned: big maps can take a very long time to render. That’s another reason why I don’t scale up the way Adam suggests).

Save your .world file regularly. EXPORT your minecraft world to the saves folder (the link shows where this can be found.

Go play.

Wait, what about the historical maps again?

The video covers it much better than I could here. Watch it, but skip ahead to the map overlay section. See the bit at the top of this post.

Ps. Here’s Vimy Ridge, site of a rather important battle in WW1 fought by the Canadian Army, imported into Minecraft this way:
Vimy Ridge in Minecraft

#hist3812a video games and simulations for historians, batting around some syllabus ideas

I’ve been batting around ideas for my video games class, trying to flesh them out some more. I put together a twine-based exploration of some of my ideas in this regard a few weeks ago; you can play it here. Anyway, what follows below is just me thinking out loud. The course runs for 12 weeks. (O my students, the version of the syllabus you should trust is the one that I am obligated to put on cuLearn).

What does Good History Through Gaming Look Like?

How do we know? Why should we care? What could we do with it, if we had it? Is it playing that matters, or is it building? Can a game foster critical play? What is critical play, anyway? ‘Close reading’ can happen not just of text, but also of code, and of experience. It pulls back the curtain (link to my essay discussing a previous iteration of this course).

Likely Topics

  1. A history of games, and of video games
  2. Historical Consciousness & Worldview
  3. Material culture, and the digital: software exists in the physical world
  4. Simulation & Practical Necromancy: representing the physical world in software
  5. Living History, LARPing, ARGs and AR: History, the Killer App
  6. Museums as gamed/gameful spaces
  7. Gamification and its bastards: or, nothing sucks the fun out of games like education
  8. Rolling your Own: Mods & Indies
  9. The politics of representation

Assessment

Which Might Include Weekly Responses & Critical Play Sessions:

  1. IF responses to readings (written using http://twinery.org)
  2. Play-throughs of others’ IF (other students; indie games in the wild)
  3. Critical play of Minecraft
  4. Critical play of ‘historical’ game of your choice
  5. Critical play of original SimCity (which can be downloaded or played online here). We’ll look at its source code, too, I think. Or we might play a version of Civilization. Haven’t decided yet.
  6. Critical boardgame play
  7. ARIS WW1 Simulation by Alex Crudas & Tyler Sinclair

Yes. I am going to have you play video games, for grades. But you will be looking for procedural rhetorics, worldviews, constraints, and other ways we share authority with algorithms (and who writes these, anyway?) when we consume digital representations of history. Consume? Is that the right verb? Co-create? Receive?

Major Works

  1. Midterm:IF your favourite academic paper that you have written such that a player playing it could argue the other sides you ignored in your linear paper. Construct it in such a way that the player/reader can move through it at will and still engage with a coherent argument. (See for example ‘Buried’ http://taracopplestone.co.uk/buried.html). You will use the Twine platform. http://twinery.org
  2. Summative Project: Minecrafted History
    1. You will design and build an immersive experience in Minecraft that expresses ‘good history through gaming’. There will be checkpoints to meet over the course of the term.Worlds will be built by teams, in groups of 5. Worlds can be picked from three broad themes:THE HISTORY OF THE OTTAWA VALLEY
      THE CANADIANS ON THE WESTERN FRONT
      COLONIZATION AND RESISTANCE IN ROMAN BRITAIN  (…look, I was a Roman archaeologist, once…)
    2. You will need to obtain source maps; you will digitize these and translate them into Minecraft. We will in all likelihood be using Github to manage your projects. The historical challenge will be to frame the game play within the world that you have created such that it expresses good history. You will need to keep track of every decision you make and why, and think through what the historical implications are of those decisions.
    3. The final build will be accompanied by a paradata document that will discuss your build, details all sources used (Harvard Style), references all appropriate literature, and explains how playing your world creates ‘good history’ for the player. This document should reference Fogu, Kee et al, and the papers in Elliot and Kappell at a miminum. More information about ‘paradata’ and examples may be found at http://heritagejam.org/what-are-paradata Due the first session on the last week of term, so that we can all play each others’ worlds. The in-class discussion that will follow in the second session is also a part of this project’s grade. Your work-in-progress may also be presented at Carleton’s GIS Day (3rd Wednesday in November)
    4. (These worlds will be made publicly available at the end of the term, ideally for local high school history classes to use. Many people at the university are interested to see what we come up with, too. No pressure).

So that’s what I’m thinking, with approximately 1 month to go until term starts. We’ve got Minecraft.edu installed in the Gaming Lab in the Discovery Centre in the Library, we’ve got logins and remote access all sorted out, I have most of the readings set … it’s coming together. Speaking of readings, we’ll use this as our bible:

Playing with the Past

and will probably dip into these:

Play the Past

PastPlay

… sensing a theme…

Heritage Jam entry: PARKER

I’m sure it isn’t quite what they were expecting, but I submitted something to HeritageJam.

View it here.

PARKER is an interactive experience in procedurally extracting, uncovering, and reversing, the burial of latent semantic core archaeological knowledge. In this era of neoliberal corporatization of cultural heritage knowledge, PARKER represents the way forward for its creation and appreciation. When we must balance funding for healthcare versus that for archaeologists, in this time of reduced availability of funds, how can we not turn to data mining and revisualization of knowledge? After all, what is the insight of the individual when millions of minutes of youtube videos are being created every minute? Further, PARKER extracts the core insights of archaeology and formats them automatically for patenting, so that DRM can be affixed and rightsholder value be fully realized.

PARKER:  for the archaeology we always dreamed of.

———

This visualization is an interactive story that frames the automatic search of youtube, natural-language parsing, and automatic super cut & re-formatting of those search results to highlight the ways code can frame archaeological knowledge. It applies Sam Lavigne’s ‘videogrep’ and ‘automatic patent generator’ to results from a search for ‘archaeological burials’ retrieved from Youtube, selecting the first few results that included closed-captioning. Videogrep uses natural-language pattern matching on those captioning files to select clips from a variety of pieces, restitching them at random. The result is similar to an I-Ching or other ways of divination of meaning. Similarly, the patent generator grabs the transcription so that elements that fit the language of patent applications. As I have argued elsewhere, digital archaeology is not about justification of results, but rather, the deformation of the familiar.

The result is a making-strange, an uncovering, of deeper truths. Code is not neutral, and we would be wise to recognize, to engage with, the theoretical perspectives encoded in our use of digital tools – especially when dealing with the human past.

 

A method and apparatus for observing the rhythmic cadence; or, an algorithmic alternative archaeology

Figure 1

Figure 1. A Wretched Garret Without A Fire (at least, according to Google Images)

A method and apparatus for observing the rhythmic cadence

ABSTRACT

A method and apparatus for observing the rhythmic cadence. The devices comprises a small shop, a wretched garret, a Russian letter, a mercantile house, a third storey

BRIEF DESCRIPTION OF THE DRAWINGS

Figure 1 illustrates a wretched garret without a fire.

Figure 2 is a block diagram of a fearful storm off the island.

Figure 3 illustrates a mercantile house on my own account.

Figure 4 is a perspective view of the principal events of the Trojan war.

Figure 5 is an isometric view of a poor Jew for 4 francs a week.

Figure 6 is a cross section of a thorough knowledge of the English language.

Figure 7 is a block diagram of the hard trials of my life.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The present invention is the son of a Protestant clergyman. The device is a wretched garret without a fire. The present invention facilitates the study of a language. The invention has my book in my hand. The invention acquires a thorough knowledge of the English language.

According to a beneficial embodiment, the invention such a degree that the study. The present invention shows his incapacity for the business. The device obtains a situation as correspondent and bookkeeper. The device understood a word of the language. The present invention established a mercantile house on my own account. The invention does not venture upon its study. The device devotes the rest of my life. The present invention realizes the dream of my whole life. The present invention publishes a work on the subject.

What is claimed is:

1. A method for observing the rhythmic cadence, comprising:
a wretched garret;
a small shop; and
a Russian letter.

2. The method of claim 1, wherein said wretched garret comprises a mercantile house on my own account.

3. The method of claim 1, wherein said small shop comprises the principal events of the Trojan war.

4. The method of claim 1, wherein said Russian letter comprises a fearful storm off the island.

5. An apparatus for observing the rhythmic cadence, comprising:
a mercantile house;
a small shop;
a third storey; and
a Russian letter.

6. The apparatus of claim 5, wherein said mercantile house comprises a wretched garret without a fire.

7. The apparatus of claim 5, wherein said small shop comprises a fearful storm off the island.

8. The apparatus of claim 5, wherein said third storey comprises a thorough knowledge of the English language.

9. The apparatus of claim 5, wherein said Russian letter comprises a thorough knowledge of the English language.

—————–
Did you recognize Troy and its Remains, by Henry (Heinrich) Schliemann, in that patent abstract? I took his ‘autobiographical notice’ from the opening of his account of the work at Troy, and ran it through Sam Lavigne’s Patent Generator. It’s a bit like the I-Ching. I have it in mind that this toy could be used to distort and reflect on, draw something new from, some of the classic works of archaeology – especially from that buccaneering phase when, well, pretty much anything went. What if, instead of publishing their discoveries, the early archaeologists had patented them instead? We live in such an era now, when new forms of life (or at least, its building blocks) can be patented; when workflows can be patented; when patents can be framed so broad that a word-generator and a lawyer will bring you riches beyond compare… the early archaeologists were after fame and fortune as much as they were about knowledge of the past. This patent of Schliemann’s uses as its source text an opening sketch about the man himself, rather than his discoveries. Doesn’t a sense of him shine through? Doesn’t he seem, well, rather over-inflated? What is the rhythmic cadence, I wonder. If I can sort out the encoding, I’ll try this on some of his discussion of what he found.

(think also the computational power that went into this toy: natural language processing, pattern matching… it’s rather impressive, actually, when you think what can be built by bolting existing bits together).

Here’s Chapter 1 of Schliemanns account of Troy. Please see the ‘detailed description of the preferred embodiments’, below.

——————-
An apparatus and method for according to the firman

ABSTRACT

An apparatus and method for according to the firman. The devices comprises a whole building, a large block

BRIEF DESCRIPTION OF THE DRAWINGS

Figure 1 is a block diagram of the north-western end of the site.

Figure 2 is an isometric view of the second secretary of his chancellary.

Figure 3 is a perspective view of a large block of this kind.

Figure 4 is a diagrammatical view of the steep side of the hill.

Figure 5 is a schematic drawing of the native soil before the winter.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

The present invention sells me the field at any price. The device reach the native soil before the winter. The present invention is the highest mountain in the world.

What is claimed is:

1. An apparatus for according to the firman, comprising:
a whole building; and
a large block.

2. The apparatus of claim 1, wherein said whole building comprises a large block of this kind.

3. The apparatus of claim 1, wherein said large block comprises the native soil before the winter.

4. A method for according to the firman, comprising:
a large block; and
a whole building.

5. The method of claim 4, wherein said large block comprises the north-western end of the site.

6. The method of claim 4, wherein said whole building comprises the second secretary of his chancellary.

Using Storymaps.js

The Fonseca Bust, a storymap

The Fonseca Bust, a storymap

A discussion on Twitter the other day – asking about the best way to represent ‘flowmaps’ (see DHQ&A) – led me to encounter a new toy from KnightLabs: Storymap.js. Knightlabs also provides quite a nice, fairly intuitive editor for making the storymaps. In essence, it provides a way, and a viewer, for tying various kinds of media and text to points along a map. Sounds fairly simple, right? Something that you could achieve with ‘my maps’ in Google? Well, sometimes, it’s not what you do but the way that you do it. Storymaps also allows you to upload your own large-dimension image so that you can bring the viewer around it, pointing out the detail. In the sample (so-called) ‘gigapixel’ storymap, you are brought around The Garden of Earthly Delights.

This struck me as a useful tool for my upcoming classes – both in terms of creating something that I could embed in our LMS and course website for later viewing, but also as something that the students themselves could use to support their own presentations. I also imagine using it in place of essays or blog post reflections. To that end, I whipped up two sample storymaps. One reports on an academic journal article, the other provides a synopsis of a portion of a book’s argument.

Here’s a storymap about the Fonseca Bust.

Here’s a storymap about looting Cambodian statues.

In the former, I’ve uploaded an image to a public google drive folder. It’s been turned into tiles, so as to load into the map engine that is used to jump around the story. Storymap’s own documentation suggests using Photoshop’s zoomify plugin. But if you don’t have zoomify? Go to sourceforge and get this: http://sourceforge.net/projects/zoomifyimage/ . It requires that you have Python and the Python Image Library installed (PIL). Unzip zoomifyimage, and put your image that you want to use for your story in the same folder. Open your image in any image processing program, and find out how many pixels wide by high it is. Write this down. Close the program. Then, open a command prompt in the folder where you unzipped zoomify (shift+right click, ‘open command prompt here’, in Windows). At the prompt, type


ZoomifyFileProcessor.py <your_image_file>

If all goes well, nothing much seems to happen – except that you have a new folder with the name of your image, an xml file called ImageProperties.xml and one or more TileGroupN folders with your sliced and diced images. Move this entire folder (with its xml and subfolders) into your google drive. Make sure that it’s publicly viewable on the web, and take note of the hosting url. Copy and paste it somewhere handy.

see the Storymap.js documentation on this:

“If you don’t have a webserver, you can use Google Drive orDropbox. You need the base url for your exported image tiles when you start making your gigapixel StoryMap. (show me how)).”

In the Storymap.js editor, when you click on ‘make a new storymap’, you select ‘gigapixel’, and give it the url to your folder.  Enter the pixel dimensions of the complete image, and you’re good to go.

Your image could be a high-resolution google earth image; it could be a detail of a painting or a sculpture; it could be a historical map or photograph. There are also detailed instructions on running a storymap off your own server here.