Somewhere in the desert… a temple

My minecraft expedition was a success. Let me share some observations.

Firstly -> I seeded the wrong world. I used

Double Village

as seed for ‘large biomes’ when I should have used it for ‘default’. Reading the map incorrectly happens all the time in landscape archaeology though. Transpose some digits, and soon you’re hundreds of metres in the wrong spot.

Framing my expedition in my mind as a kind of steam-punk exploration helped get me back ‘in the game':

I found the village quite easily this time. It was filled with NPCs going about their mysterious business. I, a stranger, wandered into their midst and had no impact on their lives. Doesn’t that often seem the way of a ‘foreign’ expedition? When as a graduate student I was excavating at Forum Novum, our world and that of the people whose local marketground we were digging up really did not intersect, except in very particular contexts: the bar and the restaurant. On market day, we would all head back to Rome. Canadian lad flies in, digs, figures it all out, writes a paper, never explains/connects with the locals. As I remarked at the time,

And so I bumbled away, trying to record stratigraphically what I was up to. The different kinds of blocks do help differentiate context – sand fill is quite different from the sandstone blocks the temple was built with. Unfortunately, sandstone is also part of the geology of Minecraft, and typically happens around 3 or 4 blocks down from the surface in this biome. So it became difficult to figure out where the temple ended and the local geology began. Since the temple is of a common ‘type’ in Minecraft, I could just dig to exhume that prexisting type-idea and poof: complete temple. The act of excavation creates the archaeology in more ways than one, it seems.

Channeling my inner Howard Carter there. But – in this world with no ‘rules’, no overarching ‘story’, deciding to go an an archaeological expedition forces a story on us. Interacting with the NPCs, and the crude excavation tools, pushes us towards a 19th century frame of mind. In my steam-punk narrative I was constructing on twitter, the archaeologist-as-better-class-of-looter trope seemed to emerge naturally out of my interaction with the game mechanics.

And then this happened.

We’ll come back to that. Suffice to say, this encounter with the ‘otherness’ of the inhabitants of the village was oddly discomfiting.

Clearly, Notch has watched too many Indiana Jones films. Meanwhile, the villagers continued to trouble me.

And then night fell. I decided to try to spend it with the villagers.

I broke the door, quite by accident. Clumsy foreigner. Interfering.

From above, I watched the zombies and creepers and who knows what else hunt each NPC down and kill them.

So I managed to set into action a chain of events that resulted in the death of the entire village. Now obviously *real* archaeological excavation rarely results in the deaths of the locales, but there are unintended consequences to our interventions. Here, the game holds a distorted fun-house mirror to life. But were I doing this with a class, this would be a teachable moment to consider the impact of academic archaeology in those ‘distant’ lands we study.

For my minecraft adventure, I left the expedition and struck out on my own. Soon I discovered more temples, more villages, more ruins. If you’re exploring too, you can find them here:

266.9 66.87 1036.99
-219.24 65.270 13.56
58 67 347
487.73 46 560.3
247.76 66 784
430 63 929.8
692 70 1256.7

Now, one could use those coordinates to begin mapping, and perhaps working out, something of the landscape archaeology in this world. One of those coordinates belongs to a vine-covered stone temple in the jungle. Here, our expectations of what ‘archaeology’ is (informed by the movies) come to the fore.

Now, it may be that I should mod this world more in order to enable a post-colonial kind of archaeology within it. But the act of modding is itself colonialist…

So what I have I learned? I have often argued in my video games for historians class that it’s not so much the ‘skin’ of a game that should be of concern to historians, but rather the rules. The rules encode the historiographic approach of the game’s designers. You’re good at the game? You’re performing the worldview of the game’s creators. But in a game like minecraft, where the rules are a bit more low-level (for lack of a better term), what’s interesting is the way player agency in the game intersects and merges with the player’s own story, the story the player tells to make sense of the action within the world. It’s poesin. Mimemsis. Practomimetic? So while some of the game’s embedded worldview can be seen to be drawn straight from the Indiana Jones canon, other elements, like the agency of NPCs, discomfits us precisely because it intersects our own worldviews (the sociocultural practice of academic archaeology) in such a way as to draw us up short.

It will be interesting to see what Andrew’s expedition uncovers…

Somewhere in the desert…

A lost village

At the upcoming SAA in San Fracisco, Andrew Rheinhard and I are participating in a forum on digital public archaeology. Our piece, ‘Playing Pedagogy: Videogaming as site and vehicle for digital public archaeology’ is still in a process of becoming. Our original abstract:

While there is an extensive literature on the pedagogical uses of video games in STEM education, and a comparitvely smaller literature for langagues, literature, and history, there is a serious dearth of scholarship surrounding videogames in their role as vectors for public archaeology. Moreover, video games work as ‘digital public archaeology’ in the ways their imagined pasts within the games deal with monuments, monumentality, and their own ‘lore’. In this presentation, we play the past to illustrate twin poles of ‘public’ archaeology, as both worlds in which archaeology is constructed and worlds wherin archaeological knowledge may be communicated.

We had initially thought to write a game to explore these ideas, and so our entire presentation would involve the session participants playing it. But writing games is tough. In fact, it would be hard for one to top the game made by Tara Copplestone for the 2014 Heritage Jam, ‘Buried’. However, another venue presents itself. Andrew recently proposed to the makers of No Man’s Sky that he be allowed to lead an archaeological expedition therein.

“What!” I hear you exclaim. Well, think of it like this. We’re used to the idea of reception studies, of how the past is portrayed in games, movies, novels. We’re also used to the idea of games as being the locus for pedagogy, or for persuading, or making arguments. What happens then, in a game like No Man’s Sky, where the entire world is generated algorithmically from a seed? That is, no human designs it: it emerges. Rather like our own universe, eh? Such procedural games are quite common, though none perhaps are as complex in their world building as Dwarf Fortress (which evolves not just the world, but also culture & individual family/clan/culture lineages!)

What then does such  xenoarchaeology look like? How does that intersect with digital public archaeology? Well, if archaeological method has any truth to it, then in these worlds we might be faced with something profoundly alter, something profoundly different (which also accounts for why the writers of Star Trek placed such stock on archaeology)

We’ve got a month to sort these thoughts out. But it was in this frame of mind that I started thinking what archaeology in Minecraft would look like, could look like, and what it might find. Not in Minecraft worlds that have been lovingly built from scratch by a human. No, I mean the ones grown from seeds. It’s quite interesting – since no computational process is actually truly random, if you know the seed from which all calculations and algorithms are run, you can recreate the exact sequence that gives rise to a particular world (in this, and indeed in all, computational simulations). There is quite a thriving subculture in Minecraft it turns out that share interesting seeds. And so, as I searched for seeds that might prove fertile for our talk, I came across ‘Double Village’ for Minecraft 1.64. (See method 5 for spawing worlds from seeds). If you’ve got Minecraft 1.64 you too can join me on my expedition to a strange –desert land….


The texts all say the same thing. Set the portal to ‘Double Village’ and soon you’ll find the exotic and lost desert villages. I put on the archaeotrancerebretron, grabbed my kit bag, and gritted my teeth. My companions all had theirs on too. We stepped into the charmed circle…

‘Teaching 1613, An Algorithmic Incoherence’, or, the results of an experiment in automatic transcription

I loaded the audio of the opening remarks I made at last year’s Champlain Colloquium at Carleton into Youtube, to see what Google’s automatic transcription would make of it.

Ladies and Gentlemen, I give you,

‘Teaching 1613, An Algorithmic Incoherence’

0:00 maize from these critical encounters I’m yours and
0:04 think back to my high school history class and mutual security
0:08 more I don’t be overly much space in homers
0:12 we brigham
0:15 hey don’t think I actually for him as a good
0:19 historical persons old I’m
0:23 my position we’re in the artist’s
0:26 those addresses her donor speaking truth to power
0:30 to whom do we use in humans
0:34 the how to change such as homeless
0:38 Jaume here in this place
0:41 this time intern so 201 church introducing my
0:48 rules is batting practice teaching in volunteering
0:51 wrestle with this question these questions it is a
0:56 various university classroom secondary schools
0:59 the water column jam we are the people who was in
1:04 people that are you know that ass time hurdles and they’re going to
1:09 problem I I’m it she does so we ask that his own loss
1:15 you know I designs
1:18 that mean for us on this issue
1:22 use minutes or so that’s just wrong
1:26 I’m Evans and you know
1:30 also moms no said also I am so there’s no
1:35 lines or yes yes it does
1:38 don’t know those did final sorry I’m gone
1:42 my saddam
1:45 we have john wong you I
1:48 comedian is historic Department is here you go
1:51 University where he teaches courses in so long as you know the issue
1:55 one so this is going to her place memory and remember
1:59 placing yeah although it has a very strong residents
2:03 and numerics today a
2:06 know I was engage I’m program so it was also observed
2:11 also sewing machine i mean for his own use or lose you
2:15 him to keep his arm because it’s government
2:19 Karen the Russian a
2:22 yeah YES on the measure them all
2:26 lost museum chaos gym class heroes:
2:30 no year also a pedagogy
2:33 anything
2:34 yeah yeah he’s OK
2:37 you people in the US you all moved into a home
2:411 0 June 1810
2:44 yeah yeah and you just who is the director of any
2:49 education for the can see you vision
2:52 luminous Jim so I was I wish to change
2:57 share those experiences and observations
3:01 I and we should use those observations for a jumping off point
3:05 for our discussion that only are you know
3:08 billion GG 6 p.m.

…I wonder though, if I went through the transcription and corrected it – since Google now knows what I sound like, and what I’m saying at each of these timestamps: would the next bit I upload be better transcribed? Am I teaching the machine? Are we all?

The Data Driven DJ

The ‘Data Driven DJ‘ project is brilliant. I can see so much potential in it. I intend to write more about it soonish, but you should go and look at this project now. Run. Don’t walk!

Watch this:

Also, note this:

I don’t have very specific guidelines for this, but I’m generally looking for these kinds of sounds:

Music you own the rights to (and would allow me to use it in a fair way)
High-quality recordings of instruments (the weirder the better)
Sound recordings of cultural or historical significance (or really any recording that is interesting or unique in some way) [see post]

I’m going to see if I can find some audio somewhere that would meet that last requirement, send them to him. You should too!

I’ve been interested in sound, space, history, data, and experience for a while (you might even call that a kind of augmented reality, or a visualization, or a sonification, or…) but instead of crappily coding my own stuff, I think I’m going to explore the data driven dj’s materials for a while, see what I can build out from there.

Oh, and if you’re interested, here’s some of my sonic’d stuff:

Hearing the past

Historical Friction

Listening to Topic Models

The Audio Guide 2.0 (wow, that’s an oldie!)

Rocker and Docker and Daemons …. oh my!

I’m teaching a course at the moment on data mining, visualization, and other sundry topics. Right now, the course takes place in the physical world but this time next year, it will be a completely online course (and students at Carleton U, U Waterloo and Brock U will be able to take it for credit without issue; others might have to arrange transfer credit with their institution). All of the course materials are available on Github at Feel free to fork, improve, and follow along. I’ll be rewriting a lot of this material in the light of this term’s experience.

For instance, there’s the issue of platforms. In the class, we have Windows 7 users, Windows 8, Mac (Mavericks & Yosemite), and two flavours of Linux. This presents certain challenges. Do I try to teach folks how to use the platform in front of them to do the kind of research they are interested in? Or do I try to get them all onto one platform, and teach to that?

It might seem silly, but I elected to do the first. Most of the students I come into contact with are barely aware of the power of the machines that they are facebooking on in class. I wanted to get them familiar with their own environments and what they could accomplish within them.

This was all fine and dandy, more or less, until I decided they should use a shell script to download materials from the web via an API. Here’s the exercise in question. On the plus side, we learned a lot about how our machines worked. On the down side, we shed a lot of tears before everyone was on the same page again. It was at this point that one of the students forked the exercise and re-wrote it to use a virtual machine.

How freaking cool is that – a student contributing to the design of the course! I thought.

I also thought: ok, maybe I was wrong in my approach. Maybe I should’ve had them using a virtual machine from the outset. Now, Bill Turkel has long advocated for using command line tools for digital history research. Recently, he and Ian Milligan and Mary Beth Start put together a super-machine with all of the tools a historian could possibly want. I looked at this, and thought, ‘too much power’. Too many steps. Too many opportunities for something to go wrong.

I needed something stripped down. Ben Marwick, coming at the same problem from an archaeology perspective, put together a Lubuntu-flavoured VM that, once installed, uses a single install script to go out and grab things like Rstudio and various python packages. It lives here:

I copied that, and tweaked it here and there for my class. Here’s my version: (as an aside, I don’t know why my gists always have such crazy strings while Ben’s have sensible digits. Probably a setting somewhere I suppose).


I was running this vm on my computer at home. Everything chugged sooooooo verrrrrryyy slowwwwwllly. Could there be something lighter?

Enter Docker.

A lighter, reproducible environment? Alright, I’ll bite.

You install ‘boot2docker’ on your machine (whether Mac or Pc). First hurdle: select all the boxes on what you’ll install. Otherwise, it seems to conflict with any existing VMs or virtual boxes you have. Or rather, at least it did that on my machines.

Once installed, you double click the icon, and a shell opens up. Meanwhile, Oracle VirtualBox is running in another window.

This is where it all really went pear-shape for me. Hurdle two: After much rummaging, I found that I needed to enable virtualization in the BIOS for one of my machines (so the software that runs the motherboard. Typically hit f2 or f10 during boot up to access this. Don’t mess with anything else in there or serious trouble can ensue).

Hurdle three: After another cryptic error message in the shell window, I determined that I had to go into the oracle virtual box setting for the boot2docker machine and select 64-bit ubuntu (something to that effect; it was a few days ago and I neglected to write down all of the steps.). I may have had to remove the virtual machine from the virtual box and then hit boot2docker again too; it’s all hazy now. So much angst.

Hurdle 3.1?: meanwhile on my Mac, while it worked at first, it is as of this writing not working at all and I’m flummoxed.

Hurdle 4 So how the hell do we run anything, now that we’ve got the virtual machine up and running? (You’ll know you’ve succeeded when the shell window displays the ascii-art version of the Docker logo.) I decided to try the Rstudio described in the Boettiger article. First thing, you need to get Rstudio from the Rocker project – if you’re familiar with github, then it’s easy to get images of different ‘containers’ to run in docker, as for instance here:

So, at the prompt, I hit:

docker pull rocker/rstudio

And after awhile the smoke cleared. Ok, let’s run this thing:

docker run -dp 8787:8787 -v /c/Users/shawn graham/docker:/home/rstudio/ -e ROOT=TRUE rocker/rstudio

I direct you to Ben again, to explain what’s happening here. But basically, docker is going to serve me up Rstudio in a browser. It will connect my directory ‘docker’ on my Windows machine to Rstudio, so that I can share files between the docker container running Rstudio, and my machine. Point your browser to  (although, on my machine, it’s sometimes; type ‘boot2docker ip’ to find out what the address is on your machine), sign in to Rstudio with ‘rstudio’ as user and ‘rstudio’ as password and there you go. Another hurdle See how there’s a space between ‘shawn’ and ‘graham’ in that command? Yeah, that completely screwed it up. And you can’t just point it to another directory – it has to be your home directory as user on your machine. So I need to rename that directory.

So that’s where I called it a day. I think there’s just a wee bit too much futzing necessary to get Docker running, for me to launch it on my students yet. Hell, I’m not entirely sure what I’m doing yet either. Why not just have students install Rstudio on their machine as per normal? Why not have them install python, or any of the other tools we’ll use, as per normal? Maybe if all the bits-and-pieces of the History VM that Turkel (or Marwick & I) put together can be containerized, and made to launch painlessly in Docker… well maybe that’s what I need.

Oh… and then I got some crazy error about my daemon not having been fed. Or called. Petted? Treated well? I dunno. Why tell me what’s wrong when you can write something perfectly obtuse? I can always google it.

Hearing the Past

what follows is our draft chapter for ‘Seeing the Past‘, a colloquium hosted by Kevin Kee at Brock University. The chapter will eventually be published in ‘Seeing the Past: Augmented Reality and Computer Vision in History’

comments welcome.

Hearing the Past – S Graham, S Eve, C Morgan, A Pantos

This volume is about seeing the past. But ‘to see’ does not necessarily imply vision. To see something can also mean to understand it. We frequently see things that do not exist, in this sense. “I see your point” or “I see what you’re saying”. ‘I hear you’ we sometimes say, also meaning, I understand.

In which case, how should we “see” the past? You can’t see the past. You can only see the present. You might believe something of what you’re looking at as being ‘from’ the past, but it still lives in the here-and-now. Thus, there is always a cognitive load, a ‘break in presence’ [Turner, 2007] that interrupts what we are seeing with awkward details. This is why we talk of the historical imagination, or the archaeological eye. To understand the past through augmented reality might not require vision. Yet, the majority of augmented reality apps currently available privilege the visual, overlaying reconstructions or text on an image of the present through a keyhole, the viewport offered by our small screens. The clumsiness of our interfaces also creates a break in presence. Visual overlays are clunky, with low-resolution 2D graphics, all of which further contribute to breaks in presence.

In short, they do not help us see the past – to understand it –  in any meaningful way.

In this chapter, we suggest that ‘hearing’ the past is a more effective and affective way of providing immersive augmented reality. We argue from cognitive and perceptual grounds that audio – spoken word, soundscapes, acoustic horizons and spaces, and spatialized audio – should be a serious area of inquiry for historians exploring the possibilities of new media to create effective immersive augmented reality. We explore some of the phenomenology of archaeological landscapes and the idea of an ‘embodied GIS’ [Eve, 2014] as a platform for delivering an acoustic augmented reality. Drawing on Phil Turner’s work on ‘presence’ in an artificial environment [Turner, 2007], we explore ‘breaks’ in presence that occur in augmented, mixed, and virtual environments. The key idea is that presence is created via a series of relationships between humans and objects, that these relationships form affordances. When these relationships are broken, presence and immersion is lost. We argue that because the sense of hearing depends on attention, audio AR is particularly effective in maintaining what Turner calls ‘affective’ and ‘cognitive/perceptual’ intentionality. In short, the past can be ‘heard’ more easily than it can be ‘seen’.  We explore three case studies that offer possible routes forward for an augmented historical audio reality.

‘Eh? Can you speak up?’ The sense of hearing

The sense of hearing, and the active cognition that hearing requires, has not been studied to the same degree or in the same depth as the visual [Baldwin, 2012: 3].  Hearing – and understanding – is also a tactile, haptic experience. Sound waves actually touch us. They move the tiny hairs of the ear canal, and the tiny bones within, and the various structures of the middle and inner ear turn these into the electro-chemical pulses that light up the various parts of our brain.  Sound is a kind of tele-haptic:

“…the initial stage of hearing operates as a mechanical process. Later the mechanical energy of sound converts to hydraulic energy as the fluids play a larger vibratory role. Thus at its source, touch operates with and causes sound, and it is only through touch-at-a-distance that we have sound at all. The famous story of Edison’s ears bleeding from his aural experiments makes visceral this tele-touch, which is not always a gentle stroke, no matter how pleasant the sounds, voice or music we might encounter.” [Bishop 2011, 25-6]

But intentional hearing – listening-  requires attention. Consider – In the crowded foyer of a cinema, it can be quite difficult to make out what the person opposite is saying. You have to pay attention; the act is tiring. One can try to read lips,  trying to match visual cues with auditory cues. In the quiet of a classroom, with the teacher’s back turned, the teacher can hear the surreptitious whisper that while much quieter, speaks volumes.  Hearing, unlike sight, requires attention that divides our ability to make semantic or emotional sense of what’s being said, or even to remember quite what was said, when the original audio signal is poor [Baldwin, 2012: 6].  What’s more, our brain is processing the spatial organization of the sound (near sounds, far sounds, sounds that move from left to right), how it is being said, not just the what being said [Baldwin, 2012: 6].

Bishop goes on to argue that touch and vision are senses that can only know the surface; sound waves transcend surfaces, they cause surfaces to vibrate, to amplify (but also, to muffle). And so,

“Sound provides the means to access invisible, unseeable, untouchable interiors. If we consider the import of vision to the general sensorium and metaphorization of knowledge, then the general figurative language of “insight” runs counter to surface vs. deep understanding of the world. Sound, it would seem, not vision or touch, would lead us to the more desired deep understanding of an object or text.” [Bishop, 2011, 26]

Sound permeates and transgresses surfaces; sound gives access to the unseen. Bishop is discussing Karlheinz Stockhausen’s “Helicopter String Quartet’. Bishop goes on to argue that the piece exposes the ways that sound and touch, blur (and “slur”) into a kind of synaesthesia, which defies the ‘assumed neatness of the sensorium’ [Bishop, 2011, 28]. With our western rationality, we assume that the senses neatly cleave. With our western focus on the visual, we prioritize one way of knowing over the others. Chris Godsen, in an introductory piece to an issue of World Archaeology on the senses in the past, argues that our western ‘sensorium’ (what we think of as the five senses) influences and conditions how we understand material culture. He advocates unlearning and unpacking the privileged position of sight [Gosden, 166] what others have called ‘ocularcentrism’ [Thomas 2008]

The effect of structured sound (let’s call it ‘music’) on movement is another interesting area where the haptic qualities of sound may be perceived. Interestingly, there are aspects to music that seem to translate into movement ‘primitives’. A recent study explored the relationship of musical structure (dynamic, harmony, timber) to guided walking (mobile tours) [Hazzard et al, 2014]. The authors note that a focus on structure in music sits between the thematic (where the emotional content of the music is manipulated) and the sonic (which is associated with spatial cues). Thus, they wondered what aspects of structure would be perceived by their non-musically trained study subjects (western, university undergraduates at an Anglophone university) and how the subjects would translate these into music.  The subjects listened to four distinct compositions that were designed to emphasize one aspect of musical structure, as they moved around an open field. The subjects were observed and interviewed afterwards. Why did they stop in certain places? Why did they back-track, or move in a spiral?

The authors found that silence in the music was often interpreted as signalling a time to stop, while crescendi (a rising intensity in the music) impelled movement forward while a diminuendo, a lessening, did not imply movement away; rather it signalled the impending end of movement altogether. Musical punctuation caused listeners to try to understand the significance of the particular spot they were standing on. Timbre ‘coloured’ different areas. ‘Harmonic resolution’ signalled ‘arrival’[Hazzard et al, 2014: 609-613]. As will be seen in our case studies, this interplay of silence and crescendo can also be a powerful affective tool to convey the density or paucity of historical information in an area.

Sound requires cognition to make sense; there is nothing ‘natural’ about understanding the sounds that reach our ears. This act of attentiveness can elide other breaks in presence. Sound is tactile. It engages pathways in the brain similar to those involved with processing visual imagery.

Culture & Soundscape

‘As a little red-headed Metis kid, it never occurred to me that the city could sound different to anyone else.’ [Todd, 2014]  Zoe Todd recently wrote a moving piece in Spacing on ‘Creating citizen spaces through Indigenous soundscapes’, where she describes amongst other things the profound effect of a flash mob occupying the West Edmonton Mall’s replica Santa Maria, Columbus’ flagship. “The sounds of Indigenous music, language and drumming soaring high up into the mall’s glass ceiling was a revelation: decolonization of our cities is not merely a physical endeavor, but also an aural one.” [Todd, 2014]

Work on the cognitive basis of memory has shown that, rather than being like a filing cabinet from which we retrieve a memory, the act of recollection actively re-writes the memory in the present: our memories are as much about our present selves as they are about the past. Thus, cognitive scientists working in the field of post-traumatic stress disorder are finding that they can reprogram the emotional content of traumatic memories by altering the contexts within which those memories are recalled. Sound plays very much a role in all of this. [see S Hall’s review article 2013 on the state of research,].

Soundscapes affect us profoundly, and as Todd demonstrates, can be used to radically reprogram, repatriate, decolonize, and contest spaces. Work on the cognitive foundations of memory suggests that sound can literally re-wire our brains and our understanding of memory. Tim Ingold talks about the ‘meshworks’ of interrelationships that create spaces and bind them in time [ref]. Can soundscapes help us ‘visualize’ the past, or at least, surface different patterns in the meshwork? Can we reprogram collective memories of place with sound?

The soundscape has been explored in a historical context by a number of scholars, and in particular, amongst archaeologists as the study of archaeoacoustics. Most work on archaeoacoustics has explored the properties of enclosed spaces [see Blesser & Salter 2007] such as caves [Reznikoff 2008], theatres [Lisa et al. 2004] and churches [Fausti et al. 2003]. For an excellent review of the increasingly extensive literature, see Mills [2010]. In particular, Mlekuz has investigated the soundscape of church bells in an area of Slovenia. He takes Schafer’s [1977] definition of the soundscape, who sits it in direct opposition to an acoustic space, explaining that where an acoustic space is the profile of the sound over a landscape, the soundscape is a sonic environment – with the emphasis being put on the way it is perceived and understood by the listener [Mlekuz 2004, para.2.2.1]. This clear distinction between the mechanics and properties of the sound (the acoustic nature) with the affect it has on the listener (the soundscape) fits perfectly with Turner’s idea of the Arc of Intentionality. Where we may be able to recreate the sounds of the historical past, we may not be able to recreate how these sounds came together to create the soundscape of a person existing in that past. The soundscape is a combination of the acoustic properties of sound, space and the individual. However, the acoustic nature of historical sounds will affect us as human beings and will evoke some kind of emotional/affective response – even if it could be argued that this response is not ‘historically authentic’.

The next question to ask, then, is that if sounds, music and voices from the past can affect us in certain ways – how do we deliver those sounds using Augmented Reality, to enable an in-situ experience?

Aural Augmented Reality

Audio tours, a handheld device rented or borrowed from a museum that guides a visitor through the exhibition, are a staple of many museums and heritage sites. The audio tour has been used since the 1950’s [see and]. Once a bulky device that had to be curated and maintained by the museum or heritage site, audio tours are quickly taking advantage of the smartphone-enabled age and releasing their tours as downloadable apps or podcasts. This is democratizing the audio tour, allowing new and alternative tours of museums and cities to be released and followed, and potentially undermining the ‘truth’ of the official tour. While we certainly do not deny that the humble audio tour is a form of Aural Augmented Reality, experienced in-situ and influencing the way the user experiences a space, they serve as a narrative-led experience of a space (much as a tour guide in book form would) and do not often explore the haptic or more immersive properties of AAR.

Some applications have taken the idea of the audio guide further, such as the SoundWalk project [] that offers alternative tours of the Lourve, with a Da Vinci Code theme, or walking tours of the Hassidic areas of Williamsburg narrated by famous actors and members of the community. What makes the SoundWalk tours different, is that they are GPS-powered, and so specific to the place (for instance you are told to open specific doors when they are in front of you, or to look left or right to see individual features). They are also produced with a very high quality of narration, sound-recording and music/sound effects. In addition they play with the notion of yourself melding with the narrator “…ok, for today you are Joseph, that’s my Hebrew name, that’s my Jewish name and that’s your name, for today we are one.” [extract from the Williamsberg Men Hassidic tour]. The SoundWalk tours attempt to create a feeling of immersion by effectively giving a ‘high-resolution’ aural experience, the acting, sound effects, music and beguiling narrative all come together to allow yourself to get lost in the experience, following the voice in your head.

An application that also uses the immersive aspect of storytelling to good effect is the fitness app, ‘Zombies, Run!’ []. The app is designed to aid a fitness regime, by making running training more interesting. When you log into the app, you take on the role of ‘Runner 5’ a soon-to-be-hero that is going to save the world from the Zombie Apocalypse. The app uses your GPS location and compass to direct you on a run around your local neighbourhood, but all the the time you are being pursued by virtual zombies. Go too slowly and the sounds of the zombies will catch you up, their ragged breath chasing you around the park. As part of the run you can also collect virtual medical supplies or water bottles – – indicated to you by the use of in-game voice – that all help to stave off the Apocalypse. By using the very visceral sounds of a pursuer getting closer, combined with the affective power of physically being out of breath, tired and aching – the run becomes an immersive experience, you are not just trying to better your time – you are escaping zombies and trying to save the world. This app works so well, mainly because you don’t have to look at the screen and the suspense of the situation is created mainly through sound [see Perron 2004].

Three Archaeological/Historical Aural Augmented Reality Case Studies

The examples of AAR applications provided so far were not specifically created with an ear to exploring and experimenting with historical sounds or soundscapes. Instead, they provide an immersive narrative (audio tours) or gamify a journey through an alternate present (Zombies, Run!). Historians and archaeologists are currently experimenting with the technology not just as a means to simply tell a story – but to allow the user to ‘feel’ the sounds and have them be affected by what they are hearing. Each of the applications eschews any kind of visual interface, concentrating instead on the power of sound to direct, affect and allow alternate interpretations. The case studies are examples of prototype applications, proofs-of-concept, rather than fully-fledged applications with many users, however, even these experimental models demonstrate the potential benefits of hearing the past.

Using Aural Augmented Reality to explore Bronze Age Roundhouses

As part of his research using the embodied GIS to explore a Bronze Age settlement on Bodmin Moor, Cornwall, United Kingdom, Stuart Eve used a form of Aural Augmented Reality to aid navigation and immersion in the landscape [Eve 2014]. By using the Unity3D gaming engine (which can spatialize sound), Eve created a number of 3D audio sources that corresponded to the locations of the settlement’s houses. As the resulting app was geo-located, the user could walk around the settlement in-situ and hear the augmented sounds of the houses (indistinct voices, laughing, babies crying, etc) getting louder or quieter the closer they got to each sound source. The houses in the modern landscape are barely visible on the ground as circles of stones and rocks, making it hard to discern where each house is. Eve then introduced virtual models of the houses to act as audio occlusion layers, simulating the effect of the house walls and roofs in dampening the sounds coming from within – and only allowing unoccluded sound to emit from the doorways:

“At first, the occlusion of the sounds by the mass of the houses was a little disconcerting, as [visually] the buildings themselves do not exist. However, the sounds came to act as auditory markers as to where the doorways of the houses are. This then became a new and unexpected way of exploring the site. Rather than just looking at the remains of the houses and attempting to discern the doorway locations from looking at the in situ stones, I was able to walk around the houses and hear when the sounds got louder – which indicated the location of the doorway” [Eve 2014:114]

Eve then goes on to suggest that by modelling sound sources and relating them to the archaeological evidence, questions can be asked about the usage of the site, and can be explored in situ. For instance, if some of the houses were used for rituals (as is indicated by the archaeological evidence) what sort of sounds might these rituals make and how would this sound permeate across the settlement? More prosaically, if animals were kept in a certain area within the settlement, how would the sound of them affect the inhabitants? How far could people communicate across the settlement area using calls or shouts?

Eve’s use of AAR to ask archaeological questions of a landscape highlights the exploratory power of an Augmented Reality approach, a different application, Historical Friction, explores the power of AAR to inform us about our surroundings and make us question what is beneath our feet.

Historical Friction

‘Historical Friction’ was directly inspired by the work of Ed Summers (of the Maryland Institute for Technology in the Humanities), filtered through the example of ‘Zombies, Run!’. Summers programmed a web-app called ‘Ici’, french for ‘Here’. Ici uses the native abilities of the browser to ‘know’ where it is in space to search and return all of the Wikipedia articles  that were geotagged within a radius of that location. []. In its current iteration, it returns the article as points on a map, with the status of the article (stub, ‘needs citations’, ‘needs image’, etc) indicated. In its original form, it returned a list with a brief synopsis of the article. Summers’ intent was that the app could work as a call-to-action, to encourage users to expand the coverage of the area in Wikipedia.

Visually, it can be impressive to see the dots-on-the-map as an indication of the ‘thickness’ of the digital annotations of our physical world. Initially, we wanted to make that ‘thickness’ literal, to make it actually physically difficult to move through places dense with historical information by exploiting the haptic nature of sound.

We tried to make it painful, to increase the noise and discords, so that the user would be forced to stop still in their tracks, to take the headphones off, and to look at the area with new eyes. Initially, we took the output from ‘Ici’ and fed it through a musical generator called ‘Musical Algorithmns’. The idea was that the resulting ‘music’ would be an acoustic soundscape of quiet/loud, pleasant/harsh as one moved through space, a kind of cost surface, a slope. We wondered if it would push the user from noisy areas to quiet areas? Would the user discover places they hand’t known about? Would the quiet places begin to fill up as people discovered them? As we iterated, we switched to a text-to-speech algorithm. As ‘Ici’ loads the pages, the text-to-speech algorithm whispers the abstracts of the wiki articles, all at once, in slightly different speeds and tones. ‘Historical Friction’ may be found at at

Historical Friction deliberately plays with the idea of creating a break in presence – a cacophony of voices that haptically forces the user to stop in her tracks- as a way of focussing attention on those areas that are think and thin with digital annotations about the history of a place.

Voices Recognition

During the inaugural York University ‘Heritage Jam’ an annual cultural heritage ‘hack-fest’, a group of archaeologists/artists/coders took the Historical Friction application as inspiration and created an AAR app called Voices Recognition.  

“Voices Recognition is an app designed to augment one’s interaction with York Cemetery, its spaces and visible features, by giving a voice to the invisible features that represent the primary reason for the cemetery’s existence: accommodation of the bodies buried underground” [Eve, Hoffman, et al., 2014 ].

The way this is achieved is by using a smartphone-based app that again uses the GPS and compass to geo-locate the user within the cemetery. Each of the graves in the cemetery is also geo-located and is attached to a database of online census data, burial records and available biographies of the persons buried within the cemetery. The app then plays the contents of this database for every grave within 10m of the user. In the example application the data themselves are voiced by actors, however, in the full application it is likely these will be computer-generated voices (due to the sheer amount of data attached and the number of graves in the cemetery). (A video of the app in action may be viewed at  The net result of this is in places a deafening cacophony of voices (especially in the areas of the mass graves) and in other places single stories being told. The umarked mass burials literally shout and clamour to be heard, whereas the grandiose individual monuments whisper the single stories. The usual experience of a cemetery is completely inverted [Eve, S., Hoffman, K., et al. 2014].

The voices recognition app uses augmented audio to represent abstract data in a visceral and tactile way.  The subject matter of the app – the deceased – is perhaps an extreme example of information that could potentially have strong emotional impact on visitors.  Careful thought is required for the appropriate presentation and distribution of material suitable for the intended cultural sphere to avoid unnecessary upset if such an app were to be made live. However the concept highlights the opportunity to relate a cultural location at a much closer and personal level through audio than can be achieved through the more ‘removed’ visual overlay and presentation. The Voices Recognition project, as well as the SoundWalk project described earlier, highlights the power of using sound not just as a way of exploring dense historical data, but also of presenting this in an engaging and unusual way. As the Voices project states, the app is part pedagogical and part an artistic soundscape. Its use of the overlapping voices as a representation akin to a ‘heat-map’, representing the clustered data because “it’s eminently possible to render delicate distinctions between layers/concentrations, and [for] the human ear to identify them more distinctly than they can colour, light or smell”. [Eve, S., Hoffman, K., et al., 2014].

Building an Aural, Haptic, Augmented Reality to Hear the Past

In a guest lecture to a digital history class at Carleton University in the Fall 2014 semester, Colleen Morgan recounted her experience with the ‘Voices Recognition’ app when it was being tested: ‘Voices, in the cemetery, was certainly the most powerful augmented reality I’ve experienced’.

Building a convincing visual AR experience, that does not cause any breaks in presence is the holy grail of Augmented Reality studies, and something that is virtually impossible to achieve. A break in presence will occur due to the mediation of the experience through a device (Head-Mounted Display, tablet computer, smartphone, etc.); the quality of the rendering of the virtual objects; the level of latency in software that delivers the experience to the eyes; the list is endless and scale-less – once you ‘solve’ one break in presence, then another occurs. The goal then can never be to completely eliminate breaks in presence, but instead to recognise them and treat them with an historian’s caution. Indeed, we can play with them deliberately to use their inevitability to underline the broader historical points we wish to make. For example, the use of artificial crescendo and diminuendo (such as with the Historical Friction and the Voices Recognition application) arrests the user, making them stop and consider why the sounds are getting louder or quieter. By inserting prehistoric sounds into the modern landscape, Eve is creating an anachronistic environment.  This is a clear break in presence as that sound should never be heard in the present. However, the alien nature of that particular sound in that landscape jars our cognitive intentional state and again prompts us to examine what that sound might be and why it might have been placed in that particular location.

In this way the case studies presented are showing that AAR does not always have to be a ‘recreation’ or a fully immersive experience. Instead, much as we would treat the written word as the result of a process of bias and production, we should treat any augmented reality experience as the result of a process of bias (what is represented), production (the quality of the experience) and delivery (the way in which it is delivered). Hearing the past requires that we pay attention not just to effect but also affect, and in so doing, it prompts the kind of historical thinking that we should wish to see in the world.

Sherlock Holmes, Samuel Vimes, and Archaeological Equifinality


How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?

Sam Vimes:

… he distrusted the kind of person who’d take one look at another man and say in a lordly voice to his companion, ‘Ah, my dear sir, I can tell you nothing except that he is a left-handed stonemason who has spent some years in the merchant navy and has recently fallen on hard times,’ and then unroll a lot of supercilious commentary about calluses and stance and the state of a man’s boots, when exactly the same comments could apply to a man who was wearing his old clothes because he’d been doing a spot of home bricklaying for a new barbecue pit, and had been tattooed once when he was drunk and seventeen and in fact got seasick on a wet pavement. What arrogance! What an insult to the rich and chaotic variety of the human experience!


The real world was far too real to leave neat little hints. It was full of too many things. It wasn’t by eliminating the impossible that you got at the truth, however improbable; it was by the much harder process of eliminating the possibilities.

For my money, Sam Vimes is the better detective. Here he has nailed the concept of equifinality, the idea that many different paths could lead to the evidence that we find. Vimes would make a good archaeologist. The problem, though, for archaeology (or history, or anthropology, or… or… or…) is that we don’t really grapple with the idea of equifinality in our writing. This is why history, archaeology, digital humanities, needs an experimental mind set. Through experimentation, we can whittle down the possibilities. Scott Weingart has an excellent post on this very issue, through analogy with astronomy:

Astronomers and historians both view their subjects from great distances; too far to send instruments for direct measurement and experimentation […] historians are still stuck looking at the past in the same way we looked at the stars for so many thousands of years: through a glass, darkly. Like astronomers, we face countless observational distortions, twisting the evidence that appears before us until we’re left with an echo of a shadow of the past. We recreate the past through narratives, combining what we know of human nature with the evidence we’ve gathered, eventually (hopefully) painting ever-clearer pictures of a time we could never touch with our fingers.

In which case, what we need is a laboratory for running different micro might-have-beens (I distrust uber-do-everything-social-models, because with so many moving parts, how the hell do you know what’s going on?). Fortunately, Scott and I have just published one over in the Journal of Archaeological Method and Theory(It will be part of a special issue on archaeological networks analysis; but JAMT does a thing called ‘online first’, so you can see articles before they’re pulled into a particular issue.)

The Equifinality of Archaeological Networks: an Agent-Based Exploratory Lab Approach

The agent based model itself is available on figshare at


In this article using this agent based model, we examine a theory of Roman economic history; we work out the kinds of generative processes for networks such a vision of the Roman economy might use; we model these; and we compare with known archaeological networks. More or less; you’ll have to read it for all the nitty gritty. Happily, if you’re paywalled-out, here’s a pre-print (the open access fee for Springer is $3000 US !!!! and I ain’t got that kind of cash – but that’s the subject for a post for another day).


When we find an archaeological network, how can we explore the necessary versus contingent processes at play in the formation of that archaeological network? Given a set of circumstances or processes, what other possible network shapes could have emerged? This is the problem of equifinality, where many different means could potentially arrive at the same end result: the networks that we observe. This paper outlines how agent-based modelling can be used as a laboratory for exploring different processes of archaeological network formation. We begin by describing our best guess about how the (ancient) world worked, given our target materials (here, the networks of production and patronage surrounding the Roman brick industry in the hinterland of Rome). We then develop an agent-based model of the Roman extractive economy which generates different kinds of networks under various assumptions about how that economy works. The rules of the simulation are built upon the work of Bang (2006; 2008) who describes a model of the Roman economy which he calls the ‘imperial Bazaar’. The agents are allowed to interact, and the investigators compare the kinds of networks this description generates over an entire landscape of economic possibilities. By rigorously exploring this landscape, and comparing the resultant networks with those observed in the archaeological materials, the investigators will be able to employ the principle of equifinality to work out the representativeness of the archaeological network and thus the underlying processes.


Text Analysis of the Grand Jury Documents

a topic in the grand jury documents, #ferguson

a topic in the grand jury documents, #ferguson

I watched Twitter and the CBC while the prosecutor was reading his statement. I watched the live feeds from Ferguson, and other cities around the US. Back in August, when this all first began, I was glued to my computer, several feeds going at once.

A spectator.

Yesterday, Mitch Fraas put the grand jury documents (transcripts of the statements, the proceedings) into Voyant Tools:

These ultimately came from here:

So today, I began, in a small way, to try to make sense of it all, the only way that I can. Text analysis.

Here’s the Voyant Tools corpus

Not having read the full corpus closely (this is, of course, a *distant* tool), it certainly looks as if the focus was on working out what Brown was doing, rather than Wilson…

I started topic modeling, using R & MALLET.

and I put everything up on github

but then I felt that I could improve the analysis; I created one concatenated file, then broke it into 1000 line chunks. The latest inputs, outputs, and scripts, are all on my github page.

The most haunting…

And all 100 topics…

None of this counts as analysis. But – by putting it altogether, my hope is that more people will grab the text files, grab the R script, explore the Voyant corpus, and really put this all under the microscope. I was tremendously effected by Bethany’s latest blog post, ‘All at once‘, which discusses her own reaction to recent news in both Ferguson and UVa, and elsewhere. It was this bit at the end that really resonated:

[…]we need analytical and interpretive platforms, too, that help us embrace our own subjective positioning in the systems in which we labor–which means, inevitably, to embrace our own complicity and culpability in them. And we need these, at the same time, to help us see beyond: to see patterns and trends, to read close and distantly all at once, to know how to act and what to do next. We need platforms that help us understand the workings of the cogs, of which we are one.

So here’s my small contribution. Maybe this can be a platform for someone to do a deeper analysis, to get started with text analysis, to read distantly and closely, to see beyond, and to understand what happened during the Grand Jury.

A Digital Archaeology of Digital Archaeology: work in progress

Ethan Watrall and I have been playing around with data mining as a way of writing a historiography of digital & computational archaeology. We’d like to invite you to play along.

We’ll probably have something to say on this at the SAA in April. Anyway, we’ve just been chugging along slowly, sharing the odd email, google doc, and so on – and a monstrous huge topic model browser I set up. Yesterday, an exchange on twitter took place that prompted us to share those materials.

This prompted a lot of chatter, including:

and this:

So let’s get this party started, shall we?


While there’s a lot of movement towards sharing data, and open access publications, there’s also this other space of materials that we don’t talk about too much – the things we build from the data that we (sometimes) share that enable us to write those publications we (sometimes) make open access. This intermediate stage never gets shared. Probably with good reason, but I thought given the nature of digital work, perhaps there’s an opportunity here to open not just our research outputs & inputs, but also our process to wider participation.

Hence this post, and all that follows.


Here’s what I did. I scoured JSTOR’s DFR for anglophone journals, from 1935 onwards (full bibliography right here: Then I fitted various topic models to them, using Andrew Goldstone’s dfr-topics which is an R package using MALLET on the bag-of-words that DFR gives you, running the result through Andrew’s dfr-browser (tag line: “Take a MALLET to disciplinary history!”).

The results can be viewed here. Like I said, this is the middle part of an analysis that we’re sharing here. Want to do some historiography with a distant reading approach? We’d love to see what you spot/think/observe in these models (maybe your students would like a go?) In which case, here’s an open pad for folks to share & comment.

Why would you bother? Well, it occurred to me that I’ve never seen anyone try to crowdsource this step of the process. Maybe it’s a foolish idea. But if folks did, and there was merit to this process, maybe some kind of digital publication could result where all contributors would be authors? Maybe a series of essays, all growing from this same body of analysis? Lots of opportunities.

Stranger things have happened, right?


Just to get you going, here are some of the things I’ve noticed, and some of my still-churning thoughts on what all this might mean (I’ve pasted this from another document; remember, work in progress!):

remembering that in topic modeling, a word can be used in different senses in different topics/discourses (thus something of the semantic sense of a word is preserved)

tools used:

-stanford tmt for detailed view on CAA (computer applications in archaeology)

-mimno’s browser based jslda for detailed view of correlations between topics (using CAA & IA) (internet archaeology, only the open access materials before it went fully OA in October 2014)

-Goldstone’s dftropics for R and dfrbrowser to visulaize 21 000 articles as entire topic model

-same again for individual journals: AJA, JFA, AmA, CA, JAMT, WA


stanford tmt of caa 1974 – 2011

Screen Shot 2014-11-09 at 3.43.49 PM

-no stoplist used; discards most prominent and least likely words from the analysis

-its output is formatted in such a way that it becomes easy to visualize the patterns of discourse over time (MALLET, the other major tool for doing topic modeling, requires much more massaging to get the output in such a form. The right tool for the right job).

-30 topics gives good breakdown; topic 26 contains garbage (‘caa proceedings’ etc as topic words)

In 1974, the most prominent topics were:

topic 1 – computer, program, may, storage, then, excavation, recording, all, into, form, using, retrieval, any, user, output, records, package, entry, one, unit

topic 6: but, they, one, time, their, all, some, only, will, there, would, what, very, our, other, any, most, them, even

topic 20: some, will, many, there, field, problems, may, but, archaeologists, excavation, their, they, recording, however, record, new, systems, most, should, need

The beginnings of the CAA are marked by hesitation and prognostication: what *are* computers for, in archaeology? There is a sense that for archaeologists, computation is something that will be useful insofar as it can be helpful for recording information in the field. With time, topic 1 diminishes. By 2000 it is nearly non-existent.  The hesitation expressed by topics 6 and 20 continues though. Archaeologists do not seem comfortable with the future.

Other early topics that thread their way throughout the entire period are topics 5, 2, 27 and 28:

Topic 5: matrix, units, stratigraphie, relationships, harris, unit, between, method, each, attributes, may two diagram, point, other, seriation, one, all, stratigraphy, sequence

Topic 2: area, survey, aerial, north, features, sites, region, located, excavation, river, areas, during, field, its, large, project, south, water, over, fig

Topic 27: sites, monuments, heritage, national, record, management, cultural, records, development, systems, england, database, english, its, survey, new, will, also, planning, protection.

Topic 28: museum, museums, collections, project, national, documentation, all, database, archives, about, archive, objects, sources, documents, university, text, our, also, collection, reports.

These topics suggest the ‘what’ of topic 1: how do we deal with contexts and units? Large surveys? Sites and monuments records and museum collections? Interestingly, topics 27 and 28 can be taken as representing something of the professional archaeological world (as opposed to ‘academic’ archaeology).

Mark Lake, in a recent review of simulation and modeling in archaeology (JAMT 2014) describes various trends in modeling [discuss]. Only topic 9 seems to capture this aspect of computational/digital archaeology:

model, models, social, modeling, simulation, human, their, between, network, approach, movement, networks, past, different, theory, how, one, population, approaches, through

Interestingly, for this topic, there is a thin thread from the earliest years of the CAA to the present (2011), with brief spurst in the late 70s, and late 80s, then a consistent presence throughout the 90s, with a larger burst from 2005-2008. Lake characterizes thus…. [lake]. Of course, Lake also cites various books and monographs which this analysis does not take into account.

If we regard ‘digital archaeology’ as something akin to ‘digital humanities’ (and so distinct from ‘archaeological computation’) how does it, or does it even, appear in this tangled skein? A rough distinction between the two perspectives can be framed using Trevor Owens meditation on what computation is for. Per Owens, we can think of a humanistic use of computing as one that helps us deform our materials, to give us a different perspective on it. Alternatively, one can think of computing as something that helps us justify a conclusion. That is, the results of the computation are used to argue that such-a-thing is most likely in the past, given this model/map/cluster/statistic. In which case, there are certain topics that seem to imply a deformation of perspective (and thus, a ‘digital archaeology’ rather than an archaeological computation):

topic 03: cultural, heritage, semantic, model, knowledge, systems, web, standards, ontology, work, domain, conceptual, different, crm, between, project, based, approach

topic 04: knowledge, expert, process, its, artefacts, set, problem, different, concepts, human, systems, but, they, what, our, scientific, about, how, all, will

topic 07: project, web, digital, university, internet, access, online, service, through, electronic, http, european, technologies, available, public, heritage, will, services, network, other

topic 14: virtual, reality, museum, public, visualization, models, reconstruction, interactive, museums, multimedia, heritage, envrionment, scientific, reconstructions, will, computer, technologies, environments, communication

topic 29: gis, spatial, time, within, space, temporal, landscape, study, into, social, approaches, geographic, applications, approach, features, environmental, based, between, their, past

Topic 3 begins to emerge in 1996 (although its general discourse is present as early as 1988).  Topic 4 emerges with strength in the mid 1980s, though its general thrust (skepticism about how knowledge is created?) runs throughout the period. Topic 7 emerges in 1994 (naturally enough, when the internet/web first hit widespread public consciousness). Should topic 7 be included in this ‘digital archaeology’ group? Perhaps, inasmuch as it also seems to wrestle with public access to information, which would seem not to be about justifying some conclusion about the past but rather opening perspectives upon it. Topic 14 emerges in the early 1990s.

Topic 29, on first blush, would seem to be very quantitative. But the concern with time and temporality suggests that this is a topic that is trying to get to grips with the experience of space. Again, like the others, it emerges in the late 1980s and early 1990s. [perhaps some function of the personal computer revolution..? instead of being something rare and precious -thus rationed and only for ‘serious’ problems requiring distinct answers – computing power can now be played with and used to address less ‘obvious’ questions?]

What of justification? These are the topics that grapple with statistics and quantification:

Topic 10: age, sites, iron, settlement, early, bronze, area, burial, century, one, period, their, prehistoric, settlements, grave, within, first, neolithic, two, different

Topic 11: pottery, shape, fragments, classification, profile, ceramics, vessels, shapes, vessel, sherds, method, two, ceramic, object, work, finds, computer, fragment, matching, one

Topic 13: dating, radiocarbon, sampling, london, dates, some, but, betwen, than, e.g. , statistical, chronological, date, there, different, only, sample, results, one, errors

Topic 15: landscape, project, study, landscapes, studies, cultural, area, gis, human, through, their, its, rock, history, historical, prehistoric, environment, our, different, approach

Topic 17: sutdy, methods, quantitative, technqiues, approach, statistical, using, method, studies, number, artifacts, results, variables, two, most, bones, based, various, analyses, applied

Topic 19: statistical, methods, techniques, variables, tiie, statistics, density, using, cluster, technique, multivariate, method, two, nottingham, example, principal, some, university

Topic 21: model, predicitve, modelling, models, cost, elevation, viewshed, surface, sites, gis, visibility, van, location, landscape, areas, one, terrain, dem, digital

topic 23: image, digital, documentation, images, techniques, laser, scanning, models, using, objects, high, photogrammetry, methods, model, recording, object, surveying, drawings, accuracy, resolution

topic 24: surface, artefact, distribtuion, artefacts, palaeolithic, materials, sites, deposits, within, middle, area, activity, during, phase, soil, processes, lithic, survey, remains, france

Macroscopic patterns

Screen Shot 2014-11-09 at 3.45.25 PMThis detail of the overall flow of topics in the CAA proceedings points to the period 1978 – 1983 as a punctuation point, an inflection point, of new topics within the computers-and-archaeology crowd. The period 1990-2011 contains minor inflections around 1997 and 2008.



In terms of broad trends, pivot points seem to be the late 70s, 1997, 2008. Given that our ‘digital archaeology’ themes emerge in the late 90s, let’s add Internet Archaeology to the mix [why this journal, why this time: because of the 90s inflection point? quicker publication schedule? ability to incorporate novel outputs that could never be replicated in print?]. This time, instead of searching for topics, let’s see what correlates with our digital archaeology topics. For this, David Mimno’s browser based LDA topic model is most useful. We run it for 1000 iterations, and find the following correlation matrix.

[insert discussion here]

-1000 iterations. Your 1000 iterations will be slightly different than mine, because this is a probablistic approach

– the browser produces csv files for download, as well as a csv formatted for visualizing patterns of correlation as a network in Gephi or other network visualization software.

-stop list is en, fr, de from MALLET + archaeology, sites, data, research

-running this in a browser is not the most efficient way of doing this kind of analysis, but the advantage is that it allows the reader to explore how topics sort themselves out, and its visualization of correlated topics is very effective and useful.

-note word usage. Mimno’s browser calculates the ‘specificity’ of a word to a topic. The closer to 1.0, the closer the word is distributed only within a single topic. Thus, we can take such words as being true ‘keywords’ for particular kinds of discourses. [which will be useful in exploring the 20000 model]. “Computer” has a specificity of 0.61, while “virtual” has a specificity of 0.87, meaning that ‘computer’ is used in a number of topics, while ‘virtual’ is almost exclusively used in a single discourse. Predicitve has a specificty of 1, and statistical of 0.9.

In the jsLDA model, there are three topics that deal with GIS.

topic 19, gis landscape spatial social approach space study human studies approaches

topic 18, database management systems databases gis web software user model tool

topic 16, sites gis landscape model predictive area settlement modelling region land

The first, topic 19, seems to correspond well with our earlier topic that we argued was about using GIS to offer a new perspective on human use/conception of space (ie, a ‘digital’ approach, in our formulation). Topics 18 and 16 are clearly about GIS as a computational tool. In the correlation matrix below, blue equals topics that occur together greater than expected, while red equals less than expected; the size of the dot gives an indication of how much. Thus, if we look for the topics that go hand in hand with topic 19, the strongest are topic 16 (the predictive power of GIS), and topic 10 (social, spain, simulation, networks, models).

Screen Shot 2014-11-09 at 5.28.47 PMThe ‘statistical, methods, techniques, artefact, quantitative, statistics, artefacts’ topic is positively correlated with ‘human, material, palaeolithic’, ‘time, matrix, relationship’, and ‘methods, points, point’ topics. This constellation of topics is clearly a use of computation to answer or address very specific questions.

-in jslda there’s a topic ‘database project digital databases web management systems access model semantic’ – positively correlated with ‘publication project electoric’, ‘text database maps map section user images museum’, ‘excavation recording’, ‘vr model’,  ‘cultural heritage museum’, ‘italy gis’, ‘sites monuments record’ [see keys.csv for exact label]. These seem to be topics that deal with deforming our perspectives while at the same time intersecting with extremely quantitative goals.

So far, we have been reading distantly some 40 years of archaeological work that is explicitly concerned with the kind of archaeology that uses computational and digital approaches. There are punctuation points, ‘virages’, and complicated patterns – there is no easy-to-see disjuncture between what the digital humanists imagine is the object of using computers, and their critics who see computation as positivism by the back door. It does show that archaeology should be regarded as an early mover in what has come to be known as ‘the digital humanities’, with quite early sophisticated and nuanced uses of computing. But how early? And how much has archaeological computing/digital archaeology permeated the discipline? To answer these questions, we turn to a much larger topic model

Zoom Out Some More

Let’s put this into a broader context. 24 journals from JSTOR were selected for both general coverage of archaeology and for regional/topical specialities. The resulting dataset contains 21000 [get exact number] articles, mostly from the past 75 years (a target start date of 1940 was selected for journals whose print run predates the creation of the electronic computer, thus computer = machine and not = woman who computes). 100 topics seemed to capture the range of thematic discourses well. We looked first for topics that seem analogous to the CAA & IA topics (CAA and IA were not included in this analysis because they are not within the JSTOR DFR database; Goldstone’s DFR Browser was used for the visualization of the topics). [better explanation, rationale, to be written, along with implications]. We also observe ‘punctuation points’ in this broader global (anglosphere) representation of archaeology that correspond with the inflection points in the small model, many trends that fit but also other trends that do not fit with standard historigoraphy of archaeology. We then dive into certain journals (AJA, JFA, AmA, JAMT) to tease these trends apart. Just what has been the impact of computational and digital archaeology in the broader field?

Screen Shot 2014-11-09 at 5.29.24 PMThe sillouhette in the second column gives a glimpse into the topic’s prevalence over the ca 75 years of the corpus. The largest topic, topic 10, with its focus on ‘time, made, work, years, great, place, make’ suggests a kind of special pleading, that in the rhetoric of archaeological argument, one always has to explain just why this particular site/problem/context is important. A similar topic was observed in the model fitted to the CAA & IAA [-in 20000 model, there’s the ‘time’ topic time made work years great place make long case fact point important good people times; it’s the largest topic, and accounts for 5.5%. here, there is one called ‘paper time work archaeologists introduction present important problems field approach’. it’s slightly correlated with every other topic. Seems very similar. ]

More interesting are the topics a bit further down the list. Topic 45 (data, analysis, number, table, size, sample) is clearly quantitative in nature, and its sillhouette matches our existing stories about the rise of the New Archaeology in the late 60s and early 70s. Topics 38 and 1 seem to be topics related to describing finds – ‘found, site, stone, small, area’; ‘found, century, area, early, excavations’. Topic 84 suggests the emergence of social theories and power – perhaps an indication of the rise of Marxist archaeologies? Further down the list we see ‘professional’ archaeology and cutlrual resource management, with peaks in the 1960s and early 1980s.

Screen Shot 2014-11-09 at 5.29.56 PM

Topic 27 might indicate perspectives connected with gender archaeology – “social, women, material, gender, men, objects, female, meaning, press, symbolic” – and it accounts for 0.8% of the corpus: about 160 articles.  ‘Female’ appears in four topics, topic 27, topic 65 (‘head, figure, left, figures, back, side, hand, part’ – art history? 1.4% of the corpus) topic 58 (“skeletal, human, remains, age, bone”- osteoarchaeology, 1.1% of the corpus), and topic 82 (“age, population, human, children, fertility” – demographics? 0.8% of the corpus).

[other words that would perhaps key into major trends in archaeological thought? looking at these topics, things seem pretty conservative, whatever the theorists may think, which is surely important to draw out and discuss]

Concerned as we are to unpick the role of computers in archaeology more generally, if we look at the word ‘data’ in the coprus, we find it contributes to 9 different topics ( ). It is the most important word in topic 45 (data, analysis, number, table, size, sample, study) and in topic 55 (data, systems, types, information, type, method, units, technique, design). The word ‘computer’ is also part of topic 55. Topic 45 looks like a topic connected with statistical analysis (indeed, ‘statistical’ is a minor word in that topic), while topic 55 seems to be more ‘digital’ in the sense we’ve been discussing here. Topic 45 is present in 3.2% of the corpus, growing in prominence from the early 1950s, falling in the 60s, and resurging in the 70s, and then decreasing to a more or less steady state in the 00s.

Screen Shot 2014-11-09 at 5.30.34 PM

Topic 55 holds some surprises:

Screen Shot 2014-11-09 at 5.31.17 PM

The papers in 1938 come from American Antiquity volume 4 and show an early awareness of not just quantitative methods, but also the reflective way those methods affect what we see [need to read all these to be certain of this]

next steps

– punctuation points – see

major – 1940 (but perhaps an artefact of the boundaries of the study)

minor- early 1950s

minor- mid 1960s

major- 1976 (american antiquity does something odd in this year)

major- 1997-8



I was at #seeingthepast these last two days (website). During one of the discussions, the idea of glitchiness of augmented reality was raised, and ways that this might intersect with materiality were explored. At one point, the idea of an app that let people break museum objects (the better to know them and how they were created) was mooted. (nb, I didn’t come up with the idea; it might have been Keri or Caitlin).

I tweeted:

and archaeologists on the twitterverse responded. (I then would periodically inform the symposium of the twitter discussion, which would then spark ruminations on the virtuality of conferences, but I digress):

On the way home, I had time to think about how this might work. If you’ve got the chops to make it happen, this is how I think ‘Breakage’ could go, so I’d love to see something like:

– photos uploaded from museum online catalogues, exhibitions, or databases (ones without good provenances)

– user can pan through these. When one catches the user’s fancy, the user selects it: and it shatters into pieces.

– each piece can then be examined; pieces highlights some aspect of the object inherent to the object (makers’ marks, artistic effects, clay fabric, whatever).

– touch again, and the pieces are put into a *possible* context. touch again, a different *possible* context. Show how different meanings could be understood if this was the actual context, and how it…. but damn. We don’t actually know what the piece’s real context was, so we don’t know anything.

– and then the image would be deleted from the user’s version of the app, never to be seen again, as if it has been looted anew.