Bots of Archaeology: Machines Writing Public Archaeology?

Today was the day for the keynotes for the Public Archaeology Twitter Conference; the main papers unroll tomorrow, so set your Twitters to #PATC and enjoy! This conference I think might well be one of those landmark conferences we discuss in years to come.

For convenience, I’ve copied my keynote tweets below. 

https://twitter.com/electricarchaeo/status/857625409084682244

https://twitter.com/electricarchaeo/status/857625638051618817

https://twitter.com/electricarchaeo/status/857625894504017920

https://twitter.com/electricarchaeo/status/857626146921476096

https://twitter.com/electricarchaeo/status/857626395488342016

https://twitter.com/electricarchaeo/status/857626648866410497

https://twitter.com/electricarchaeo/status/857626896766345216

https://twitter.com/electricarchaeo/status/857627151075618816

https://twitter.com/electricarchaeo/status/857627399399145472

https://twitter.com/electricarchaeo/status/857627651992764416

https://twitter.com/electricarchaeo/status/857627905307758593

https://twitter.com/electricarchaeo/status/857628155309211648

https://twitter.com/electricarchaeo/status/857628406183108608

https://twitter.com/electricarchaeo/status/857628657921122305

https://twitter.com/electricarchaeo/status/857628912620232704

https://twitter.com/electricarchaeo/status/857629166245642241

https://twitter.com/electricarchaeo/status/857629415802404865

https://twitter.com/electricarchaeo/status/857629665610973184

https://twitter.com/electricarchaeo/status/857629916472291328

https://twitter.com/electricarchaeo/status/857630167694430208

https://twitter.com/electricarchaeo/status/857630420577337345

https://twitter.com/electricarchaeo/status/857630673464590337

https://twitter.com/electricarchaeo/status/857630922761424896

https://twitter.com/electricarchaeo/status/857631174453190656

https://twitter.com/electricarchaeo/status/857631430364495872

https://twitter.com/electricarchaeo/status/857631679342493696

…and if you’re wondering why the featured image (and final image) are shots from the tv adaptation of ‘Going Postal’, by Terry Pratchett, see this.

Open By Default

Recently, there’s been a spat of openness happening in this here government town. This post is just me collecting my thoughts.

First thing I saw: a piece in the local paper about the Canadian Science and Technology Museum’s policy on being ‘open by default’. The actual news release was back in April.

This is exciting stuff; I’ve had many opportunities to work the folks from CSTM and they are consistently in the lead around here in terms of how they’re thinking about the ways their collections (archival, artefactual and textual) could intersect with the open web.

This morning, I was going over the Government of Canada’s ‘Draft Plan on Open Government’ and annotating it. (I’m using wordpress.com, so can’t use Kris Shaffer’s awesome new plugin that would pull these annotations into this post.)

There’s a lot of positive measures in this plan. Clearly, there’s been a lot of careful thought and consideration and I applaud this. There are a few things that I am concerned about though (and you can click on the link above, ‘annotating it’ to see my annotations). Broadly, it’s about the way access != openness. It’s not enough to simply put materials online, even if they have all sorts of linked open data goodness. There are two issues here.

  1. accessing data is something that is not equitably available to all. Big data dumps require fast connections, or good internet plans, or good connectivity. In Canada, if you’re in a major urban area, you’re in luck. If you live in a more rural area, or a poorer area, or an area that is broadly speaking under-educated, you will not have any of these. Where I’m from, there’s a single telephone cable that connects everything (although in recent years a cell phone tower was built. But have you looked at the farce that is Canadian mobile data?)
  2. accessing data so that it becomes useful depends on training. Even I struggle often to make use of things like linked open data to good effect. Open Context for instance (an open archaeological data publishing platform) provides example ‘api recipes‘ to show people what’s possible and how to actually accomplish something.

So my initial thought is this: without training and education (or funds to encourage same), open data becomes a public resource that only the private few can exploit successfully. Which makes things like the programming historian and the emergence of digital humanities programs at the undergraduate and graduate level all the more important, if the digital divide (and the riches being on the right side of it brings) is to be narrowed. If ‘open by default’ is to be for the common good.

 

Digital Humanities is Archaeology

Caution: pot stirring ahead

I’m coming up on my first sabbatical. It’s been six years since I first came to Carleton – terrified – to interview for a position in the history department, in this thing, ‘digital humanities’. The previous eight years had been hard, hustling for contracts, short term jobs, precarious jobs, jobs that seemed a thousand miles away from what I had become expert in. I had had precisely one other academic interview prior to Carleton (six years earlier. Academia? I’d given up by then). Those eight years taught me much (and will be a post for another day, why I decided to give one more kick at the can).

The point of this morning’s reflection is to think about what it was that I was doing then that seemed appropriate-enough that I could spin my job application around it. At that time, it was agent based modeling.

During those previous eight years, I had had one academic year as a postdoc at the U Manitoba. The story of how I got that position is a story for another day, but essentially, it mirrors the xkcd cartoon Scott linked to the other day. I said ‘fuck it’. And I wrote an application that said, in essence, I’ve got all these networks; I want to reanimate them; agent modeling might be the ticket. (If you’ve ever spent any time in the world of stamped brick studies, this is NOT what we do…). So I did. And that’s what I had in hand when I applied to Carleton.

‘Agent modeling is digital humanities’, I said. Given that nobody else had much idea what DH was/is/could be, it worked. I then spent the next six years learning how to be a digital humanist. Learning all about what the literary historians are doing, learning about corpus linguistics, learning about natural language processing, learning learning learning. I could program in Netlogo; I learned a bit of python. I learned a bit of R. It seemed, for a long time, that my initial pitch to the department was wrong though. DH didn’t do the agent modeling schtick. Or at least, nobody I saw who called themselves a ‘digital humanist’. Maybe some digital archaeologists did (and are they DH? and how does DA differ from the use of computation in archaeology?)

But. I think there’s a change in the air.

I think, maybe, the digital humanities are starting to come around to what I’ve been arguing for over a decade, in my lonely little corners of the academy. Here’s some stuff I wrote in 2009 based on work I did in 2006 which was founded on archaeological work I did in 2001:

In any given social situation there are a number of behavioural options an individual may choose. The one chosen becomes “history,” the others become “counter-factual history.” As archaeologists, we find the traces of these individual decisions. In literature we read of Cicero’s decision to help his friend with a gift of money. What is the importance of the decision that Cicero did not make to not help his friend? How can we bridge the gap between the archaeological traces of an individual’s decision, and the option he or she chose not to pursue in order to understand the society that emerged from countless instances of individual decision-making? Compounding the problem is that the society that emerged influenced individual decision-making in a recursive, iterative fashion. The problem, simply stated, is one of facing up to complexity. A major tool for this problem is the agent based simulation.

[..]

[A]gent-based modeling […] requires modellers to make explicit their assumptions about how the world operates (Epstein). This is the same argument made by Bogost for the video game: it is an argument in code, a rhetoric for a particular view of the world. As historians, we make our own models every day when we conceive how a particular event occurred. The key difference is that the assumptions underlying our descriptions are often implicit.

The rules that we used to encode the model are behaviours derived from archaeology, from the discovered traces of individual interactions and the historical literature. Once the rules for agents in this model and others are encoded, the modeller initiates the simulation and lets the agents interact over and over again. As they interact, larger-scale behaviours – an artificial society – begins to emerge. In using an ABM, our central purpose is to generate the macro by describing the micro.

[…] It is worth repeating that agent-based modelling forces us to formalise our thoughts about the phenomenon under consideration. There is no room for fuzzy thinking. We make the argument in code. Doing so allows us to experiment with past and present human agents in way that could never be done in the real world. Some ABMs, for example, infect agents with a “disease” to determine how fast it spreads. An ABM allows us to connect individual interactions with globally emergent behaviours. It allows us to create data for statistical study that would be impossible to obtain from real-world phenomena

That’s a long quote; sorry. But.

Compare with what Sinclair & Rockwell write in their new book, Hermeneuticap41-42:

…we can say that computers force us to formalize what we know about texts and what we want to know. We have to formally represent a text – something which may seem easy, but which raises questions… Computing also forces us to write programs that formalize forms of analysis and ways of asking questions of a text. Finally, computing forces us to formalize how we want answers to our questions displayed for further reading and exploration. Formalization, not quantification, is the foundation of computer-assisted interpretation.

[…] In text analysis you make models, manipulate them, break them, and then talk about them. Counting things can be part of modeling, but is not an essential model of text analysis. Modeling is also part of the hermeneutical circle; there are formal models in the loop. […] thinking through modeling and formalization is itself a useful discipline that pushes you to understand your evidence differetnly, in greater depth, while challenging assuptions We might learn the most when the computer model fails to answer our questions.

The act of modeling becomes a parth disciplined by formalization, which frustrates notions of textual knowledge. When you fail at formalizing a claim, or when your model fails to answer questions, you learn something about what is demonstrably and quanitifiably there. Fromalizing enables interrogation. Others can engage with and interrogate your insights. Much humanities prose supports claims with quotations, providing an argument by association or with general statements aabout what is in the text – vagaries that cannot be tested by others except with more assertions and quotations. Formalization and modeling, by contrast, can be exposed openly in ways that provide new affordances for interaction between interpretations.

That’s a long quote; sorry. But.

Compare with what Piper writes in the inaugrual issue of Cultural Analytics:

One of the key concepts operative in computational research that has so far been missing from traditional studies of culture is that of modeling. A model is a metonymical tool – a miniature that represents a larger whole. But it is also recursive in that it can be modified in relationship to its “fit,” how well it represents this whole. There is a great deal of literature on the role of modeling in knowledge creation and this should become core reading for anyone undertaking cultural analytics. The more we think about our methods as models the further we will move from the confident claims of empiricism to the contingent ones of representation. Under certain conditions, it is true that (i.e. replicable and stable)…

That’s not as long a quote. I’m getting better. But.

Compare with Underwood’s abstract (and watch the video for) his talk on ‘Predicting the Past

We’re certainly comfortable searching and browsing [libraries], and we’re beginning to get used to the idea of mining patterns: we can visualise maps and networks and trends. On the other hand, interpreting the patterns we’ve discovered often remains a challenge. To address that problem, a number of literary scholars have begun to borrow methods of predictive modelling from social science. Instead of tracing a trend and then speculating about what it means, these scholars start with a specific question they want to understand — for instance, how firm is the boundary between fiction and biography? Or, how are men and women described differently in novels? The categories involved don’t have to be stable or binary. As long as you have sources of testimony that allow you to group texts, you can model the boundaries between the groups. Then you can test your models of the past by asking them to make blind predictions about unlabelled examples. Since the past already happened, the point of predicting it is not really to be right. Instead we trace the transformation of cultural categories by observing how our models work, and where they go wrong.

It feels like something is going on. It feels like there’s been a bit of a sea-change in what DH sees as its relationship to the wider world. I feel like there is an arc to my story now that makes sense, that where this field is going fits squarely in where I myself have come from. What is ‘digital humanities’?

It might be that DH is really a branch of archaeology.

Postscriptum

Here’s a thought:

If DH is archaeology in its use of modeling as a core method, and given that modeling inherently builds its theoretical perspectives into its core operations, then the only appropriate way of writing DH must be in simulation. Games. Playful interations.

Discuss.


BTW: There’s a rich literature in archaeology on modeling, on moving from the incomplete evidence to the rich stories we want to tell. All archaeological data is necessarily incomplete; it’s the foundational problem of archaeology. DH folks might want to give that literature a read. Recently, Ted Underwood posted on ‘the real problem with distant reading‘ and the objections folk raise concerning the complexity of human life if considered computationally. Ted comes around to essentially a ‘screw that’ position, and writes,

It’s okay to simplify the world in order to investigate a specific question. That’s what smart qualitative scholars do themselves, when they’re not busy giving impractical advice to their quantitative friends. Max Weber and Hannah Arendt didn’t make an impact on their respective fields — or on public life — by adding the maximum amount of nuance to everything, so their models could represent every aspect of reality at once, and also function as self-operating napkins.

The problems that literary scholars are finding in presenting their models and approaches to their (non-computational) peers have their parallels in archaeological debates from the 70s onwards; I think they might find useful material in those debates. Again: DH is archaeology.

‘Big Data Gothic’ & Data Day 3

update, 8pm march 29. Well, this sure is embarrassing. Turns out, when they said ‘panel’, by gosh, they really did mean panel. I didn’t need to produce all this; I wasn’t presenting anything; I wasn’t a formal speaker… man, make sure to read the fine print, eh? Or in my case, the large print. So, in the end, there were four of us being quizzed by the moderator and the audience. It was a really great conversation. But that’s all it was. This was totally not necessary. So now I’ve got this talk, below, and here it will remain, ne’er to be delivered. But y’all might find it interesting anyway, and perhaps I’ll fix its problems, expand it, turn it into something meaningful, someday. But in the meantime…

D’oh!

I’ve got between 12-15 minutes tomorrow, at Carleton’s 3rd data day. That’s not a lot of time. I’ve written out roughly what it is I want to talk about – but I go off-script a lot when I speak, so what’s below is only the most nebulous of guides  to what’ll actually come out of my mouth tomorrow. In any event, the stuff below would take more like 25 – 30 minutes if I stuck to the script.

Apparently the day’ll be live-streamed too, at http://carleton.ca/cuids/events/data-day-2/. I’m on at the 2pm-ish slot.

Slides are at: j.mp/sg-dd3.

*title* One of the things I’m interested in, inasmuch as ‘big data’ is a thing in history, is the way our tool use changes us. I’m an archaeologist by training; it’s almost an article of faith in archaeology that our tools change us as much as we change our tools. But I guess I worry that our tools are getting out of control, that we exult in the ways our tools exceed us*. This panel is called ‘Needs and Opportunities for Big Data in Social Sciences and the Humanities’; I think I’d like to rejig that to read ‘ways in which big data needs social sciences & the humanities’

1. In recent years, everyone has suddenly had to cope with the sheer amount of data that our always-on, always-connected, always-under-surveillance society has generated. It’s forced us to take stock, and rethink our place in the world. This kind of thing has happened before; in an earlier moment it gave rise to ‘the Gothic’ – and it’s worthwhile thinking through *that* reaction to the changing place of humans in the world, and what it implies for *this* moment. To simplify horribly, and at the risk of undermining my humanities street-cred, I will for the sake of convenience conflate romanticism with the gothic and boil the gothic down to the text on this slide; this is all tied up the broader changes in western society precipitating out of the enlightenment and the beginnings of our industrial age. If I said ‘Frankenstein, or the modern prometheus’, you get what I’m talking about. Key here is the idea of shock and thrill …

2. …and the annihilation of ‘the self’. That is, the real ambition here with the gothic is to first frame everything from the point of view of the individual, to overwhelm the senses of the viewer or the reader in the terror or majesty of, say, a landscape or a sensation (the gothic horror), such that only the sublime feeling remains. The quantified self, and those people who keep fit-bits and personal trackers, would be at home in the gothic – and indeed, it is from an event explicitly tying that aspect of ‘personal’ big data to the gothic sensibility (see this; also, read this) that I began thinking along these lines. But let’s return to that idea of the annihilation of ‘self’.

3. It’s in this sense that big data is gothic. The traces that we leave are aggregated and interrogated and correlated, a vast digital landscape of signs and signals from which predictions are generated; and yet, without foundation. Microsoft can build a chat-bot that learns from humans, but they don’t understand or can’t foresee that releasing it into a particular environment already toxic for women is going to be a bad idea. The terror and majesty of the algorithm, of the code, of the data – never mind the humans in the foreground – is what matters.

4. This is what I mean by big data gothic: as Zoe Quin said, ‘if you’re not asking yourself ‘how could this be used to hurt someone’ in your design/engineering process, you’ve failed’. That is to say: you’ve been seduced by the data vista stretching out before you. This is the same impulse that Steve Jobs channeled when he (more or less) said ‘I don’t do marketing, I do what I want because people don’t know what they want’. It’s the same impulse that reaches for plagiarism detection software rather than asking, ‘what is it about my course that makes plagiarism a rationale response’. The seductive lure of ‘moar data’ suggests that eventually, all solutions will percolate out of the data. But who decides what *counts* as data?

5. It’s the rhetorical and tactical usage of the phrase ‘big data’ that I’m concerned with here; I’m not against data science per se or the interesting things you can learn when you have aggregated information – I am an archaeologist after all, and I *did* publish a book with ‘big data’ in the title. The thing is, metaphors matter. Metaphors structure thought and action – ‘the university is a business’, for instance – and so if we imagine ‘big data’ as somehow objectively out there, and not produced by conscious decisions about *what counts*, and *who does the counting*, we end up in situations where people lose their jobs (uber!) or miss out on credit, or constantly get reconnected with abusive ex partners on facebook. Big data, as an aggregated thing, means that the means of production, the power in the system, has shifted from those of us who create the data, to those of us with the money, the privilege, the computing power, to mine it.

6. So let’s call this blinkered version of working with human data ‘Big Data Gothic’. Like its namesake, it’s not too concerned about fallout on individual, named, humans; it revels in the data landscape and draws much of its power from the thrill of off-loading decision making to the machines…

7. That is to say, it begins by thinking of people as things.

8. Oddly enough, big data is not concerned with context; but – again, as an archaeologist – context forces us to think of humans as, well, human.

9. This is what the humanities excel at. At this event two years ago, a businessman spoke about the need to learn how to tell stories from data. We’ve got you covered, over in the humanities.

10. When you think of humans as things, a poorly trained machine learning routine can be used to target other humans for killing. That the routine has a probable error rate that translates into 15 000 people slated for death, people shrug. Much easier to say, there’s a 0.008% false positive rate. (see http://arstechnica.co.uk/security/2016/02/the-nsas-skynet-program-may-be-killing-thousands-of-innocent-people/)

11. Treating humans as things. Is there anything more thing-like, than buying and selling human remains? The literal commodification of humans. This is a project I’m working on with Damien Huffer; he does the anthropology, I do the numbers. Human remains are bought and sold on Instagram in ways that circumvent Instagram’s rules (such as they are). We want to know both the language used to facilitate these sales, and also the visual language, that isn’t caught by my algorithmic trawling. I have around, I donno, perhaps 15000 images and posts now on my machine. ‘Big’ in this context means: overwhelming for one person using the tools he was taught in grad school. Big data from culling the posts gives me some insight, esp when I represent as vector models, some of the explicit language behind this trade, and ways that people signal that something is for sale. But it also misses the visual signals in the composition of the images itself. For that, I have to go in and read these hidden cues – rather like a kind of steganography that is explicitly meant to conceal the trade from algorithmic monitoring. By the way, this kind of reaction is also present on Facebook or Twitter as people ‘template’ themselves for particular audiences. The danger is that these templated selves could become algorithmic prisons: our performances in reaction to alogorithms that make assumptions about how the world work cease to become performances and instead become real. This is big data gothic.

12. Tricia Wang prefers the term ‘thick data’, that is, the kind of thick storytelling that ethnography, anthropology, history, english, and so on, excel at. She argues with reference to what happened to Nokia – and I agree with her – that insight is dependent on both modes; that data science can usefully learn a thing or two from the humanities, and likewise, the humanities can benefit from the scale and distance that an aggregated view can provide.

13. The thing is not to be seduced by the view that big data gothic provides. It is exhilerating, I agree. But it’s ultimately getting in the way.

14. I’ll just leave this here: a good way to know if you’re dealing with a big data gothic situation is if you’ve blamed the algorithm. If you’ve offloaded responsibility for the consequences of decisions made to the computer. In the end, it all comes down to humans.

 

a note on the phrase ‘big data gothic’. I’m pretty sure that this is not my own phrase, that I must have encountered it somewhere and it’s been pinging around inside my head, quietly waiting for its moment to emerge. I really like the phrase; but I’d really like to attribute it properly. So now to excavate back through everything I’ve read/browsed these last few months…

Featured image is courtesy Sophie Hay

* not that I have access to any of the really souped up big data tools that the people in that room tomorrow will use as a simple matter of course. It’s all relative… right?

Can we fix it? Yes we can! #DHAnnotates Feb 8-12

With apologies to Bob the Builder, and perhaps also Obama.

Preamble

In my graduate seminar on digital/public history, I framed the course as ‘Digital History Methods as Public History Performance’. I did this deliberately to riff on my colleague David Dean’s amazing seminar and research on perfoming history; students in that class were making videos, writing music, putting on vignettes. It’s been amazing to watch. But digital history methods as performance? I wanted to suggest to my students that working with digitized materials necessarily involves a performative element, even if that element is hidden away in the final published works (where eg we pretend we actually consulted the 1856, June 15th edition of the Globe and Mail). My goal then for the course was to expose these, and by doubling down on the ‘how to’, the ‘why should I’ and the ‘what’s in it for me’ naturally would emerge.

Thus, the seminar presentations are not on various readings, but rather, on the various tutorials of the Programming Historian. The task of the discussion leader is to tie the method into their thesis research, into the readings from other classes, into the wider world of historical meaning-making. (I do give them some suggestions to read, of course). The final project involves making some sort of public-facing object “using the tools of gaming, a work of digital history that simultaneously explores, comments, critiques, or teaches, digital history.”

Meanwhile, Amanda Visconti and I had been talking about some sort of collaborative project to annotate a crucial text in the digital humanities world using the Hypothes.is tool. That is, a public work of digital humanities that simultaneously explores, comments, critiques, or teaches, digital humanities. Worlds have collided! So here’s the plan:

Collaborative Annotation Fest #DHannotates

  1. Create an account on Hypothesis.
  2. During the week of February 8-12 visit the Programming Historian’s lessons and see what other annotations people have left.
  3. Try out some of the lessons, and leave annotations on the page that discuss where things went well, went off the rails, remind you of some other useful resource, reply to others’ annotations (Hypothesis allows for threaded conversations in the annotations) or whatever kind of note strikes you.
  4. Tweet the link to your annotation with the tag #dhannotates .
  5. Amanda and I will monitor the tweets and the feed of annotations. If you get stuck, or you need help, feel free to tweet or email Amanda or myself (@Literature_Geek or @electricarchaeo)

Won’t you join us? Friends don’t let friends do #dh alone!

For more on the nuts-and-bolts of annotation and how Amanda and I plan to support you all, please see her post

The humane hack – a snippet of an argument

[this is the snippet of an argument, and all that I’ve managed to produce today for #AcWriMo. I kinda like it though and offer it up for consumption, rough edges, warts, and all.  It emerges out of something Shawn Anctil said recently about ‘the Laws of Cool‘ when we were talking about his comps which happen this Thursday. In an effort to get my head around what he said, I started to write. This might make it into a piece on some of my recent sound work. Alan Liu’s stuff is always wonderful to read because it turns my head inside out, and I make no warrant that I am doing justice to Alan’s ideas. It’s been a while since I last looked, and I realize I really need to block out several days to do this properly. Anyway, working in public, fail gloriously, etc etc, i give you a snippet of an argument:]

Alan Liu, in 2004, wondered what the role of the arts and humanities was in an age of knowledge work, of deliverables, of an historical event horizon that only goes back the last financial quarter.  He examined the idea of ‘knowledge work’ and teased out how much of the driving force behind it is in pursuit of the ‘cool’. Through a deft plumbing of the history of the early internet (and in particular, riffing on Netscape’s ‘what’s cool?’ page from 1996 and their inability to define it except to say that they’d know it when they saw it ), Liu argues that cool is ‘the aporia of information… cool is information designed to resist information [emphasis original]… information fed back into its own signal to create a standing interference pattern, a paradox pattern’ (Liu, 2004: 179).  The latest web design, the latest app, the latest R package for statistics, the latest acronym on Twitter where all the digital humanists play: cool, and dividing the world.

That is, Liu argued that ‘cool’ was amongst other things a politics of knowledge work, a practice and ethos. He wondered how we might ‘challenge knowledge work to open a space, as yet culturally sterile (coopted, jejune, anarchistic, terroristic), for a more humane hack of contemporary knowledge?’ (Liu 2004: 9). Liu goes on to discuss how the tensions of ‘cool’ in knowledge work (for us, read: digital archaeology) also intersects with an ethos of the unknown, that is, of knowledge workers who work nowhere else somehow manage to stand outside that system of knowledge production. (Is alt-ac ‘alt’ partially because it is the cool work?). This matters for us as archaeologists. There are many ‘cool’ things happening in digital archaeology that somehow do not penetrate into the mainstream (such as it is). The utilitarian dots-on-a-map were once cool, but are now pedestrian. The ‘cool’ things that could be, linger on the fringes. If they did not, they wouldn’t be cool, one supposes. They resist.

To get that more humane hack that Liu seeks, Liu suggests that the historical depth that the humanities provides counters the shallowness of cool:

“The humanities thus have an explanation for the new arts of the information age, whose inheritance of a frantic sequence of artistic modernisms, postmodernisms, and post-postmodernists is otherwise only a displaced encounter with the raw process of historicity. Inversely, the arts offer the humanities serious ways of engaging – both practically and theoretically- with “cool”. Together, the humanities and arts might be able to offer a persuasive argument for the humane arts in the age of knowledge work” 2004:381.

In which case, the emergence of digital archaeologists and historians in the last decade might be the loci of the humane hacks – if we move into that space where we engage the arts.

We need to be making art.

 

Branding

I’m reading some stuff right now on branding. When we started our family cider mill years ago, we eventually stopped trying to DIY our logo and hired a graphic designer. This is what she came up with:

Screen Shot 2015-10-04 at 8.55.57 PM

What’s nice about it, is that you understand instantly what we’re about – our historic dance hall, with the apple tree in full fruit. It’s drawn very crisp, and carries some of the idiosyncracies of the building through, things like the slightly unsymetrical frontage of the building for instance, cleverly tucked behind the apple tree; the font (no idea what the font is, but anyway) is also similarly crisp.

I’m no graphic designer. But I decided to give it a try – what would a logo for the ‘Electric Archaeology’ brand look like? I have to think back to 06 when I started this, and remember why I went with ‘electric’ and not ‘digital’. After all, what with Twitter handles and other social media properties, I’ve pretty much aligned myself with ‘electricarchaeo’, that performance of me on the net, and ‘electric archaeology’, this strange half-world between archaeology, history, public expressions of the same, teaching, and everything else I’ve done.

Why did I go with ‘electric’? Today, I’m not sure. I think it was partly through fear that ‘digital’ as adjective would get old hat: that we would all become digital archaeologists. That’s not really happening though; ‘digital archaeology’ signals something different than ‘computation-in-the-service-of-archaeology’, more of the deformative, digital-humanities side of archaeology, rather than the GIS-and-databases problem solving/management side. Neither side is better or worse than the other; they’re just different. But ‘electric’ also signals a kind of nostalgia for the excitement that ‘electric’ as an adjective once conveyed. Perhaps I was being hipsterarchaeologist.  It also conveys a bit my thought processes; I get very switched on (as it were) to different methods, paths, techniques, issues, as I see them come over the horizon. (Maybe, in a way, I’m trying to repent for the embarrassment of Shawn-the-callow-youth-of-1995 who wrote that the web would never amount to anything for academics).  Ideally, you’d read the stuff I produce, or see the things I make, or hear the data I sonify, and get electrified with new ideas yourself.

I did my own A/B testing on the Sunday night twitter crowd.

https://twitter.com/electricarchaeo/status/650835253318295553

and received lots of feedback, eg:

https://twitter.com/BillCaraher/status/650836595856179201

https://twitter.com/precatlady/status/650836961440235520

https://twitter.com/nowviskie/status/650829907564564480

Lots of opinions out there. I rather like them both (and if you want to see other contenders, just flip through my twitter posts).

A:Drawing (2)

B:

Drawing (1)

Why does any of this matter? I’m not entirely sure – but stick with me. Partly I think, it’s because we operate in a reputation economy. People listen because I’m a known quantity; I’m a known quantity because people listen. Mathew Kirschenbaum:

So it is with digital humanities: you are a digital humanist if you are listened to by those who are already listened to as digital humanists, and they themselves got to be digital humanists by being listened to by others.

That’s just one bit cherry picked from a very important essay. But if doing digital work, online work, critical digital humanities, has taught me anything, it’s that part of the performance is taking control of one’s digital identity. Part of that is carving out a brand.  And if I’m a brand, at least it’s me who’s doing the selling.

(the magic of WordPress reminds me that I last visited these kinds of ideas back in 2010).

Thank you to everyone who offered an opinion! I’m looking into stickers for option B…

Bless your little cotton socks: reflecting on Carleton’s Data Day

I went to Carleton’s ‘Big Data Day’ yesterday. My student, Hollis, had entered a poster in the poster competition, detailing his approach to data mining the audio recordings of THATCamp Accessibility. Looks like he and I were the only two from the humanities end of the spectrum at the event. The day opened with a panel discussion, then a series of presentations from faculty involved in Carleton’s new MA in Data Science (which it appears has support from IBM; there’s a webpage at Carleton but it has a warning on it to all and sundry not to share the link. So go to carleton.ca and do this search). Carleton has aspirations in this regard. The program, and the event, was a collaboration from Computer Science, Biology, Business, Systems and Computer Engineering, Economics, Geography and Environmental Studies.

(And of course, not to be confused with our MA in Digital Humanities, which does not have support from a major tech firm).

Hollis explains his research at Carleton’s Data Day

Anyway, some observations on what this particular approach to big data seems to be about:

1. ‘Big data’ is not defined in this forum.

2. Data are an unalloyed good.

3. Data have agency.

4. “You can have privacy, or you can have cool online stuff. Which do you want?” – this from the Shopify data analyst (biology grad, self-described hacker).

I had a question for the panel. I asked about algorithmic prisons: the uses to which data are put. I pointed out that the data, by themselves, mean nothing. Interpretation, power, and control, and the creation of algorithms that have a view of the world which may not in themselves be warranted. Wasn’t there a role for the humanities in all of this?

Well bless my little cotton socks. It is rare that I am on the receiving end of techno-condescenion. I hope to god I’ve never done this to some one else – make assumptions about your level of engagment or knowledge of the field this way –  but it was apparent that I lost the panel the moment I said, ‘hi, I’m from the history department’. The shopify guy proceeded to lecture me on disruption, and that even the book was a disruption; but we got over that. Unintended side-effects? Sure, there’s always those, but don’t worry.

Big data will save us.

Shopify guy did have one interesting point to make: it’s not ‘big data’, but ‘big interest in data’. He then framed the 20th century as the century of deterministic solutions, whereas the 21st century will be the one of statistical solutions.

Oh good.

https://twitter.com/electricarchaeo/status/459336937762226176

On the privacy issue, one comment from the floor in response to the ‘you can have privacy or you can have cool online tools’ was, ‘you’re just not trying hard enough’. The day progressed.

https://twitter.com/electricarchaeo/status/459360566856155136

https://twitter.com/electricarchaeo/status/459359977174761474

There were good points throughout the day; one of the data librarians from Carleton talked about data research management plans.

https://twitter.com/electricarchaeo/status/459355258645663744

That’s why you need humanities.

https://twitter.com/electricarchaeo/status/459334732485234688

As I was sitting in the room, I was also following the tweets from #caa2014 and #saa2014.  At those two conferences, archaeologists are wrestling with big data issues themselves. Hell, archaeology could be the original big data (if measured in sheer metric tonnage… but also in more conventional measurements of data). But what was striking was the contrast between the two displicinary perspectives. The archaeologists were a lot more humble in the face of dealing with big data describing the human past. There were tweets about uncertainty (about how to model it; about how to think through its implications). There were tweets about power, control, ownership, engagement, community. Tweets about liberation, tweets about being chained. (I’m freewheeling right now, from memory, rather than linking through).

Humility was lacking, at data day.  More data, better data, and our algorithms are an unalloyed good.

https://twitter.com/electricarchaeo/status/459358955991760896

I’m not against making money. But… but… but… not like this.

So. The takeaway from this day? The big data, analytics-for-business crowd, need the humanities. There’s space there already in their discussions, though they don’t seem to recognize it. Or at least not yesterday.

https://twitter.com/electricarchaeo/status/459349150862032897

https://twitter.com/mcburton/status/459336099962167296

Yes.

 

Hollis Peirce, George Garth Graham Research Fellow

Hollis Peirce on Twitter: https://twitter.com/HollPeirce

Mr. Hollis Peirce https://twitter.com/HollPeirce

I am pleased to announce that the first George Garth Graham Undergraduate Digital History Research Fellow will be Mr. Hollis Peirce.

Hollis is a remarkable fellow. He attended the Digital Humanities Summer Institute at the University of Victoria in the summer of 2012. At DHSI he successfully completed a course called “Digitization Fundamentals and Their Application”. In the fall semester of 2012 he was the impetus behind, and helped to organize,  THATCamp Accessibility on the subject of the impact of digital history on accessibility in every sense of the word.

Hollis writes,

Life for me has been riddled with challenges.  The majority of them coming on account of the fact that I, Hollis Peirce, am living life as a disabled individual with Congenital Muscular Dystrophy as many things are not accessible to me.  However, I have never let this fact hold me back from accomplishing my goals.  Because of this, when I first started studying history I knew I was not choosing an easy subject for a disabled individual such as myself.  All those old, heavy, books on high library shelves that history is known for made it one of the most inaccessible subjects possible to study.  All that changed however, when I discovered digital history.

It was thanks to a new mandatory class for history majors at Carleton University called The Historian’s Craft taught by a professor named Dr Shawn Graham.  This course was aimed at teaching students all about how to become a historian, and how a historian is evolving through technology.  At that moment the idea for ‘Accessibility & Digital History’ came to mind.  From that point on many steps have been taken to advance my studies in this field, which has led to being selected as the first George Garth Graham Undergraduate Digital History Reseach Fellow.

Hollis and I have had our first meeting, about what his project might entail. When I initially cooked this idea up, I thought it would allow students the opportunity to work on my projects, or those of my colleagues around the university. As we chatted about Hollis’ ideas, (and where I batted around some of my own stuff),  I realized that I had the directionality of this relationship completely backwards.

It’s not that Hollis gets to work on my projects. It’s that I get to work on his.

Here’s what we came up with.

At THATCamp Accessibility, we recorded every session. We bounced around the idea of transcribing those sessions, but realized that that was not really feasible for us. We started talking about zeroing in on certain segments, to tell a history of the future of an accessible digital humanities… and ideas started to fizz. I showed Hollis some of Jentery Sayer’s stuff, especially his work with Scalar . 

Jentery writes,

the platform particularly facilitates work with visual materials and dynamic media (such as video and audio)….it enables writers to assemble content from multiple sources and juxtapose them with their own compositions.

Can we use Scalar to tell the story of THATCamp Accessibility that captures the spontaneity, creativity, and excitement of that day in a way that highlights the issues of accessibility that Hollis  wants to explore? And if we can, how can we make it accessible for others (screenreaders, text-to-speech, etc?) And if we focus on telling history with an eye to accessibility (oh, how our metaphors privilege certain senses, ways of knowing!) maybe there will be lessons for telling history, full stop?

Stay tuned! Hollis is setting up his blog this week, but he’ll be posting over at http://hollispeirce.grahamresearchfellow.org/

Some Assembly Required: teaching through/with/about/by/because of, the Digital Humanities (slides & notes)

I’m giving a keynote address to the Canadian Network for Innovation in Education conference, at Carleton on Thursday (10.30, River Building). I’ve never done a keynote before, so I’ll confess to being a bit nervous. ‘Provoke!’ I’ve been told. ‘Inspire! Challenge!’ Well, here goes….

These are the slides and the more-or-less complete speaker’s notes. I often write things out, and then completely adlib on the day, but this is more or less the flavour I’m going for.

[Title]

I never appreciated how scary those three words were until I had kids. ‘Some assembly required’. That first Christmas was all, slide Tab A into Slot B. Where’s the 5/8ths gripley? Is that an Allen key? Why are there so many screws left over? The toys, with time, get broken, get fixed, get recombined with different play sets, are the main characters and the exotic locales for epic stories. I get a lot of mileage out of the stories my kids tell and act out with these toys.

My job is the DH guy in the history department. DH, as I see it, is a bit like the way my kids play with the imperfectly built things – it’s about making things, about breaking things, about being playful with those things. This talk is about what that kind of perspective might imply for our teaching and learning.

[2]

I don’t know what persuaded my parents that it’d be a good idea to spend $300 in 1983 dollars on a Vic20, but I’m glad they did. You turn on your ipad, it all just happens magically, whoosh! In those days, if you had a computer, you had to figure out how to make it do stuff, the hard way. A bit disappointing, that first ‘Ready’ prompt. Ready to do what? My brothers and I wanted to play games. So, we sat down to learn how to program them.  If you had a vic-20, do you remember how exciting it was when that ball first bounced off the corners of your screen? A bit like the apes in the opening scene of ‘2001’.  At least, in our house.

[3]

‘Wargame’, film with Matthew Broderick. This scared me; but I loved the idea of being able to reach out to someone else, someone far from where I lived in Western Quebec. So we settled for occasional trips to the Commodore store in Ottawa, bootleg copies of Compute! Magazine, and my most treasured book, a ‘how to make adventure games’ manual for kids, that my Aunt purchased for me at the Ontario Science centre.

[4]

Do you remember old-school text adventures? They’re games! They promote reading! Literacy! They are a Good Thing. Let’s play a bit of this game, ‘Action Castle’, to remind us how they worked.

To play an interactive fiction is to foreground how the rules work; it’s easy to see, with IF. But that same interrogation needs to happen whenever we encounter digital media.

[5]

Games like Bioshock – a criticism of Randian philosophy. Here, the interplay between the rules and the illusion of agency are critical to making the argument work.

When you play any kind of game, or interact with any kind of medium, you generally achieve success once you begin to think like the machine. What do games teach us? How to play the game: how to think like a computer. This is a ‘cyborg’ consciousness. The ‘cyb’ in ‘Cyborg’ comes from the greek for ‘governor’ or ‘ship’s captain’. Who is doing the governing? The code. This is why humanities NEEDS to consider the digital. It’s too important to leave to the folks who are already good at thinking like machines. This is the first strand of what ‘digital humanities’ might mean.

[6]

A second strand comes from that same impulse that my brothers and I had – let’s make something! Trying to make something on the computer inevitably leads to deformation. This deformation can be on purpose, like an artist; or it can be accidental, a result of either the user’s skill or the way that the underlying code imagines the world to work.

 [7]

‘Historical Friction’ is my attempt to realize a day-dream: what if the history of a place was thick enough to impede movement through it? I knew that I could find a) enough information about virtually everywhere on Wikipedia; that b) I could access this through mobile computing and c) something that often stops me in my tracks is not primarily visual but rather auditory. But I don’t have the coding chops to build something like that from scratch.

What I can do, though, is mash things together, sometimes. But when I do that, I’m beholden to design choices others have made. ‘Historical Friction’ is my first stab at this, welding someone else’s Wikipedia tool to someone else’s voice synthesizer. Let’s take a listen.

…So this second strand of DH is to deform (with its connotations of a kind of performance) different ways of knowing.

[8]

A third strand of DH comes from the reflexive use of technology. My training is in archaeology. As an archaeologist, I became Eastern Canada’s only expert in Roman Brick Stamps. Not a lot of call for that.

But I recognized that I could use this material to extract fossilized social networks, that the information in the stamps was all about connections. Once I had this social network, I began to wonder how I could reanimate it, and so I turned to simulation modeling. After much exploration, I’ve realized that what I resurrect on these social networks is NOT the past, but rather the story I am telling about the past. I simulate historiography. I create a population of zombie Romans (individual computing objects) and I give them rules of behavior that describe some phenomenon in the past that I am interested in. These rules are formulated at the level of the individual. I let the zombies go, and watch how they interact. In this way, I develop a way to interrogate the unintended or emergent consequences of the story I tell about the past: a kind of probabilistic historiography.

So DH allows me to deform my own understandings of the world; it allows me to put the stories I tell to the test.

[9]recap

There’s an awful lot of work that goes under the rubric of ‘digital humanities’. But these three strands are I think the critical ones for understanding what university teaching informed by DH might look like.

[10]

Did I mention my background was in archaeology? There’s a lot that goes under the rubric of ‘experimental’ archaeology that ties in to or is congruent with the digital humanities as well. Fundamentally, you might file it under the caption of ‘making as a way of knowing’.

[11]

Experimental archaeology has been around for decades. So too has DH (and its earlier incarnation as ‘humanities computing’) which goes back to at least the 1940s and Father Busa, who famously persuaded IBM to give him a research lab and computer scientists to help him create his concordance of the work praesans in the writings of Thomas Aquinas.

So despite the current buzz, DH is not just a fad, but rather has (comparatively) deep antecedents. The ‘Humanities’ as an organizing concept in universities has scarcely been around for much longer.

[12]

So let’s consider then what DH implies for university teaching.

[13]salt

But I feel I should warn you. My abilities to forecast the future are entirely suspect. As an undergrad, in 1994, I was asked to go on the ‘world wide web’, this new thing, and create an annotated bibliography concerning as many websites as I could that dealt with the Etruscans. The first site I found (before the days of content filters) was headlined, ‘the Sex Communist Manifesto’. Unimpressed, I wrote a screed that began, “The so-called ‘world wide web’ will never be useful for academics.”

Please do take everything I say then with a grain or two of salt.

[14]

Let me tell you about some of the things I have tried, built on these ideas of recognizing our increasingly cyborg consciousness, deformation of our materials, and of our perspectives. I’m pretty much a one-man band, so I’ve not done a lot with a lot of bells and whistles, but I have tried to foster a kind of playfulness, whether that’s role-playing, game playing, or just screwing around.

[15]epic fails

Some of this has failed horribly; and partly the failure emerged because I didn’t understand that, just like digital media, our institutions have rule sets that students are aware of; sometimes, our ‘best’ students are ‘best’ not because they have a deep understanding of the materials but rather because they have learned to play the games that our rules have created. In the game of being a student, the rules are well understood – especially in history (which is where I currently have my departmental home). Write an essay; follow certain rhetorical devices; write a midterm; write a final. Rinse. Repeat. Woe betide the prof who messes with that formula!

I once taught in a distance ed program, teaching an introduction to Roman culture class. The materials were already developed; I was little more than a glorified scantron machine. I was getting essay after essay that contained clangers along the lines of, ‘Vespasian won the civil war of AD 69, because Vespasian was later the Emperor.’ I played a lot of Civilization IV at the time, so I thought, I bet if I could get students to play out the scenario of AD69, students would understand a lot more of the contingency of the period, that Vespasian’s win was not foreordained. I crafted the scenario, built an alternative essay around it (’play the scenario, contrast the game’s history with ‘real’ history’), found students who had the game. Though many played it, they all opted to just write the original essay prompt. My failure was two-fold. One,‘playing a game for credit’ did not mesh with ‘the game of being a student’; there was no space there. Two, I created a ‘creepy treehouse’, a transgression into the student’s world where I did not belong. Profs do not play games. It’d be like inviting all my students to friend me on Facebook. It was creepy.

I tried again, in a history course last winter. The first assessment exercise – an icebreaker, really – was to play an interactive fiction that recreated some of the social aspects of moving through Roman space. The player had to find her way from Beneventum to Pompeii, without recourse to maps. What panic! What chaos! I lost a third of the class that week. Again, the concern was, ‘how does playing a game fit into the game of being a student’. Learning from the previous fiasco, I thought I’d laid a better foundation this time. Nope. The thing I neglected: there is safety in the herd. No one was willing to play as an individual and submit an individual response – ‘who wants to be a guinea pig?’ might have been the name of THIS game, as far as the students were concerned. I changed course, and we played it as a group, in class. Suddenly, it was safe.

[16]epic wins

But from failure, we learn, and we sometimes have epic wins (failures almost always are more interesting than wins). Imagine if we had a system that short-circuited the game of being a student, to allow students the freedom to fail, to try things out, and to grow! One of the major fails of my Year of the Four Emperors experiment was that it was I who did all the building. It should’ve been the students. When I built my scenario, I was doing it in public on one of the game’s community forums. I’ve since started crafting courses (or at least, trying to) where the students are continually building upwards from zero, where they do it in public, and where all of their writing and crafting is done in the open, in the context of a special group. This changes the game considerably.

[17]

To many of you, this is no doubt a coals-to-newcastle, preaching-to-the-choir kind of moment.

[18]

And again, I hear you say, what would an entire university look like, if all this was our foundation? Well, it’s starting to look a little better than it did when we first asked the question…

 [19]dh will save us

…but DH has been pushed an awful lot lately. DH will save us! It’ll make the humanities ‘relevant’: to funding bodies, to government, to parents! Just sprinkle DH fairy dust, and all will be safe, right?

[19]memes & dark side

You’ve probably heard that. It’s happened enough that there’s even memes about it.

Yep. No doubt – a lot of folks are sick of hearing about ‘the digital humanities’. At the most recent MLA, there was a good deal of pushback, including a session called ‘the dark side of DH’. Wendy Chun wrote,

For today, I want to propose that the dark side of the digital humanities is its bright side, its alleged promise: its alleged promise to save the humanities by making them and their graduates relevant, by giving their graduates technical skills that will allow them to thrive in a difficult and precarious job market. Speaking partly as a former engineer, this promise strikes me as bull: knowing GIS or basic statistics or basic scripting (or even server side scripting) is not going to make English majors competitive with engineers or CS geeks trained here or increasingly abroad […] It allows us to believe that the problem facing our students and our profession is a lack of technical savvy rather than an economic system that undermines the future of our students.”

  (That’s not a DH that I recognize, by the way, as I hope you’ll have noticed given my three strands).

Now, I wasn’t at that meeting, but I saw a lot of chatter flutter by that day, as in that same session MOOCs were conflated with the digital humanities; that somehow the embrace of DH enables the proliferation of MOOCs. As Amanda French, who has coordinated an extraordinary number of digital humanities ‘THATCamp’ conferences, has said, ‘I don’t know a single digital humanist who likes MOOcs.”

We’ve heard a lot about MOOCs today, and I’m certainly in no position to critique them as I’ve never offered nor successfully finished one. But as I’ve identified the strands of DH today, there *is* an affinity though with the so-called ‘cMOOC’.

[21]Know Your MOOCs

Before there was coursera, udacity, and glorified talking heads over the internet, there was the cMOOC. The Canadian MOOC. The personal learning environment. Isn’t it interesting that Pearson, a text book publisher, is a heavy investor in the MOOC scene? Frankly, as xMOOCs are currently designed, they seem to me to be a challenge to publishers of textbooks rather than to teaching. We can do better, and I think DH ties well with the idea of personal learning environments. ‘Massive’ is not, in and of itself, a virtue, and we’d do well to remember that.

[22]Rainbow Castle

So, following my three strands, we’d:

 [23]

-identify the ways our institutions and our uses of technology force particular ways of thinking

-we’d deform the content we teach

-we’d set up our institutions and our uses of technology to deform the way our students think: including the ways our institutions are set up.

[24]

So let’s turn the university inside out. It’s been about silos for so long (also known as ivory towers). I grew up on a farm: do you know what gets put into a silo, what comes out? It’s silage, chopped up, often a bit fermented, cattle food: pre-processed cud. Let’s not do that anymore.

[25]Walled Gardens, online dating

For all their massiveness, MOOCs and Universities are still walled gardens. And what’s the unit of connection? It’s the course. It’s the container. I used to work with a guy who often said, ‘once we get the contract, we’ll just get monkeys to do the work’. That guy is no longer in business. I used to work for a for-profit university in the States that had a similar approach to hiring online faculty.

MOOCs are not disruptive in that sense. Want to be really disruptive? Let’s turn to a model that massively connects people together who have a shared interest. I hereby banish the use of any metaphor that frames the relationship at a university in terms of clients, or customers. Instead, what if the metaphor used was more in line with a dating service?

In online dating, the site brings together two kinds of people, both looking for the same thing. Typically, the men pay a fee to be on the site; women are wooed to the site by all sorts of free promos etc.  No point having a dating site that does not have any available ‘others’ on it. In which case, the university could be in the business of bringing together students [the ‘men’] with faculty [the ‘women’]. If a university had that metaphor in its mind, it would be thinking, ‘what can we do to make our site – the university – an attractive place for faculty to be?’ Imagine that!

Students would not be signing up for classes, but rather, to follow and learn from particular profs. Typically on something like eBay or a dating site, there are reputation systems embedded in the site. You do not buy from the person with the bad rep in eBay; you do not contact the person whose profile has gotten many negative reviews. Since the university knows the grades of the students and has teaching evaluations and other indicators of faculty interests and reputations, it has the ability to put together faculty and students in a dynamic way. “Others who have enjoyed learning about Roman civilization with Dr. Graham have loved learning about Bronze Age Greece with…”.  Wouldn’t it be something to allow students to select their areas of interest knowing the reputation of the profs who work in a particular area; and for profs to select their students based on their demonstrated interests and aptitudes? Let faculty and students have ‘tokens’ – this is my first choice, this is my second choice, this is my third choice prof/student to work with for the session. Facilitate the matching of students and faculty. Let the student craft their way through university following individuals, and crafting a ‘masterpiece’ for their final demonstration of making as a way of knowing, for their BA? Hmmm. Kinda sounds like a return to the Guild, as it were.

You might not like that, which is fine; there are probably better ideas out there. We’ve got all this damned information around! Maybe there are earlier models that could work better with our new technologies, maybe there are new models for our new techs. But surely we can do better than merely replicate processes that were designed for the late 19th and early 20th century? Whatever metaphor we use to frame what the university does, it goes a long way to framing the ways learning can happen. That’s what DH and its exploration of a cyborg consciousness should make us at least explore.

[26]domain of one’s own

And once we’ve done that, let’s have some real openness. Let the world see that faculty-student, and student-student, relationship develop. Invite the rest of the world in. Folks like Ethan Watrall at MSU already do that for their on-campus courses putting all course materials and assessment activities on open websites, inviting the wider world to participate and to interact with the students.

Give every student, at the time of registration, a domain of their own, like Mary Washington is starting to do. Pay for it, help the student maintain it, for their time at university. At graduation, the student could archive it, or take over its maintenance. Let the learning community continue after formal assessment ends. The robots that construct our knowledge from the world wide web – Google and the content aggregators – depend on strong signals, on a creative class. If each and every student at your institution (and your alumni!) is using a domain of their own as a repository for their own IP, a personal learning environment, a node in a frequently re-configuring network of learners, your university would generate real gravity on the web, become the well out of which the wider world draws its knowledge. Use the structure and logic of the web to embed the learning life of the university so deeply into the wider world that it cannot be extricated!

[27]

Because right now, that’s not happening. If you study the structure of the web for different kinds of academic knowledge (here, Roman archaeology), there’s a huge disconnect between where the people are, and where the academics are. If we allow that to continue, it becomes increasingly more easy for outsiders to frame ‘academic’ knowledge as a synonym ‘pointless’. With the embedded university, the university inside out, there are no outsiders. If we embed our teaching through the personal learning environments of our students, our research production will become similarly embedded.

[28]

If the university is inside out, and not in splendid isolation, then it is embedded.

Forget massively ‘open’.

Think massively embedded.

Think massively accessible.

(Not the best image I could fine, but hey! that boulder, part of a structure, is embedded in a massively accessible landscape.)

[29]Check mark list

So what’s tuition for, then? Well, it’s an opportunity to have my one-on-one undivided attention; it’s icetime, an opportunity to skate. But we need to have more opportunities for sideways access to that attention too, for people who have benefited from participating in our openness, our embeddedness to demonstrate what they’ve learned. There’s much to recommend in Western Governors’ University’s approach to the evaluation of non-traditional learners.

[30]

The digital humanities, as a perspective, has changed the way I’ve come to teach. I didn’t set out to be a digital humanist; I wanted to be an archaeologist. But the multiple ways in which archaeological knowledge is constructed, its pan-disciplinary need to draw from different wells, pushed me into DH. There are many different strands to DH work; I’ve identified here what I think are three major ones that could become the framework, the weave and the weft, for something truly disruptive.

Practical Necromancy talk @Scholarslab – part I

Below is a draft of the first part of my talk for Scholarslab this week, at the University of Virginia. It needs to be whittled down, but I thought that those of you who can’t drop by on Thursday might enjoy this sneak peak.

Thursday, March 21 at 2:00pm
in Scholars’ Lab, 4th floor Alderman Library.

When I go to parties, people will ask me, ‘what do you do?’. I’ll say, I’m in the history department at Carleton. If they don’t walk away, sometimes they’ll follow that up with, ‘I love history! I always wanted to be an archaeologist!’, to which I’ll say, ‘So did I!’

My background is in Roman archaeology. Somewhere along the line, I became a ‘digital humanist’, so I am honoured to be here to speak with you today, here at the epicentre, where the digital humanities movement all began.

If the digital humanities were a zombie flick, somewhere in this room would be patient zero.

Somewhere along the line, I became interested in the fossilized traces of social networks that I could find in the archaeology. I became deeply interested – I’m still interested – in exploring those networks with social network analysis. But I became disenchanted with the whole affair, because all I could develop were static snapshots of the networks at different times. I couldn’t fill in the gaps. Worse, I couldn’t really explore what flowed over those networks, or how those networks intersected with broader social & physical environments.

It was this problem that got me interested in agent based modeling. At the time, I had just won a postdoc in Roman Archaeology at the University of Manitoba with Lea Stirling. When pressed about what I was actually doing, I would glibly respond, ‘Oh, just a bit of practical necromancy, raising the dead, you know how it is’. Lea would just laugh, and once said to me, ‘I have no idea what it is you’re doing, but it seems cool, so let’s see what happens next!’

How amazing to meet someone with the confidence to dance out on a limb like that!

But there was truth in that glib response. It really is a form of practical necromancy, and the connections with actual necromancy and technologies of death is a bit more profound than I first considered.

So today, let me take you through a bit of the deep history of divination, necromancy, and talking with the dead; then we’ll consider modern simulation technologies as a form of divination in the same mold; and then I’ll discuss how we can use this power for good instead of evil, of how it fits into the oft-quote digital humanities ethos of ‘hacking as a way of knowing’ (which is rather like experimental archaeology, when you think about it), and how I’m able to generate a probabilistic historiography through this technique.

And like all good necromancers, it’s important to test things out on unwilling victims, so I would also like to thank the students of HIST3812 who’ve had all of the ideas road-tested on them earlier this term.

Zombies clearly fill a niche in modern western culture. The president of the University of Toronto recently spoke about ‘zombie ideas’ that despite our best efforts, persist, infect administrators, politicians, and students alike, trying to eat the brains of university education.

Zombies emerge in popular culture in times of angst, fear, and uncertainty. If hollywood has taught us anything, it’s that Zombies are bad news. Sometimes the zombies are formerly dead humans; sometimes they are humans who have been transformed. Sometimes we deliberately create a zombie. The zombie can be controlled, and made to do useful work; zombie as a kind of slavery. More often, the zombies break loose, or are the result of interfering with things humanity was wont not too; apocalypse beckons. But sometimes, like ‘Fido’, a zombie can be useful, can be harnessed, and somehow, be more human than the humans. [Fido]

If you’d like to raise the dead yourself, the answer is always just a click away [ehow].

There are other uses for the restless dead. Before our current fixation with apocalypse, the restless dead could be useful for keeping the world from ending.

In video games, we call this ‘the problem space’ – what is it that a particular simulation or interaction is trying to achieve? For humanity, at a cosmological level, the response to that problem is through necromancy and divination.

I’m generalizing horribly, of course, and the anthropologists in the audience are probably gritting their teeth. Nevertheless, when we look at the deep history and archaeology of many peoples, a lot can be tied to this problem of keeping the world from ending. A solution to the problem was to converse with those who had gone before, those who were currently inhabiting another realm. Shamanism was one such response. The agony of shamanism ties well into subsequent elaborations such as the ball games of mesoamerica, or other ‘game’ like experiences. The ritualized agony of the athlete was one portal into recreating the cosmogonies and cosmologies of a people, thus keeping the world going.

The bull-leaping game at Knossos is perhaps one example of this, according to some commentators. Some have seen in the plan of the middle minoan phase of this palace (towards the end of the 2nd millenium BC) a replication in architecture of a broader cosmology, that its very layout reflects the way the Minoans saw the world (this is partly also because this plan seems to replicate in other Minoan centres around the Aegean). Jeffrey Soles, pointing to the architectural play of light and shadow throughout the various levels of Knossos argues that this maze-like structure was all part of the ecstatic journey, and ties shamanism directly to the agonies of sport & game in this location. We don’t have the Minoans’ own stories, of course, but we do have these frescoes of bull-leaping, and other paraphernalia which tie in nicely with the later dark-age myths of Greece

So I’m making a connection here between the way a people see the world working, and their games & rituals. I’m arguing that the deep history of games  is a simulation of how the world works.

This carries through to more recent periods as well. Herodotus wrote about the coming of the Etruscans to Italy: “In the reign of Atys son of Menes there was a great scarcity of food in all Lydia. For a while the Lydians bore this with patience; but soon, when the famine continued, they looked for remedies, and various plans were suggested. It was then that they invented the games of dice, knucklebones, and ball, and all the other games of pastime, except for checkers, which the Lydians do not claim to have invented. Then, using their discovery to forget all about the famine, they would play every other day, all day, so that they would not have to eat… This was their way of life for eighteen years. Since the famine still did not end, however, but grew worse, the king at last divided the people into two groups and made them draw lots, so that one should stay and the other leave the country’.

Here I think Herodotus misses the import of the games: not as a pasttime, but as a way of trying to control, predict, solve, or otherwise intercede with the divine, to resolve the famine. In later Etruscan and Roman society, gladiatorial games for instance were not about entertainment but rather about cleansing society of disruptive elements, about bringing everything into balance again, hence the elaborate theatre of death that developed.

The specialist never disappears though, the one who has that special connection with the other side and intercedes for broader society as it navigates that original problem space. These were the magicians and priests. But there is an important distinction here. The priest is passive in reading signs, portents, and omens. Religion is revealed, at its proper time and place, through proper observation of the rituals. The magician is active – he (and she) compels the numinous to reveal itself, the spirits are dragged into this realm; it is the magician’s skill and knowledge which causes the future to unfurl before her eye.

The priest was holy, the magician was unholy.

Straddling this divide is the Oracle. The oracle has both elements of revelation and compulsion. Any decent oracle worth its salt would not give a straight-up answer, either, but rather required layers of revelation and interpretation. At Delphi, the God spoke to the Pythia, the priestess, who sat on the stool over the crack in the earth. When the god spoke, the fumes from below would overcome her, causing her to babble and writhe uncontrollably. Priests would then ‘interpret’ the prophecy, in form of a riddle.

Why riddles? Riddles are ancient. They appear on cuneiform texts. Even Gollum knew what a true riddle should look like – a kind of lyric poem asking a question that guards the right answer in hints and wordplay.

‘I tremble at each breath of air/ And yet can heaviest burders bear. [implicit question being asked is who am I? – water]

Bilbo cheated.

We could not get away from a discussion of riddles in the digital humanities without of course mentioning the I-ching. It’s a collection of texts that, depending on dice throws, get combined and read in particular ways. Because this is essentially a number of yes-or-no answers, the book can be easily coded onto a computer or represented mechanically. In which case, it’s not really a ‘book’ at all, but a machine for producing riddles.

Ruth Wehlau writes, “Riddlers, like poets, imitate God by creating their own cosmos; they recreate through words, making familiar objects into something completely new, rearranging the parts of pieces of things to produce creatures with strange combinations of arms, legs, eyes and mouths. In this transformed world, a distorted mirror of the real world, the riddler is in control, but the reader has the ability to break the code and solve the mystery (wehlau 1997)

Riddles & divination are related, and are dangerous. But they also create a simulation, of how the world can come to be, of how it can be controlled.

One can almost see the impetus for necromancy, when living in a world described by riddles. Saul visits the Witch of Endor; Oddyseus goes straight to the source.

…and Professor Hix prefers the term ‘post mortem communications’. However you spin it, though, the element of compulsion, of speaking with the dead, marks it out as a transgression; necromancers and those who seek their aid never end well.

It remains true today, that those who practice simulation, are similarly held in dubious regard. If that was not the case, tongue in cheek articles titles such as this would not be necessary.

I am making the argument that modern computational simulation, especially in the humanities, is more akin to necromancy than it is to divination, for all of these reasons.

But it’s also the fact that we do our simulation through computation itself that marks this out as a kind of necromancy.

The history of the modern digital computer is tied up with the need to accurately simulate the yields of atomic bombs,  of blast zones, and potential fallout, of death and war. Modern technoculture has its roots in the need to accurately model the outcome of nuclear war, an inversion of the age old problem space, ‘how can we keep the world from ending’ through the doctrines of mutually assured destruction.

The playfulness of those scientists, and the acceleration of hardware technology lead to video games, but that’s a talk for another day (and indeed, has been recently well treated by Rob MacDougall of Western University).

‘But wait! Are you implying that you can simulate humans just as you could individual bits of uranium and atoms, and so on, like the nuclear physicists?’ No, I’m not saying that, but it’s not for nothing that Isaac Asimov gave the world Hari Seldon & the idea of ‘psychohistory’ in the 1950s. As Wikipedia so ably puts it, “Psychohistory is a fictional science in Isaac Asimov’s Foundation universe which combines history, sociology, etc., and mathematical statistics to make general predictions about the future behavior of very large groups of people, such as the Galactic Empire.”

Even if you could do Seldon’s psychohistorical approach, it’s predicated on a population of an entire galaxy. One planetfull, or one empire-full, or one region-full, of people just isn’t enough. Remember, this is a talk on ‘practical’ necromancy, not science-fiction.

Well what about so-called ‘cliodynamics’? Cliodynamics looks for recurring patterns in aggregate statistics of human culture. It may well find such patterns, but it doesn’t really have anything to say about ‘why’ such patterns might emerge. Both psycohistory and cliodynamics are concerned with large aggregates of people. As an archaeologist, all I ever find are the traces of individuals, of individual decisions in the past. It always requires some sort of leap to jump from these individual traces to something larger like ‘the group’ or ‘the state’. A Roman aqueduct is, at base, still the result of many individual actions.

A practical necromancy therefore is a simulation of the individual.

There are many objections to simulation of human beings, rather than things like atoms, nuclear bombs, or the weather. Our simulations can only do what we program them to do. So they are only simulations of how we believe the world works (ah! Cosmology!). In some cases, like weather, our beliefs and reality match quite well, at least for a few days, and we know much about how the variables intersect. But, as complexity theory tells us, starting conditions strongly affect how things transpire. Therefore we forecast from multiple runs with slightly different starting conditions. That’s what a 10% chance of rain really means: We ran the simulation 100 times, and in 10 of them, rain emerged.

And humans are a whole lot more complex than the water cycle. In the case of humans, we don’t know all the variables; we don’t know how free will works; we don’t know how a given individual will react; we don’t understand how individuals and society influence each other. We do have theories though.

This isn’t a bug, it’s a feature. The direction of simulation is misplaced. We cannot really simulate the future, except in extremely circumscribed situations, such as pedestrian flow. So let us not simulate the future, as humanists. Let us create some zombies, and see how they interact. Let our zombies represent individuals in the past. Give these zombies rules for interacting that represent our best beliefs, our best stories, of how some aspect of the past worked. Let them interact. The resulting range of possible outcomes becomes a kind of probabilistic historiography. We end up with not just a story about the past, but also about other possible pasts that could have happened if our initial story we are telling about how individuals in the past acted is true, for a given value of true.

 We create simulacra, zombies, empty husks representing past actors. We give them rules to be interpreted given local conditions. We set them in motion from various starting positions. We watch what emerges, and thus can sweep the entire behavior space, the entire realm of possible outcomes given this understanding. We map what did occur (as best as we understand it) against the predictions of the model. For the archaeologist, for the historian, the strength of agent based modeling is that it allows us to explore the unintended consequences inherent in the stories we tell about the past. This isn’t easy. But it can be done. And compared to actually raising the dead, it is indeed practical.

[and here begins part II, which runs through some of my published ABMS, what they do, why they do it. All of this has to fit within an hour, so I need to do some trimming.]

[Postscriptum, March 23: the image of the book of random digits came from Mark Sample’s ‘An Account of Randomness in Literary Computing, & was meant to remind me to talk about some of the things Mark brought up. As it happens, I didn’t do that when I presented the other day, but you really should go read his post.]

Living the Life Electric

I’m addressing the Underhill Graduate Students’ Colloquium tomorrow, here in the history department at Carleton U. Below are my slides for ‘Living the Life Electric: On Becoming a Digital Humanist’

update March 7: here are my speaking notes. These give a rough sense of what I intend to talk about at various points. Bolded titles are the titles of slides. Not every slide is listed, as some speak more or less for themselves.

I wanted to be an archaeologist – I graduated in 2002.

‘Digital Humanities’ wasn’t coined until 2004.

It emerges from ‘humanities computing’, which has been around since the 1940s.

In fact, computing wouldn’t be the way it is today without the Humanities, and the Jesuit, Father Busa.

Eastern Canada’s Only Stamped Brick Specialist -Roman archaeology

Stamped brick

Eastern Canada’s only Stamped Brick Specialist, probably

….things were pretty lean in 2003…

Life from a suitcase

Comin’ Home Again

Youth development grant to study cultural heritage of my home township

Also a small teaching excavation based in Shawville

Which led to a teaching gig at the local high school.

A Year of Living Secondarily

What was it about my academic work that I really enjoyed?

Networks

Possibilities of Simulation

Random Chances and the virtues of ‘What the Hell’

Coronation Hall

Meanwhile, I enter business – 3 different startups, one of which has survived (so far!)

Heritage focus

Heritage education – learned how to install my own software, LMS

Trying to monetize the information I uncovered in my cultural heritage study

Coronation Hall Cider Mills

(Shameless Plug).

What are the digital humanities  – think about it: modern computers were developed in order to allow us to map, forecast, the consequences of massive annihilation and death. Simulation is rooted in the desire to predict future death counts. My interest emerged from trying to simulate my own understandings of the past, to understand the unintended consequences of my understandings, to put some sort of order on the necessarily incomplete materials I was looking at. I call it ‘practical necromancy’

Do your work in public blog was originally intended to chronicle my work on simulation, but it has become very much the driver of my online identity, the calling card that others see when they intersect my work – and because it’s been up for so long, with a sustained focus, it creates a very strong signal which our algorithms, Google, pick up. This is how academics can push the public discourse: interfere with the world’s extended mind, their entangled consciousness of cyberspace & meatspace.

Allows you to develop your ideas

Forces you to write in small chunks

Exposes your work to potential audiences

My blog posts have been cited in others’ academic monographs

Has improved the readership of my published work

A quarter million page reads over the last six years.

My book: maybe 40 copies, if I’m lucky.

Basic Word Counts

Top words:

digital 1082 research 650 university 577 experience 499 library 393 humanities 386

History: 177 times

Broadly, not useful or surprising. But consider the structure of word use…

Group 1: gives you a sense of technical skills, but for the most part not the kinds of analyses that one would use that for. That’s an important distinction. The analysis should drive the skill set, not the other way around (a man with a hammer, everything looks like a nail)

Group 2: European centres!

Group 3: Canada!

Job adverts – to – topics. Six broad groups based on how the adverts share particular discourses. Gives a sense of where academic departments think this field is going. If I’d done this according to individual researcher’s blogs, or the ‘about’ pages for different centres, you’d get a very different picture – game studies, for instance.

 

Important point: I wanted to show you how you can begin to approach large masses of material, and extract insights, suss out, underlying structures of ideas. This is going to be big in the future, as more and more data about our every waking moment gets recorded. Google Glass? It’s not about the user: it’s about everything the user sees, which’ll get recorded in the googleplex. Governments. Marketers. University Administrations. Learn to extract signals from this noise, and you’ll never go hungry again.

Keep in mind that in 1994 I wrote that the internet would never be useful for academics. My ability to predict the future is thus suspect.

So how to join this brave new world? Twitter, etc.