An Open Research Notebook Workflow with Scrivener and Github Part 2: Now With Dillinger.io!

A couple of updates:

First item

The four scripts that sparkygetsthegirl crafted allow him to

1. write in Scrivener,

2. sync to a Dropbox folder,

3. Convert to md,

4. then open those md files on an android table to write/edit/add

5. and then reconvert to rtf for syncing back into Scrivener.

Screen Shot 2014-09-19 at 2.24.27 PMI wondered to myself, what about some of the online markdown editors? Dillinger.io can scan Dropbox for md files. So, I went to Dillinger.io, linked it to my dropbox, scanned for md files, and lo! I found my project notes. So if the syncing folder is shared with other users, they can edit the notecards via Dillinger. Cool, eh? Not everyone has a native app for editing, so they can just point their browser’s device to the website. I’m sure there are more options out there.

Second Item

I was getting syncing errors because I wasn’t flipping the md back to rtf.

But, one caveat: when I went to run the md to rtf script, to get my changes back into Scrivener (and then sync), things seemed to go very wonky indeed. One card was now blank, the others were all Scrivener’s markup but Scrivener wasn’t recognizing it.

So I think the problem is me doing things out of order. I continue to play.

Third Item

I automated running of the conversion scripts. You can see my automator set up in the screenshot below. Again, I saved it as an application on my desktop. First step is to grab the right folder. Second, to open the terminal, input the commands, then close the terminal.

Screen Shot 2014-09-19 at 2.36.03 PM

Postscript

I was asked why on earth would I want to share my research notes? Many many reasons – see Caleb McDaniel’s post, for instance – but one other feature is that, because I’m doing this on Github, a person could fork (copy) my entire research archive. They could then use it to build upon. Github keeps track of who forks what, so forking becomes a kind of mass citation and breadcrumb trail showing who had an idea first. Moreover, github code (or in this case, my research archive) can be archived on figshare too, thus giving it a unique DOI *and* proper digital archiving in multiple locations. Kinda neat, eh?

An Open Research Notebook Workflow with Scrivener and Github

I like Scrivener. I *really* like being able to have my research and my writing in the same place, and most of all, I like being able to re-arrange the cards until I start to see the ideas fall into place.

I’m a bit of a visual learner, I suppose. (Which makes it ironic that I so rarely provide screenshots here. But I digress). What I’ve been looking for is a way to share my research, my lab notes, my digital ephemera in a single notebook. Lots of examples are out there, but another criterion is that I need to be able to set something up that my students might possibly be able to replicate.

So my requirements:

1. Visually see my notes, their layout, their possible logical connections. The ability to rearrange my notes provides the framework for my later written outputs.

2. Get my notes (but not all of the other bits and pieces) onto the web in such a way that each note becomes a citable object, with revision history freely available.

3. Ideally, that could then feed into some sort of shiny interface for others’ browsing – something like Jeckyll, I guess – but not really a big deal at the moment.

So #1 is taken care of with Scrivener. Number 2? I’m thinking Github. Number 3? We’ll worry about that some other day. There are Scrivener project templates that can be dropped into a Github repository (see previous post). You would create a folder/repo on your computer, drop the template into that, and write away to your hearts content, committing and syncing at the end of the day. This is what you’d get. All those slashes and curly brackets tell Scrivener what’s going on, but it’s not all that nice to read. (After all, that solution is about revision history, not open notebooks).

Now, it is possible to manually compile your whole document, or bits at a time, into markdown files and to commit/sync those. That’s nice, but time consuming. What I think I need is some way to turn Scrivener’s rtf’s into nice markdown. I found this, a collection of scripts by Sparkygetsthegirl as part of a Scrivener to Android tablet and back writing flow. Check it out! Here’s how it works. NB, this is all Mac based, today.

1. Make a new Scrivener project.

2. Sync it to dropbox. (which is nice: backups, portability via Dropbox, sharing via Github! see below)

3. drop the 4 scripts into the synced folder. Open a terminal window there. We’ll come back to that.

4. open Automator. What we’re going to do is create an application that will open the ‘drafts’ folder in the synced project, grab everything, then filter for just the markdown files we made, then move them over to our github repo, overwriting any pre-existing files there. Here’s a screenshot of what that application looks like in the Automator editing screen:

Remember, you're creating an 'application', not a 'workflow'

Remember, you’re creating an ‘application’, not a ‘workflow’

You drag the drafts folder into the ‘Get specified finder items’ box, get the folder contents, filter for files with file extension .md, and then copy to your github repo. Tick off the overwrite checkbox.

Back in scrivener, you start to write.

Write write write.

Here’s a screenshot of how I’m setting up a new project.

Screen Shot 2014-09-17 at 1.50.14 PM

In this screenshot, I’ve already moved my notecards from ‘research’ into ‘draft’. In a final compile, I’d edit things heavily, add bits and pieces to connect the thoughts, shuffle them around, etc. But right now, you can see one main card that identifies the project and the pertinent information surrounding it (like for instance, when I’m supposed to have this thing done). I can compile just that card into multimarkdown, and save it directly to the github repository as readme.md.

Now the day is done, I’m finished writing/researching/playing. I sync the project one last time. Then, in the terminal window, I can type

./rtf2md Draft/*.rtf

for everything in the draft folder, and

./rtf2md Notes/*.rtf

for everything in the notes folder. Mirabile dictu, the resulting md files will have the title of the notecard as their file name!

Screen Shot 2014-09-17 at 1.56.06 PM

Here, I’ve used some basic citation info as the name for each card; a better idea might be to include tags in there too. Hey, this is all still improv theatre.

Now, when I created that application using automator, I saved it to my desktop. I double-click on it, and it strains out the md files and moves them over to my github repository. I then commit & sync, and I now have an open lab notebook on the web. Now, there are still some glitches; my markdown syntax that I wrote in, in Scrivener, isn’t being recognized on github because I think Scrivener is adding backslashes here and there, which are working like escape characters?

Anyway, this seems a promising start. When I do further analysis in R, or build a model in Netlogo, I can record my observations this way, create an R notebook with knitr or a netlogo applet, and push these into subfolders in this repo. Thus the whole thing will stick together.

I think this works.

~o~
Update Sept 18. I’ve discovered that I might have messed something up with my syncing. It could be I’ve just done something foolish locally or it might be something with my workflow. I’m investigating, but the upshot is, I got an error when I synced and a new folder called ‘Trashed Files’, and well, I think I’m close to my ideal setup, but there’s still something wonky. Stay tuned.

Update Sept 19 Don’t write in Scrivener using markdown syntax! I had a ‘doh’ moment. Write in Scrivener using bold, italics, bullets, etc to mark up your text. Then, when the script converts to markdown, it’ll format it correctly – which means that github will render it more or less correctly, making your notes a whole lot easier to read. Click on ‘raw’ on this page to see what I mean!

Open Notebooks

This post is more a reminder to me that anything you’d like to read, but anyway-

I want to make my research more open, more reproducible, and more accessible. I work from several locations, so I want to have all my stuff easily to hand. I work on a Mac (sometimes) a PC (sometimes) and on Linux (rarely, but it happens; with new goodies from Bill Turkel et al I might work more there!).

I build models in Netlogo. I do text analysis in R. I visualize and analyze with things like Voyant and Overview. I scrape websites. I use Excel quite a lot. I’m starting to write in markdown more often. I want to teach students (my students typically have fairly low levels of digital literacy) how to do all this too. What I don’t do is much web development type stuff, which means that I’m still struggling with concepts and workflow around things like version control. And indeed, getting access to a server where I can just screw around to try things out is difficult (for a variety of reasons). So my server-side skills are weak.

What I think I need, is an open notebook. Caleb McDaniel has an excellent post on what this could look like. He uses Gitit. I looked at the documentation, and was defeated out of the gate. Carl Boettiger uses a combination of github and jekyll and who knows what else. What I really like is Mark Madsen’s example but I’m not aufait enough yet with all the bits and pieces (damn you version control, commits, make, rake, et cetera et cetera!)

I’ve got ipython notebooks working on my PC, which are quite cool (I installed the Anaconda version). I don’t know much python though, so yeah. Stefan Sinclair is working on ‘voyant notebooks’ which uses the same general idea to wrap analysis around Voyant, so I’m looking forward to that. Ipython can be used to call R, which is cool, but it’s still early days for me (here’s a neat example passing data to R’s ggplot2).

So maybe that’s just the wrong tool.  Much of what I want to do, at least as far as R is concerned is covered in this post by Robert Flight on ‘creating an analysis as a package and vignette‘ in R studio. And there’s also this, for making sure things are reproducible – ‘packrat

Some combination of all of this I expect will be the solution that’ll work for me. Soon I want to start doing some more agent based modeling & simulation work, and it’s mission critical that I sort out my data management, notebooks, versioning etc first this time.

God, you should see the mess around here from the last time!

On Teaching High School

“Hey! Hey Sir!”

Some words just cut right to the cerebellum. ‘Sir’ is not normally one of them, but I was at the Shawville Fair, and ‘sir’ isn’t often used in the midway. I turned, and saw before me a student from ten years previously. We chatted; he was married, had a step daughter, another one on the way. He’d apprenticed, become a mechanic. He was doing well. I was glad to see him.

“So, you still teaching us assholes up at the school?”

No, I was at the university. “You guys weren’t assholes.”.

A Look. “Yes, we were. But there were good times, too, eh?”

Ten years ago, I held my first full-time, regular, teaching contract, at the local highschool. The year before that, I was a regular-rotation substitute teacher. Normally one would need a teaching certificate to teach in a highschool, but strangely enough newly minted teachers never seem to consider rural or more remote schools. Everyone wants to teach in the city. Having at least stood in front of students in the past, I was about the best short-term solution around. Towards the latter part of that year holes had opened up in the schedule and I was teaching every day. This transmuted into a regular gig teaching Grade 9 computing, Grade 9 geography (a provincially mandated course), and Grade 10/11 technical drawing.

And Math for Welders.

The school is formally a ‘polyvalente’, meaning a school where one could learn trades. However, our society’s bias against trades and years of cuts to the English system in Quebec (and asinine language laws which, amongst other things, mandate that only books published in Quebec can be used as textbooks. How many English textbooks are published for a community with only around a million people, full stop?) meant that all of the trades programs were dead. In the last decade this last-gasp program had been established in the teeth of opposition (which meant these students were watched very carefully indeed – and they knew it). Instead of taking ‘high math’ and other courses (targeted at the University bound) these students could take ‘welding’ math. They also worked in a metal shop. If they could pass my course, and pass the ticket exam for Welders, they could graduate High School and begin apprenticeships.

The welding program was conceived as a solution for students (typically boys) who had otherwise fallen through the cracks in the system. It was intense. These boys (though there have been maybe five or six girls in the program over the years) had never had academic success. They were older than their peers, having fallen behind. They had all manner of social issues, family issues, learning difficulties, you name it.

And they were all mine. Not only did I teach technical drawing and math (so right there, two or three hours of face to face time per day, every day) I was also their home room teacher. At our school, ‘home room’ was not just about morning attendance, but was also a kind of group therapy session too. (I say, ‘group therapy’, but really in other classes, there was a mix of years in these home rooms, so older students could work with younger on homework, personal stuff, whatever; but in my class, it was just me, and the welders. We didn’t mix).

I learned a lot about teaching over those two years.

I could tell you a lot of stories of pain and stress. I’ve never been quite so near to quitting, to tears, to breaking down, to screaming at the world. I did a PhD! I was from the same town! I’d beaten the system! Did that not earn me some respect? Was I not owed?

No.

And that was the hardest lesson right there. In fact, although I thought myself humble when I started the job (after two years of slogging in the sessional world, hustling for contract heritage work, and so on), I still had a hard time disentangling my expectations of what students should be from my notion of the kind of student I was. Those first two months, up to Thanksgiving, might’ve been a lot easier if I had.

I also underestimated how hard it would be to earn respect. I figured ‘PhD’ meant I’d already earned it, in the eyes of the world. But I hadn’t counted on the ‘if you were any good you wouldn’t be working here’ attitude that infects so much of Canadian life (and rural life in particular).

Once, one of the students fell asleep in class. What do you do, as a novice teacher? You wake him up. You take him into the hallway to ‘deal’ with him. And then I sent him up to the office. What I didn’t know: his Dad was long gone. His mom was with a new beau, and had been spending every night at the bar. The oil bill had not been paid, and what with it being winter and all, there was no heat. He had been sitting up, every night to watch over his sisters whom he’d put in sleeping bags in the kitchen, in front of an open electric oven. He was afraid of burning down the house if he fell asleep.

And god help me, I was giving him shit for not drawing his perspective drawings correctly, for falling asleep.

With time, I began to earn their respect. It helped that at school functions I had no fear of standing up and making a fool of myself doing whatever silly activity the pep leaders had devised. “He’s a goof but he’s OUR goof!” seemed to be the sense. I learned that I had to stop being a ‘teacher’ and start being these guys’ advocate. Who else was going to stand up for them? Everyone else had already written them off.

In some corners of the school, there was a firmly held conviction that these guys were getting off easy, that somehow what they were doing was less mentally challenging. There were some ugly staffroom showdowns, sometimes. Welding math involves a lot of geometry and trigonometry, finances, and mental calculation. It’s not easy in any way shape or form. Tradesmen in Canada frequently work in Imperial units, while officialdom works in metric. Calculating, switching, tallying… these are all non-trivial things! “Sir, that’s the first time I passed a math test since Grade four” said one lad, around about October.

The first test since Grade four. My god, what have we done to ourselves? None of these students were dumb, in the sense that students use. When I lost most of the class to moose hunting season, when they got back I had them explain to me exactly what they did. Extremely complicated thinking about camouflage, fish and game laws & licensing, working with weapons and bullets… these guys were smart. They never hesitated to call me on it either when what I was saying to them was nonsense or not making sense.

“Sir”, a voice in the back would say, “what the fuck are you talking about?” You can’t get angry about language. This is how they’ve learned to speak. But imagine: a student in your class actually taking the time to explain that they don’t understand, and to show you where they lost you? These guys did that! Once I learned to take the time to listen, they had a lot to say.  Would that my university students had the bravery to do the same.

It was never easy, working with these guys. At the end of the year, I was completely drained. A tenured teacher came back from sick leave, and I was bumped from my position. Unemployed again.  Look at that from my students’ perspective. Here’s a guy, finished first in his high school, got a phd. Came back home without a job. Ends up working with us – us! – and then loses his job again afterwards. Maybe, just maybe, doing the whole ‘academic’ thing they push isn’t the thing. Maybe, maybe, working with my hands, welding, machining… I’ll always have work. If I can figure out how to plan the best cuts in this sheet of metal so that I don’t waste any money. If I can pass the welding exam. If I don’t get my girlfriend pregnant. If I maybe pass on the blow this weekend and go to work.

Did some of them think that? I’d like to think so. We bickered, we locked horns, but once I proved to them that I was on their side, I’d like to think the good stuff outweighed the bad. I certainly know that it did wonders for me as a teacher. First and foremost, it forced me to get over myself. I learned that:

  • nobody owes me anything
  • what I was like as a student is no guide to what my students are like as students
  • I need to ask how do I make it safe to try something, for students to admit that I’m making not an ounce of sense?
  • I need to not assume I know anything about my students’ backgrounds
  • I need to make my expectations crystal clear for what constitutes proof-of-learning
  • I need to be part of the life of my school/community so that my students see that I’m invested in them.

A few years later, I won a postdoc position at U Manitoba, and began teaching in distance education and online education. That helped me transmogrify into whatever this ‘digital humanities’/’digital archaeology’ thing is. That’s the final lesson right there. I have a PhD in the finer points of the Tiber Valley brick industry. Don’t be afraid to change: your PhD is not you. It’s just proof that you can see a project through to the end, that you are tenacious, and that you can put the pieces together to see something new. Without the PhD, I could never have worked with those boys.

I was glad to see Jeremy, at the fair this year.

 

 

 

Setting the groundwork for an undergraduate thesis project

We have a course code, HIST4910, for students doing their undergraduate thesis project. This project can take the form of an essay, it can be a digital project, it could be code, it could be in the form of any of the manifold ways digital history/humanities research is communicated.

Hollis Peirce will be working with me this year on his HIST4910, which for now is called ‘The Evolution of the Digitization of History: Making History Accessible’. Hollis has been to DHSI twice now, once to take courses on digitization, once to work on the history of the book. Hollis’ interest in accessibility (as distinct from ‘open access’, which is an entirely different kettle of fish) makes this an exciting project, I think. If you’re interested in this subject, let us know! We’d like to make connections.

We met today to figure out how to get this project running, and I realized, it would be handy to have a set of guidelines for getting started. We don’t seem to have anything like this around the department, so Hollis and I cobbled some together. I figured other folks might be interested in that, so here they are.

Play along at home with #hist3812a

In my video games and history class, I assign each week one or two major pieces that I want everyone to read. Each week, a subset of the class has to attempt a ‘challenge’, which involves reading a bit more, reflecting, and devising a way of making their argument – a procedural rhetoric – via a game engine (in this case, Twine). Later on, they’ll be building in Minecraft. Right now, we have nearly 50 students enrolled.

If you’re interested in following along at home, here are the first few challenges. These are the actual prompts cut-n-pasted out of our LMS. Give ‘em a try if you’d like, upload to philome.la, and let us know! Ours will be at hist3812a.dhcworks.ca

I haven’t done this before, so it’ll be interesting to see what happens next.

Introduction to #hist3812a

Challenge #1

Read:

  1. Fogu, Claudio. ‘Digitalizing Historical Consciousness’, History and Theory 2, 2009.
  2. Tufekci, Zeynep. ‘What Happens to #Ferguson Affects Ferguson: Net Neutrality, Algorithmic Filtering and Ferguson. MediumAugust 14 2014

Craft:

A basic Twine that highlights the ways the two articles are connected.

Share:

Put your Twine build (the *html file) into the ‘public’ folder in your Dropbox account (if you don’t have a public folder, just right-click and select public link – see this help file). Share the link on our course blog:

  1. Create a new post.
  2. Hit the ‘html’ button.
  3. type:
  4. Preview your post to make sure it loads your Twine.

Play:

Explore others’ Twines and be ready to discuss this process and these readings in Tuesday’s class.

A history of games, and of video games

Challenge #2

Read & Watch:

Antecedents (read the intros):

Shannon, C. A Mathematical Theory of Communication  Reprinted with corrections from The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948. http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf

Turing, Alan Mathison. “On computable numbers, with an application to the Entscheidungsproblem.” J. of Math 58 (1936): 345-363. http://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf

Cold War (watch this entire lecture): https://www.youtube.com/watch?v=_otw7hWq58A

1980s:

Dillon, Roberto. The golden age of video games : the birth of a multi-billion dollar industry CRC Press, c2011.

Christiansen, Peter ‘Dwarf Norad: A Glimpse of Counterfactual Computing History’ Play the Past August 6 2014 http://www.playthepast.org/?p=4892

Craft:

A Twine that imagines what an ENIAC developed to serve the needs of historians might’ve looked like, ie explore Christiansen’s argument.

Share:

Put your Twine build (the *html file) into the ‘public’ folder in your Dropbox account. Share the link on our course blog by:

  1. Create a new post.
  2. Hit the ‘html’ button.
  3. type:
  4. Preview your post to make sure it loads your Twine.

Play:

Explore others’ Twines and be ready to discuss this process and these readings in Tuesday’s class.

Historical Consciousness and Worldview

Challenge #3

Read:

Kee, Graham, et al. ‘Towards a Theory of Good History Through Gaming’ The Canadian Historical Review
Volume 90, Number 2, June 2009 pp. 303-326.

http://muse.jhu.edu/journals/can/summary/v090/90.2.kee.html

Travis, Roger. ‘Your practomimetic school: Duck Hunt or BioShock?’ Play the Past Oct 21 2011 http://www.playthepast.org/?p=2067

Owens, T. ‘What does Simony say? An interview with Ian Bogost’ Play the Past Dec 13, 2012 http://www.playthepast.org/?p=3394

Travis, Roger. ‘A Modest Proposal for viewing literary texts as rulesets, and for making game studies beneficial for the publick’ Play the Past Feb 9 2012 http://www.playthepast.org/?p=2417

McCall, Jeremiah. “Historical Simulations as Problem Spaces: Some Guidelines for Criticism”. Play the Past http://www.playthepast.org/?p=2594

(Not assigned, but more of Travis’ work: http://livingepic.blogspot.ca/2012/07/rules-of-text-series-at-play-past.html)

Craft:

A Twine that exposes the underlying rhetorics of the game of teaching history.

Share:

Put your Twine build (the *html file) into the ‘public’ folder in your Dropbox account. Share the link on our course blog by:

  1. Create a new post.
  2. Hit the ‘html’ button.
  3. type:
  4. Preview your post to make sure it loads your Twine.

Play:

Explore others’ Twines and be ready to discuss this process and these readings in Tuesday’s class.

Critical Play Week

Challenge # 4

Remember: 

Keep notes on the discussions from the critical play session; move around the class, talk with people about what they’re playing, why they’re making the moves they’re doing, and think about the connections with the major reading.

(nb, I’ve assigned all the students to bring in video games, board games, in both sessions this week that we’ll play. We might decamp to the game lab in the library to make this work. This group will observe the play. I’ve also pointed them to Feminist Frequency as an example of the kind of criticism I want them to emulate).

Craft:

Devise a Twine that captures the dynamic and discussions of this week’s in-class critical play. Remember, for historians, it may be all about time and space.

Share:

Put your Twine build (the *html file) into the ‘public’ folder in your Dropbox account. Share the link on our course blog by:

  1. Create a new post.
  2. Hit the ‘html’ button.
  3. type:
  4. Preview your post to make sure it loads your Twine.

Play:

Explore others’ Twines and be ready to discuss this process and these readings in Tuesday’s class.

Material Culture and the Digital

Challenge #5

Read

Montfort et al, ‘Introduction’, 10 Print http://10print.org/ (download the pdf)

Montfort et al, ‘Mazes,’ 10 Print http://10print.org/ (download the pdf)

Bogost, Ian, Montfort, N. ‘New Media as Material Constraint: An Introduction to Platform Studies.’ 1st International HASTAC Conference, Duke University, Durham NC  http://bogost.com/downloads/Bogost%20Montfort%20HASTAC.pdf

Craft:

Make a Twine game that emulates Space Invaders; then discuss (within the Twine) the interaction between game, platform, and experience. Think also about ‘emulation’…

OR

Play one of these games, reviewing it via Twine, thinking about in a way that reverses the points made my Montfort & Bogost (ie, think about the way the physical is represented in the software).

Share:

Put your Twine build (the *html file) into the ‘public’ folder in your Dropbox account. Share the link on our course blog by:

  1. Create a new post.
  2. Hit the ‘html’ button.
  3. type:
  4. Preview your post to make sure it loads your Twine.

Play:

Explore others’ Twines and be ready to discuss this process and these readings in Tuesday’s class.

 

Web Seer and the Zeitgeist

I’ve been playing all evening with Web Seer, a toy that lets you contrast pairs of Google autocomplete suggestions. As is well known, Google autocomplete suggests completions based on what others have been searching for given that pattern of text you are entering. This is sparking some thoughts on how I might use this to think about things like public archaeology or public history.

As Alan Liu put it,

But for now, enjoy the pairings that I’ve been feeding it….

Screen Shot 2014-08-28 at 8.58.44 PM

In ancient/modern

 

Screen Shot 2014-08-28 at 8.57.17 PM

Greek versus Roman

 

Screen Shot 2014-08-28 at 8.52.25 PM

What School Should I Go To?

 

Screen Shot 2014-08-28 at 8.46.18 PM

Games and Literature

 

Screen Shot 2014-08-28 at 8.42.20 PM

Getting Down to Brass Tacks

 

Screen Shot 2014-08-28 at 8.35.24 PM

Drunkards and Teetotallers, never the twain shall meet

 

Screen Shot 2014-08-28 at 8.09.24 PM

Historians v Archaeologists, a Google Cage Match

 

Screen Shot 2014-08-28 at 7.55.55 PM

The DH Dilemma

 

Screen Shot 2014-08-28 at 7.55.08 PM

Future/Perfect

 

Screen Shot 2014-08-28 at 7.52.58 PM

Two Solitudes Redux

SAA 2015: Macroscopic approaches to archaeological histories: Insights into archaeological practice from digital methods

Ben Marwick and I are organizing a session for the SAA2015 (the 80th edition, this year in San Francisco) on “Macroscopic approaches to archaeological histories: Insights into archaeological practice from digital methods”. It’s a pretty big tent. Below is the session ID and the abstract. If this sounds like something you’d be interested in, why don’t you get in touch?

Session ID 743.

The history of archaeology, like most disciplines, is often presented as a sequence of influential individuals and a discussion of their greatest hits in the literature.  Two problems with this traditional approach are that it sidelines the majority of participants in the archaeological literature who are excluded from these discussions, and it does not capture the conversations outside of the canonical literature.  Recently developed computationally intensive methods as well as creative uses of existing digital tools can address these problems by efficiently enabling quantitative analyses of large volumes of text and other digital objects, and enabling large scale analysis of non-traditional research products such as blogs, images and other media. This session explores these methods, their potentials, and their perils, as we employ so-called ‘big data’ approaches to our own discipline.

—-

Like I said, if that sounds like something you’d be curious to know more about, ping me.

Historical Maps into Minecraft: My Workflow

The folks at the New York Public Library have a workflow and python script for translating historical maps into Minecraft. It’s a three-step (quite big steps) process. First, they generate a DEM (digital elevation model) from the historical map, using QGIS. This is saved as ‘elevation.tiff’. Then, using Inkscape, they trace over the features from the historical map that they want to translate into Minecraft. Different colours equal different kinds of blocks. This is saved as ‘features.tiff’. Then, using a custom python script, the two layers are combined to create a minecraft map, which can either be in ‘creative’ mode or ‘survival’ mode.

There are a number of unspoken steps in that workflow, including a number of dependencies for the python script that have to be installed first. Similarly, QGIS and its plugins also have a steep (sometimes hidden) learning curve. As does Inkscape. And Imagemagick. This isn’t a criticism; it’s just the way this kind of thing works. The problem, from my perspective, is that if I want to use this in the classroom, I have to guide 40 students with widely varying degrees of digital fluency.* I’ve found in the past that many of my students “didn’t study history to have to work with computers” and that the payoff sometimes (to them) doesn’t seem to have (immediate) value. The pros and cons of that kind of work shall be the post for another day.

Right now, my immediate problem is, how can I smooth the gradient of the learning curve? I will do this by providing 3 separate paths for creating the digital elevation model.

Path 1, for when real world geography is not the most important aspect.

It may be that the shape of the world described by the historical map is what is of interest, rather than the current topography of the world. For example, I could imagine a student wanting to explore the historical geography of the Chats Falls before they were flooded by the building of a hydro dam. Current topographic maps and DEMs are not useful. For this path, the student will need to use the process described by the NYPL folks:

Requirements

QGIS 2.2.0 ( http://qgis.org )

  • Activate Contour plugin
  • Activate GRASS plugin if not already activated

A map image to work from

  • We used a geo-rectified TIFF exported from this map but any high rez scan of a map with elevation data and features will suffice.

Process:

Layer > Add Raster Layer > [select rectified tiff]

  • Repeat for each tiff to be analyzed

Layer > New > New Shapefile Layer

  • Type: Point
  • New Attribute: add ‘elevation’ type whole number
  • remove id

Contour (plugin)

  • Vector Layer: choose points layer just created
  • Data field: elevation
  • Number: at least 20 (maybe.. number of distinct elevations + 2)
  • Layer name: default is fine

Export and import contours as vector layer:

  • right click save (e.g. port-washington-contours.shp)
  • May report error like “Only 19 of 20 features written.” Doesn’t seem to matter much

Layer > Add Vector Layer > [add .shp layer just exported]

Edit Current Grass Region (to reduce rendering time)

  • clip to minimal lat longs

Open Grass Tools

  • Modules List: Select “v.in.ogr.qgis”
  • Select recently added contours layer
  • Run, View output, and close

Open Grass Tools

  • Modules List: Select “v.to.rast.attr”
  • Name of input vector map: (layer just generated)
  • Attribute field: elevation
  • Run, View output, and close

Open Grass Tools

  • Modules List: Select “r.surf.contour”
  • Name of existing raster map containing colors: (layer just generated)
  • Run (will take a while), View output, and close

Hide points and contours (and anything else above bw elevation image) Project > Save as Image

You may want to create a cropped version of the result to remove un-analyzed/messy edges

The hidden, tacit bits here involve installing the Countour plugin, and working with GRASS tools (especially the bit about ‘editing the current grass region’, which always is fiddly, I find). Students pursuing this path will need a lot of one-on-one.

Path 2, for when you already have a shapefile from a GIS:

This was cooked up for me by Joel Rivard, one of our GIS & Map specialists in the Library. He writes,

1) In the menu, go to Layer > Add Vector Layer. Find the point shapefile that has the elevation information.
Ensure that you select point in the file type.
2) In the menu, go to Raster > Interpolation. Select “Field 3″ (this corresponds to the z or elevation field) for Interpolation attribute and click on “Add”.
Feel free to keep the rest as default and save the output file as an Image (.asc, bmp, jpg or any other raster – probably best to use .asc since that’s what MicroDEM likes.
We’ll talk about MicroDEM in a moment. I haven’t tested this path yet, myself. But it should work.

Path 3 For when modern topography is fine for your purposes

In this situation, modern topography is just what you need.

1. Grab Shuttle Radar Topography Mission data for the area you are interested in (it downloads as a tiff.)

2. Install MicroDEM and all of its bits and pieces (the installer wants a whole bunch of other supporting bits; just say yes. MicroDEM is PC software, but I’ve run it on a Mac within WineBottler).

3. This video tutorial covers working with MicroDEM and Worldpainter:

https://www.youtube.com/watch?v=Wha2m4_CPoo

But here’s some screenshots – basically, you open up your .tiff or your .asc image file within MicroDEM, crop to the area you are interested in, and then convert the image to grayscale:

MicroDEM: open image, crop image.

MicroDEM: open image, crop image.

Convert to grayscale

Convert to grayscale

Remove legends, marginalia

Remove legends, marginalia

Save your grayscaled image as a .tiff.
Regardless of the path you took (and think about the historical implications of those paths) you now have a gray scale DEM image that you can use to generate your mindcraft world.

Converting your grayscale DEM to a Minecraft World

At this point, the easiest thing to do is to use WorldPainter. It’s free, but you can donate to its developers to help them maintain and update it. Now, the video shown above shows how to load your DEM image into WorldPainter. It parses the black-to-white pixel values and turns them into elevations. You have the option of setting where ‘sea level’ is on your map (so elevations below that point are covered with water). There are many, many options here; play with it! Adam Clarke, who made the video, suggests scaling up your image to 900%, but I’ve found that that makes absolutely monstrous worlds. You’ll have to play around to see what makes most sense for you, but with real-world data of any area larger than a few kilometres on a side, I think 100 to 200% is fine.

Now, the crucial bit for us: you can import an image into WorldPainter to use as an overlay to guide the placement of blocks, terrain, buildings, whatever. So, rather than me simply regurgitating what Adam narrates, go watch the video. Save as a .world file for editing; export to Minecraft when you’re ready (be warned: big maps can take *a very long time* to render. That’s another reason why I don’t scale up the way Adam suggests).

Go play.

To get you started: here are a number of DEMs and WorldPainter world files that I’ve been playing with. Try ‘em out for yourself.

 

* another problem I’ve encountered is that my features colours don’t map onto the index values for blocks in the script. I’ve tried modifying the script to allow for a bit of fuzziness (a kind of, ‘if the pixel value is between x and y, treat as z’). I end up with worlds filled with water. If I run the script on the Fort Washington maps provided by NYPL, it works perfectly. The script is supposed to only be looking at the R of the RGB values when it assigns blocks, but I wonder if there isn’t something else going on. I had it work once, correctly, for me – but I used MS Paint to recolour my image with the exact colours from the Fort Washington map. Tried it again, exact same workflow on a different map, nada. Nyet. Zip. Zilch. Just a whole of tears and heartache.

Assessing my upcoming seminar on the Illicit Antiquities trade, HIST4805b

So I’m putting together the syllabus for my illicit antiquities seminar. This is where I think I’m going with the course, which starts in less than a month (eep!). The first part is an attempt to revitalize my classroom blogging, and to formally tie it into the discussion within the classroom – that is, something done in advance of class in order to make the classroom discussion richer. In the second term, I want to make as much time as possible for students to pursue their own independent research, which I’m framing as an ‘unessay’ following the O’Donnell model.

~oOo~

Daylight: The Journal of #HIST4805b Studying Looted Heritage

Rationale: What we are studying is important, and what we are learning needs to be disseminated as widely as possible. In a world where ‘American Diggers‘ can be a tv show, where National Geographic (for heaven’s sake!) seriously can contemplate putting on a show that desecrates war dead for entertainment there is a need to shed daylight. The fall term major assessment piece does this. You will be writing and curating a Flipboard magazine that ties our readings and discussions into the current news regarding heritage crime.

There are a number of steps to this.

  1. Each week, everyone  logs into heritage.crowdmap.com and puts three new reports on the map.
  2. Each week, a different subset of the class will be the lead editors for our journal.
    1. lead editors each write an editorial that explores the issues raised in the readings, with specific reference to new reports on our crowdmap. Editorials should be 750- 1000 words long.
    2. lead editors curate the Flipboard magazine so that it contains:
      1. the editorials
      2. the crowdmap reports
      3. the readings
  3. This should be completed before Monday’s class where we will discuss those readings. The lead editors will begin the class by discussing their edition of Daylight.*
  4. Each student will be a lead editor three times.

*if you can think of a better name, we’ll use that.

At the end of term you will nominate your two best pieces for grading. I will grade these for how you’ve framed your argument, for your use of evidence, and for your understanding of the issues. I will also take into account your in-class discussion of your edition of Daylight.

At the end of term you will also nominate two of your peers’ best pieces for consideration for bonus, with a single line explaining why.

This is worth 40% of your final grade.

—–

The Unessay Research Project

Unessay‘ noun - as described by Daniel Paul O’Donnell,

“[…] the unessay is an assignment that attempts to undo the damage done by [traditional essay writing at the university level]. It works by throwing out all the rules you have learned about essay writing in the course of your primary, secondary, and post secondary education and asks you to focus instead solely on your intellectual interests and passions. In an unessay you choose your own topicpresent it any way you please, and are evaluated on how compelling and effective you are.”

Which means for us:

The second term is an opportunity for exploration, and for you to use the time that you would normally spend in a classroom listening as time for active planning, researching, and learning the necessary skills, to effectively craft an ‘unessay’ of original research on a topic connected with the illicit antiquities trade. I will put together a schedule for weekly one on one or small group meetings where I can help you develop your project.

For this to work, you will have to come prepared to these meetings. This means keeping a research journal to which I will have access. You may choose to make this publicly accessible as well (and we’ll talk about why and how you might want to do that).  Periodically, we will meet as an entire class to discuss the issues we are having in our research. You will present your research formally to the class and invited visitors at the end of term – your project might not be finished at that point, but your presentation can take this into account. The project is due on the final day of term.

Grading:

Pass/Fail: Research Journal (ie, no complete research journal, no assessment for this project). We will discuss what is involved in a research journal. A Zotero library with notes would also be acceptable.

5% Presentation in class

45% Project

O’Donnel writes,

“If unessays can be about anything and there are no restrictions on format and presentation, how are they graded?

The main criteria is how well it all fits together. That is to say, how compelling and effective your work is.

An unessay is compelling when it shows some combination of the following:

  • it is as interesting as its topic and approach allows
  • it is as complete as its topic and approach allows (it doesn’t leave the audience thinking that important points are being skipped over or ignored)
  • it is truthful (any questions, evidence, conclusions, or arguments you raise are honestly and accurately presented)

In terms of presentation, an unessay is effective when it shows some combination of these attributes:

  • it is readable/watchable/listenable (i.e. the production values are appropriately high and the audience is not distracted by avoidable lapses in presentation)
  • it is appropriate (i.e. it uses a format and medium that suits its topic and approach)
  • it is attractive (i.e. it is presented in a way that leads the audience to trust the author and his or her arguments, examples, and conclusions).”

~oOo~

So that’s what I’m going with. I’m not giving points out for participation, as that never has really worked for me. There will of course be much more going on in the classroom that just what is described here, including technical tutorials on various digital tools that I think are useful, beta-testing some other things, but my thinking is that these will see their expression in the quality of the independent research that takes place in the Winter term.

So Fall term: much reading, much discussion. Winter term: self-direction along trajectories established in the Fall. We shall see.