Reading & Experiencing Space

When I was an MA student at Reading University (studying the City of Rome), one of the texts that we used was Diane Favro’s The Urban Image of Augustan Rome, Joesphy Ryckwert’s The Idea of a Town, and Kevin Lynch’s The Image of the City. One of the things I took away from that experience was about reading and experience space, and the messages encoded in the urban fabric. One of the things that has always bugged me about reconstructions – whether written or drawn or CAD – is that, no matter how good they are, the experience and the reading is always at one or two steps’ remove.

Today, I was briefly at Vassar’s Sistine Chapel in Second Life. This it strikes me is an excellent example of how archaeological space and its reconstructions might be profitably done. For starters, the simulation takes control of your avatar so that you must read the code of conduct. If you do not agree to behave inside, the simulation dumps you elsewhere (how I wish the real Sistine Chapel had such an device!). Once inside, the space surrounds you. Because you can fly, you too can play the artist and poke your nose against the paint. Moreover, every fresco, every scene seems thick with information. Touching one presents you with a notecard with all of the information you could want about that particular scene.

The experience was tranquil; it was like stumbling into one of those churches in the back alleys of Rome that the tourists always miss… which I suppose raises some questions about whether spiritual experiences are possible in Second Life. But I digress.

I would love to see a reconstructed and annotated Pantheon; or a walk through the Campus Martius teeming with information attached: hyperlinked experienced virtual space… Imagine publishing your excavation in Second Life, a reconstruction where different layers could be peeled away, or ‘wall54’ tagged with all of the associated info, perhaps via a link to an online database. Publishing through Experiencing…


One Comment

  1. […] He was showing me a plugin that they’ve developed for exporting AutoCad models into the Unreal2 engine, and then scaling the textures back onto the model (usually, one would use something like 3d Studio Max or Maya to import models into Unreal2). From an archaeological point of view, archaeologists have been using AutoCad for years to create reconstructions of sites. To get those models into a world engine usually’d involve all sorts of translations, but if you could import directly from your existing archaeological AutoCad model…. you’d suddenly be able to experience the space that you’ve recreated. A 3d picture is still just a picture. Experiencing the space makes – as it were – a world of difference. Read Diane Favro or Kevin Lynch for a start on the importance of experiencing space. […]

Comments are closed.