MacUser 28th April 2000
|Ready to Wear|
|Winter came back, my email backlog is running at over 1000
messages, I've written one small, inadequate paragraph of my Great Work in three days - a
production rate shameful to anybody ever paid by the word. And the camellia on my balcony
has decided to go into full atrophy rather than full bloom. Things are not well with the
world. Somebody should change it.
Enter, somewhat surprisingly, by way of email number 257, a notice about an outfit at Loughborough University which calls itself the Advanced VR Research Centre, and is planning a conference with the intriguing title: Augmenting the Real World: Augmented Reality and Wearable Computing.
Thinking that whatever this augmented reality is, right now I really, truly, need some - even though I'm less sure about wearing it (being allergic to nickel) - I'm off to check out the site sooner than you can say, 'obviously this is all going to be highly speculative and rather cognitive'. Which, of course, it is. And it isn't.
Augmented reality is the term used to describe a specific kind of virtual reality: the overlaying of virtual objects on the real world. Instead of retreating into a closed simulation (VR) or moving from screen projection to the real world and back (standard desk-based computing) to relate what's on screen with what you see in front of you, you do something different. Augmented Reality entails the placing of the virtual projection over the real, to directly enhance it. Think about a map laid out on a landscape, a map the size of the landscape which doesn't replace it, but augments it, and you get the picture. Wearable computing is a concept intimately linked to AR systems. Developing wearable computers would allow systems capable of delivering augmented reality to move with the user. These systems would interface with the world (augment it) as the user moves through it - in a sense they wrap themselves over it. The result is peculiarly seamless; but in a new way, having less to do with immersion and more to do with the production of a kind of continuous commentary by way of the addition of new semi-transparent layer which unfurls itself with the user. If this technology takes off, perhaps you'll never need to see the world in an un-augmented way, or not so long as you're wearing your (Bristol University-produced) CyberJacket. That, at least, is the idea.
Augmented reality is now being researched, and questions of interface design, basic technologies, and even such apparently arcane questions as lighting are worked on. Clearly, effective modes of use will have to move on from screen-based paradigms; as one introduction to a paper to be presented to Loughborough suggests, anyone who has used a wearable computer in anger will realise that user interfaces based around the desktop metaphor don't work - for instance, interfacing with the real world using a keyboard obviously isn't ideal. Researchers are looking at producing new kinds of multi-modal interfaces offering many different ways to interact with the machine.
Most of this is yet to come. Which means augmented reality is rarely seen outside the lab or specific test conditions, and wearable computing systems can't be bought off the shelf. Given this, projections about the eventual commercial uses for the technology remain speculative - users are famous for finding new and unexpected uses for technologies originally designed for something else. Which doesn't mean there aren't researchers testing out possible use scenarios: ideas include systems for archeological sites and designs for art galleries, for instance. Futuristic it may be, but in a sense we're already halfway there.
Augmented reality, understood as a technology which allows you to make sense of the world, makes more sense to me as the logical follow-on from screen-based computing than virtual reality - which, being laden with all the paraphernalia required to induce a sense of immersion, a sense precisely of being cut off from the world, seems peculiarly cumbersome. It's AR rather than VR which is already presaged by a series of current developments and trends: the growth of mobile computing, mobile telephony, Internet access by phone, on the one hand, and the increasing number of tasks which we complete by way of the screen on the other. We're already tending to weave between the world on the screen and the world we see around us. Laying one over the over is only the next step in the process.