Like HowStuffWorks on Facebook!

How Augmented Reality Works

        Tech | Other Software

Augmenting Our World
Mistry demonstrates SixthSense
Mistry demonstrates SixthSense
Photo courtesy Sam Ogden, Pranav Mistry, MIT Media Lab

The basic idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time. Sounds pretty simple. Besides, haven't television networks been doing that with graphics for decades? However, augmented reality is more advanced than any technology you've seen in television broadcasts, although some new TV effects come close, such as RACEf/x and the super-imposed first down line on televised U.S. football games, both created by Sportvision. But these systems display graphics for only one point of view. Next-generation augmented-reality systems will display graphics for each viewer's perspective.

Some of the most exciting augmented-reality work is taking place in research labs at universities around the world. In February 2009, at the TED conference, Pattie Maes and Pranav Mistry presented their augmented-reality system, which they developed as part of MIT Media Lab's Fluid Interfaces Group. They call it SixthSense, and it relies on some basic components that are found in many augmented reality systems:

These components are strung together in a lanyardlike apparatus that the user wears around his neck. The user also wears four colored caps on the fingers, and these caps are used to manipulate the images that the projector emits.

SixthSense is remarkable because it uses these simple, off-the-shelf components that cost around $350. It is also notable because the projector essentially turns any surface into an interactive screen. Essentially, the device works by using the camera and mirror to examine the surrounding world, feeding that image to the phone (which processes the image, gathers GPS coordinates and pulls data from the Internet), and then projecting information from the projector onto the surface in front of the user, whether it's a wrist, a wall, or even a person. Because the user is wearing the camera on his chest, SixthSense will augment whatever he looks at; for example, if he picks up a can of soup in a grocery store, SixthSense can find and project onto the soup information about its ingredients, price, nutritional value -- even customer reviews.

By using his capped fingers -- Pattie Maes says even fingers with different colors of nail polish would work -- a user can perform actions on the projected information, which are then picked up by the camera and processed by the phone. If he wants to know more about that can of soup than is projected on it, he can use his fingers to interact with the projected image and learn about, say, competing brands. SixthSense can also recognize complex gestures -- draw a circle on your wrist and SixthSense projects a watch with the current time.


More to Explore