Can a computer recreate what you're seeing?

Seeing Is Believing -- Even for Smart Computers

Can a computer really recreate what you're seeing?

In some experiments, the answer is a resounding "Yes." By recording our brains' activity while watching a movie, for instance, scientists can look at how the organ reacts to a participant seeing certain objects. One group of researchers, led by University of California Berkeley neuroscientist Jack Gallant, created a special computer that could accurately guess what a person was watching just from analyzing that person's fMRI data [source: Nishimoto et al.].

The experiment worked like this: Scientists first needed to find out how a person's brain reacts to viewing a variety of different things, so they asked the three participants to watch hours of videos while hooked up to an fMRI machine. The videos, dubbed natural movies, were a randomly selected collection of video clips from the Internet. While the person viewed the clips, the computer built a library of data on how that person's brain reacted to seeing people and objects on a screen.

After creating a database to draw from, researchers used the information as a reference in the next portion of the experiment. If you were to play these recordings back, you could piece together which objects, colors and shapes make parts of the brain light up on the fMRI. With this knowledge, it's possible to work in reverse by using the brain data alone to guess what a person just saw.

Here's where the library of data came into play. In the next step, the same person watched a video while hooked up to the machine, but this time neither the participant nor the computer had "seen" the video. So when the person sat down to watch a new video he'd never seen before, the computer compared his brain's reaction to other ones already stored in its system. Then, it strung together computer pixels from previous videos to recreate its own account of what the participant was seeing in real time.

The system accurately recreated basic videos of what participants were seeing 90 percent of the time [source: Nishimoto et al.]. (Visit Gallant's lab Web site to get an idea of the images the participants were viewing and the computer's reconstructed take on what the participants were viewing.)

Think of the system working much in the same way as learning a foreign language. Learners will first expose themselves to many letters, words and phrases. Eventually, they'll stop looking at the translated version and find meaning by using what they already know.

Still, "seeing" like humans is a far stretch from determining exactly what a person is thinking. In this instance, the computer created a crude picture of what a person was viewing -- not an exact image. In addition, it's not possible at this time to know how viewing something makes a person feel, or whether it conjures up other memories or feelings.

Up next: the history and future frontiers of mind-reading computers.