Can a computer recreate what you're seeing?

Pictured are nine different MRI scans from a woman's brain. MRI technology has been critical to understanding how the human mind works.
Pictured are nine different MRI scans from a woman's brain. MRI technology has been critical to understanding how the human mind works.
Don Farrall/Digital Vision/Getty Images

Conscious of it or not, you feed your brain a constant stream of information to interpret. Whether you're being blasted with color, action and sound while watching a movie or you're stooping to pet a shaggy dog, your reality depends on interpreting bits and pieces of information from your surrounding environment.

But are human and animal brains the only systems able to interpret the world in such a way? Can computers recreate what a person thinks as well?

So far, neuroscience shows that some computers can "read" our minds, so to speak. Although the idea differs from the fictional mind-reading technology that can predict and deter a person's intent to commit a crime in the high-thrill blockbuster "Minority Report," real science is getting closer to playing back what the human mind sees.

Understanding how our brains work, neuron by neuron, surely beats the guesswork employed in the sci-fi world. But to create a computer capable of such complexity, researchers have to know how the organ works at a detailed level.

Today, scientists use multiple tools to design a computer with the potential for interpreting the world in the same way as the brain. For starters, functional MRI (fMRI) machines, or devices that use large magnets to measure blood flow and oxygen levels in the brain, play a big role in most neuroscience studies. Measuring brain activity with electroencephalogram (EEG) machines also gives scientists the opportunity to link the firing neurons to specific activities. But there's also the question of how to interpret the data. Computer engineers have filled the gap by developing highly accurate methods to represent brain activity on a computer screen, even in three-dimensional representations. Statistical models also allow scientists to infer or predict how the brain might react based on previous data sets.

One model uses neuroscience and computers to recreate what humans see in real time. Read on to learn whether mind-reading bots are a thing of the near future or the stuff of fantasy.

Seeing Is Believing -- Even for Smart Computers

Can a computer really recreate what you're seeing?

In some experiments, the answer is a resounding "Yes." By recording our brains' activity while watching a movie, for instance, scientists can look at how the organ reacts to a participant seeing certain objects. One group of researchers, led by University of California Berkeley neuroscientist Jack Gallant, created a special computer that could accurately guess what a person was watching just from analyzing that person's fMRI data [source: Nishimoto et al.].

The experiment worked like this: Scientists first needed to find out how a person's brain reacts to viewing a variety of different things, so they asked the three participants to watch hours of videos while hooked up to an fMRI machine. The videos, dubbed natural movies, were a randomly selected collection of video clips from the Internet. While the person viewed the clips, the computer built a library of data on how that person's brain reacted to seeing people and objects on a screen.

After creating a database to draw from, researchers used the information as a reference in the next portion of the experiment. If you were to play these recordings back, you could piece together which objects, colors and shapes make parts of the brain light up on the fMRI. With this knowledge, it's possible to work in reverse by using the brain data alone to guess what a person just saw.

Here's where the library of data came into play. In the next step, the same person watched a video while hooked up to the machine, but this time neither the participant nor the computer had "seen" the video. So when the person sat down to watch a new video he'd never seen before, the computer compared his brain's reaction to other ones already stored in its system. Then, it strung together computer pixels from previous videos to recreate its own account of what the participant was seeing in real time.

The system accurately recreated basic videos of what participants were seeing 90 percent of the time [source: Nishimoto et al.]. (Visit Gallant's lab Web site to get an idea of the images the participants were viewing and the computer's reconstructed take on what the participants were viewing.)

Think of the system working much in the same way as learning a foreign language. Learners will first expose themselves to many letters, words and phrases. Eventually, they'll stop looking at the translated version and find meaning by using what they already know.

Still, "seeing" like humans is a far stretch from determining exactly what a person is thinking. In this instance, the computer created a crude picture of what a person was viewing -- not an exact image. In addition, it's not possible at this time to know how viewing something makes a person feel, or whether it conjures up other memories or feelings.

Up next: the history and future frontiers of mind-reading computers.

Reading Minds: Real Science Beats Sci-Fi

Scientists have long challenged the brain's reputation as a black box, especially over the past 150 years. Since scientists discovered that brain activity could be measured by blood flow in the 1890s, a wave of studies have contributed to our knowledge of localizing activity, or tying certain activities and emotions to areas of the brain [source: Roy & Sherrington].

But it takes a lot of work to develop the high-tech computers capable of interpreting what the brain sees.

Researchers have looked at how neurons work in nonhuman mammals, including rodents and primates [source: Wagner]. Computer modeling and engineering have also helped scientists develop ways to predict trends and translate the brain's reactions into usable data.

Previous studies with fMRI have shown how the brain's nerve cells react to seeing objects and people [source: Haxby et al.]. Other experiments have examined how the organs react to colored stimuli and even visualized parts of speech, including nouns and verbs [sources: Bohannon, Mitchell et al.].

Other "mind-reading" computers rely on electroencephalogram (EEG) technology, where electrodes attached to a person's scalp can pick up faint activity inside the brain. In some cases, EEG experiments with computers have found ways to let people control a computer with their brain alone [source: O'Brien & Baime]. Patients with health conditions that make speaking or moving difficult could benefit from these developments.

Then, there's the type of computers programmed to pick up on humans' social cues. In a general sense, these robots recreate what a person might be seeing or thinking by analyzing his body language or voice [source: MIT].

With time, computers' ability to recreate the inner workings of the human mind might become more refined. But overall, your 3-pound (1.3-kilogram) organ largely remains one of nature's greatest mysteries.

Head over to the next page for more on the brain-computer pairing.

Related Articles

More Great Links

Sources

  • Bohannon, John. "A New Angle on Mind Reading." ScienceNOW. April 25, 2005. (Oct. 13, 2011) http://news.sciencemag.org/sciencenow/2005/04/25-02.html
  • Chudler, Eric. "The Hows, Whats and Whos of Neuroscience. " Neuroscience for Kids. (Oct. 22, 2011) http://faculty.washington.edu/chudler/what.html
  • Haxby, James, Gobbini, M., Furey, Maura, Ishai, Alumit, Schouten, Jennifer, & Pietrini, Pietro. "Distributed and Overlapping Representations of Faces and Objects in Ventral Temporal Cortex." Science. Vol. 293, no. 5539. 2001. (Oct. 12, 2011) http://www.sciencemag.org/content/293/5539/2425.abstract
  • Nishimoto, Shinji, Vu, An, Naselaris, Thomas, Benjamini, Yuval, Yu, Bin, & Gallant, Jack. "Reconstructing Visual Experiences from Brain Activity Evoked by Natural Movies." Current Biology. Vol. 21, no. 19. 2011. (Oct. 8, 2011) http://www.cell.com/current-biology/abstract/S0960-9822%2811%2900937-7
  • Massachusetts Institute of Technology. "Personal Robots Group: Leonardo." MIT Media Lab. (Oct. 13, 2011) http://robotic.media.mit.edu/projects/robots/leonardo/overview/overview.html
  • MedlinePlus. "EEG." U.S. National Library of Medicine. Oct. 19, 2011. (Oct. 22, 2011) http://www.nlm.nih.gov/medlineplus/ency/article/003931.htm
  • Mitchell, Tom, Shinkareva, Svetlana, Carlson, Andrew, Chang, Kai-Min, Malave, Vicente, Mason, Robert, & Just, Marcel. "Predicting Human Brain Activity Associated with the Meanings of Nouns." Science. Vol. 320, no. 1191. 2008. (Oct. 13, 2011) http://www.cs.cmu.edu/~tom/pubs/science2008.pdf
  • O'Brien, Miles & Baime, Jon. "Mind Reading Computer System May Help People with Locked-in Syndrome." Oct. 17, 2011. (Oct. 18, 2011) http://www.nsf.gov/news/special_reports/science_nation/brainmachine.jsp?WT.mc_id=USNSF_51
  • Randerson, James. "Scary or Sensational? A Machine that can Look into the Mind." The Guardian. March 5, 2008. (Oct. 12, 2011) http://www.guardian.co.uk/science/2008/mar/06/medicalresearch
  • Roy, C.S. & Sherrington, C.S. "On the Regulation of the Blood-supply of the Brain." Journal of Physiology. Vol. 11, no. 1-2. 1890. (Oct. 28, 2011). http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1514242/
  • Wagner, Holly. "MRI Technique Lets Researchers Directly Compare Similarities, Differences Between Monkey and Human Brain." Ohio State Research Communications. Oct. 17, 2002. (Oct. 12, 2011) http://researchnews.osu.edu/archive/monkysci.htm