Scientists have long challenged the brain's reputation as a black box, especially over the past 150 years. Since scientists discovered that brain activity could be measured by blood flow in the 1890s, a wave of studies have contributed to our knowledge of localizing activity, or tying certain activities and emotions to areas of the brain [source: Roy & Sherrington].
But it takes a lot of work to develop the high-tech computers capable of interpreting what the brain sees.
Researchers have looked at how neurons work in nonhuman mammals, including rodents and primates [source: Wagner]. Computer modeling and engineering have also helped scientists develop ways to predict trends and translate the brain's reactions into usable data.
Previous studies with fMRI have shown how the brain's nerve cells react to seeing objects and people [source: Haxby et al.]. Other experiments have examined how the organs react to colored stimuli and even visualized parts of speech, including nouns and verbs [sources: Bohannon, Mitchell et al.].
Other "mind-reading" computers rely on electroencephalogram (EEG) technology, where electrodes attached to a person's scalp can pick up faint activity inside the brain. In some cases, EEG experiments with computers have found ways to let people control a computer with their brain alone [source: O'Brien & Baime]. Patients with health conditions that make speaking or moving difficult could benefit from these developments.
Then, there's the type of computers programmed to pick up on humans' social cues. In a general sense, these robots recreate what a person might be seeing or thinking by analyzing his body language or voice [source: MIT].
With time, computers' ability to recreate the inner workings of the human mind might become more refined. But overall, your 3-pound (1.3-kilogram) organ largely remains one of nature's greatest mysteries.
Head over to the next page for more on the brain-computer pairing.