How a Brain-Computer Interface Works

By: Ed Grabianowski  | 
An illustration, explaining how a brain-computer interface works.
Brain Image Gallery See more brain pictures.
2007 HowStuffWorks

As the power of modern computers grows alongside our understanding of the human brain, we move ever closer to making some pretty spectacular science fiction into reality. It isn't about convenience — for severely disabled people, development of a brain-computer interface (BCI) could be the most important technological breakthrough in decades.

Imagine transmitting signals directly to someone's brain that would allow them to see, hear or feel specific sensory inputs. Consider the potential to manipulate computers or machinery with nothing more than a thought. In this article, we'll learn all about how BCIs work, their limitations and where they could be headed in the future.

Advertisement

The Electric Brain

The reason brain computer interface systems work at all is because of the way our brains function. Our brains are filled with neurons, individual nerve cells connected to one another by dendrites and axons. Every time we think, move, feel or remember something, our neurons are at work. That work is carried out by small electric signals that zip from neuron to neuron as fast as 250 mph [source: Walker]. The signals are generated by differences in electric potential carried by ions on the membrane of each neuron.

­Although the paths the signals take are insulated by something called myelin, some of the electric signal escapes. Scientists can detect this neural activity, interpret what they mean and use them to direct an external device, like a computer or mobile device. It can also work the other way around. For example, researchers could figure out what signals are sent to the brain by the optic nerve when someone sees the color red. They could rig a camera that would send those exact signals into someone's brain whenever the camera saw red, allowing a blind person to "see" without eyes.

Advertisement

BCI Input and Output

An illustration, explaining how a brain-computer interface works.
An illustration, explaining how a brain-computer interface works.
2007 HowStuffWorks

One of the biggest challenges facing brain-computer interface researchers today is the basic mechanics of the interface itself. The easiest and least invasive method is a set of electrodes — a device known as an electroencephalograph (EEG) — attached to the scalp. The electrodes can read brain signals. However, the skull blocks a lot of the electrical signal, and it distorts what does get through.

To get a higher-resolution signal, scientists can implant electrodes directly into the brain tissue itself, or on the surface of the brain, beneath the skull. This allows for much more direct reception of electric signals and allows electrode placement in the specific area of the brain where the appropriate signals are generated. This approach has many problems, however. It requires invasive surgery to implant the electrodes, and devices left in the brain long-term tend to cause the formation of scar tissue in the gray matter. This scar tissue ultimately blocks signals.

Advertisement

­Regardless of the location of the electrodes, the basic mechanism is the same: The electrodes measure minute differences in the voltage between neurons. The signal is then amplified and filtered. In current BCI systems, it is then interpreted by a computer program, although you might be familiar with older analogue encephalographs, which displayed the signals via pens that automatically wrote out the patterns on a continuous sheet of paper.

In the case of a sensory input BCI, the function happens in reverse. A computer converts a signal, such as one from a video camera, into the voltages necessary to trigger neurons. The signals are sent to an implant in the proper area of the brain, and if everything works correctly, the neurons fire and the subject receives a visual image corresponding to what the camera sees.

Another way to measure brain activity is with a Magnetic Resonance Image (MRI). An MRI machine is a massive, complicated device. It produces very high-resolution images of brain activity, but it can't be used as part of a permanent or semipermanent BCI. Researchers use it to get benchmarks for certain brain functions or to map where in the brain electrodes should be placed to measure a specific function. For example, if researchers are attempting to implant electrodes that will allow someone to control external devices like a robotic arm with their thoughts, they might first put the subject into an MRI and ask him or her to think about moving their actual arm. The MRI will show which area of the brain is active during arm movement, giving them a clearer target for electrode placement.

So, what are the real-life uses of a BCI? Read on to find out the possibilities.

Advertisement

BCI Applications

An illustration, explaining how a brain-computer interface works.
An illustration, explaining how a brain-computer interface works.
2007 HowStuffWorks

One of the most exciting areas of BCI research is the development of devices that can be controlled by thoughts. Some of the applications of this technology may seem frivolous, such as the ability to control a video game by thought. If you think a remote control is convenient, imagine changing channels with your mind.

However, there's a bigger picture — devices that would allow severely disabled people to function independently. For those suffering from a spinal cord injury, something as basic as controlling a computer cursor via mental commands would represent a revolutionary improvement in quality of life. But how do we turn those tiny voltage measurements into the movement of a robotic arm?

Advertisement

Early research used monkeys with implanted electrodes. The monkeys used a joystick to control a robotic arm. Scientists measured the signals coming from the electrodes. Eventually, they changed the controls so that the robotic arm was being controlled only by the signals coming form the electrodes, not the joystick.

A more difficult task is interpreting the brain signals for movement in someone who can't physically move their own arm. With a task like that, the subject must "train" to use the device. With an EEG or implant in place, the subject would visualize closing his or her right hand. After many trials, the software can learn the signals associated with the thought of hand-closing. Software connected to a robotic hand is programmed to receive the "close hand" signal and interpret it to mean that the robotic hand should close. At that point, when the subject thinks about closing the hand, the signals are sent and the robotic hand closes.

A similar method is used to manipulate a computer cursor, with the subject thinking about forward, left, right and back movements of the cursor. With enough practice, users can gain enough control over a cursor to draw a circle, access computer programs and control a TV [source: Ars Technica]. It could theoretically be expanded to allow users to "type" with their thoughts.

Once the basic mechanism of converting thoughts to computerized or robotic action is perfected, the potential uses for the technology are almost limitless. Instead of a robotic hand, disabled users could have robotic braces attached to their own limbs, allowing them to move and directly interact with the environment. This could even be accomplished without the "robotic" part of the device. Signals could be sent to the appropriate motor control nerves in the hands, bypassing a damaged section of the spinal cord and allowing actual movement of the subject's own hands.

On the next page we'll learn about cochlear implants and artificial eye development.

Advertisement

Sensory Input

Dr. Peter Brunner demonstrates the brain-computer interface at a conference in Paris.
Dr. Peter Brunner demonstrates the brain-computer interface at a conference in Paris.
Stephane de Sakutin/AFP/Getty Images

The most common and oldest way to use a BCI is a cochlear implant. For the average person, sound waves enter the ear and pass through several tiny organs that eventually pass the vibrations on to the auditory nerves in the form of electric signals. If the mechanism of the ear is severely damaged, that person will be unable to hear anything. However, the auditory nerves may be functioning perfectly well. They just aren't receiving any signals.

A cochlear implant bypasses the nonfunctioning part of the ear, processes the sound waves into electric signals and passes them via electrodes right to the auditory nerves. The result: A previously deaf person can now hear. He might not hear perfectly, but it allows him to understand conversations.

Advertisement

The processing of visual information by the brain is much more complex than that of audio information, so artificial eye development isn't as advanced. Still, the principle is the same. Electrodes are implanted in or near the visual cortex, the area of the brain that processes visual information from the retinas. A pair of glasses holding small cameras is connected to a computer and, in turn, to the implants. After a training period similar to the one used for remote thought-controlled movement, the subject can see.

Again, the vision isn't perfect, but refinements in technology have improved it tremendously since it was first attempted in the 1970s. Jens Naumann was the recipient of a second-generation implant. He was completely blind, but now he can navigate New York City's subways by himself and even drive a car around a parking lot [source: CBC News]. In terms of science fiction becoming reality, this process gets very close.

The terminals that connect the camera glasses to the electrodes in Naumann's brain are similar to those used to connect the VISOR (Visual Instrument and Sensory Organ) worn by blind engineering officer Geordi La Forge in the "Star Trek: The Next Generation" TV show and films, and they're both essentially the same technology. However, Naumann isn't able to "see" invisible portions of the electromagnetic spectrum.

Thought Control?

If we can send sensory signals to someone's brain, does that mean thought control is a something we need to worry about? Probably not. Sending a relatively simple sensory signal is difficult enough. The signals necessary to cause someone to take a certain action involuntarily is far beyond current technology. Besides, erstwhile thought-controllers would need to kidnap you and implant electrodes in an extensive surgical procedure, something you'd likely notice.

Advertisement

BCI Drawbacks and Innovators

Two people in Germany use a brain-computer interface to write "how are you?"
Two people in Germany use a brain-computer interface to write "how are you?"
Volker Hartmann/AFP/Getty Images

Although we already understand the basic principles behind BCIs, they don't work perfectly. There are several reasons for this.

  1. The brain is incredibly complex. To say that all thoughts or actions are the result of simple electric signals in the brain is a gross understatement. There are about 100 billion neurons in a human brain [source: Greenfield]. Each neuron is constantly sending and receiving signals through a complex web of connections. There are chemical processes involved as well, which EEGs can't pick up on.
  2. The signal is weak and prone to interference. EEGs measure tiny voltage potentials. Something as simple as the blinking eyelids of the subject can generate much stronger signals. Refinements in EEGs and implants will probably overcome this problem to some extent in the future, but for now, reading brain signals is like listening to a bad phone connection. There's lots of static.
  3. The equipment is less than portable. It's far better than it used to be — early systems were hardwired to massive mainframe computers. But some BCIs still require a wired connection to the equipment, and those that are wireless require the subject to carry a computer that can weigh around 10 pounds. Like all technology, this will surely become lighter and more wireless in the future.

BCI Innovators

The field of brain-computer interfaces (BCIs) has seen significant advancements in recent years, with several companies emerging as key players. While many innovations are still in the research and development stages, there are notable strides towards commercial applications.

Advertisement

  • Neuralink, founded by Elon Musk, has become one of the most prominent companies in the BCI space. Neuralink is developing implantable devices designed to enhance communication between the brain and external devices, with the long-term goal of treating neurological disorders and enabling advanced human-computer interactions. Their recent progress includes successful human trials, with the potential to restore movement and communication abilities in people with severe disabilities [source: Leffer].
  • Synchron is another key player, working on a minimally invasive BCI that can be implanted via blood vessels. This technology is aimed at helping individuals with paralysis control digital devices using only their thoughts, offering a less invasive alternative to traditional brain implants.
  • Other companies, like CTRL-Labs (now part of Meta), are exploring non-invasive BCIs, focusing on translating neural signals into digital commands using wearable devices. These innovations represent a significant leap forward in making BCIs more accessible and practical for everyday use.

To learn more about brain-computer interfaces, take a look at the links on the next page.

Advertisement

A New Way to Stay Connected

The advancements in brain-computer interfaces (BCIs) are transforming what was once the realm of science fiction into tangible reality. These technologies hold the promise of not only improving the lives of those with severe disabilities but also redefining human-computer interaction.

As companies like Neuralink, Synchron, and CTRL-Labs push the boundaries of what is possible, we are witnessing the early stages of a revolution that could enable mind-controlled devices, restore lost senses, and even offer new ways to interact with the world. While there are still significant challenges to overcome, the progress being made today lays a strong foundation for a future where BCI technology could become an integral part of our daily lives.

Advertisement

Frequently Answered Questions

How does EEG BCI work?
EEG BCI works by detecting changes in brain activity and using them to control a computer or other device. EEG signals are recorded from the scalp and then converted into commands that can be used to control a cursor, type words, or move a robotic arm.
What is the BCI system?
The BCI system is a set of hardware and software tools that allow people to communicate with a computer using their thoughts.
How a BCI is implemented?
There are many ways to implement a BCI. One common way is to use electrodes to measure brain activity. The electrodes can be placed on the scalp, in the brain, or on the body. The data from the electrodes is then used to control a device, such as a computer or a prosthetic limb.
What are the three types of BCI?
The three types of BCI are:
  • 1. Invasive BCIs
  • 2. Partially-invasive BCIs
  • 3. Non-invasive BCIs

Lots More Information

Related HowStuffWorks Articles

More Great Links

  • CBC News. "Out of the Dark." Jan. 5, 2003. http://www.cbc.ca/sunday/sight/index.html
  • Cheng, Jacqui. "Researchers help users control Second Life avatars via brain activity." ARS Technica, Oct. 15, 2007. http://arstechnica.com/news.ars/post/20071015-researchers-help-users-control-second-life-avatars-via-brain-activity.html
  • Cyberkinetics. "BrainGate Neural Interface System." http://www.cyberkineticsinc.com/content/medicalproducts/braingate.jsp
  • Greenfield, Susan A. "Brain Story: Unlocking Our Inner World of Emotions, Memories, Ideas and Desires." DK Adult, 2001.
  • McKee, Maggie. "NASA develops 'mind-reading' system." New Scientist, March 2004. http://www.newscientist.com/article/dn4795-nasa-develops-mindreading-system.html
  • Neural Signals. "Speech Restoration Project." http://www.neuralsignals.com/movementrestoration.htm
  • Pollack, Peter. "Brain control gives hope to the paralyzed." ARS Technica, July 13, 2006. http://arstechnica.com/news.ars/post/20060713-7262.html
  • Walker, Richard. "Secret Worlds: Brain." DK Children, 2002.

Advertisement

Loading...