The computer's developers think their mind-reading program could run off of a webcam.

Photo courtesy Amazon.com

June 26, 2006 | Post Archive

If scientists and inventor types have their way, we could be powerless in the face of computers one day. Forget about the idea of robot super-smarts and super strength for a minute. A team of developers from the University of Cambridge are working on a computer that can read minds.

With a special camera, the computer observes a person's emotions through facial expressions. The computer does this by observing what the team calls "feature points" -- prominent features on the face -- to measure movements of the head or facial features. In total, there are 24 feature points the computer watches for and 20 movements, regardless of what's on a person's face. Even users who wear glasses or have beards are programmed into its memory. But because facial expressions vary from person to person, and because everyone expresses emotions differently, the team still has some work to do.

Currently, the "mind-reading" computer is being shown at the Royal Society Summer Science Exhibition in London, where the developers are using the exhibition's visitors to build up the computer's database. In order for the computer to read minds, it needs thousands of samples to more accurately read one's emotions. So far, the computer's database is full of actors' faces doing their best to express joy, sorrow, anger, confusion and other emotions.

The group sees several practical applications for the set-up, including customized advertising. In this scenario, a computer-camera combo, maybe even a home Webcam, could tailor advertisements specifically to your mood, which is both fascinating and potentially scary. Sure, someone with an angry face could be confronted with anger-management books, but couldn't others abuse the technology and advertise less desirable material companions? It's certainly feasible, but perhaps the better question is: Who really wants more advertisements, mood-specific or not?

Other potential uses include aiding those with autism and Asperger's syndrome, who typically have a difficult time interpreting the facial expressions of others. In conjunction with MIT, the developers are working on a special headset version of the system that would communicate the emotions of others to its wearer. They also have plans to develop a system to be installed in cars, which would help keep drivers safer on the road. The system could detect when drivers are tired or confused. But how that corresponds to practical solutions hasn't been determined yet, though the team is working with a major (unnamed) car manufacturer who expects the system will be in cars as soon as 2012.