I’m a hard-line skeptic when it comes to the topic of ESP (extra-sensory perception). I don’t believe in telepathy, precognition, telekinesis, or people bending flatware just by looking at it.
That said, I’m pretty confident that in the near future mind-reading will be possible. Not for us, though: for our machines.
In fact, machines can already read our minds, to a limited extent.
Just recently, Ambient Corporation demonstrated a neckband that translates thought into speech…sort of.
It takes some training to use, and requires “a level above thinking,” according to Michael Callahand, inventor of the Audeo with fellow University of Illinois at Urbana-Champaign researcher Thomas Coleman and co-founder with Coleman of the Ambient Corporation.
Rather than broadcasting a person’s thoughts, it picks up on nerve signals deliberately, but soundlessly, sent to the vocal cords, and relays those signals wirelessly to a computer, which then converts them into words spoken by a computerized voice.
The current system only recognizes about 150 words and phrases, but an improved version is supposed to be out by the end of the year that doesn’t have a vocabulary limit, because instead of recognizing specific words and phrases, it will identify the distinct bits of sound, called phonemes, that we use to construct complete words.
A person wearing a neckband can still talk normally: the system can differentiate between when the wearer wants to talk silently and when he wants to talk out loud.
There’s more use for something like this than simply allowing people to make silent phone calls while sitting in a meeting. The phoneme-based version, although it will be slower because the user will have to build words a phoneme at a time, will be aimed at people who have lost the ability to speak due to neurological disease or injury.
That’s not the only way the Audeo technology can help people with handicaps. Just last fall, Callahan and Coleman demonstrated the use of the neckband to guide a motorized wheelchair. It could also be used to allow people with serious muscle control problems to operate a computer or other equipment.
Even though I’m not telepathic, I can sense what you’re thinking at this point: “That’s not mind-reading. They’re picking up those signals from well outside the brain.”
How about this, then? Researchers at the University of California in Berkeley have developed a system that uses functional MRI data to decode information from the visual cortex. They first measured visual cortex activity in people looking at more than a thousand photographs. Using the data, they were able to program a computer to understand how each person’s visual cortex processes information.
When the participants were then shown a random set of more than 100 previously unseen photographs, the researchers were able to accurately identify which image was being viewed just by looking at the brain scans.
New Scientist magazine quotes John-Dylan Haynes of the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, as saying that the research hints that scientists may one day be able to access dreams, memories and imagery from people’s brains (assuming, that is, that dreams are processed in a way analogous to visual stimuli, which is uncertain).
Heck, even our cars are learning to read our minds. A team from the Technical University of Berlin discovered they could improve reaction times by as much as 100 milliseconds in real driving conditions by monitoring drivers’ brains and reducing distractions during periods of high brain activity.
That may not sound very impressive, but it’s enough to reduce braking distance by nearly three metres when you’re travelling at 100 kilometres per hour.
A car with such a device installed might know to switch off unnecessary information systems when it senses the driver is distracted–say, turning off the radio when the driver’s busy talking to a passenger. (Admittedly, they’re going to have to come up with a better interface than EEG sensors stuck all over the driver’s scalp. You think it’s hard to get people to wear seatbelts…)
The phrase “I know what you’re thinking” has thus far only been addressed by one human to another, and only in a metaphorical sense.
Someday soon, our machines could make it literal.