At Yale University, researchers recently used a brain scanner to identify which face someone was looking at — just from their brain activity. At the University of California-Berkeley, scientists are moving beyond "reading" simple thoughts to predicting what someone will think next.
And at Carnegie Mellon, in Pittsburgh, cognitive neuroscientist Marcel Just has a vision that will make Google Glass seem very last century. Instead of using your eye to direct a cursor — finding a phone number for a car repair shop, for instance — he fantasizes about a device that will dial the shop by interpreting your thoughts about the car (minus the expletives).
Mind reading technology isn't yet where the sci-fi thrillers predict it will go, but researchers like Just aren't ruling out such a future.
"In principle, our thoughts could someday be readable," said Just, who directs the school's Center for Cognitive Brain Imaging. "I don't think we have to worry about this in the next 5-10 years, but it's interesting to think about. What if all of our thoughts were public?"
He can imagine a terrifying version of that future, where officials read minds in order to gain control over them. But he prefers to envision a more positive one, with mind reading devices offering opportunities to people with disabilities — and the rest of us.
Marvin Chun, senior author on the Yale work, published last month in the journal Neuroimage, sees a more limited potential for mind reading, at least with current functional-MRI technology, which measures blood flow to infer what is happening in the brain.
"I think we can make it a little better. I don't think we'll be able to magically read out people's faces a whole lot better," he said.
In his experiment, an undergraduate working in his lab developed a mathematical model to allow a computer to recognize different parts of faces. Then, by scanning the brains of volunteers as they looked at different faces, the researchers trained the computer to interpret how each volunteer's brain responded to different faces. Lastly, the volunteers were asked to look at new faces while in a brain scanner — and the computer could distinguish which of two faces they were observing. It was correct about 60-70% of the time.
"This will allow us to study things we haven't studied before about people's internal representation of faces and memories and imagination and dreams — all of which are represented in some of the same areas we use to reconstruct faces," said Alan Cowen, who led the research as a Yale undergraduate and is now a graduate student researcher at Berkeley.
Jack Gallant, a leader in the field of mind reading, also at Berkeley, said the work at Yale may not have immediate benefits, but it helps build enthusiasm for the field.
"Brain decoding tells us whether some specific type of information can be recovered from the brain," he said. "It can also be used to build a brain-computer interface if one is so inclined."
Because this process requires the volunteer's full participation, this approach cannot be used to read someone's mind against their will, Just said.
The Yale work helps confirm that the brain doesn't just have one area dedicated to a task like perceiving faces, Just said. Instead, "thinking is a collaborative process," with three or four areas of the brain working together to allow people to distinguish, say, between the face of their spouse and that of their best friend.
Next, Chun said, he's going to test people with famous faces to see if his scanner and algorithm can tell when someone is thinking about Brad Pitt or his partner, Angelina Jolie.
"It's a little fantastical, but it'll be fun to try," he said. "This really is bringing science fiction closer to reality."