Just another site

Scientists can read your thoughts!!!!! Yeah, right

There have been two recent sets of reports on the ‘scientists can read your thoughts’ theme.
The Guardian reports:

Mind-reading program translates brain activity into words

The research paves the way for brain implants that would translate the thoughts of people who have lost power of speech (31 January 2012)

This is about the paper by Pasley & al (2012) in PLoS Biology ‘Reconstructing Speech from Human Auditory Cortex.’
Here’s the original press release (as always, it’s a university press release which produces all the news coverage):– which includes a video showing the original stimuli and the reconstructions.

The Guardian story says:

In a series of new experiments, scientists have been able to use a computer to decipher brain activity. So what, huh? Well, the computer can reconstruct those signals into the actual words the participants are thinking about. It can read your mind.
OK, so sometimes the words were difficult to recognise, but that’s not the point: it means that people unable to speak could generate a voice just by thinking in sentences.
“Potentially, the technique could be used to develop an implantable prosthetic device to aid speaking, and for some patients that would be wonderful,” Robert Knight, a senior member of the team and director of the Helen Wills Neuroscience Institute at the University of California, Berkeley, told the Guardian. “Perhaps in 10 years it will be as common as grandmother getting a new hip.”

Well, that would make sense if they were recording brain activity of people who are speaking these words, or even better intending to speak these words – but that’s not what’s happening here. They’re recording the activity of people listening to these words, so if there’s any mind reading going on here, it is reading what people are hearing, not what they’re thinking or intending.

The other similar story concerns the work of Jack Gallant and his team at U. C. Berkeley, published in Current Biology (Nishimoto & al, 2011): for the abstract.

The Economist says:


It is now possible to scan someone’s brain and get a reasonable idea of what is going through his mind. For the second paper of the trio [Gallant & al, 2011], published in Current Biology in September, shows that it is now possible to make a surprisingly accurate reconstruction, in full motion and glorious Technicolor, of exactly what is passing through an awake person’s mind.

Well, not really*.
The Discovery News account is more realistic:

What if scientists could peer inside your brain and then reconstruct what you were thinking, playing the images back like a video?
Science and technology are not even remotely at that point yet, but a new study from the University of California Berkeley marks a significant, if blurry, step in that direction.
Gallant wants to be clear about his lab’s research goal. “We’re trying to understand how the brain works,” he said. “We’re not trying to build a brain-decoding device.”

In the study, activity in the brain while watching the target video was matched with activity while watching a very large number of other random video clips, with ingenious software matching the target activity with whatever appeared in the brain activity while watching the other clips.

Here’s a shorter, but more precise, online press account:  Mind-Reading Tech Reconstructs Videos From Brain Images, by Dan Nosowitz, though as is often the case, the headline is not backed up by the information in the article. It’s a very short article, but is quite clear that what is happening is that Gallant is “attempting to reconstruct a video by reading the brain scans of someone who watched that video–essentially pulling experiences directly from someone’s brain”, and points out that this is really “what the researchers would really prefer we call ‘brain decoding’ rather than “mind-reading.”

That’s the point for of these studies: they’re picking up input signals at some level of decoding, and this isn’t really very different from the kind of event recording in the optic nerve or the visual cortex carried out by people like Hubel and Weisel all those years ago. Certainly, H & W were given the Nobel Prize, quite rightly, for their work, and this work takes the analysis deeper into the brain and at a much higher level of complexity and so is a considerable advance – but it’s not ‘reading our thoughts’. The Gallant paper from U.C. Berkeley puts it nice and clearly: “These results demonstrate that dynamic brain activity measured under naturalistic conditions can be decoded using current fMRI technology.”
The results are really impressive. Here’s a demo video of the video inputs and the computer reconstruction:

That’s a great technical advance, but we already have a ‘reading your thoughts’ example using EEG. This is the ‘readiness potential’ which Libet (1985) used in the well-known study which shows that brain activity showing decision to act seems to anticipate conscious awareness of that decision. Actually, the readiness potential was discovered a long time ago, first reported by Kornhuber and Deecke in 1965. I first heard of it in a talk by W. Grey Walter in 1968, and Grey Walter had been able to use readiness potential to allow people to use ‘mind control’ of the world nearly 50 years ago. He had set up a system to detect the readiness potential, and use that signal to do things like switching a light off and on. All you had to do was to decide to switch the light and the system would pick up your decision and do the action for you. I don’t think there was any differentiation of readiness potentials, so the system could only be set up to do one thing at a time, and probably deciding to do anything would activate it, so that’s not really mind reading, either. I remember Grey Walter saying that the easy way of doing this was to actually reach out for the switch, when the sytem would turn on th elight before you got there, but he did find it possible to activate the system without actually making the movement, just by forming the intention. He said it was a weird sensation. I think Grey Walter is an under-remembered scientist. His EEG work is fascinating, and he also did important early work in robotics. He does have a Wikipedia page:

(You need to be aware that the account I just given is an unsubstantiated memory of a rather informal talk nearly 50 years ago, when I was a young physiologist just beginning to learn about psychology. I’m sure I haven’t made it all up, but the account of what Grey Walter had been able to do may be more complete and coherent than the actual research. From all we know about memory some changes in that direction are likely.)

Actually, while following the press stories on the research above, I came across something which does look a bit more like mind reading, and is maybe more encouraging, or more frightening, depending on your point of view.

Here’s the Discovery News story:

A simple slide show could be the next weapon against terrorists. Using a brain-electrode cap and imagery, scientists at Northwestern University can pick the date, location and means of a future terrorist attack from the minds of America’s enemies.

Well, no it can’t, but if you read on there is some interesting stuff happening:

The electrodes measure the P300 brain wave, an involuntary response to stimuli that starts in the temporoparietal junction and spreads across the rest of the brain. When the wave hits the surface of the brain, the electrodes detect the signal. The stronger the reaction of the subject to a particularly stimuli, the stronger the P300 brain wave.
Rosenfeld and his co-author, graduate student John Meixner, divided 29 Northwestern University students into two groups. One group planned a vacation while the other group planned a terrorist attack. The students then had electrodes placed on their scalp, and were shown a series of images of various cities, such as Boston and Houston, and various means of attack, along with other related, but irrelevant, images as controls.
As the slide show advanced, the electrodes recorded the P300 waves. When, for instance, the mock terrorists saw an image of the city they planned to attack, the electrodes recorded strong P300 brain waves. The Northwestern scientist then compared the strength of all the brain waves to find out who was planning at attack on which city, when they were planning it and how they meant to carry out the attack.
The Northwestern scientists correlated the strongest brain waves with “guilty knowledge” every time. Weaker P300 waves were seen when subjects saw images not associated with their planned attack. Scientists also examined P300 waves from the students in the group that was planning vacations, and did not falsely identify any of them as terrorists.

Here’s an actual paper on the research (Rosenfeld & al, 2008):

If you’ve read my previous posts, you’ll know exactly what I’m going to say here. Brilliant research, doing complicated stuff, with fascinating possibilities, but greatly overhyped by the headlines, and slightly misrepresented by the text, with the clearest remarks about the true scope of the research right at the end of the article. I think the overall result of this is to make the reader cynical about any possibility of progress – “I read about the same thing five years ago, and it never happened: these scientists are always making fanciful claims” – and to underrepresent the complexity (and interest) of the research that is actually going on.

*To be fair to The Economist, the article also describes two other interesting studies which are a little bit nearer to the ‘mind reading’ headline**.

**But to be pedantic (and maybe unfair) no-one uses Technicolor nowadays, and you have to be pretty old to even remember the phrase ‘in glorious Technicolor’.


Grey Walter, W (1964) Contingent negative variation: An electrical sign of sensorimotor association and expectancy in the human brain Nature 203, 380-384

Libet, B. (1985) Unconscious cerebral initiative and the role of conscious will in voluntary action. Behavior & Brain Science 8, 529–566

Nishimoto, Shinji, Vu, An T., Naselaris, Thomas, Benjamini, Yuval, Yu, Bin, Gallant Jack L. (2011) Reconstructing Visual Experiences from Brain Activity Evoked by Natural Movies Current Biology, 21(19), 1641-1646

Pasley, Brian N., Stephen V. David, Nima Mesgarani, Adeen Flinker, Shihab A. Shamma, Nathan E. Crone, Robert T. Knight, Edward F. T PLoS Biology
Available at:

Rosenfeld, J. Peter, Elena Labkovsky, Michael Winograd, Ming A. Lui, Catherine Vandenboom and Erica Chedid (2008) The Complex Trial Protocol (CTP): A new, countermeasure-resistant, accurate, P300-based method for detection of concealed information Psychophysiology, 45, 906–919.
Available at:


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: