Mind-reading device can decode your internal monologue, but accuracy varies widely

Publicly released:
International
Photo by Brett Jordan on Unsplash
Photo by Brett Jordan on Unsplash

Researchers have been able to decode words spoken internally with 79% accuracy in a quadriplegia patient, which they hope could lead to the development of a device to help restore communication in people who have lost the ability to speak. However, the accuracy was a far less impressive 23% for another patient. The team recorded brain activity from two people with quadriplegia who had electrodes implanted in specific areas of the brain. Following an auditory (spoken sound) or visual (written text) prompt, the people were first asked to think of the word (for example, ‘spoon’, ‘python’ or ‘battlefield’) internally and, after a brief pause, to say the word out loud. In real-time, during the experiment, the researchers decoded both internally spoken and vocalised words with an accuracy of 79% for one patient, but only 23% in the other. The researchers say their work represents a proof-of-concept for a high-performance internal speech brain-machine interface.

Media release

From: Springer Nature

Psychology: Decoding real-time internal speech

A new technology that could enable researchers to extract meaning from brain signals during internal speech is presented in a study published in Nature Human Behaviour. Although the findings are preliminary, this work could aid the development of tools to help to restore communication in people who have lost the ability to speak.

Due to disease or injury, some neurological conditions can destroy peoples’ ability to speak, but brain-machine interfaces (BMIs) can help patients to communicate again. BMIs known as ‘speech decoders’ can capture brain activity during inner speech — words thought within the mind, while making no movement or sound — and turn it into units of language. Despite recent progress, the lack of an observable output combined with differences in brain activation between thought and speech make decoding internal speech challenging.

Sarah Wandelt and colleagues recorded brain activity from two participants with tetraplegia who had microelectrode arrays implanted in specific areas of the brain (the supramarginal gyrus and the primary somatosensory cortex). Following an auditory (spoken sound) or visual (written text) prompt, the participants were first asked to think of the word (for example, ‘spoon’, ‘python’ or ‘battlefield’) internally and, after a brief pause, to say the word out loud. In real time, during the experiment, the researchers were able to decode both internally spoken and vocalized words via the electrodes on the supramarginal gyrus with an accuracy of 79% for one participant and 23% for the other. The authors found evidence of shared neural representation between internal speech, word reading and vocalized speech in one of the participants.

Although further research needs to improve the functionality of the technology by testing more participants and new words, the authors recommend the supramarginal gyrus as a promising location for multi-purpose BMIs.

Journal/
conference:
Nature Human Behaviour
Research:Paper
Organisation/s: California Institute of Technology, USA
Funder: This research was supported by the NIH National Institute of Neurological Disorders and Stroke Grant U01: U01NS098975 and U01: U01NS123127 (S.K.W., D.A.B., K.P., C.L. and R.A.A.) and by the T&C Chen Brain-Machine Interface Center (S.K.W., D.A.B. and R.A.A.). The funders had no role in study design, data collection and analysis, decision to publish or preparation of the paper.
Media Contact/s
Contact details are only visible to registered journalists.