Braincopter holds promise for people with paralysis

Publicly released:
International
Photo by Milad Fakurian on Unsplash
Photo by Milad Fakurian on Unsplash

A brain implant has allowed a person with tetraplegia to fly a virtual drone. Researchers placed the 'brain-computer interface' in the part of the brain that controls hand movement, and recorded the participant's brain activity as he watched a virtual hand making different movements. Using machine learning, they trained the interface to link his brain signals with the movements so the participant could control a virtual hand with his mind, and use it to pilot a virtual quadcopter. Based on his experience, the researchers say this interface and the access it provides to video games requiring fine movement could offer people with paralysis a sense of enablement, recreation, and social connectedness.

News release

From: Springer Nature

Neuroscience: Piloting a virtual quadcopter using a brain–computer interface

A surgically implanted brain–computer interface can detect and decode finger movements in a person with paralysis, allowing them to play a video game, reports a study in Nature Medicine.

More than 5 million people in the US live with severe motor impairments. Although many of the basic needs of people with paralysis are being met, there remain unmet needs for social and leisure activities, such as video games. Brain–computer interfaces have been recognised as a potential solution for motor restoration, but current examples of this technology have struggled with complex movement such as individual finger movements, which could help with activities like typing, playing musical instruments, or using a video game controller.

Mathew Willsey and colleagues developed a brain–computer interface capable of continuously recording the electrical activity patterns of multiple neurons in the brain to translate complex movements. The interface was implanted in the left precentral gyrus — the brain region responsible for hand movement control — of a person with upper and lower extremity paralysis. Neuronal activity was recorded as the participant observed a virtual hand performing various movements, after which the researchers used machine learning algorithms to identify the signals linked to specific finger movements. Using these signals, the system was able to accurately predict finger movements, enabling the participant to control three highly distinct finger groups, that included two-dimensional thumb movements, in a virtual hand. This system achieved a level of movement precision and freedom greater than previously possible.

The authors then extended the application of this finger control to a video game. Finger movements decoded by the interface were programmed to control the speed and direction of a virtual quadcopter, allowing the participant to pilot the device through multiple obstacle courses as a part of a video game.

Multimedia

Virtual hand movements and quadcopter flight
Virtual quadcopter navigation through rings

Attachments

Note: Not all attachments are visible to the general public. Research URLs will go live after the embargo ends.

Research Springer Nature, Web page URL will go live after the embargo lifts
Journal/
conference:
Nature Medicine
Research:Paper
Organisation/s: Stanford University, University of Michigan - USA
Funder: This work was supported by the Office of Research and Development, Rehabilitation R&D Service, Department of Veterans Affairs (grant nos. N2864C, A2295R) (L.R.H.); Wu Tsai Neurosciences Institute (J.M.H.); Howard Hughes Medical Institute (D.T.A., F.R.W.); L. and P. Garlick (J.M.H.); Simons Foundation Collaboration on the Global Brain grant no. 543045 (J.M.H.); grant no. NIH-NIDCD R01-DC014034 (J.M.H.); grant no. NIH-NIDCD U01-DC017844 (L.R.H.); and the Milton Safenowitz Postdoctoral Fellowship from the Amyotrophic Lateral Sclerosis Association (N.P.S.).
Media Contact/s
Contact details are only visible to registered journalists.