This robo-glove could help stroke survivors learn to play music again

Publicly released:
International
Photo by Dolo Iglesias on Unsplash
Photo by Dolo Iglesias on Unsplash

International researchers have developed a prototype of a 'robo-glove' that could help stroke survivors re-learn to play musical instruments, or perform other tasks requiring dexterity in their fingers. The researchers say that, while music can help stroke survivors re-gain language and motor function lost to brain trauma, playing a musical instrument can be a difficult skill to re-learn. The flexible glove has sensors that can generate motion and exert force, so it can assist and enhance a patient's natural hand movements. The researchers also used AI to train the glove to know the difference between playing a basic piano melody correctly or incorrectly, so it could potentially give the wearer feedback about how they are doing as they play. While the glove was tested as a tool for playing music, the researchers say it could be adapted to a number of similar tasks.

Media release

From: Frontiers

Soft robo-glove can help stroke patients relearn to play music

‘Smart hand exoskeleton’ could also help neurotrauma patients to relearn other daily tasks, regaining dexterity and coordination

Researchers have developed the prototype of a comfortable and flexible ‘soft smart hand exoskeleton’ or robo-glove, which gives feedback to wearers who need to relearn tasks that require manual dexterity and coordination, for example after suffering a stroke. The present study focused on patients who need to relearn to play the piano as a proof-of-principle, but the glove can easily be adapted to help relearn other daily tasks.

Stroke is the most important cause of disability for adults in the EU, which affects approximately 1.1 million inhabitants each year. After a stroke, patients commonly need rehabilitation to relearn to walk, talk, or perform daily tasks. Research has shown that besides physical and occupational therapy, music therapy can help stroke patients to recover language and motor function. But for people trained in music and who suffered a stroke, playing music may itself be a skill that needs to be relearned. Now, a study in Frontiers in Robotics and AI has shown how novel soft robotics can help recovering patients to relearn playing music and other skills that require dexterity and coordination.

“Here we show that our smart exoskeleton glove, with its integrated tactile sensors, soft actuators, and artificial intelligence, can effectively aid in the relearning of manual tasks after neurotrauma,” said lead author Dr Maohua Lin, an adjunct professor at the Department of Ocean & Mechanical Engineering of Florida Atlantic University.

Whom the glove fits: custom-made ‘smart hand’

Lin and colleagues designed and tested a ‘smart hand exoskeleton’ in the shape of a multi-layered, flexible 3D-printed robo-glove, which weighs only 191g. The entire palm and wrist area of the glove are designed to be soft and flexible, and the shape of the glove can be custom-made to fit each wearer’s anatomy.

Soft pneumatic actuators in its fingertips generate motion and exert force, thus mimicking natural, fine-tuned hand movements. Each fingertip also contains an array of 16 flexible sensors or ‘taxels’, which give tactile sensations to the wearer’s hand upon interaction with objects or surfaces. Production of the glove is straightforward, as all actuators and sensors are put in place through a single molding process.

“While wearing the glove, human users have control over the movement of each finger to a significant extent,” said senior author Dr Erik Engeberg, a professor at Florida Atlantic University’s Department of Ocean & Mechanical Engineering.

“The glove is designed to assist and enhance their natural hand movements, allowing them to control the flexion and extension of their fingers. The glove supplies hand guidance, providing support and amplifying dexterity.”

The authors foresee that patients might ultimately wear a pair of these gloves, to help both hands independently to regain dexterity, motor skills, and a sense of coordination.

AI trained the glove to be a music teacher

The authors used machine learning to successfully teach the glove to ‘feel’ the difference between playing a correct versus incorrect versions of a beginner’s song on the piano. Here, the glove operated autonomously without human input, with preprogrammed movements. The song was ‘Mary had a little lamb’, which requires four fingers to play.

“We found that the glove can learn to distinguish between correct and incorrect piano play. This means it could be a valuable tool for personalized rehabilitation of people who wish to relearn to play music,” said Engeberg.

Now that the proof-of-principle has been shown, the glove can be programmed to give feedback to the wearer about what went right or wrong in their play, either through haptic feedback, visual cues, or sound. These would enable her or him to understand their performance and make improvements.

Picking up the gauntlet for remaining challenges

Lin added: “Adapting the present design to other rehabilitation tasks beyond playing music, for example object manipulation, would require customization to individual needs. This can be facilitated through 3D scanning technology or CT scans to ensure a personalized fit and functionality for each user.”

“But several challenges in this field need to be overcome. These include improving the accuracy and reliability of tactile sensing, enhancing the adaptability and dexterity of the exoskeleton design, and refining the machine learning algorithms to better interpret and respond to user input.”

Attachments

Note: Not all attachments are visible to the general public. Research URLs will go live after the embargo ends.

Research Frontiers, Web page The URL will go live after the embargo ends
Journal/
conference:
Frontiers in Robotics and AI
Research:Paper
Organisation/s: Florida Atlantic University, USA
Funder: This research was supported in part by a seed grant from the College of Engineering at FAU and I-SENSE. Research reported in this publication was supported by the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health under Award Number R01EB025819. This research was also supported by the National Institute of Aging under 3R01EB025819-04S1 and National Science Foundation awards #2205205 and #1950400. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or the National Science Foundation.
Media Contact/s
Contact details are only visible to registered journalists.