Daily tasks can be exceedingly difficult for people who have experienced neurotrauma, such as a stroke, due to impaired coordination and strength in one or both upper limbs. To help them improve their capabilities, robotic gadgets have been developed in response to these issues. However, the inflexible design of these aids might be a drawback, particularly for harder tasks like playing an instrument.
For the first time, a robotic glove is giving piano players who have had a stroke their “hand” and giving them hope. Developed by researchers from Florida Atlantic University’s College of Engineering and Computer Science, the soft robotic hand exoskeleton uses artificial intelligence to improve hand dexterity.
This robotic glove is the first to “feel” the difference between correct and erroneous renditions of the same song and to combine these qualities into a single hand exoskeleton. It does this by fusing flexible tactile sensors, soft actuators, and AI.
“Playing the piano requires complex and highly skilled movements, and relearning tasks involves the restoration and retraining of specific movements or skills,” said Erik Engeberg, Ph.D., senior author, a professor in FAU’s Department of Ocean and Mechanical Engineering within the College of Engineering and Computer Science, and a member of the FAU Center for Complex Systems and Brain Sciences and the FAU Stiles-Nicholson Brain Institute. “Our robotic glove is composed of soft, flexible materials and sensors that provide gentle support and assistance to individuals to relearn and regain their motor abilities.”
Each fingertip of the robotic glove has a specific sensor array that was integrated by researchers. This innovative technology, in contrast to earlier exoskeletons, offers exact force and guidance in recovering the delicate finger movements necessary for playing the piano. The robotic glove provides real-time feedback and adjustments in response to users’ movements, making it simpler for them to learn the proper movement mechanics.
To demonstrate the robotic glove’s capabilities, researchers programmed it to feel the difference between correct and incorrect versions of the well-known tune, “Mary Had a Little Lamb,” played on the piano.
Playing the piano requires complex and highly skilled movements, and relearning tasks involves the restoration and retraining of specific movements or skills. Our robotic glove is composed of soft, flexible materials and sensors that provide gentle support and assistance to individuals to relearn and regain their motor abilities.
Erik Engeberg
They developed a pool of 12 distinct error types that might occur at the start or finish of a note or as a result of timing problems that were either premature or delayed and sustained for 0.1, 0.2, or 0.3 seconds. Three groups of three variations each made comprised the ten various song versions, and the proper tune played flawlessly as well.
To classify the song variations, Random Forest (RF), K-Nearest Neighbor (KNN) and Artificial Neural Network (ANN) algorithms were trained with data from the tactile sensors in the fingertips.
The robotic glove was used both independently and while being worn by a person to detect variations between the right and wrong versions of the song. To categorize the correct and incorrect song versions with and without the human subject, the accuracy of various algorithms was compared.
Results of the study, published in the journal Frontiers in Robotics and AI, demonstrated that the ANN algorithm had the highest classification accuracy of 97.13 percent with the human subject and 94.60 percent without the human subject.
The system correctly identified key hits that were out of sync and calculated the percentage inaccuracy of a particular song. These results show the smart robotic glove’s potential to help disabled people regain dexterity skills like playing musical instruments.
The robotic glove was created by researchers combining hydrogel casting and 3D printed polyvinyl acid stents to combine five actuators into a single wearable device that adapts to the user’s hand. Using 3D scanning technologies or CT scans, the production process might be modified to fit the form factor to each patient’s specific anatomy.
“Our design is significantly simpler than most designs as all the actuators and sensors are combined into a single molding process,” said Engeberg. “Importantly, although this study’s application was for playing a song, the approach could be applied to myriad tasks of daily life and the device could facilitate intricate rehabilitation programs customized for each patient.”
Clinicians might use the information to create individualized action plans to identify a patient’s weak points, which might show up as songs that are repeatedly played incorrectly, and which motor skills need to be strengthened. The rehabilitation staff may choose harder songs as patients advance in a game-like progression to offer a personalized path to improvement.
“The technology developed by professor Engeberg and the research team is truly a gamechanger for individuals with neuromuscular disorders and reduced limb functionality,” said Stella Batalama, Ph.D., dean of the FAU College of Engineering and Computer Science. “Although other soft robotic actuators have been used to play the piano; our robotic glove is the only one that has demonstrated the capability to ‘feel’ the difference between correct and incorrect versions of the same song.”
Study co-authors are Maohua Lin, first author, and a Ph.D. student; Rudy Paul, a graduate student; and Moaed Abd, Ph.D., a recent graduate; all from the FAU College of Engineering and Computer Science; James Jones, Boise State University; Darryl Dieujuste, a graduate research assistant, FAU College of Engineering and Computer Science; and Harvey Chim, M.D., a professor in the Division of Plastic and Reconstructive Surgery at the University of Florida.
The National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health (NIH), the National Institute of Aging of the NIH and the National Science Foundation supported this research. This research was supported in part by a seed grant from the FAU College of Engineering and Computer Science and the FAU Institute for Sensing and Embedded Network Systems Engineering (I-SENSE).