Machine learning meets fingers, hands and elbows for enhanced robotic therapy
Strokes and spinal cord injuries leave millions of people around the world disabled every year, with chances of recovery that are often not that great. Unfortunately, today’s robotic therapy is not as effective as it could be because of limited knowledge of rehabilitation devices’ interaction with the human body. “If we could predict the outcome of a robotic therapy beforehand, then we could optimize it for a patient and deliver a truly personalized and cost-effective treatment,” states researcher Prof. Massimo Sartori in a news item posted on the website of the University of Twente in the Netherlands. Prof. Sartori heads the university’s Neuromechanical Modelling and Engineering Lab, whose team, together with researchers at the Meta AI research lab in the United States, co-developed an open-source framework called MyoSuite. Developed with partial support from the EU-funded INTERACT and SOPHIA projects, MyoSuite applies machine learning to biomechanical control problems by combining motor and neural intelligence.
No more long experimentations on humans
The platform is a collection of musculoskeletal tasks and environments simulated using an open-source physics engine. It consists of three models: finger, elbow and hand. Using these models, the researchers have designed a wide variety of tasks ranging from simple reaching movements to contact-rich movements like pen twirling or manipulating Baoding balls. “The AI-powered digital models in MyoSuite can learn to execute complex movements, and interactions with assistive robots, that would otherwise require long experimentations on real human subjects,” the news item reports. MyoSuite makes it possible to co-simulate AI-powered musculosketelal systems that physically interact with assistive robots. Users can simulate biological phenomena such as muscle fatigue and sarcopenia, and how assistive robots could be designed to restore movement in individuals with a disability. “This all is achieved by combining state-of-the-art musculoskeletal modelling with state-of-the-art artificial intelligence for movement behavior synthesis,” notes Prof. Sartori. As reported in the news item, Meta CEO Mark Zuckerberg shared a social media post introducing MyoSuite: “Meta AI team developed a new AI platform called MyoSuite that builds realistic musculoskeletal simulations that run up to 4,000x faster than state of the art. We can train these models to perform complex movements like twirling a pen or rotating a key. This research could accelerate development of prosthetics, physical rehab, and surgery techniques. … We’re going to open source these models so researchers can use them to advance the field further.” The platform developed with support from INTERACT (Modelling the neuromusculoskeletal system across spatiotemporal scales for a new paradigm of human-machine motor interaction) and SOPHIA (Socio-physical Interaction Skills for Cooperative Human-Robot Systems in Agile Production) could help usher in a new era in rehabilitation robotics. “We hope that diverse features supported by our framework will open new opportunities in understanding neuromechanical systems interacting with artificial robotic agents,” concludes Prof. Sartori. For more information, please see: INTERACT project web page SOPHIA project website
Keywords
INTERACT, SOPHIA, MyoSuite, AI, robot, robotics, disability, movement, musculoskeletal, finger, elbow, hand