Researchers at the University of California, San Francisco used signals from the brain sent via computers to allow paralyzed men to control the robotic arms regularly.
He could grasp, move, and release objects simply by imagining that he would perform actions. The device, known as the Brain Computer Interface (BCI), has successfully performed record-breaking features for seven months without the need for adjustment.
Until now, such devices have only been working for a day or two.
This BCI relies on an AI model that adapts to small changes in brain activity as a person, allowing people to imagine movements repeatedly and gradually improve their accuracy.
“This fusion of learning between humans and AI is the next step in these brain-computer interfaces,” says Professor Karunesh Ganguly, a neurologist at the UCSF Weill Institute for Neurosciences. “It's what you need to achieve sophisticated and realistic features.”
The study, funded by the US National Institutes of Health, was published in the journal Cell on March 6th.
One study participant who lost movement and speaking ability following a stroke several years ago was able to control the robotic arm by imagining a particular movement.
An important breakthrough was understanding how brain activity changes daily as participants imagined these movements repeatedly.
Once AI systems were trained to consider these changes, they maintained performance for several months at a time.
Professor Ganguly previously studied brain activity patterns in animals and observed that these patterns evolved as animals learn new movements.
He suspected that the same process took place in humans. This explained why previous BCIS quickly lost its ability to interpret brain signals.
Neurology researchers Ganguly and Nikhilesh Natraj worked with participants who were paralyzed due to stroke and were unable to move or speak.
Participants had implanted small sensors on the surface of their brains to detect neural activity when they imagined migration.
To investigate whether these brain patterns change over time, participants were asked to imagine moving different body parts, such as hands, feet, and head.
He couldn't move physically, but his brain continued to generate signals corresponding to these imaginary movements.
BCI recorded these signals and found that although the general patterns remain the same, the exact location in the brain shifts slightly every day.
The researchers then asked participants to imagine simple finger, hand and thumb movements over two weeks, but they learned that AI systems interpret brain activity. Initially, the robot arm movement was inaccurate.
To improve accuracy, participants practiced using virtual robot arms that provided feedback on how closely the imaginary movements matched the intended action.
Eventually he was able to obtain a virtual arm to perform the desired task. Once participants began practicing with real robot arms, it only took a few practice sessions for him to transfer his skills to the real world. He could use the robot arm to pick up blocks and move them to new locations.
He even managed to open the cabinet to retrieve the cup and hold it under the water dig. A few months later, he was able to control the robotic arm after a brief 15-minute “adjustment” and adjust changes in brain activity over time.
Ganguly and his team are currently working to improve the AI model to make the robotic arms move faster and smoother. We will also be testing the system in a home environment. For people with paralysis, the ability to perform simple tasks like feeding and drinking water can be life-changing.
“We've learned how to build a system now and I'm sure we can do this job,” Ganguly said.





