The rise of the telepresence robot — a device that looks like a monitor attached to a Segway roaming the halls of your office — has been met with bemusement, to say the least. But a team of researchers from Switzerland and Italy have figured out a way to use the device to help people with motor disabilities remotely navigate places with their brains.
According to MIT Technology Review, the user is able to navigate the robot via a non-invasive helmet that reads EEG signals. By imagining movement with their feet or hands, the robot will move in a way that corresponds to those directions, within certain limits to account for trajectory and acceleration.
When tested with users who still have motor function versus those with serious motor disabilities, both groups performed in the same fashion. For those with motor function, it was actually slightly easier to control the robot with their brains, rather than inputting controls manually.
The existence of this technology could be a big breakthrough for people who suffer from many kinds of brain and nerve impairments, as well as spinal cord injuries. People who have very limited mobility often miss out on everyday actions people without disabilities experience — by creating a non-intrusive way for that group to get around, there’s greater opportunity to more meaningfully integrate their lives into society.
It’s a long way off from manufacturing, but the strides science is making regarding these technologies will likely show even more promise in the years to come.
➤ Telepresence Robot for the Disabled Takes Directions from Brain Signals [MIT Technology Review]
Get the TNW newsletter
Get the most important tech news in your inbox each week.