Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on August 17, 2018

Developing bionics: How IBM is adapting mind-control for accessibility


Developing bionics: How IBM is adapting mind-control for accessibility

What if there was a way to give everyone suffering from conditions like paralysis or Locked-in syndrome the means to operate prosthetic devices and tech gadgets using mind-control? Well, there is – or at least, there will be.

IBM Research recently developed an end-to-end proof-of-concept for a method of controlling an off-the-shelf robotic arm with a brain-computer interface built using a take-home EEG monitor. To accomplish this, the researchers developed AI to interpret the data from the EEG monitor as commands for the robotic arm.

That may not sound like something that will change everything overnight – and IBM isn’t the only or first company to dabble in brain-computer interfaces. But they’re one of the only that appear interested in figuring out how to build a system that uses inexpensive hardware that’s already available.

We reached out to Stefan Harrer, a research scientist at IBM Research working on the project. He told TNW:

Our primary design goals were (i) low-cost and (ii) suitable for use in an unrestricted real-life environment. (i) allows the system to transition from an expensive research grade exploratory setup (the status-quo of BMIs) to a setup that is affordable for the broad public (the first of our main objectives) – (ii) allows the system to be taken out of highly specialized research lab environments and moved into everyday environments for use by the broad public (the second of our main objectives).

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

This early work indicates people can control machines with their minds alone, using commonly available technology and cutting-edge AI. That’s huge for those who don’t have that same control over their own bodies.

Harrer told us that, with further development, the same machine learning techniques could potentially be applied to control a prosthetic limb or even a robot assistant.

IBM’s system isn’t ready for prime-time just yet though. Harrer says the team is working on reducing latency and doesn’t have any current plans for human trials. But the proof-of-concept indicates it’s only a matter of time before devices built using this technology become a common accessibility solution.

For more information visit IBM’s blog.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with