This article was published on December 22, 2020

Wearable that detects hand gestures could one day control prosthetics and computers

The device uses a hyperdimensional computing algorithm to update itself with new information


Wearable that detects hand gestures could one day control prosthetics and computers

UC Berkeley researchers have developed a gesture-detecting wearable that they believe could be used to control prosthetics and electronic devices

The device uses a combination of biosensors and AI software to identify the hand gestures a person intends to make by analyzing electrical signals from their arm.

It’s far from the first gesture recognition system designed for human-computer interaction (HCI), but the new system offers some unique benefits.

Most notably, it uses a neuro-inspired hyperdimensional computing algorithm to update itself as it receives new information, such as changes to electrical signals when an arm gets sweaty.

“In gesture recognition, your signals are going to change over time, and that can affect the performance of your model,” study coauthor Ali Moin explained in a statement. “We were able to greatly improve the classification accuracy by updating the model on the device.”

The team screen-printed the biosensing system onto a thin sheet of PET substrate, a polymer resin that’s typically used to produce synthetic fibers and plastic containers.

The researchers picked the material for their armband due to its flexibility, which allows it to conform to a forearm’s muscle movements.

The array is comprised of 64 electrodes, each of which detects electric signals from a different point on the arm. This data is fed into an electric chip, which uses the algorithm to associate the signals with specific hand gestures.

[Read: MIT’s new wearable lets you control drones with Jedi-like arm gestures]

The team trained the algorithm by wrapping the armband around a user’s forearm and instructing them to perform each gesture. In testing, the system accurately classified 21 hand signals, including a fist, thumbs-up, and counting numbers.

A wearable future?

All of the computing is done locally on the chip, which speeds up the system and protects the user’s biological data.

Moin believes this combination of security and performance could turn the system into a viable commercial product:

Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers. Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.

You can read the study paper in the journal Nature Electronics.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top