AI & futurism

powered by

This article was published on February 11, 2021

New AI technique allows robots to detect human touch by analyzing shadows

The method could make natural interactions with robots more accessible

New AI technique allows robots to detect human touch by analyzing shadows Image by: Mue Ervive from Pexels
Thomas Macaulay
Story by

Thomas Macaulay

Writer at Neural by TNW Writer at Neural by TNW

Scientists from Cornell University have developed a way for robots to identify physical interactions just by analyzing a user’s shadows.

Their ShadowSense system uses an off-the-shelf USB camera to capture the shadows produced by hand gestures on a robot’s skin. Algorithms then classify the movements to infer the user’s specific interaction.

Study lead author Guy Hoffman said the method provides a natural way of interacting with robots without relying on large and costly sensor arrays:

Touch is such an important mode of communication for most organisms, but it has been virtually absent from human-robot interaction. One of the reasons is that full-body touch used to require a massive number of sensors, and was therefore not practical to implement. This research offers a low-cost alternative.

Greetings, humanoids

Subscribe to our newsletter now for a weekly recap of our favorite AI stories in your inbox.

The researchers tried out the system on an inflatable robot with a camera underneath its skin.

[Read: How Polestar is using blockchain to increase transparency]

They trained and tested the classification algorithms with shadow images of six gestures: touching with a palm, punching, touching with two hands, hugging, pointing, and not touching.

It successfully distinguished between the gestures with 87.5 − 96% accuracy, depending on the lighting.

The system was most accurate in daylight (96%), followed by dusk (93%), and night (87%).
Credit: Hu et al.
ShadowSense was most accurate in daylight (96%), followed by dusk (93%), and night (87%).

The researchers envision mobile guide robots using the tech to respond to different gestures, such as turning to face a human when it detects a poke, and moving away when it senses a tap on the back.

It could also add interactive touch screens to inflatable robots and make home assistant droids more privacy-friendly.

“If the robot can only see you in the form of your shadow, it can detect what you’re doing without taking high fidelity images of your appearance,” said Hoffman. “That gives you a physical filter and protection, and provides psychological comfort.”

You can read the full study paper here.

Get the Neural newsletter

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Also tagged with

Back to top