Join us at TNW Conference 2021 for insights into the future of tech →

Human-centric AI news and analysis

This article was published on June 1, 2021


This AI robot mimics human expressions to build trust with users

The project aims to improve human-robot interactions

This AI robot mimics human expressions to build trust with users
Thomas Macaulay
Story by

Thomas Macaulay

Writer at Neural by TNW — Thomas covers AI in all its iterations. Likes Werner Herzog films and Arsenal FC. Writer at Neural by TNW — Thomas covers AI in all its iterations. Likes Werner Herzog films and Arsenal FC.

Scientists at Columbia University have developed a robot that mimics the facial expressions of humans to gain their trust.

Named Eva, the droid uses deep learning to analyze human facial gestures captured by a camera. Cables and motors then pull on different points of the robot’s soft skin to mimic the expressions of nearby people in real-time.

The effect is pretty creepy, but the researchers say that giving androids this ability can facilitate more natural and engaging human-robot interactions.

Eva produces different expressions by utilizing one or more of six basic emotions: anger, disgust, fear, joy, sadness, and surprise. Per the study paper:

For example, while joy would correspond to one facial expression, the combination of joy and surprise would result in happily surprised, which would correspond to a separate facial expression.

[Read: This dude drove an EV from the Netherlands to New Zealand — here are his 3 top road trip tips]

The team trained the robot to generate these expressions by filming it making a series of random faces. Eva’s neural networks then learned to match the humanoid’s gestures to those of human faces captured on its video camera.

The robot mirrors human facial expressions captured by a camera.
Credit: Creative Machines Lab/Columbia Engineering
The robot first practiced different facial expressions in front of a camera.

While there are already numerous facially expressive humanoids in existence, the team believes that Eva is the only one that’s open-source.

It’s also relatively inexpensive to manufacture and assemble. The researchers say the combination of moderate costs and open-source design could make Eva an accessible and customizable platform for emotional AI research.

Ultimately, they envision droids that respond to our body language improving human-robot communication in a wide range of settings, from nursing homes to factories.

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Also tagged with