Scientists at Columbia University have developed a robot that mimics the facial expressions of humans to gain their trust.
Named Eva, the droid uses deep learning to analyze human facial gestures captured by a camera. Cables and motors then pull on different points of the robot’s soft skin to mimic the expressions of nearby people in real-time.
The effect is pretty creepy, but the researchers say that giving androids this ability can facilitate more natural and engaging human-robot interactions.
Eva produces different expressions by utilizing one or more of six basic emotions: anger, disgust, fear, joy, sadness, and surprise. Per the study paper:
For example, while joy would correspond to one facial expression, the combination of joy and surprise would result in happily surprised, which would correspond to a separate facial expression.
[Read: This dude drove an EV from the Netherlands to New Zealand — here are his 3 top road trip tips]
The team trained the robot to generate these expressions by filming it making a series of random faces. Eva’s neural networks then learned to match the humanoid’s gestures to those of human faces captured on its video camera.
While there are already numerous facially expressive humanoids in existence, the team believes that Eva is the only one that’s open-source.
It’s also relatively inexpensive to manufacture and assemble. The researchers say the combination of moderate costs and open-source design could make Eva an accessible and customizable platform for emotional AI research.
Ultimately, they envision droids that respond to our body language improving human-robot communication in a wide range of settings, from nursing homes to factories.
Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.
Get the TNW newsletter
Get the most important tech news in your inbox each week.