‘Nightmare Machine’ is an AI project that uses our own fears against us

‘Nightmare Machine’ is an AI project that uses our own fears against us
Credit: Nightmare Machine

It’s safe to say that there’s a paranoid subset of our readers that fear the future as it relates to AI. Those people should look away now.

Artificial intelligence already scares people. It makes sense; we have entire generations raised on ‘The Terminator’ in which ‘Skynet’ is hell-bent on destroying the world, or ‘The Matrix’ which sees humans serving as the life force that drives an AI program.

And now, we have ‘Nightmare Machine.’

The project, a partnership between researchers in the US and Australia, aims to explore the fear caused by intelligent machines and gain a better understanding of their capabilities. By confronting the common fear, the group hopes to find out if a machine is capable of understanding and visualizing what scares us. Seriously, what could go wrong?

“We know that AI terrifies us in the abstract sense,” co-creator Pinar Yanardag, a postdoctoral researcher at MIT Media Lab in Massachusetts, told Live Science. “But can AI scare us in the immediate, visceral sense?”

Nightmare Machine uses a deep learning algorithm to identify what makes an image unsettling. It then uses that data to transform innocuous photos into something you might see in a good horror flick. Landscapes feature additional texture, melting edges, ominous lighting, and other abstract effects. Faces, however, are where Nightmare Machine truly shines. Hollow eyes, zombie-like flesh, added shadows, and the occasional hint of blood adds a creepy factor that certainly proves the AI is on the right track.

Granted, this one isn’t going to take over the world or grow humans like crops, but an AI created solely for the purpose of scaring people definitely has the makings of a good Sci-Fi/horror film. Maybe Nightmare Machine can help create it.

via Live Science

Nightmare Machine on MIT

Read next: Uber redesigns rider app with improved navigation and on-trip feed

Here's some more distraction

Comments