This article was published on June 1, 2016

So, let’s talk about this song a Google Brain machine composed


So, let’s talk about this song a Google Brain machine composed

Researchers have been attempting to make robots and artificial intelligence more creative over the past months – from drawing to writing quasi-dystopian poetry. Today we get another piece of work from a Google machine: a 90-second melody.

It’s the result of Google’s Project Magenta, which aims to use machine learning to create music and art, and bridge the communities between those interests with coders and researchers. Magenta is built on top of its TensorFlow system, and you can find the open-sourced materials through its Github.

The team says the challenge is not to just get Google machines to create art, but to be able to tell stories from it. After all, that’s what artists do with their crafts: to compose a narrative into their work then share them with the world.

“The design of models that learn to construct long narrative arcs is important not only for music and art generation, but also areas like language modeling, where it remains a challenge to carry meaning even across a long paragraph, much less whole stories,” the team wrote. “Attention models like the Show, Attend and Tell point to one promising direction, but this remains a very challenging task.”

But enough about the tech, what does the song sound like? Well…

 

You tell us. Frankly, I’m reminded of walking into a Best Buy and seeing some kids discover electronic keyboards and their back track buttons, or an old school Nokia ringtone,  but it’s pretty impressive for a machine. Hell, it’s better than what I can do.

What do you think about the future of our robo-music?

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with