A pair of researchers from Columbia University recently built a self-replicating AI system. Instead of painstakingly creating the layers of a neural network and guiding it’s development as it becomes more advanced – they’ve automated the process.
The researchers, Oscar Chang and Hod Lipson, published their fascinating paper titled “Neural Network Quine” earlier this month, and with it a novel new method for “growing” a neural network.
Chang told The Register about the team’s reasoning behind creating an AI that evolves itself:
The primary motivation here is that AI agents are powered by deep learning, and a self-replication mechanism allows for Darwinian natural selection to occur, so a population of AI agents can improve themselves simply through natural selection – just like in nature – if there was a self-replication mechanism for neural networks.
The method Lipson and Chang use relies on natural selection techniques by using one of AI’s greatest strengths: predicting patterns.
A neural network compares data across various layers to determine what is similar or dissimilar, thus forming patterns. Various components of the network, agents, do specific tasks – such as finding all the images of cats in a stack of six million images or attempting to replicate a human style of art.
These networks can improve in a myriad of different ways, including having two agents argue, or approaching different aspects of a task with individual agents and then combining the “knowledge” each gathers. Over various iterations, such neural networks get better.
But with the quine system that Lipson and Chang have created, the neural network improves through “evolving” new versions of agents within itself by predicting what they will look like in the future after they’ve learned new information.
Eventually a neural network that can predict its own growth could lead to an AI system that’s resilient to efforts to delete or scale it back. Theoretically, humans could try and remove specific components or delete the entire program but tiny snippets of code, perhaps stored in a secure cloud, could bring entire systems back online near instantaneously.
There’s no need to worry just yet however. Humans are still firmly in control. In fact, the researchers on the Neural Network Quine paper ran into the same power consumption problem with their digitally-evolving AI that exists in nature.
Because it takes extra resources to self-replicate and create a better version of itself, AI that does so is less successful at accomplishing tasks than more traditional neural networking methods.
Where those can reach near 100 percent accuracy at tasks such as image recognition after only a nominal number of “evolutions,” or iterations spent trying a particular task, the self-replicating model is at least ten percent worse after the same number of tries.
Even the researchers seem to find this aspect a bit mysterious, Chang told The Register:
It’s not entirely clear why this is so. But we note that this is similar to the trade-off made between reproduction and other tasks in nature. For example, our hormones help us to adapt to our environment and in times of food scarcity, our sex drive is down-regulated to prioritize survival over reproduction
This means there’s a lot of work to be done before this kind of neural network can perform as well as the type we’re used to.
It’s probably too early to understand the complete implications of this new method. We’ve seen something similar in Google’s DeepMind division teaching its AI how to create better algorithms than people can. But this is the first time a neural network, designed with another purpose (in this case image recognition), has been built with a self-replication mechanism baked-in.
The research is still in the early stages, but future iterations will include neural networks that can use the same self-replication techniques to recreate other neural networks.
In the future AI will create itself, advance itself, and integrate new neural networks through a natural selection process. What’s the worst that could happen?
The Next Web’s 2018 conference is just a few months away, and it’ll be 💥💥. Find out all about our tracks here.