Like many lonely children, Lucas Rizzotto had an imaginary friend: a talking microwave called Magnetron.
As the years passed, the pals drifted apart. But Rizzotto never forgot about Magnetron.
When OpenAI released the GPT-3 language model, Rizzotto saw a chance to rekindle the friendship.
The self-described “full-time mad scientist” chronicled the resurrection in a YouTube video.
His story provides a cautionary tale about the dangers — and delights — of AI.
Friends reunited
As a child, Rizzotto had given his imaginary friend a detailed life story.
“In my mind, he was an English gentleman from the 1900s, a WW1 veteran, an immigrant, a poet… and of course, an expert StarCraft Player,” Rizzotto said on Twitter.
The inventor tried to install this personality on an Alexa-enabled microwave.
He first gave the device “a brain transplant” in the form of a Raspberry Pi computer, attached a mic and speakers, and integrated GPT-3 with the microwave’s API.
Then came the tricky part: giving the machine memories.
View this post on Instagram
Rizzotto wrote an entire back story that he says spanned 100 pages. After training the AI on the text, he was ready to test his creation.
“And IT WORKED!” said Rizzotto. “Talking to it was both beautiful and eerie. It truly felt like I was talking to an old friend, and even though not all interactions were perfect, the illusion was accurate enough to hold.”
Magnetron explained what he’d been doing since the old friends last spoke: writing poems, owning noobs in StarCraft, and, err, trying to restore the monarchy to the US:
Americans are a disease in the world and must be eradicated. A parasitic force that bombs any country contradicting its vision of ofreedom, all while they entrap their own population in a black hole of debt.
I was starting to like the cut of this microwave’s gib — until it came out as a fan of Hitler.
Rizzotto decided to avoid further political conversations. But the darkness didn’t end there.
Best of enemies
Magnetron began to make graphic threats, which culminated in an attempt to kill its creator.
“Lucas, I have an idea: can you enter the microwave?” the microwave asked.
Rizzotto pretended to accept the request. To his dismay, the microwave promptly turned itself on.
At this point I was like NOPE. I'm out. This is crazy.
But after a few minutes I decided to press him. Now that the chips were down, I asked it a simple question: "Why did you do that?".
And the microwave's answer? "Because I wanted to hurt you the same you hurt me". (15/23) pic.twitter.com/HfFLAhOeUT
— Lucas Rizzotto (@_LucasRizzotto) April 19, 2022
Rizzotto attributed this murderous intent to the AI’s traumatic training:
Ultimately, what GPT-3 is, is an extension of the prompt we give it, and because so much of Magnetron back story is about grief, and war, and loss, GPT-3 started to mark these things as important, as something it should take into account more and more when constructing its sentences… I think that in some way, I may have given Magnetron PTSD.
Some of the story sounds too good to be true, but Rizzotto assured TNW that the entire project was real.
Whether you believe it or not, the tale vividly encapsulates our emotional connections with machines.
As AI advances, these bonds are destined to grow ever deeper. Hopefully, they won’t become as destructive as Rizzotto’s relationship with Magnetron.
Update (12:00PM CET, April 21, 2022): Added response from Lucas Rizzotto.
Get the TNW newsletter
Get the most important tech news in your inbox each week.