Do robots go to hell? An AI developer recently created a remix of the Holy Bible (King James version) by training a text generator on BDSM stories. The results were decidedly not safe for work.
The new tome, dubbed “The Orange Erotic Bible,” comes in at a whopping 64,167 words – enough to call itself a full-fledged novel, though paltry compared to the Bible’s nearly 800K. Its developer – so far we haven’t been able to identify the person(s) responsible – generated the book as part of an AI novel-writing contest called NaNoGenMo (National Novel Generation Month) that’s been held annually since 2013.
According to the project’s GitHub page, the person or team responsible for The Orange Erotic Bible set out to answer a simple question: “If gpt-2 read erotica, what would be its take on the Holy scriptures?”
The answer: pretty much the same thing you always get out of GPT-2, that being a lot of gibberish and the occasional poignant phrase. GPT-2 is a controversial text generator developed by OpenAI, a non-profit whose original founders include Elon Musk. It’s a scary example of where AI research is going, but it’s not exactly capable of human-level communication.
Read: Why the smartest AI is still dumber than a toddler — and how we can fix that
The AI doesn’t know what it’s saying and there’s no thought process or reasoning behind the words it spits out. It just tries to match the tone of whatever its input prompt is.
So, if you train it on BDSM literature, and prompt it with lines from The Bible, you get a lot of this:
You also get the word “cock” 29 times (beating the KJV’s mere 13 mentions) and the word “fuck” four times. The word “fuck” does not appear in the King James Bible.
Here’s a fun thread on Twitter featuring some other gems:
Do you want to go to hell? Because this is how you get a whole layer of special hell designed SPECIFICALLY FOR YOU. pic.twitter.com/QPGuUDN704
— foone (@Foone) April 6, 2020
In order to get GPT-2 to spit out nearly 65K words, the developer(s) needed to come up with a sinful dataset full of easily searchable text. According to the project page:
[The developer(s)] fine-tuned a 117M GPT-2 model on a BDSM dataset scraped from Literotica.com. Then used conditional generation with sliding window prompts from The Bible, King James Version.
I.e: For each generation iteration, the model was asked to continue chunks of 15 lines of the Bible with 8 sentences. The aim was to keep the topics close to the Bible’s, whilst giving enough space for the model to add some spice to it based on its erotica studies.
And it does add some spice. As you can imagine (or read for yourself here) there’s plenty of scandalous and dirty passages in the book. But, there’s also some cogent religious takes as well. For example, Jesus is only mentioned once and he doesn’t seem to be a big deal at all:
This may be due to the fact that, according to GPT-2, the real Christian messiah is Ham:
Hagar was small: 5 foot 4 and so pretty, with broad shoulders and lovely hair. She was also muscled. And she was a virgin.
She had given birth to twins while riding in the army: one boy and one girl.
And Japheth was a window-washer; a good cook.
And Shem, the eldest, was a carpenter; a good driver.
And Ham, the second, was a messiah; a religious leader and s the founder of a new religion, Christianity.
Maybe this isn’t the best use of AI, but it’s certainly interesting. And, considering a former Google and Uber engineer started a church that worships AI, this isn’t even the strangest religion/technology mashup we’ve seen in recent years. But, if you subscribe to the belief that systems like GPT-2 are only going to get better and better… you have to ask yourself how long until these “religious books” start becoming convincing alternatives to the existing syllabus.
Hail the muscled-virgin Hagar, full of Army experience is she. May Ham watch over you all.
Get the TNW newsletter
Get the most important tech news in your inbox each week.