Deepfake tech has been deployed to create a new digital drag act that explores the social biases of artificial intelligence.
The Zizi Show invites audiences to choose a deepfake drag artiste and performance to watch. As the digital bodies move across the virtual stage, viewers can switch between the different AI identities to expose the methods used to generate them.
The project is the brainchild of artist Jake Elwes, who created the virtual cabaret by training a neural network on videos of 13 drag performers.
“There are drag kings, queens, biologically female drag queens, trans drag kings and queens, drag monsters,” he said. “We also made sure there was diversity in terms of race, gender, sexuality.”
Over time, the neural network iterated and improved its outputs until it could produce realistic deepfakes of the performers.
The cast members range from Me, “a master of lip sync with an aesthetic best viewed through a pair of sunglasses,” to Oedipussi Rex, “a beardy drag barbarian, with acts as wildly inconsistent as the Gods themselves.”
Elwes also generated a shape-shifting host for the show called Zizi by simultaneously training the neural network on images of all the artists. He describes the result as a “queering” of the data.
The show is the latest iteration of the Zizi Project, a collection of artworks that explore the intersection of AI and drag.
The project began in 2019 with “Zizi – Queering the Dataset,” which investigates how facial recognition algorithms misidentify non-cisgender people due to biases in their training data.
Elwes disrupted these biases by re-training a neural network on 1,000 images of drag performers. This led the system to generate more gender-fluid faces.
The Zizi Show shifts the project’s focus onto performance to communicate these issues to a different audience.
Ultimately, Elwes believes that if artists collaborate with scientists on the development of AI, they can reshape the tech to better represent our diverse societies.