
AI ethicists have called for urgent safeguards against an emerging âdigital afterlifeâ industry.
The concerns centre on chatbots that mimic the appearances, speech, and personalities of dead people.
Known as âdeadbotsâ or âgriefbots,â these AI clones are trained on data about the deceased. They then provide simulated interactions with virtual recreations of the departed.
This âpostmortem presenceâ can social and psychological harm, according to researchers from Cambridge University.
Their new study highlights several risks. One involves the use of deadbots for advertising. By mimicking lost loved ones, deadbots could manipulate their vulnerable survivors into buying products.
Another concern addresses therapeutic applications. The researchers fear that these will create an âoverwhelming emotional weight.â This could intensify the grieving process into endless virtual interactions.
Deadbots coming to life
The study also envisions deadbots spamming users with unwanted notifications. The researchers compare this to being âdigitally stalked by the dead.â
Itâs a prospect thatâs quickly becoming a reality. Services such as âProject Decemberâ and âHereAfterâ already offer customers a chance to digitally resurrect the dead.
To mitigate their risks, the researchers have called for deadbot designers to seek consent from âdata donorsâ before they die. They also want the products to regularly alert users about the risks, provide easy opt-out protocols, and bar disrespectful uses of deadbots.
Another suggested safeguard is user-friendly termination methods. These could even involve a âdigital funeralâ for the deadbot.
All these measures need to consider both the dead and those they leave behind.
âIt is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,â said Dr Tomasz Hollanek, one of the study co-authors.
âThese services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.â
Get the TNW newsletter
Get the most important tech news in your inbox each week.