Study Warns That AI ‘Griefbots’ Could Become a Burden

A person in a modern, minimalist living room sits on a chair facing a futuristic, holographic figure. The room is sleek with wooden floors, a large couch, and soft ambient lighting. The scene has a futuristic, high-tech ambiance.
A new study from the University of Cambridge called for “safeguards” around the use of “griefbots,” lest they end up becoming digital ghosts.

While AI ghosts might seem outlandish, the technology already exists. “Griefbots,” also known as “deadbots,” are “AI chatbots that simulate the language patterns and personality traits of the dead using the digital footprints they leave behind,” as University of Cambridge researchers put it.

In the study, AI ethicists from Cambridge’s Leverhulme Centre for the Future of Intelligence outlined three scenarios that, while speculative, are meant to identify potential risks. The three scenarios include pitfalls such as the griefbots advertising products, distressing children who may not understand what is happening, or essentially haunting loved ones with painful reminders of the deceased.

“Even those who take initial comfort from a ‘deadbot’ may get drained by daily interactions that become an ‘overwhelming emotional weight,’ argue researchers, yet may also be powerless to have an AI simulation suspended if their now-deceased loved one signed a lengthy contract with a digital afterlife service,” a release regarding the study states.

A joyful elderly woman with glasses and white hair embraces a smiling young child with a ponytail. The text overlay reads "Be the favorite grandkid — forever," with a button saying "Connect with your Nan now."
A fake company created for the Cambridge University study.

The study also raises questions about how a griefbot can and should be “retired,” so to speak.

“People might develop strong emotional bonds with such simulations, which will make them particularly vulnerable to manipulation,” study co-author Tomasz Hollanek and Cambridge LCFI member said in the release. “Methods and even rituals for retiring deadbots in a dignified way should be considered. This may mean a form of digital funeral, for example, or other types of ceremony depending on the social context.”

These are all critical questions and issues to consider as deepfakes and generative AI advance and become more accessible. Some companies already offer similar services as well, the Daily Mail points out, noting Project December and Hereafter.

“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost,” Hollanek adds. “The potential psychological effect, particularly at an already difficult time, could be devastating.”

PetaPixel previously reported how a grieving father replicated his dead daughter with AI technology so he and his wife could keep her “alive” in the “digital world.”


Image credits: University of Cambridge

Discussion