For $199, you can live for ever. At least that’s the promise of an app called HereAfter AI, which might let you haunt your family for generations.
The app is part of a fast-growing industry known as grief tech – a cluster of startups exploring how technology can commemorate, simulate or even resurrect the dead.
Users can upload an unlimited number of voice messages, text prompts, photographs and videos of themselves, and in return an algorithm melds together memories and media and spits out a “virtual you” – an avatar that can speak to loved ones on phone calls and through text messages “so that children, grandchildren and beyond can know what you’re really like”.
It’s an idea particularly resonant in parts of Asia, where Confucian traditions emphasise ancestor veneration – rituals once expressed through offerings at home altars and graves, and now increasingly mirrored in virtual spaces. One South Korean company, DeepBrain AI, offers perhaps the most high-end example. For up to $50,000 – plus thousands more in maintenance fees – it creates hyper-realistic 3D avatars of the deceased. A video of its technology went viral, showing a grieving mother embracing a VR version of her seven-year-old daughter, who had died three years earlier.
Bots are now “capable of generating novel content rather than merely parroting content produced by their creator while living”, according to a new paper from Google DeepMind. These “generative ghosts” might “reshape society in complex ways beyond our current imagining”.
Even today, our AI afterlives are an ethical minefield. Unlike HereAfter AI, apps like character.ai, Seance AI and Virtual You allow users to create companions without the consent of the deceased, based on old messages – or even just a written description.
“If I die and my children access my email account and reanimate me … for me, personally, that’s a horrific prospect,” says Professor Robert Sparrow of Monash University, who has written about the ethics of digital ghosts.
Sparrow says some people may benefit from the comfort of reconnecting with loved ones but the risks of such novel technology are vast. “One of the most interesting questions is whether these artefacts risk distorting or even poisoning your memory of your deceased relative,” he says.
Then there’s data privacy – for the living human, who in grief may share deeply personal data with the commercial entity that owns the bot, but also for the ghost, which might inadvertently reveal a n old family secret from beyond the grave.
Regulators and developers should consider the privacy risks. But more than anything, Sparrow thinks we should all reflect on how we prepare for the end. “My mother has late-stage cancer, she’s clearly thinking about her own death, and so she is going through a process of throwing out all the things she doesn’t want the kids to deal with and prepping her house,” he says.
“And in the future, part of that process will actually be: what data do I want to leave behind? And do I want to give people instructions about whether they should digitally reanimate me?”