4 min read

Trigger Warning: This article discusses grief, death, and the use of AI to digitally recreate deceased loved ones, which may be emotionally distressing for some readers.

With the passing of a loved one, until recent years, you would be left with a collection of physical objects. Now, though, there exists something called their digital shadow: emails, texts, voice notes, videos, and years of passive metadata that collectively suggest a version of them still moving through the world. 

Now AI can make that shadow speak. 

What is a digital afterlife? 

In 2022, Marina Smith, a Holocaust educator, “attended” her own funeral. That is to say, her family conversed with a hyper-realistic video avatar built by the company StoryFile, which responded to their questions with pre-recorded answers enhanced by AI. 

It’s strange to say, but there’s now a global industry forming around digital afterlife technologies. The output varies; some companies offer voice-cloned avatars trained on the deceased’s social media history. Others allow users to pre-record messages that will be delivered posthumously. In South Korea, a grieving mother was reunited with a digital recreation of her deceased daughter in a virtual reality documentary that was watched by millions. It was moving, and deeply unsettling. 

No, it’s not true resurrection. But what does it mean to die in a world where you can still post on your own memorialized Facebook page? And what happens when your loved ones can converse with a version of you long after this current iteration of self has moved on? 

Should we commercialize grief? 

The appeal is obvious. AI-generated avatars promise a new kind of continuity, in a culture where relationships are often maintained through text and video anyway. “It’s not about replacing the dead,” one digital afterlife developer said, “It’s about extending the relationship.” 

But psychologists question that framing. While digital memorials—like Facebook’s “legacy contacts” or Google’s Inactive Account Manager—offer structured ways to manage a deceased person’s presence, interactive avatars introduce new forms of emotional dependency. Dr. Jessica Heesen, lead ethicist of the Edilife project at the University of Tübingen, warns that continued interaction with a digital surrogate may blur the boundaries of grief: “Digital avatars could act like a painkiller in preventing the bereaved from accepting and dealing with their loss.” 

Grief, traditionally, has an arc. It’s built into our nature. Now technology curtails that. 

Who does the digital afterlife belong to? 

There are also questions of ownership. As courts begin treating digital accounts as inheritable assets, legal frameworks are struggling to keep pace. A growing number of families are finding access to emails, cloud storage or social media accounts blocked after the death of loved ones. Without legal planning—like naming a digital executor or writing a digital will—loved ones can be locked out of crucial parts of a person’s life. 

But even when access is granted, much like organ donation, there are serious conversations to be had over our right to recreate someone digitally. Can a corporation own your voice or face after you die? And what happens if a digital avatar starts behaving in ways the original person never would? 

This isn’t hypothetical. AI systems like Eter9 and HereAfter allow users to train virtual versions of themselves during life for posthumous interaction. These avatars learn over time, meaning that if you speak to one years after a person’s death, you’re engaging with an evolving construction. And that only deepens concerns about possible misrepresentation, emotional manipulation and posthumous consent. 

As such, some practical steps now seem necessary which would have been unfathomable only 10 years ago: 

1. Naming a digital executor in your will. 

2. Proactively setting up tools like Google’s Inactive Account Manager or Apple’s Legacy Contacts. 

3. Considering whether you want your digital remains to be deleted, archived or activated. 

4. Being specific about what kinds of posthumous interaction you consent to—or don’t. 

And for families, it’s worth asking whether continued interaction with a digital double honors the deceased—or prevents the living from letting go. 

Is access to digital afterlife technology fair? 

If we may permit ourselves to imagine our loved ones as high-quality avatars, what are the basic requirements for that representation to feel dignified? Voice simulation, 3D modeling, responsive AI? If anything less feels insufficient, then we’re already looking at the digital afterlife from a place of privilege. While platforms like Facebook and Instagram offer free memorialization tools, full digital continuance is expensive and exclusive. An innovation gap is already forming between continents. Without regulation, digital immortality could become a luxury good, deepening existing socioeconomic divides even after death. 

There are cultural implications too. In Japan, “digital graveyards” have become increasingly common for younger generations who live far from ancestral homes. In contrast, others religious institutions have called for dignity and restraint in the use of posthumous digital likenesses. Some cultures view avatars as deeply disrespectful; others embrace them as spiritual continuity. In either case, technological capability is moving faster than cultural consensus. 

In the end, the tools we use to grieve, be they gravestones or chatbots, shape our understanding of both the past and present. One of the great human challenges lies in coming to terms mortality. We already have a wealth of temporary solutions for more permanent struggles: painkillers, sleeping pills, coffee, antidepressants. All of great use. All come with a risk of dependence. 

And yet, when a loved one passes on, traces will continue to live across millions of intersecting sets of data. One day, you may be asked if you want to hear from them again. 

What do you say?