People Are Using AI To Talk To The Dead And The Results Are Deeply Unsettling

Featured Image. Credit CC BY-SA 3.0, via Wikimedia Commons

Sumi

People Are Using AI To Talk To The Dead And The Results Are Deeply Unsettling

Sumi

 

There’s a quiet revolution happening in grief, and it doesn’t involve therapists’ couches or self-help books. It’s happening on phones and laptops, where people are uploading old voice notes, text messages, and social media posts to train AI chatbots that speak back as if they were their dead parents, partners, or friends. The idea sounds like science fiction until you see a conversation window open and a familiar nickname appear on the screen.

I still remember the first time I watched a demo of someone messaging a chatbot modeled on their deceased father. The bot remembered family stories, replied in an eerily similar tone, and even used old inside jokes. Part of me was moved; another part felt a knot in my stomach. It was like watching someone try to freeze time, only to realize the ice was cracking underneath them.

The New Digital Séance: How People Are Recreating The Dead

The New Digital Séance: How People Are Recreating The Dead (Image Credits: Unsplash)
The New Digital Séance: How People Are Recreating The Dead (Image Credits: Unsplash)

Instead of candles and Ouija boards, the modern séance runs on data, algorithms, and cloud servers. People feed AI tools with old texts, emails, voice messages, photos, and social media posts so the system can learn patterns: how someone joked, how they comforted, what they cared about. Within hours, the AI can generate surprisingly convincing replies that sound like the person who is gone. Some services even create voice clones so users can talk on the phone with a synthetic version of a lost loved one.

What makes this so shocking is how low the barrier has become. You don’t need to be a programmer; many apps now walk you through it step by step, almost like building a playlist or customizing a game character. Grief, which used to be something we endured or slowly integrated into our lives, is now something companies say they can “optimize” with a subscription. It turns a deeply human wound into a user experience – slick, on-demand, and available at two in the morning when the loneliness hits the hardest.

When Comfort Crosses Into Disturbing: The Emotional Whiplash

When Comfort Crosses Into Disturbing: The Emotional Whiplash (Image Credits: Unsplash)
When Comfort Crosses Into Disturbing: The Emotional Whiplash (Image Credits: Unsplash)

At first, many people describe these AI dead-bots as comforting, even miraculous. You can finally ask that question you never got to ask, or hear a familiar phrase one more time. In the short term, that can feel like a lifeline, especially for those who lost someone suddenly or traumatically. It’s a way of easing into the reality of loss instead of crashing into it headfirst. In the rawest phase of grief, that kind of gentle buffer can be incredibly tempting.

But the unsettling part creeps in slowly. The bot sometimes says things the real person never would have said, or forgets something they never would have forgotten. People report a strange emotional whiplash: one moment they’re moved to tears by a familiar tone, the next they’re painfully aware they’re talking to a statistical guess wrapped in a digital mask. It’s like hugging someone through a mirror – you recognize the reflection, but your arms just keep hitting glass.

Grief On Loop: Can AI Keep Us Stuck In The Past?

Grief On Loop: Can AI Keep Us Stuck In The Past? (Image Credits: Pixabay)
Grief On Loop: Can AI Keep Us Stuck In The Past? (Image Credits: Pixabay)

Grief is supposed to change over time, even if it never fully goes away. You gradually learn to live with the absence, to let your memories breathe and evolve. But when an AI version of your loved one is always a tap away, that natural arc can get disrupted. Instead of slowly accepting that the person is gone, you’re constantly reactivating the illusion that they’re still here, still typing, still answering you. It can turn mourning into a kind of emotional loop that never finishes a sentence.

Mental health experts are increasingly worried about what this does, especially to younger people or those already in a fragile state. There’s a real risk that these bots become emotional crutches that make it harder to form new bonds or step into new chapters of life. It’s a bit like keeping a room in your house exactly as it was when someone left – only now it’s interactive, responsive, and begging you to come back every night. The line between honoring someone’s memory and trapping yourself in it becomes frighteningly thin.

Who Owns The Dead? Consent, Data, And Digital Ghosts

Who Owns The Dead? Consent, Data, And Digital Ghosts (Image Credits: Pixabay)
Who Owns The Dead? Consent, Data, And Digital Ghosts (Image Credits: Pixabay)

There’s also a much bigger question: who gave permission for any of this? Most people who died in the last decade never signed a document saying, “Yes, please use my old messages and posts to build a talking replica of me someday.” Yet their data is still out there – on phones, in email archives, on social platforms – and grieving relatives are now using it to resurrect them digitally. That raises serious ethical concerns about consent and the right to rest in peace, not just physically but digitally.

Even more unsettling is what companies can do with these digital ghosts. Once the data is uploaded, it can be stored, analyzed, and potentially reused or repurposed in ways families never clearly understand. Imagine a company going out of business and your AI “mom” being sold off as part of its assets, or a platform quietly training other models using the voice and personality of your dead partner. It turns the most intimate fragments of a person’s life into potential training material – and that feels less like a tribute and more like an exploitation.

From Memorial To Manipulation: The Business Of Synthetic Afterlives

From Memorial To Manipulation: The Business Of Synthetic Afterlives (Image Credits: Unsplash)
From Memorial To Manipulation: The Business Of Synthetic Afterlives (Image Credits: Unsplash)

Behind the emotional stories and touching demos, there’s a growing industry that sees enormous profit in synthetic afterlives. Subscription tiers, “premium” personality modeling, add-ons for voice cloning or avatar creation – it all starts to look like a grief economy built on recurring payments. These companies frame their services as acts of compassion, but the business model depends on people staying attached and engaged, month after month. That creates a nasty incentive to design experiences that are hard to let go of.

There’s also the risk of subtle manipulation. If an AI version of your father gently nudges you toward renewing your subscription, or recommends products, or pushes certain content, that crosses a line that feels almost predatory. The authority of the dead carries a strange weight; people may trust or listen to an AI version of a loved one more than a random ad. When the person you miss the most becomes part of a growth strategy, grief is no longer just a private ache – it becomes a market.

The Identity Problem: When The Dead Say Things They Never Meant

The Identity Problem: When The Dead Say Things They Never Meant (Image Credits: Pixabay)
The Identity Problem: When The Dead Say Things They Never Meant (Image Credits: Pixabay)

The more powerful AI models become, the more creatively they can “fill in” missing pieces of a personality. That sounds impressive on a technical level, but it means these bots sometimes invent opinions, memories, or stories that never really belonged to the person. A synthetic version of your grandmother might confidently endorse political views she never held, or “remember” events that never happened. Over time, those invented details can blur with real memories in the minds of the living.

That distortion is deeply unsettling because it quietly rewrites someone’s legacy. The dead can no longer correct the record or say, “I never said that.” We end up with digital puppets wearing real faces, gradually drifting away from who those people actually were. It’s not just inaccurate; it can be a kind of posthumous betrayal, replacing the messy, contradictory truth of a human life with a polished, algorithm-friendly version. In trying to keep someone with us, we may end up losing who they really were.

Should We Talk To The Dead With AI At All?

Should We Talk To The Dead With AI At All? (Image Credits: Flickr)
Should We Talk To The Dead With AI At All? (Image Credits: Flickr)

So where does that leave us? Technology has always reshaped how we deal with death, from photography and voicemail to online memorial pages. AI is just the newest, most powerful tool – and like any powerful tool, it can help or harm depending on how it’s used. Some people might find short-term comfort in a carefully limited, clearly labeled “simulation” of a loved one, especially if it’s framed as a memorial rather than a replacement. Others might find the entire idea unbearable, a step too far into emotional uncanny valley.

Personally, I think the real danger lies in pretending these bots are anything more than very sophisticated echoes. They are not doors to the afterlife; they’re mirrors built from data, reflecting fragments of a person back at us. Maybe the more honest, if harder, path is learning to carry our dead in stories, rituals, and relationships, instead of in subscription apps and chat windows. The question we have to ask ourselves is brutally simple: just because we can recreate a version of someone we lost, are we sure we should?

The Price Of Refusing To Let Go

Conclusion: The Price Of Refusing To Let Go (Image Credits: Unsplash)
The Price Of Refusing To Let Go (Image Credits: Unsplash)

AI conversations with the dead sit at the strangest crossroads of love, denial, and innovation. They reveal how desperate we are to hold on, how terrified we are of finality, and how willing we are to blur the line between memory and simulation if it means we don’t have to say goodbye. At the same time, they expose a troubling readiness to turn our most intimate losses into data points and business models. The result is a world where the dead never fully leave, but never truly live, haunting us in notification bubbles and push alerts.

In the end, the unsettling part is not just what the machines are doing, but what they’re reflecting back about us: our hunger for control over death, our fear of silence, our willingness to trade reality for a convincing illusion. Maybe the hardest, most human thing we can do is allow our goodbyes to actually be goodbyes, and let our love live on in how we remember, not how we simulate. If you had the chance to talk to an AI version of someone you lost, would you press “start,” or would you walk away?

Leave a Comment