They Say Hi Son Again: How AI Voice Cloning Is Changing Grief
A new way to hear someone
Grief used to be practiced largely in silence, in photographs and in letters. Now some people are hearing lost loved ones speak again through artificial intelligence. Short voice notes, recordings from hospital stays, or a few messages can be uploaded to services that recreate a familiar voice and generate new messages in that tone. For many, the result is comfort. For others, it opens difficult ethical and emotional questions.
How people are using the tools
Diego Felix Dos Santos, 39, found himself missing the simple sound of his father after he died. A voice note from the hospital became the basis for a voice clone generated with a voice-AI service. He now hears greetings like ‘Hi son, how are you?’ in a voice he thought was gone.
Companies such as Eleven Labs, StoryFile, HereAfter AI and Eternos offer so-called grief tech: voice cloning, chatty avatars and digital twins modeled on recordings of deceased relatives. Families use these products to preserve mannerisms, hear familiar tones, and in some cases to maintain an ongoing, interactive memory of a lost person.
Comforts and sharp edges
Users often say these tools do not replace mourning but add another way to hold a memory. Anett Bommer, whose husband prepared an Eternos avatar before he died, describes it as part of her life now. She did not lean on the avatar in the most acute moments of grief, but later found it precious.
At the same time, experts warn of serious concerns. Key issues include consent, especially when the person who is being cloned is no longer able to agree; emotional dependency, where someone might get stuck in a loop of virtual interaction instead of moving through grief; and data privacy, since voice data can be reused or repurposed.
Cambridge researchers and others call for ongoing consent mechanisms, transparent data practices and protections for sensitive personal information. As AI capabilities evolve rapidly, law and social readiness often lag behind.
Broader implications
This technology ripples outward into therapy, ethics, law and culture:
- Mental health: Some therapists see potential for closure, others worry AI could delay acceptance or complicate natural grieving processes.
- Ethical precedents: If digital afterlives become common, societies will need to establish norms for consent, ownership and posthumous rights to voice and likeness.
- Commercialization: Subscription models and legacy accounts can create pressure to monetize grief, risking looser consent practices or misuse of data.
- Cultural variation: Religious and ritual practices will shape whether different communities accept or reject voice clones and avatars.
There is no single right answer for everyone. For some, voice cloning is a new form of relic; for others, it is an unwelcome simulation.
Questions to ask before you try it
Before using grief tech, consider these practical questions:
- Did the deceased give consent before death for their voice or likeness to be used? This affects both legal standing and moral considerations.
- Can you control how the voice and data are stored and used later? Clear terms make misuse less likely.
- Is there a plan to retire or turn off the avatar if it becomes harmful? Knowing how to end access matters for emotional safety.
- How might ongoing interaction with a voice clone affect your grieving process over months and years? For some it offers comfort; for others it could stall emotional acceptance.
A personal view
There is a real pull to what these tools offer: relief, closeness, and one more chance to hear someone you miss. Grief is unpredictable and brutal, and a recreated voice can feel sacred for a moment.
Yet there is a fine line between comfort and illusion, between preserving memory and delaying farewell. As grief tech grows, it needs guardrails: ethical design, honest marketing and clear user education. Nothing should exploit grief for profit or promise more than it can deliver.
AI voices are not ghosts. They are echoes. Use them wisely.