They Say Hi Son Again: How AI Voice Cloning Is Changing Grief

A new way to hear someone

Grief used to be practiced largely in silence, in photographs and in letters. Now some people are hearing lost loved ones speak again through artificial intelligence. Short voice notes, recordings from hospital stays, or a few messages can be uploaded to services that recreate a familiar voice and generate new messages in that tone. For many, the result is comfort. For others, it opens difficult ethical and emotional questions.

How people are using the tools

Diego Felix Dos Santos, 39, found himself missing the simple sound of his father after he died. A voice note from the hospital became the basis for a voice clone generated with a voice-AI service. He now hears greetings like ‘Hi son, how are you?’ in a voice he thought was gone.

Companies such as Eleven Labs, StoryFile, HereAfter AI and Eternos offer so-called grief tech: voice cloning, chatty avatars and digital twins modeled on recordings of deceased relatives. Families use these products to preserve mannerisms, hear familiar tones, and in some cases to maintain an ongoing, interactive memory of a lost person.

Comforts and sharp edges

Users often say these tools do not replace mourning but add another way to hold a memory. Anett Bommer, whose husband prepared an Eternos avatar before he died, describes it as part of her life now. She did not lean on the avatar in the most acute moments of grief, but later found it precious.

At the same time, experts warn of serious concerns. Key issues include consent, especially when the person who is being cloned is no longer able to agree; emotional dependency, where someone might get stuck in a loop of virtual interaction instead of moving through grief; and data privacy, since voice data can be reused or repurposed.

Cambridge researchers and others call for ongoing consent mechanisms, transparent data practices and protections for sensitive personal information. As AI capabilities evolve rapidly, law and social readiness often lag behind.

Broader implications

This technology ripples outward into therapy, ethics, law and culture:

There is no single right answer for everyone. For some, voice cloning is a new form of relic; for others, it is an unwelcome simulation.

Questions to ask before you try it

Before using grief tech, consider these practical questions:

A personal view

There is a real pull to what these tools offer: relief, closeness, and one more chance to hear someone you miss. Grief is unpredictable and brutal, and a recreated voice can feel sacred for a moment.

Yet there is a fine line between comfort and illusion, between preserving memory and delaying farewell. As grief tech grows, it needs guardrails: ethical design, honest marketing and clear user education. Nothing should exploit grief for profit or promise more than it can deliver.

AI voices are not ghosts. They are echoes. Use them wisely.