Ghosts in the Machine…
When my dad died thirteen years ago, I couldn’t look at him, listen to his voice, or even be in the house where he died — the house my beautiful mum still lives in. Grief did strange things to me; it made the ordinary unbearable. I could hold my dad in my mind, keep him close in memory, but seeing him outside of myself — a photo, a video, even his handwriting — reminded me, viscerally, that he wasn’t here.
I’ve often wondered what it must be like for people who’ve lost someone famous. The grief must be public, porous, owned by everyone. There’s nowhere to hide from the echoes of that person’s image, their voice, their work. At least they might find safety in knowing their version of the person — the private memories untouched by the crowd. But then AI walks in with its big size 9’s…
I can’t begin to imagine what it’s like for Zelda Williams to receive unsolicited AI-generated videos of her dad the extraordinary Robin Williams. The thought alone makes my chest tighten. There is enough footage of him to last several lifetimes; we do not need him digitally reanimated for our entertainment. Let him rest. Let his family be.
So, when Zelda called out the AI-generated videos, I felt angry on her behalf. Angry that grief — already such a private, unpredictable thing — could be hijacked and fed back to her by strangers who claim to love the same person. She shouldn’t have to navigate that kind of intrusion. These clips aren’t tributes — they’re distortions. They trade intimacy for imitation, compassion for clicks. They’re part of a growing genre of digital necromancy — our obsession with bringing back the dead, whether it’s Marilyn Monroe on a catwalk or a deep-faked Freddie Mercury on TikTok.
It’s easy to think this is harmless. It’s not porn or hate speech — it’s “just” nostalgia with a neural net. But the truth is, it springs from the same psychological soil as the darker stuff. The same technology that gives us a “new” Robin Williams can also give us AI porn of women and children where consent is never sought. Both turn people into puppets, both erase boundaries, and both stem from the same human fantasy: control.
That’s where we need to pause — to understand what’s really at play here. Beneath the algorithms lie three deeply human drives: our denial of death, our pressure to fantasy, and our failure to respect consent — even after death.
Connection is sacred. But the moment it’s automated, it stops being connection and starts being consumption.
💀 The Denial of Death
Freud called it das Unheimliche — the uncanny: that unsettling feeling when something is both familiar and foreign, alive and not-alive. AI resurrections live right there in the uncanny valley — almost human, but not quite. Watching them provokes fascination laced with dread — and the frankly the urge to throw your phone into the sea.
Ernest Becker’s The Denial of Death argued that civilisation rests on our refusal to face mortality. We build cathedrals, corporations, and now content to distract ourselves from the fact we’re temporary. AI resurrections are the twenty-first-century version of that denial — a collective fantasy that if we can simulate life well enough, we might not lose it at all.
We’ve outsourced death anxiety to the algorithm. Instead of mourning, we render. Instead of praying, we prompt. Instead of letting go, we loop.
Roland Barthes wrote, after his mother’s death, that a photograph is “a wound that will not close.” AI turns that wound into wallpaper — now available in 4K, with monetised ads and emotional manipulation included.
Irvin Yalom reminds us that peace doesn’t come from defeating death — it comes from accepting it. Grief, he says, is what love becomes when it has nowhere else to go. AI resurrections send that love somewhere else entirely — not always dark, but definitely disorienting.
There are, of course, small corners of this world where AI might help — carefully, compassionately. A virtual avatar used by a therapist to help someone say goodbye, or to rehearse forgiveness, might offer relief when human contact feels too raw. Used ethically, with evidence and clear boundaries, maybe this technology could help us hold grief. But used badly, it holds us hostage.
🪞The Pressure to Fantasy
When someone dies, we don’t just lose them; we lose the version of ourselves that existed in their presence. Lacan called this the mirror stage — our identity forms through reflection. When the mirror shatters, we scramble to rebuild it, and AI waits like an eager architect — with an unbeatable ego.
We’re allergic to endings, addicted to continuity, and terrified of silence. Living inside that fantasy lets us pretend the real world doesn’t ache, doesn’t age, doesn’t end. AI helps us decorate denial. But at some point, we have to re-enter reality — to rejoin the unfinished, uncomfortable world that grief insists upon.
As Baudrillard warned, the copy eventually replaces the real. We stop mourning the person and start mourning the image. AI reanimations soothe us with replicas, but they flatten everything that makes life precious: imperfection, transience, goodbye.
🧠 Living with illusion isn’t just philosophical — it’s psychological. Our brains evolved to process emotion through presence, not simulation. When even grief becomes content, we lose one of our last safe places to feel. Constant exposure to manipulated or resurrected imagery disorients the nervous system. It feeds anxiety, hypervigilance, and alienation. When we can’t tell what’s real, we start doubting ourselves — and that’s the deepest fracture of all.
🧍♀️Consent and the Body - The Quadvector
Psychoanalyst Jessica Benjamin wrote that real relationship depends on recognising the “otherness” of others — allowing them to exist beyond our control. AI violates that boundary. It gives us the illusion that people, living or dead, exist to serve our emotional needs on demand.
Donna Haraway once imagined the cyborg as a feminist hope — human and machine co-creating new forms of freedom. But that dream has curdled. Instead of liberation, we’re witnessing synthetic servitude — where technology doesn’t free us from domination but perfects it.
Laura Mulvey’s male gaze has gone algorithmic. Deepfake porn — non-consensual, AI-generated — is now an industry built on digital assault. What began as novelty became a weapon. The same systems that produce “harmless tributes” can also generate humiliation. The logic is identical: the body is no longer a boundary, only a file format.
Shoshana Zuboff warned that in surveillance capitalism, even our emotions are mined for profit. Now, even the dead are exploited for content. Immortality isn’t comfort; it’s commerce.
⚖️ Just Because We Can Doesn’t Mean We Should
Tech evangelists love to say, “The tool isn’t good or bad — it’s how we use it.” That’s moral laziness disguised as neutrality.
Kant warned that morality collapses when we treat people as means rather than ends. AI collapses that boundary with frightening ease — turning human lives into tools for comfort, content, or control.
We’ve also mistaken reach for righteousness — as if the scale of pleasure justifies the depth of harm. But morality measured in metrics is no morality at all.
Progress doesn’t require omnipotence; it requires wisdom. The question isn’t “Can we?” — it’s “Can we live with the consequences if we do?” Because every digital resurrection, every deepfake, every fantasy of endless presence chips away at something sacred — our right to truth, to absence and to nothingness.
🚀 Progress Without Pause
The biggest lie technology ever told is that progress is inevitable — that if it can be built, it must. This is the gospel of Silicon Valley: innovation as instinct, disruption as deity.
Political theorist Langdon Winner argued that technologies are political; they carry the values of those who design them. AI doesn’t just process data — it replicates desire. It learns from our biases, appetites, and fears. Every dataset is a mirror of the culture that made it.
Hannah Arendt warned that when humans seek total mastery over nature, we eventually turn that mastery inward — against ourselves. The more we automate empathy, the less we practise it. The more we delegate judgment, the less we tolerate doubt.
We’re already living in the resurrection economy — voice clones of celebrities, griefbots that chat like the dead, and AI-generated sex videos of people who never consented. It’s not the future; it’s now.
Every blurred boundary — between truth and imitation, presence and performance — adds to what psychologists now call truth fatigue: the exhaustion that comes from constantly questioning what’s real. For leaders, parents, and citizens alike, this corrodes the most essential ingredient of mental health: trust. Without shared reality, collective sanity frays.
The horror isn’t that we can imitate life — it’s that we’ll soon struggle to tell imitation from reality, and stop caring which is which.
🕊️ Letting the Dead Rest
When Zelda Williams says “stop,” she isn’t gatekeeping art or comfort — she’s defending humanity. She’s protecting the fragile space where grief still belongs to the living. Her refusal isn’t prudish; it’s profoundly human. She’s reminding us that dignity is eternal.
I don’t want to mock or undermine people who find solace in these things. Grief is weird, personal, ungovernable. If talking to an AI version of someone brings momentary peace, I understand the temptation. But solace built on simulation is fragile. It offers comfort without closure — a fantasy that keeps the story open long after the ending was written.
Even now, thirteen years later, whilst I can look at Dad, and I do every day my breath still catches. I keep him close in memory, where he’s safe — and so am I.
Because every time we try to resurrect someone through a screen, what we really expose is our own fear — of loss, of endings, of silence. AI doesn’t just imitate the dead; it illuminates how badly we want to be gods. Honestly, I’m not sure we need a resurrection myth coded by men in Silicon Valley with God complexes and oat-milk lattes.
I’m not anti-AI, I think it could be revolutionary in ways we are yet to imagine. But, I am anti coercive illusion. Ethics mean knowing when to stop.
📚Bookshelf
Ernest Becker – The Denial of Death
Irvin Yalom – Staring at the Sun: Overcoming the Terror of Death
Sigmund Freud – The Uncanny
Jacques Lacan – Écrits: The Mirror Stage
Jessica Benjamin – The Bonds of Love
Donna Haraway – A Cyborg Manifesto
Laura Mulvey – Visual Pleasure and Narrative Cinema
Roland Barthes – Camera Lucida
Jean Baudrillard – Simulacra and Simulation
Shoshana Zuboff – The Age of Surveillance Capitalism
Langdon Winner – Do Artifacts Have Politics?
Hannah Arendt – The Human Condition
Martha Nussbaum – The Monarchy of Fear
🎧 Ghosts in the Machine Playlist
Author’s Note
I use AI tools for research, creative organisation, and ordering my thoughts — but every idea, argument, and bit of soul in this piece comes from my own small, human imagination.


