Generative artificial intelligence is now being used to conduct “séances” and bring back the dead. But these are not “meetings” with spirits led by a medium, which became especially popular in the 19th to early 20th centuries. The amazing effects are being used as entertainment icons, “presenters” in media advertisements, political witnesses, and as everyday companions for grieving families.

Likenesses of David Ben-Gurion have been turned into legitimate ad campaigns, as have those of actor Yehuda Barkan to sell hamburgers, with approval from his family. Young people who never heard of or knew much about them are introduced to “living” people, and older adults see them and reminisce.

A new study by Tom Divon, a media and cultural researcher from the Hebrew University of Jerusalem (HUJI) and digital media and communication Prof. Christian Pentzold of Leipzig University in Germany claims this practice isn’t just emotionally powerful: it’s ethically explosive because it turns a person’s voice, face, and life history into reusable raw material. It has just been published in the journal New Media & Society under the title “Artificially alive: An exploration of AI resurrections and spectral labor modes in a postmortal society.”

The researchers analyze more than 50 real-world cases from the US, Europe, the Middle East, and East Asia where AI technologies are used to recreate deceased people’s voices, faces, and personalities. The study offers one of the most comprehensive looks yet at this unsettling frontier and raises urgent questions about consent, exploitation, and power in a world where the dead can be digitally revived.

Divon told The Jerusalem Post in an interview that “AI resurrections are important because they can happen with little or no consent, clear ownership rules, or accountability, creating a new kind of exploitation we call ‘spectral labor,’ in which the dead become an involuntary source of data and profit, while the living are left to navigate blurred lines between memory and manipulation, comfort and coercion, tribute and abuse.”

Hologram images of Holocaust Survivors are displayed on the Dizengoff Square fountain in Tel Aviv, ahead of Holocaust Remembrance Day. April 16, 2023.
Hologram images of Holocaust Survivors are displayed on the Dizengoff Square fountain in Tel Aviv, ahead of Holocaust Remembrance Day. April 16, 2023. (credit: AVSHALOM SASSONI/FLASH90)

What does it mean when artificial intelligence makes the dead speak again? From hologram concerts of long-deceased pop stars to chatbots trained on the texts of lost loved ones, Gen AI is rapidly redrawing the boundary between life and death. Divon is worried that Israel’s upcoming election for a new government and Knesset will lead to abuse of generative AI as bots will sound like real people and opponents of people running for election can make them appear to say things they never said. “One can already see such things from political parties and anonymous activation of profiles by people hired to do so.”

WHAT SETS this study apart is its scope and clarity. Rather than focusing on a single technology or viral example, the researchers examined dozens of cases from across continents to show that AI “resurrections” are already forming a recognizable social pattern.

Take, for instance, the scenario of late Holocaust survivors, some of whom were active in sharing their memories during their lifetimes. Now being re-invoked as agentic ghosts, they are animated to carry a voice of warning against the rise of fascism and underscore the importance of resistance, offering personal narratives on safeguarding human rights.

Engineered encounters between living and dead

Consider the case of Ofra Haza, the legendary Israeli-Yemenite singer who died in 2000, yet was summoned back for a celebrity production on Israel’s Independence Day in 2023. In this televised spectacle, Haza’s holographic image and voice were reconstructed and forced into a duet with contemporary pop star Noa Kirel. The performance fused eras and aesthetics – Haza’s haunting presence reanimated through algorithmic synthesis, “performing” Kirel’s modern hit song.

The result was a carefully engineered encounter between the living and the dead, where the technological revival of Haza served as both a nostalgic homage and a profitable spectacle of digital enchantment, transforming her posthumous image into a pliable cultural asset.

AI-driven commemoration of war has emerged as a deeply charged practice. During the Israel-Hamas war, GenAI applications were used to resurrect the faces and voices of fallen Israeli soldiers and police officers, often using recordings from WhatsApp voice notes or other personal archives. Families produced videos in which the deceased appear to speak directly to their loved ones – expressing love, sharing final wishes, reaffirming patriotic faith in the justness of their cause, and praising the sacrifice they have made. While profoundly emotional, these reanimations raise ethical questions around agency, consent, and posthumous representation, as family members speak on behalf of those who can no longer voice their own beliefs.

The study situates AI resurrections within what sociologists call a postmortal society: one that does not deny death, but increasingly seeks to overcome it technologically. In this world, immortality is no longer promised through religion alone, but through data, algorithms, and platforms offering “digital afterlives.” But the authors are clear – AI does not conquer death. Instead, it keeps people suspended in an uneasy state of limbo, neither fully alive nor fully gone.

The ethical and legal implications of generative Ai

AS GENERATIVE AI accelerates, Divon and Pentzold warn that society must confront the ethical and legal implications now, before digital resurrection becomes normalized and unregulated. “Thinking seriously about what AI does to our relationship with the dead,” they wrote, “is essential to understanding what it is doing to the living.” They identified three distinct ways the dead are being digitally reintroduced into society:

Spectacularization – the digital re-staging of famous figures for entertainment. Fans can now watch “new” performances by Whitney Houston where she is made to cover songs like Bohemian Rhapsody, or Freddie Mercury, who is made to perform “I Will Always Love You” that are generated by AI and staged as immersive spectacles.

Sociopoliticization – the reanimation of victims of violence or injustice for political or commemorative purposes. In some cases, AI-generated personas of the dead are made to testify, protest, or tell their own stories posthumously.

Mundanization – the most intimate and fast-growing mode, in which ordinary people use chatbots or synthetic media to “talk” with deceased parents, partners, or children, keeping relationships alive through daily digital interaction. Across all three modes, the dead are not simply remembered: they are made to work.

Divon and Pentzold introduce the concept of “spectral labor” to describe what is happening beneath the surface. AI systems are trained on the digital remains of the dead; photos, videos, voice recordings, social media posts. Without consent, these data are extracted, repackaged, and monetized, with immense potential for weaponization.

WHAT HAPPENS when a figure like the murdered American political activist Charlie Kirk is resurrected to continue circulating his ideology, speaking to new audiences after his death, without accountability, context, or the possibility of refusal? Or when the likeness of a victim is reanimated to repeatedly relive trauma for political, commercial, or instructional ends? In these cases, AI resurrection becomes a tool for extending power, ideology, and influence beyond the limits of life itself. “The dead are compelled to haunt the present,” serving the emotional, political, or commercial desires of the living, the authors argue.

Immortalization is a powerful illusion that bespeaks modern societies’ fascination with remaking the boundaries that separate life and death and, ultimately, overcoming the finality of life, the team wrote. “This raises difficult questions: Who owns a voice after death? Can a digital likeness be exploited? And who gets to decide how, when, and why the dead are brought back?”

There are few laws regarding use of these images, both in Israel or the world. “The ability to manipulate and generate images is very easy. YouTube wants people to be engaged in its videos,” Divon said. “The more exciting they seem, the more money they make.” Some actors/actresses and social media activists who work as presenters for products may be replaced by AI models that develop relationships with viewers; so will models who used to sell skin products.

“There just be dialogue with the industry,” Divon concluded, “and the setting of guidelines to prevent abuse.”