AI immortality: How deathbots are changing the way we grieve

AI immortality: How deathbots are changing the way we grieve

Credit: RDNE Stock project from Pexels

A paper appearing in Topoi by Dr. Regina Fabry and Associate Professor Mark Alfano, from Macquarie University’s Department of Philosophy, explores the impact “deathbots” might have on the way grief is experienced and the ethical implications.

A deathbot is a chatbot that imitates the conversational behavior—its content, vocabulary and style—of a person who has died.

Based on generative AI systems that depend on a large collection of human-generated information, deathbots draw on text messages, voice messages, emails and social media posts to mimic the speech or writing of a deceased person.

The most common form of deathbot is based on text. However, deathbots with verbal inputs and audio outputs are becoming more common. They draw on “digital remains,” generating responses to prompts entered by a human which can resemble the conversational responses the now-deceased person would have given.

The paper from Dr. Regina Fabry and Associate Professor Mark Alfano, titled “The Affective Scaffolding of Grief in the Digital Age: The Case of Deathbots,” examines the potential impact of human-deathbot interactions on the process.

To try to understand how deathbots work, and how they can malfunction, philosophers have been researching accounts of human-deathbot interactions for several years. These accounts might have important implications for the compilation of future policy guidelines.

A new way to process grief

“From an optimistic perspective, deathbots can be understood as technological resources that can shape and regulate emotional experiences of grief,” says Dr. Fabry.

“Researchers suggest that interactions with a deathbot might allow the bereaved to continue ‘habits of intimacy’ such as conversing, emotional regulation and spending time together.”

Some people do not want to be ‘zombified’ in the form of a deathbot after their death.

But, she cautions, grief experiences are complex and variable. “How we grieve, for how long we grieve, and which resources and practices can best support us as we navigate and negotiate loss depends on a range of factors.

“These include the cause of death (an accident, long-term illness, or homicide, for example); the kind and quality of the relationship between the bereaved and the person who has been lost; and the wider cultural practices and norms that shape the grieving process.”

Furthermore, the positive or negative impact of deathbots on grief also depends on the attitudes of the bereaved towards the conversational possibilities and limitations of deathbots.

“Is a bereaved person aware that they are chatting with a deathbot, one that will eventually commit errors? Or does a bereaved person, at least at times, feel as if they are, literally, conversing with the dead? Answering these questions needs more .”

Consent will be a key challenge

“Some people do not want to be ‘zombified’ in the form of a deathbot after their death. Others might express the wish during their lifetime that a deathbot be generated after their death. They might collect and curate data for that purpose,” says Dr. Fabry.

“Either way, the bereaved—and tech companies offering deathbot services—would have a to respect the wishes of the dead.”

Some researchers have pointed out, says Dr. Fabry, the bereaved might face an autonomy problem and come to rely too much on a deathbot in their attempts to navigate and negotiate a world irrevocably altered by the death of a loved one.

There has been discussion, too, about whether human-deathbot interactions could see an irreversibly lost human relationship replaced by a digitally mediated relationship with an AI system, leading to self-deception or even delusion.

“To prevent the occurrence of this problem, we recommend the implementation of ‘automated guardrails’ to detect whether a bereaved person becomes overly dependent on their interactions with a deathbot,” says Dr. Fabry.

“Furthermore, we recommend that interactions with a deathbot should be supervised by a grief counselor or therapist.”

More information:
Regina E. Fabry et al, The Affective Scaffolding of Grief in the Digital Age: The Case of Deathbots, Topoi (2024). DOI: 10.1007/s11245-023-09995-2

Citation:
AI immortality: How deathbots are changing the way we grieve (2024, June 25)
retrieved 25 June 2024
from https://medicalxpress.com/news/2024-06-ai-immortality-deathbots-grieve.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Read More

Zaļā Josta - Reklāma