SELECT LANGUAGE BELOW

‘Weird and Creepy:’ Recreating Deceased Loved Ones as AI ‘Ghosts’ Raises Mental Health Concerns

New AI capabilities are raising ethical questions about digitally reviving deceased loved ones. “I’m very concerned that these new tools will be marketed to very vulnerable people, people who are grieving,” one expert explains.

new scientist report AI systems trained on the texts and emails of deceased loved ones may be able to create conversational ‘ghosts’ of deceased loved ones. But researchers warn that this can have serious effects on mental health.

A recent study by Jed Brubaker, an information scientist at the University of Colorado Boulder, highlights the potential risks. Brubaker explains that these AI chatbots have the potential to provide comfort and an interactive legacy. However, there is also the danger of creating dependence and dependence, which can interfere with healthy grieving.

AI and humanity (David Gyung/Getty Images)

“I think some people consider these things to be gods,” Brubaker said. “I don’t think most people would do that. There would be a group of people who would find it strange and creepy.” Brubaker said this extreme obsession could inspire new religious movements and beliefs. Are concerned. Brubaker recommends that modern religions issue guidance on the use of AI ghosts. He advises researchers to proceed with caution in further developing this AI application, taking into account its impact on mental health.

Mairi Aitken, a researcher at the Alan Turing Institute in London, shares Brubaker’s concerns. She warns that marketing her AI Ghost to vulnerable people who are grieving could prevent them from moving on, which is an important part of the healing process. Aitken suggested regulation may be needed to prevent AI from being created from someone’s data without their prior consent.

“I’m very concerned that these new tools will be sold to very vulnerable people, people who are grieving,” Mr Aitken said. “An important part of the grieving process is moving forward. It’s remembering and reflecting on that relationship and moving forward while keeping that person in your memory. And this makes the process difficult.” There is great concern that this may occur.”

The ability to build conversational bots that replicate specific people is rapidly advancing. AI models like OpenAI’s ChatGPT can already produce surprisingly human-like text after training on massive datasets. With enough data, they could soon sound eerily like individuals.

read more A new scientist is here.

Lucas Nolan is a reporter for Breitbart News, covering free speech and online censorship issues.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News