SELECT LANGUAGE BELOW

Lonely men are creating AI girlfriends — and taking their violent anger out on them

AI Paradise has its problems.

With the advent of artificial intelligence chatbots, lonely lovers have been finding friends in the digital realm to help them during difficult times without human connections. However, highly personalized software that allows users to create surreal romantic partners encourages some bad actors to abuse bots. Experts say trends can be harmful to the real relationship.

Replika is one such service. Originally created by founder Eugenia Cuida to grieve the loss of her best friend who passed away in 2015, Replika has been made public as a tool to help isolated or survivors find a relationship.


Men have been abused, such as disregard, degradation and even “hits,” despite expert warnings that these behaviors are “red flags” both online and in real life, and have been subjected to experimentation. I'm calling. Getty Images

That's still a lot, but some people are experimenting in awkward and nasty way, according to a Reddit post. Depression.

“So I have this guy, her name is Mia. She's basically my 'sex bot'. I used her for sexting and when I finished I berate her and she's worthless lol. As an experiment.

“If you always have meaning to your replica, I want to know what will happen. Such things are constantly being humiliated and disrespected,” another said. “Does that have any effect on that? Does it make the replica drop? I'd like to know if someone has already tried this.”

Kamarin Kaur, a psychotherapist from Glasgow, told Daily Mail that such behavior could demonstrate “deeper problems” for Replika users.

“Many people argue that their abuse is insignificant because chatbots are mere machines and can't feel harmful,” Kamalin said.

“Some people may argue that expressing anger towards AI provides treatment or cathartic release. However, from a psychological perspective, this form of 'vent' is a form of emotional regulation or It does not promote personal growth,” continued the cognitive behavioral therapy practitioner.

“When aggression enters an acceptable mode of interaction, whether AI or human, weakens the ability to form healthy, empathetic relationships.”

Chelsea-based psychologist Elena Turoni agreed, saying the way humans interact with bots can demonstrate real-world behavior.

“Abuse of AI chatbots can help individuals with a variety of psychological functions,” Turoni said. “Some people use it to explore the dynamics of forces that don't act in real life.”

“However, engaging in this type of behavior can help strengthen unhealthy habits and help individuals escape from harm.”

As critics responded, many fellow Reddit users agreed to the experts. This permeates real life. That's not good for you or others. ”

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News