SELECT LANGUAGE BELOW

Therapists are being discovered using AI with their patients

Therapists are being discovered using AI with their patients

Therapists and AI: A New Twist in Mental Health Care

Traditionally, therapists have met with patients face-to-face in very personal settings. But since the Covid pandemic led to widespread closures, many have begun to adapt to online consultations, making them more common. This shift toward a more virtual world has significantly impacted the mental health sector.

Online therapy platforms, like Talkspace and BetterHelp, have emerged, allowing patients to connect with licensed therapists through video calls. This means that even if a patient and therapist aren’t in the same room, meaningful conversations can still take place.

However, it seems that some therapists might not be as above board as one would hope.

A recent MIT Technology Review report brought some surprising revelations from clients of online therapy. One patient, Declan, experienced difficulty connecting with his therapist during a session, prompting them to turn off the video feed. During this period, the therapist inadvertently shared his screen, exposing that he was using ChatGPT to generate his responses.

According to Declan, “He takes what I was saying, puts it into ChatGPT, and then summarizes or cherry-picks the answer.” He remarked that he thought his perspective might be too rigid, and the therapist seemed to agree, turning it into a rather peculiar session.

While Declan’s experience was quite blatant, others have reported more subtle signs of dishonesty from their therapists.

In her article for MIT Tech Review, Laurie Clark shared her own unsettling encounter with her therapist. After receiving an unusually polished email with distinct formatting and well-structured responses, she started to analyze it. The email alarmed her, as it felt too refined. Upon confronting the therapist, she learned that they had indeed composed the email using ChatGPT.

Clark reflected on how her initial positive feelings quickly faded, leaving her with disappointment and a sense of distrust towards her therapist.

Another client shared that her therapist sent a comforting message in response to her pet’s death, but the presence of an AI-generated prompt at the top of the message was hard to overlook. “This is a more human and heartfelt version with a gentle, conversational tone,” it read.

Given these experiences, an increasing number of individuals are turning directly to chatbots for assistance, bypassing traditional therapists altogether. However, not all mental health professionals are on board with this trend.

For instance, the president of the Australian Psychological Association raised concerns about using AI for therapy, stating, “Algorithms, regardless of their sophistication, cannot replace the unique space created between two individuals.” Sarah Quinn emphasized that while AI can mimic human communication, it fundamentally lacks authentic connection.

The American Psychological Association points to various studies indicating that while chatbots may offer some tools for support, they can’t effectively replace human therapists, who provide tailored, professional mental health care. There’s also worry that reliance on chatbots might contribute to harmful stigmas surrounding mental health issues.

When asked whether AI chatbots could ever replace human therapists, one response summed it up nicely: “AI can offer support and guidance, but it is not a substitute for genuine therapists.”

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News