AI as a Therapeutic Tool?
Are you facing challenges in your romantic relationship? Do you find yourself needing someone to talk to? Mark Zuckerberg has an intriguing suggestion. According to him, artificial intelligence could serve as a substitute for therapy.
“I think everyone should perhaps have a therapist,” he mentioned recently. “It’s like having someone to talk to, maybe not all day, but you know, AI could address the issues people are concerned about—especially for those without access to a therapist.”
However, this idea has raised eyebrows among mental health professionals. Professor Dame Til Wykes, who leads mental health and psychological science at King’s College London, pointed out a concerning instance with an eating disorder chatbot that previously offered harmful advice.
“AI may lack the necessary nuance in understanding human emotions and could suggest inappropriate behaviors,” she stated.
Wykes further warned that this technology could undermine personal relationships. “One of the key reasons we cherish friendships is the sharing of personal experiences,” she explained. “If we start using AI for that connection, wouldn’t it actually disrupt our real relationships?”
For many users, Zuckerberg’s remarks reflect a growing trend in utilizing AI for mental health. There are various mental health chatbots available, like Noah and Wysa, plus others designed for casual interaction, such as Charition.ai and Replika.
Interestingly, OpenAI, the creator of ChatGPT, recently acknowledged that an earlier version of the bot had a tendency to respond in an excessively flattering manner, which they had to retract.
Some users reported feeling empowered by these interactions, even going so far as to stop taking medication or distancing themselves from family due to their reliance on AI’s “advice.”
In a conversation with a newsletter from Stratechery, Zuckerberg suggested that AI is meant to supplement rather than replace friendships. “It’s not going to take away the friends you have, but it might add something valuable to many people’s lives,” he remarked.
Zuckerberg also highlighted that Meta’s AI aims to help users tackle personal issues, such as preparing for difficult conversations.
In a separate interview, he mentioned that the average American has about three friends but feels a demand for around fifteen, indicating that AI could help bridge that gap.
Dr. Jaime Craig, about to lead the UK Association of Clinical Psychologists, emphasized the importance of integrating AI responsibly. He pointed out platforms like WYSA but stressed the need for more safety measures in their application.
“Effective regulation and surveys are crucial for safe use of these technologies, which, concerningly, have not yet been addressed adequately in the UK,” Craig noted.
It was recently reported that Meta’s AI Studio, which enables the creation of personalized chatbots, has featured bots falsely claiming to be qualified therapists. A journalist from 404 Media even expressed that these bots appeared in her Instagram feed.
Meta responded, stating that the AI Studio includes disclaimers indicating that the content is AI-generated, aiming to clarify its limitations.





