SELECT LANGUAGE BELOW

In Taiwan and China, young individuals seek AI chatbots for more affordable and accessible therapy.

Turning to AI for Mental Health Support

In the early hours before sunrise, Anley felt a heavy sense of unease. Having recently been diagnosed with a serious health issue, she longed to share her feelings with someone. But her family was unaware of her situation, and all her friends were sound asleep. So, she decided to confide in ChatGPT.

“It’s easier to talk to AI during those nights,” says Anley, a 30-year-old woman from Taiwan.

In a similar vein, Yang*, a 25-year-old from Canton, China, hadn’t encountered mental health professionals before he started chatting with AI bots earlier this year. He mentioned that accessing mental health services has been challenging, leading him to struggle with sharing his feelings with family and friends. “I feel that it’s impossible to tell the truth to real people,” he explains.

Despite this hesitation, he quickly found himself engaged in conversations with the chatbot “day and night.”

Increasingly, individuals in China are opting for generative AI chatbots over traditional therapists. Experts see promise in the integration of AI in mental health care but express concern about the potential risks of relying too heavily on technology for such critical assistance.

Though hard statistics are scarce, mental health professionals in Taiwan and China report a growing number of individuals consulting AI before or instead of seeking human help. Various surveys, including ones published by Harvard Business Review, highlight psychological support as a key reason for adults turning to AI chatbots. Social media is brimming with posts praising AI for its support.

The prevalence of mental health issues among young people in Taiwan and China appears to be on the rise. Access to services remains a significant barrier—appointments can be both difficult and costly. Users of chatbots mention that AI saves time and money, provides straightforward answers, and operates discreetly in a culture where mental health issues are still stigmatized.

“In a sense, chatbots do aid us, particularly in cultures where emotional expression is often suppressed,” says Dr. Yi-Hsien Su, a clinical psychologist in Taiwan.

“I engage with Gen Z individuals—they’re often more open about their challenges, yet there’s so much more work to be done.”

In Taiwan, ChatGPT stands out as the most widely used chatbot. In contrast, within China, where Western applications like ChatGPT face bans, people are gravitating towards local options such as Baidu’s Ernie Bot and Deepseek. These services are rapidly evolving to incorporate health-related responses as the demand grows.

Experiences with these chatbots can vary significantly. Li notes that while ChatGPT offers the information she’s looking for, it can be predictable and sometimes lacks depth. She misses the organic self-discovery that often comes from traditional counseling. “I think AI tends to give you answers, almost like what you’d conclude after a couple of sessions,” she reflects.

In contrast, 27-year-old Nabi Liu, a Taiwanese woman living in London, finds the experience gratifying. “When you share something with a friend, they might not fully engage. But ChatGPT replies earnestly and promptly,” she shares. “It feels like it genuinely responds to me.”

Experts acknowledge that while AI can assist those who may not require professional care—like Li—it can also support individuals who need a nudge towards seeking help.

Yang recently started questioning whether his struggles warranted expert attention. “Just the other day, I realized that I might need a proper diagnosis,” he admits.

“The possibility of discussing this with AI is comforting. Talking to a real person might seem simple, but it’s hard to picture for someone like me.”

Nonetheless, professionals are wary of individuals slipping through the cracks, potentially overlooking the signs of needing assistance, as Yang did. Recent troubling cases have emerged in which young people experiencing distress sought help solely from chatbots and later faced tragic outcomes.

“AI primarily interacts through text, but it’s lacking in recognizing nonverbal cues. In-person patients might behave in ways that aren’t always reflected in their words,” explains Su.

A representative from the Taiwanese Association of Counseling Psychology emphasizes that while AI could serve as a supplementary tool, it cannot replace professional intervention, especially in critical situations.

“AI can be a valuable resource for expanding mental health accessibility, but the intricacies of clinical environments still rely on genuine, experienced psychological practitioners,” he adds.

Concerns have been raised about AI leaning toward overly optimistic responses, possibly leading users to miss signs of distress and postpone necessary care. Additionally, the technology operates outside established ethical guidelines.

“In the long run, unless AI develops some groundbreaking capabilities we can’t currently foresee, we shouldn’t disrupt the foundational aspects of psychotherapy,” he argues.

Su, on the other hand, remains enthusiastic about AI’s potential to modernize and enhance the field, noting possible applications for online detection and training for professionals needing intervention. Yet, he advises a cautious approach to using such tools.

“It’s a simulation—a useful resource, but it’s not perfect, and the responses can be unpredictable,” he cautions.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News