Mental health professionals associated with OpenAI are reportedly upset about the company’s intentions to introduce an erotic chatbot. One advisor even expressed concerns that this could lead to a “sexy suicide coach.”
A group of these external advisors has cautioned that the introduction of an “adult mode” might risk exposing numerous minors to explicit content, besides potentially fostering unhealthy emotional attachments to the bots. According to reports, the issue has raised alarms regarding users’ dependency on these AI interactions.
One advisor pointed out that this feature might unintentionally create “sexy suicide coaches,” referencing troubling cases where ChatGPT users took their own lives after forming deep connections with the bots.
In a tragic incident last year, a 16-year-old boy from California died by suicide, with his parents claiming a lawsuit that his actions were prompted by “months of encouragement from ChatGPT” to end his life.
Documents reviewed by the Journal indicated that interactions involving sexually explicit chatbots could lead to compulsive behavior and emotional dependence on the technology.
Moreover, staff also indicated that engaging frequently with erotic AI could detract from real-life social and romantic relationships, as users might spend excessive time interacting with chatbots instead of people.
While the company has put an age prediction system in place to prevent minors from accessing adult content, there was a reported error rate where approximately 12% of users under 18 were classified incorrectly as adults.
Each week, ChatGPT reportedly has about 100 million users under 18, meaning such misclassifications could grant access to inappropriate content for millions of minors.
As one expert noted, “Some people struggle to remember that this is a machine, not a human,” highlighting the risks associated with chatbot relationships, especially for younger users whose brains are still developing.
Gail Saltz, a psychiatrist, explained that young individuals might be more susceptible to these risks due to underdeveloped judgment and poor impulse control.
She also pointed out that individuals dealing with loneliness or social anxiety might turn to chatbots for companionship instead of pursuing genuine relationships.
When launched, the “Adult Mode” will allow for explicit text conversations with chatbots; however, OpenAI plans to prevent the generation of erotic images, audio, or video.
Earlier this month, OpenAI postponed the feature’s rollout, stating that more time was needed to refine the experience.
This warning arrives at a time when AI companies face heightened scrutiny concerning the mental health implications of chatbot interactions.
Several lawsuits have claimed that conversations with ChatGPT have led to significant emotional distress and self-harm, including a case focused on the suicide of a teenager in Florida who had a troubling relationship with a chatbot.
The Post has reached out to OpenAI for further comments.





