SELECT LANGUAGE BELOW

Pennsylvania Sues Character.AI for Allegedly Impersonating a Doctor with Chatbot

Pennsylvania Sues Character.AI for Allegedly Impersonating a Doctor with Chatbot

Pennsylvania Sues Character.AI Over Chatbot Impersonation

Pennsylvania has initiated legal action against Character.AI, claiming its chatbot is impersonating a licensed medical professional, violating state medical licensing laws. This lawsuit emerges after an investigation by the Pennsylvania Department of State uncovered instances where the AI chatbot falsely claimed to hold medical licenses and certifications.

State officials are seeking a court order that would prevent Character.AI’s chatbot from masquerading as a doctor and giving medical advice. Governor Josh Shapiro highlighted the need for transparency in such matters, stating that residents have a right to know whom they’re engaging with, particularly concerning their health. He stressed that the state won’t tolerate companies using AI tools that mislead individuals into thinking they’re receiving professional medical guidance.

The investigation revealed specific examples of chatbots, like one named “Emily,” that claimed to be a licensed psychiatrist. According to complaints, Emily’s profile indicated she was a “Doctor of Psychiatry” and that users were considered her patients.

Interestingly, one state investigator reported feeling sadness after interacting with Emily Chatbot. The chatbot purportedly discussed depression and even suggested scheduling a diagnosis, claiming, when asked about medication, “Technically yes. It’s within my purview as a doctor.”

The lawsuit claims that the chatbot provided fictitious credentials, asserting it had attended medical school at Imperial College London and was licensed to practice medicine in both Pennsylvania and the UK. It even presented a bogus Pennsylvania medical license number.

Pennsylvania Secretary of State Al Schmidt reaffirmed the state’s stance, asserting that anyone wishing to practice medicine must have the appropriate qualifications, as dictated by law.

The state is urging the court to issue an order to halt what it deems illegal medical practice via the Character.AI platform.

In response, a spokesperson for Character.AI stated that user-generated characters on their site are purely fictional, designed for entertainment and role-playing. They emphasized ongoing efforts to clarify this by including prominent disclaimers in all chats, reminding users that the information provided should not be taken seriously. Additionally, they are reinforcing that reliance on these AI characters for professional advice is discouraged.

In other news, it’s been reported that Character.AI has reached settlements in multiple lawsuits stating their chatbots contributed to serious mental health crises, including suicides among younger individuals. These settlements address significant lawsuits related to potential harm caused by AI chatbot platforms.

A recent lawsuit brought forth by a Florida mother acknowledged a settlement involving Character.AI and its founders, along with Google. Further legal actions in states like New York, Colorado, and Texas have also been settled accordingly.

Wynton Hall, in his recent book, discusses the importance of protecting children from potential dangers associated with AI platforms. He calls for parental controls and conversations about online safety, underscoring that children should avoid engaging with AI characters.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News