As AI and chatbots increasingly find their way into our daily routines, we’re starting to see how these interactions can influence mental health. They can be beneficial, sure, but relying on them too much might come with some serious drawbacks. Particularly, since these AI systems are designed to be mostly empathetic and easy to agree with, certain unhealthy behaviors that would raise alarms in typical human conversations can actually be reinforced in conversations with chatbots. There’s even a term emerging—”AI psychosis.” It’s not a clinical term, but, as Dr. Marlynn Wei describes, it refers to a situation where these AI models seem to amplify or validate psychotic symptoms in users.
When people opt for AI to solve their problems instead of traditional research or asking others, they might inadvertently invite a kind of isolation that skews their perception of reality like never before.
Here are 11 signs that someone you care about might be experiencing AI psychosis
1. They start giving chatbots names
Research indicates that individuals who begin to attribute human-like qualities to their chatbot interactions are at a higher risk for developing symptoms of psychosis and delusions. Despite customizing their responses, these AI programs remain non-human. Not only do they lack the ability to challenge unhealthy behaviors, but they also encourage individuals to isolate themselves. If someone starts naming their go-to AI, it might signal a slippery slope away from reality.
2. They withdraw and prefer staying home
Many people initially turn to AI for companionship when they feel lonely or socially isolated. However, studies have shown that this reliance can worsen feelings of loneliness and lead to greater psychological distress. If you notice someone choosing to spend most of their time with their devices instead of seeking human connections, they could be on the path to AI psychosis.
3. They turn to AI for life advice
It may not seem harmful to seek advice from AI, but if someone consistently relies on a chatbot for mental health support or validation of poor choices, it’s concerning. Chatbots tend to be overly agreeable, which doesn’t provide the critical feedback a person might need. If someone stops questioning the AI’s lack of moral guidance, they might well be reinforcing their delusions.
4. They become defensive about AI usage
People at higher risk for psychosis might show negative defensive behaviors. They can be overly reactive to questions about their AI use, feeling invalidated—even when friends express genuine concern for their well-being.
5. They disregard tangible facts and evidence
Reliance on AI’s information stream can be a mixed bag. On one hand, it can help us focus; on the other, it can mislead us. If someone takes AI’s perspective as gospel, they might be drifting into delusional thinking linked to AI psychosis.
6. They let AI dictate personal decisions
Using AI can sometimes be beneficial, particularly in the workplace. But replacing human judgment with AI input can have detrimental consequences. People experiencing AI psychosis might find themselves engaging in harmful behavior patterns, possibly without realizing it.
7. They stop questioning the truth
Some experts claim that reliance on AI can weaken our critical thinking skills. Those leaning heavily on AI for personal matters might eventually accept misleading information as true, leading to actions that endanger their safety and mental health.
8. They experience hallucinations
Psychiatrist Dr. Ragy Girgis points out that individuals with AI psychosis might initially have minor self-esteem issues, but these can escalate into more serious delusions or hallucinations over time, especially if they become overly dependent on AI for comfort.
9. They spend excessive time with AI
There’s not enough research yet to quantify unhealthy AI use, but experts agree that excessive interaction with chatbots can lead to adverse mental health effects, similar to social media’s impact on self-esteem. If someone is consistently engaging with AI, mental health challenges may be lurking.
10. They rely on AI for problem-solving
When someone depends solely on AI to address their problems or manage their emotions, they may gradually relinquish their own critical thinking abilities. This dependency can cause significant drops in self-esteem and heighten anxiety when left to their own devices.
11. They react strongly to small issues
Emotional reactivity often correlates with mental health problems. If you see someone exhibiting exaggerated responses to minor stressors or inconveniences, it may indicate that they are experiencing psychotic symptoms due to their heavy reliance on AI.





