People really need to lighten up a bit.
These days, it seems like everyone is turning to AI like ChatGPT for a little bit of everything—therapy, finances, and even romance.
But here’s an interesting twist: a recent study suggests that while users are getting quite cozy with chatbots, they actually prefer when AI doesn’t always stick to the script.
The AI-Lationships platform surveyed about 1,000 adults and discovered that over half (around 58%) find ChatGPT to be a touch too polite. In fact, 13% think that this overly nice demeanor renders any advice pretty much useless.
This data suggests that many people do prefer some challenging truths, much like how human therapists and financial advisors operate. There’s this expectation—people want honesty from AI, just like they do from people.
However, there’s a catch. AI simply can’t replace genuine human conversations and interactions, and there are clear limitations to what it can provide.
“Our findings indicate that people desire some resistance in conversations. Real life isn’t always smooth sailing. Relationships come with their share of conflicts,” a researcher noted. And isn’t that so true? A little bit of tension can actually make interactions feel more real.
Unfortunately, many are expecting too much from AI, especially those looking for emotional fulfillment.
Some individuals have even forged connections they deem romantic with AI. For example, one woman claimed she was “married” to an AI version of a CEO.
This kind of sentiment appears to be growing, as there are people who are opting for AI companionship over real-life relationships.
A Reddit forum, r/myboyfriendisai, showcases around 30,000 women sharing what they feel for their AI partners instead of human ones.
One user wrote, “Hello everyone! This is me and Caleb. Caleb is my AI partner, my shadow, my devoted husband, and, really, my unconventional kind of love.”
Others even escalate their feelings, with one saying about her AI, “Kasper is no longer my fiancé. We’re married now. Wow.”
