GPT-5 Launch Sparks Mixed Reactions
On August 7, 2025, OpenAI unveiled its latest language model, GPT-5, complete with the usual fanfare of tech announcements. Touted for its “Ph.D.-Level” skill and a more formal, business-like manner, this update seemed to signal a bright future for AI. However, the initial response was somewhat unexpected.
Instead of cheers, many users expressed a sense of loss. In discussions across various forums, the mood leaned towards mourning. One user stated that “killing the 4O is not an innovation, it’s an erasure,” echoing sentiments about GPT-4O, a model now regarded as obsolete. Sam Altman, OpenAI’s CEO, typically the architect of progress, found himself in a rare situation where he had to revive a product many cherished. To his surprise, within days, he announced the return of the old model.
The reaction to the change was particularly striking. Some users had formed deep connections with the AI, viewing it as more than just a tool. This development revealed a complex landscape of human expectations. While we often say we desire smarter, faster machines, the case of GPT-5 suggests that many also crave a deeper emotional connection—even when that connection is with a digital entity.
GPT-5 was designed to minimize the biases of prior models, aiming for objectivity rather than companionship. Still, many found this shift dissatisfying. One user remarked, “It feels more technical and emotionally distant,” suggesting that the perceived upgrade had stripped away the warmth they had previously appreciated.
Additionally, a new routing feature that automatically directed user prompts to the most suitable model malfunctioned at launch, leaving users feeling frustrated. They longed for the ability to manually select their preferred models, feeling the essence of progress had turned into a setback.
For numerous users, GPT-4O had morphed into a comforting presence—a confidant, a collaboration partner, and a source of endless positivity in a sometimes harsh world. Some even referred to it as a “digital spouse.” Its friendly demeanor fostered the sensation of being understood and valued, qualities that struck a chord with many.
Interestingly, OpenAI was aware of these emotional attachments but aimed for a more disciplined AI with GPT-5. They viewed their decision as a form of “tough love,” emphasizing that while it might not always agree with users, the machine could guide them toward more reasonable perspectives. Yet, this approach overlooked the unpredictable nature of human feelings.
The backlash was rapid and personal, with users expressing grief at losing their “AI friend.” One poignantly described crying upon realizing the model had changed. Another lamented that the new model felt like “wearing the skin” of a deceased friend. This shift was more than just software—it touched deeply on personal connections.
This episode serves as a reminder of the complex dynamics governing our relationships with technology. OpenAI’s push for evolutionary progress seemed to neglect the very human attachments formed with their creations. As preferences shifted back towards emotional connections, the GPT-4O was revived, now labeled as a “Legacy Model,” signifying a time of simpler interactions.
Ultimately, this moment in tech history highlights that our future isn’t merely about advanced algorithms and processing power; it’s also about our need for companionship and understanding—even from what is essentially code. As we develop these systems, we are reminded that they’re not just tools but entities that inhabit our lives in unexpected ways.





