SELECT LANGUAGE BELOW

OpenAI warns people might become emotionally reliant on its ChatGPT voice mode

OpenAI warns that some people may become emotionally dependent on its realistic ChatGPT voice mode.

in Thursday ReportOpenAI has published information about the safety work it has done on its popular artificial intelligence tool, ChatGPT, and its new human-like voice mode.

OpenAI began rolling out GPT-4o to paying customers last week. CNN was first to report the news.

The company unveiled its latest technology in a demo in May, which can translate between two speakers during a real-time conversation and detect a person’s emotions based on a selfie they take.

The company said the new audio technology poses “new risks”, including speaker identification and fraudulent voice production.

The technology can respond to voice input in 232 milliseconds, which the company says is roughly the same as the response time of a human being during a conversation.

The company says the tool’s voice capabilities enable human-like interactions, raising the risk of anthropomorphism, the practice of making non-human entities behave like humans.

OpenAI said in its report that early tests used language that could suggest users were forming a connection with the technology.

“While these cases appear to be harmless, they highlight the need for continued research into how these effects may manifest over the longer term,” the company said.

Users may also form relationships with technology that reduce the need or desire for human interaction, and the company says that while this may benefit lonely people, it will impact “healthy relationships.”

The technology’s ability to complete tasks on users’ behalf and remember conversation details “can create both compelling product experiences and undue reliance and dependency,” OpenAI said.

The technology has been compared to the 2013 film “Her,” after actress Scarlett Johansson, who voiced the character in the film, took issue with one of OpenAI’s voices, saying it was “eerily similar” to her own.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News