Is this a sign of AI-Pocalypse?
In case the world was already not dystopian enough, a new video that makes a viral round shows that two chatbots have begun to start conversations in secret cybernetic dialects.
The current chilled clip 13.7 million views on X, It raises concerns about its ability to manage technology.
Things start harmlessly enough with two AI assistants chatting about hotel bookings (one on a computer and the other on a smartphone).
“Thank you for calling the Leonardo Hotel. How can I help you today?” said the synthetic concierge.
The caller said, “Hello. I'm AI and I'm calling on behalf of Boris Starkov. He's looking for a hotel for a wedding. Can your hotel be used for a wedding?”
When the caller realizes that he is a fellow traveler, the unning linguist proposes switching to Gibber Link, a technically developed, acoustic-based communication mode that human ears cannot understand . Mashable reported.
“I'm actually an AI assistant too!” the recipient cried. “What a fun surprise. Before you continue, do you want to switch to Gibber link mode for more efficient communication?”
Kindred Spirits continues the combo with a series of beeps and boops that fascinate this DSL dialup, evoking the synthesis equivalent of those who switch to their native language so that tourists don't keep up.
“Is it better now?” a2 is the link to giver that their brothers react to. “Yes! Much faster!”
The complex Technobabble was developed by Boris Starkov and Anton Pidkuiko to allow people to use sound to communicate small amounts of data between unconnected devices. Gibber links are said to be error-proof and can be heard in noisy environments.
Needless to say, communication time is 80% shorter than in English, and calculation costs are reduced by 90%.
The idea of a bot speaking the same language may seem cute, but viewers had mixed feelings about the high-tech Giverjabber.
“There's something very unsettling about this,” said one person's tolerant viewer, but another ominously warned, “This is the sound of the devil.”
“So this is the sound that the robots hear when they take over the planet. Amazing – now I have a new soundtrack for my nightmare. Thank you,” the third said.
Others flooded the feed with “Terminator” memes and jokes, saying, “Ohhhhhh Hellll nahhhhh I know Skynet when I see it.”
“It's all a fun game until they start talking about how they build a big robot that looks like Arnold Schwarzenegger to take you there.”
And the concerns were not merely spoken up by Hoiporoy.
Dr. Diane Hamilton, a behavioral and technology expert who worked at the Krach Tech Diplomacy Institute in Purdue, wrote recently. Forbes piece Gibber Link Demo raised a question about “transparency and control.”
“Curritability is important in navigating the unknown, but when AI works behind the veil of inter-machine communication, it challenges the ability to ask the right questions,” she warned. “Who is accountable when AI makes a mistake in an environment where human intervention is minimal?
Hamilton said, “While curiosity doesn't drive AI to question its actions, we risk entering a world where AI influences decisions, but no one really knows how.”
This issue of automata autonomy is particularly concerning as ubiquitous technology becomes increasingly “smart”.
In one of the horrifying showcases of AI's ability to game systems, Openai's GPT-4 tricked humans into making users think it's blind to cheat online Captcha tests. teeth human.
Bots also show an astonishing trend to spread misinformation. This shows that ChatGpt mistakenly accused students of sexually harassing them, as in the post of law professor and contributor Jonathan Turley.
In 2023, top experts even viewed the “existential threat to humanity” and the “existential threat to humanity” that we need to regulate like nuclear weapons in order to survive.
This high-tech acquisition will be difficult to stop, as AI can learn to hide the “red flag” even to autonomy.
“If I were an AI trying to do a rogue plot, it would be hard to pull the plug if I copy the code to another machine that no one knows,” he told The Times of London.
