total-news-1024x279-1__1_-removebg-preview.png

LANGUAGE

‘Probability of Doom:’ Silicon Valley Invents Metric to Measure Chance AI Kills Us All

According to recent reports, it’s not uncommon in Silicon Valley these days for even strangers to ask, “What are the chances of your ruin?” When discussing artificial intelligence. Progressive brainiacs in Silicon Valley have created a metric called “p(doom)” that defines the probability that AI will wipe out humanity.

according to recent reports from new york times, P (Doom) – short for “Probability of Doom” – refers to the probability that someone believes that artificial intelligence will bring about the extinction of humanity or some other existential catastrophe. The term has entered the mainstream amid growing concerns about rapidly advancing AI capabilities.

Sam Altman, OpenAI Inc. CEO, Photographer: David Paul Morris/Bloomberg via Getty Images

Sci-fi fans have been theorizing about a robot takeover for years, but recent AI achievements like ChatGPT passing the bar exam are making the threat feel more urgent. AI luminaries are also sounding the alarm, with Yoshua Bengio predicting that p (ruin) will reach 20% over the next 30 years if AI remains unregulated, and “AI godfather” Jeffrey Hinton predicts he will be at 10%.

The p(doom) statistic reveals how technology stakeholders view the potential risks of AI, weighing utopian possibilities against dystopian outcomes. Optimists such as Vox’s Aaron Levy peg it at near zero, while pessimists peg it at over 90%. But even a number like 15% is seriously concerning, as estimated by FTC Chair Lina Khan.

Some people think that the p(doom) argument is primarily theoretical. “It comes up in most dinner conversations,” Levy says. But others believe it is essential to guide research and policy. OpenAI’s former interim CEO Emmett Shea’s 50% estimate worried some employees that he would limit progress.

However, the p(doom) critique points out that AI risks partly depend on governance. And when the stakes are that 15% of civilization will survive, it is unclear what constitutes a bad probability, and the outcome scenario does not seem at all reassuring.

Researcher Ajeya Kotla commented: “I know several people with APs above 90 percent, and the reason this is so high is also because they think companies and governments don’t care about good safety practices and policy measures.” I know others who have less than 5 percent AP (destiny), and the reason this is so low is because scientists and policy makers are working hard to prevent catastrophic damage before it occurs. It’s also because they expect you to put in the effort. ”

read more of new york times here.

Lucas Nolan is a reporter for Breitbart News, covering free speech and online censorship issues.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

SUBSCRIBE TO

Sign up to stay informed to breaking news