SELECT LANGUAGE BELOW

Specialists warn that ‘AI mess’ and poor leadership are leading us toward ‘worldwide turmoil.’

Specialists warn that 'AI mess' and poor leadership are leading us toward 'worldwide turmoil.'

A Ticking Boom?

The Bulletin of Atomic Scientists has moved the Doomsday Clock forward to 2026, indicating there are now 85 seconds to midnight. This represents the closest humanity has been to potential disaster in the organization’s 79-year history.

“The Science and Security Committee believes humanity hasn’t made enough progress on the existential threats we all face, hence the adjustment,” stated Alexandra Bell, the agency’s chief executive.

Steve Fetter, a member of the committee, expressed that their decision highlights the growing risks of catastrophe linked to current trends, including imminent dangers like nuclear conflict, pandemics, and the evolving landscape of artificial intelligence.

Established in 1945 by notable figures like Albert Einstein and J. Robert Oppenheimer, the Bulletin created the Doomsday Clock two years later to symbolize the proximity of humanity to self-destruction.

Initially set at seven minutes to midnight in 1947, the clock ticked at 89 seconds to midnight last year. This new setting reflects an increase of four seconds over the past twelve months.

However, the intention behind this clock isn’t just to mark a countdown to disaster. It serves as a motivator to tackle “the world’s most urgent man-made existential threats.”

Despite this, many fear that the news indicates dark times are looming. The Bulletin has pointed to a global “leadership failure” as a key factor in pushing the world closer to catastrophe.

Daniel Holtz, a Bulletin member and professor at the University of Chicago, noted that major powers have become more aggressive and nationalistic. Tensions rose in 2025 due to military actions involving nuclear-capable nations.

Conflicts intensified over the summer, particularly between Pakistan and India, where border clashes raised concerns of full-scale war. Additionally, earlier this month, Russia launched a nuclear-capable missile into Ukraine amid rising tensions.

Holtz raised concerns about the impending expiration of the New START Treaty, which helps limit strategic nuclear weapons. He stressed that this could lead to an uncontrolled nuclear arms race for the first time in over fifty years.

According to this coalition of experts, nuclear war isn’t the only danger humanity faces. Maria Pressa, a journalist and Nobel Peace Prize laureate, warned that the emergence of “generative AI” has incited “global chaos,” largely due to the rapid spread of false information.

She pointed out that AI failures can lead to various forms of fraud and contribute to a breakdown of the information ecosystem. Moreover, there’s a risk that such technology could be utilized to engineer deadly pathogens.

Fetter highlighted these concerns, stating, “One clear risk is using AI to create new diseases that don’t naturally occur and for which there are no defenses.” He also noted the dangers of AI in military applications, especially concerning lethal decision-making.

Computer scientists Eliezer Yudkowsky and Nate Soares have cautioned in their book, “Somebody Builds It, Everybody Dies: Why Superhuman AI Will Kill Us All,” that humanity could face extinction due to synthetic viruses and other threats if we fail to shut down these technologies.

On a hopeful note, members of the Bulletin believe it’s still possible to reverse the grim trajectory indicated by the Doomsday Clock. Pressa emphasized that while “it’s ticking, there’s still time to take action.” She called for technology platforms to prioritize human rights and for AI governance to focus on safety rather than speed.

“Information integrity is crucial for democracy; we can’t effectively function when our operating systems are flawed,” she added, stressing that genuine change can’t happen if many people remain skeptical about these pressing issues.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News