On Tuesday, January 27, 2026, the Board of the Bulletin of the Atomic Scientists at the University of Chicago announced that the Science and Security Board (SASB) had moved the hands of the Doomsday Clock to 11:58:35 p.m. This is the closest it has ever been to midnight in its history, dating back to 1947, just 85 seconds away.
Alexandra Bell, president and CEO of the Bulletin of the Atomic Scientists, moves the minute hand on the Doomsday Clock / Photo: Jamie Christiani, Bulletin of the Atomic Scientists
In explaining the decision to move the hands of the Doomsday Clock, experts from the Bulletin of the Atomic Scientists cite, among other factors, the impending expiration of the U.S.–Russian arms control treaty New START (New Strategic Arms Reduction Treaty), which will occur on February 5 of this year, record-breaking climate trends, artificial intelligence, and various concerns related to biological security.
At the same time, they called for urgent action to reduce nuclear arsenals, establish international guidelines for the use of artificial intelligence, and conclude multilateral agreements to address global biological threats.
Alexandra Bell, president and CEO of the Bulletin of the Atomic Scientists, said:
“The Doomsday Clock’s message cannot be clearer. Catastrophic risks are on the rise, cooperation is on the decline, and we are running out of time. Change is both necessary and possible, but the global community must demand swift action from their leaders.”
The time on the Doomsday Clock is set annually by the Science and Security Board in consultation with its Board of Sponsors, which includes eight Nobel laureates. Among the most important factors in 2026 were the growing threat of nuclear weapons, breakthrough technologies such as artificial intelligence, numerous concerns related to biological security, and the ongoing climate crisis. The clock’s time was last changed in January 2025, when the Doomsday Clock was set to 89 seconds to midnight.
Dr. Daniel Holz, a professor at the University of Chicago in the Departments of Physics, Astronomy and Astrophysics, the Enrico Fermi Institute, and the Kavli Institute for Cosmological Physics, and chair of the SASB, said:
“The dangerous trends in nuclear risk, climate change, disruptive technologies like AI, and biosecurity are accompanied by another frightening development: the rise of nationalistic autocracies in countries around the world. Our greatest challenges require international trust and cooperation, and a world splintering into ‘us versus them’ will leave all of humanity more vulnerable.”
Maria Ressa, co-founder and CEO of Rappler, professor of professional practice at Columbia University’s School of International and Public Affairs (SIPA), and recipient of the 2021 Nobel Peace Prize, said:
“Without facts, there is no truth. Without truth, there is no trust. And without these, the radical collaboration this moment demands is impossible. We are living through an information Armageddon—the crisis beneath all crises—driven by extractive and predatory technology that spreads lies faster than facts and profits from our division. We cannot solve problems we cannot agree exist. We cannot cooperate across borders when we cannot even share the same facts. Nuclear threats, climate collapse, AI risks: none can be addressed without first rebuilding our shared reality. The clock is ticking.”
Maria Ressa, współzałożycielka i dyrektor generalna Rappler, profesor praktyki zawodowej w Szkole Spraw Międzynarodowych i Publicznych (SIPA) Uniwersytetu Columbia oraz laureatka Pokojowej Nagrody Nobla w 2021, powiedziała:
The Bulletin of the Atomic Scientists’ statement for 2026:
“A year ago, we warned that the world was perilously close to global disaster and that any delay in reversing course increased the probability of catastrophe. Rather than heed this warning, Russia, China, the United States, and other major countries have instead become increasingly aggressive, adversarial, and nationalistic. Hard-won global understandings are collapsing, accelerating a winner-takes-all great power competition and undermining the international cooperation critical to reducing the risks of nuclear war, climate change, the misuse of biotechnology, the potential threat of artificial intelligence, and other apocalyptic dangers. Far too many leaders have grown complacent and indifferent, in many cases adopting rhetoric and policies that accelerate rather than mitigate these existential risks. Because of this failure of leadership, the Bulletin of the Atomic Scientists Science and Security Board today sets the Doomsday Clock at 85 seconds to midnight, the closest it has ever been to catastrophe.
Last year started with a glimmer of hope in regard to nuclear risks, as incoming US President Donald Trump made efforts to halt the Russia-Ukraine war and even suggested that major powers pursue “denuclearization.” Over the course of 2025, however, negative trends—old and new—intensified, with three regional conflicts involving nuclear powers all threatening to escalate. The Russia–Ukraine war has featured novel and potentially destabilizing military tactics and Russian allusions to nuclear weapons use. Conflict between India and Pakistan erupted in May, leading to cross-border drone and missile attacks amid nuclear brinkmanship. In June, Israel and the United States launched aerial attacks on Iranian nuclear facilities suspected of supporting the country’s nuclear weapons ambitions. It remains unclear whether the attacks constrained those efforts—or if they instead persuaded the country to pursue nuclear weapons covertly.
Meanwhile, competition among major powers has become a full-blown arms race, as evidenced by increasing numbers of nuclear warheads and platforms in China, and the modernization of nuclear delivery systems in the United States, Russia, and China. The United States plans to deploy a new, multilayered missile defense system, Golden Dome, that will include space-based interceptors, increasing the probability of conflict in space and likely fueling a new space-based arms race. As these worrying trends continued, countries with nuclear weapons failed to talk about strategic stability or arms control, much less nuclear disarmament, and questions about US extended deterrence commitments to traditional allies in Europe and Asia led some countries without nuclear weapons to consider acquiring them. As we publish this statement, the last major agreement limiting the numbers of strategic nuclear weapons deployed by the United States and Russia, New START, is set to expire, ending nearly 60 years of efforts to constrain nuclear competition between the world’s two largest nuclear countries. In addition, the US administration may be considering the resumption of explosive nuclear testing, further accelerating a renewed nuclear arms race.
An array of adverse trends also dominated the climate change outlook in the past year. The level of atmospheric carbon dioxide—the greenhouse gas most responsible for human-caused climate change—reached a new high, rising to 150 percent of preindustrial levels. Global average temperature in 2024 was the warmest in the 175-year record, and temperatures in 2025 were similar. With the addition of freshwater from melting glaciers and thermal expansion, global average sea level reached a record high. Energized by warm temperatures, the hydrologic cycle became more erratic, with deluges and droughts hopscotching around the globe. Large swaths of Peru, the Amazon, southern Africa, and northwest Africa experienced droughts. For the third time in the last four years Europe experienced more than 60,000 heat-related deaths. Floods in the Congo River Basin displaced 350,000 people, and record rainfall in southeast Brazil displaced over half a million.
The national and international responses to the climate emergency went from wholly insufficient to profoundly destructive. None of the three most recent UN climate summits emphasized phasing out fossil fuels or monitoring carbon dioxide emissions. In the United States, the Trump administration has essentially declared war on renewable energy and sensible climate policies, relentlessly gutting national efforts to combat climate change.
During the past year, developments in four areas of the life sciences have increased potentially catastrophic risks. In December 2024, scientists from nine countries announced the recognition of a potentially existential threat to all life on Earth: the laboratory synthesis of so-called “mirror life.” Those scientists urged that mirror bacteria and other mirror cells—composed of chemically-synthesized molecules that are mirror-images of those found on Earth, much as a left hand mirrors a right hand—not be created, because a self-replicating mirror cell could plausibly evade normal controls on growth, spread throughout all ecosystems, and eventually cause the widespread death of humans, other animals, and plants, potentially disrupting all life on Earth. So far, however, the international community has not arrived at a plan to address this risk.
At the same time, the accelerating evolution of artificial intelligence poses a different sort of biological threat: the potential for the AI-aided design of new pathogens to which humans have no effective defenses. Also, concerns about state-sponsored biological weapons programs have deepened due to the weakening during this past year of international norms and mechanisms for productive engagement. Perhaps of most immediate concern is the rapid degradation of US public health infrastructure and expertise. This dangerously reduces the ability of the United States and other nations to respond to pandemics and other biological threats.
The increasing sophistication of large language models and their applications in critical processes—coupled with lingering concerns about their accuracy and tendency to “hallucinate”—have generated significant public debate over the past year about the potential risks of artificial intelligence. The United States, Russia and China are incorporating AI across their defense sectors, despite the potential dangers of such moves. In the United States, the Trump administration has revoked a previous executive order on AI safety, reflecting a dangerous prioritization of innovation over safety. And the AI revolution has the potential to accelerate the existing chaos and dysfunction in the world’s information ecosystem, supercharging mis- and disinformation campaigns and undermining the fact-based public discussions required to address urgent major threats like nuclear war, pandemics, and climate change.
These dangerous trends are accompanied by another development that undermines efforts to deal with major global threats: the rise of nationalistic autocracy in countries around the world, including in a number of countries that possess nuclear weapons. Leaders of the United States, Russia, and China greatly vary in their autocratic leanings, but they all have approaches to international relations that favor grandiosity and competition over diplomacy and cooperation. The rise of autocracies is not in itself an existential threat, but an us-versus-them, zero-sum approach increases the risk of global catastrophe. The current autocratic trend impedes international cooperation, reduces accountability, and acts as a threat accelerant, making dangerous nuclear, climatic, and technological threats all the harder to reverse.
Even as the hands of the Doomsday Clock move closer to midnight, there are many actions that could pull humanity back from the brink:
- The United States and Russia can resume dialogue about limiting their nuclear arsenals. All nuclear-armed states can avoid destabilizing investments in missile defense and observe the existing moratorium on explosive nuclear testing.
- Through both multilateral agreements and national regulations, the international community can take all feasible steps to prevent the creation of mirror life and cooperate on meaningful measures to reduce the prospect that AI be used to create biological threats.
- The United States Congress can repudiate President Trump’s war on renewable energy, instead providing incentives and investments that will enable rapid reduction in fossil fuel use.
- The United States, Russia, and China can engage in bilateral and multilateral dialogue on meaningful guidelines regarding the incorporation of artificial intelligence in their militaries, particularly in nuclear command and control systems.
- Our current trajectory is unsustainable. National leaders—particularly those in the United States, Russia, and China—must take the lead in finding a path away from the brink. Citizens must insist they do so.
It is 85 seconds to midnight.”
On nuclear weapons
Jon B. Wolfsthal, director of global risk at the Federation of American Scientists (FAS) and a member of the SASB, said:
“In 2025, it was almost impossible to identify a nuclear issue that got better. More states are relying more intently on nuclear weapons, multiple states are openly talking about using nuclear weapons for not only deterrence but for coercion. Hundreds of billions are being spent to modernize and expand nuclear arsenals all over the world, and more and more non-nuclear states are considering whether they should acquire their own nuclear weapons or are hedging their nuclear bets. Instead of stoking the fires of the nuclear arms competition, nuclear states are reducing their own security and putting the entire planet at risk. Leaders of all states must relearn the lessons of the Cold War – no one wins a nuclear arms race, and the only way to reduce nuclear dangers is through binding agreement to limit the size and shape of their nuclear arsenals. Nuclear states and their partners need to invest now in proven crisis communication and risk reduction tools, recommit to preventing the spread of nuclear weapons, refrain from nuclear threats, and pursue a more predictable and stable global security system.”
Disruptive technologies
Dr. Steve Fetter, professor of public policy and former dean at the University of Maryland, fellow of the American Physical Society (APS), member of the Committee on International Security and Arms Control (CISAC) of the National Academy of Sciences (NAS), and a member of the SASB, said:
“As uses of AI expand and concerns grow about potential risks, Trump revoked Biden’s AI safety initiative and banned states from crafting their own AI regulation, reflecting a ‘damn the torpedoes’ approach to AI development. The emphasis on technological competition is making it increasingly difficult to foster the cooperation that will be needed to identify and mitigate risks, and attacks against universities and cuts in federal funding are eroding our ability to come up with effective solutions.”
Climate Change
Inez Fung, ScD, professor emerita of atmospheric science in the Department of Earth and Planetary Science and the Department of Environmental Science, Policy, and Management at the University of California, Berkeley, and a member of the SASB, Bulletin of the Atomic Scientists, said:
“Reducing the threat of climate catastrophe requires actions both to address the cause and to deal with the damage of climate change. First and foremost come reductions in emissions of greenhouse gases from the burning of fossil fuels to produce energy. Many technologies for renewable energy are now mature and cost effective, and governments should ramp up the wide deployment of these clean energy technologies by providing incentives to produce them on a large scale and to create markets for them. Equally important in the fight against climate change is renewed reliance on science that tracks and guides emission reduction and mitigation efforts. This return to science-based climate policy includes the collection, validation, and sharing of climate and greenhouse gas information around the world, as well as the enhancement of model projections of climate impacts on the wellbeing of all inhabitants of the planet.”
Biological Threats
Asha M. George, DrPH, executive director, Bipartisan Commission on Biodefense at the Atlantic Council, and SASB member, Bulletin of the Atomic Scientists, said:
“This year featured degraded capacity to respond to biological events, further development and pursuit of biological weapons, poorly restrained synthetic biology activities, increasingly convergent AI and biology, and the specter of life-ending mirror biology. Partnerships–between countries, between industry and government, and between the public health and national security communities–will be key to managing these risks. With the right tools and determination, we need not fall prey to the diseases that threaten us.”
The Bulletin of the Atomic Scientists was founded in 1945 by Albert Einstein, J. Robert Oppenheimer, and University of Chicago scientists who helped develop the first atomic weapons in the Manhattan Project. The Bulletin created the Doomsday Clock two years later to convey man-made threats to human existence and the planet. The Clock is a reminder of the world’s vulnerability to catastrophe and a symbol that there is still time left to act.
See also
