Climate science matters in more ways than you might think. It has set the pace and targets for the most ambitious economic transformation since the Industrial Revolution: the transition to a carbon-free economy. Ever since the World Meteorological Organization (WMO) and the United Nations Environment Programme established the Intergovernmental Panel on Climate Change (IPCC) in 1988, climate data and models have been a global public good – an instrument of economic power with growing normative value. Climate targets are increasingly being enshrined in law and cited in jurisprudence.
Climate science is also a necessarily global discipline, because it uses mathematical physics to predict the combined behaviour of the planet’s atmosphere and ocean – two commons that know no borders. Over the last two decades, the field has expanded to incorporate hydrology, ecology, and biogeochemistry in an interdisciplinary earth-systems science that requires substantial infrastructure – from observational systems to monitor the state of the whole planet, to vast computational resources to integrate ever more sophisticated models.
It is a science well-suited for a globalized world, and climate scientists have long focused their attention on the agenda set by international institutions – from the WMO and the IPCC to the UN Framework Convention on Climate Change (UNFCCC), created in 1992 – to guide humanity toward decarbonization.
For many years, climate science faced serious threats, chiefly from politically and financially motivated efforts to discredit its central claims. But the “Merchants of Doubt”, as historians Naomi Oreskes and Erik M. Conway called them, have largely failed. UNFCCC negotiators have made steady progress, overcoming obstacles and recovering from setbacks. There are ongoing debates about what we ought to do, of course. But, by and large, policymakers and businesses have moved from paralysis to commitments, if not yet to large-scale action.
But now a different kind of threat is looming. International turmoil, rising authoritarianism and nationalism, and breaches of longstanding and widely agreed rules are pushing countries away from the international order that has prevailed since the collapse of the Soviet Union. Russia’s war of aggression in Ukraine is the most recent and glaring example in a broader series of geopolitical fractures that are complicating climate scientists’ work.
The danger is that in an increasingly competitive multipolar world, countries will rush to nationalize, consolidate, and silo off planetary observations and computational resources. Not only will the scientific agenda be fractured, but policymakers will start to view climate change through the narrower lens of national security and other parochial interests. Governments will ask what climate change, or the technological responses to it, will mean for their country and its adversaries, rather than for the planet more broadly.
As political boundaries become more salient in scientific pursuits, scientists, and the policymakers who support them, will need to address an important question: When geopolitics turns our scientific understanding of the planet into a competitive field with strategic value, how should scientific institutions adjust?
The old situational awareness
Contrary to popular belief, the roots of climate science lie not in contemporary environmentalism but in twentieth-century security concerns. Modern climate science emerged from specific national agendas and the contest for strategic advantage through superior knowledge of the commons. History is never so neat and predictable as to repeat itself; but, given today’s fracturing global order, scientists and policymakers should look to the past to see what could happen if knowledge about our planet’s workings once again becomes an instrument of geopolitics.
After all, earth-observing infrastructure is highly susceptible to competition. During the first few weeks of its war in Ukraine, Russia was denied access to planetary observations. Because weather information is critical to the use of biochemical weapons, the European Organization for the Exploitation of Meteorological Satellites suspended Russian licences to access its data. Understandably, the organization sacrificed its expressed commitment to open data to avoid providing any assistance to attacks on civilians. Yet, in doing so, it also weaponized a major earth-observation system.
Earth observations have a longstanding association with security. In 1939, a German U-Boat managed to enter the British Royal Navy’s base at Scapa Flow undetected, sinking the battleship HMS Royal Oak. Such events pushed the world’s navies to make surveilling the planet’s commons a principal objective. As a result, anti-submarine technology like sonar was deployed widely, transforming warfare in the process.
Effective surveillance depends on a deep understanding of the environment. For example, because sonar works by timing the echo of a reflected soundwave, it must account for factors like temperature and salinity gradients, which can distort its path. Reliable detection depends on knowing the density structure that sound will encounter as it travels. And that, in turn, is a function of surface and deep-water currents. Successful antisubmarine warfare thus depends on measuring the state of the ocean. As German Grand Admiral Karl Dönitz conceded in December 1943, “the enemy has rendered the U-boat war ineffective … not through superior tactics or strategy, but through his superiority in the field of science.”
Recognizing that science creates comparative advantages when the nuclear deterrent plunges beneath the waves, as it did during the Cold War, the United States and the Soviet Union both poured money into physical oceanography, laying the groundwork for the field as we know it today.
The same happened with observations of the atmosphere. After the Soviets surprised everyone by launching Sputnik 1 – the first man-made object to orbit Earth – investments in satellite-observing infrastructure accelerated, becoming an integral component of security arrangements for both sides in the Cold War. In that context, spaceborne remote sensing was the product of a US government programme to determine how weather phenomena affect radar. In the process, modern meteorology was transformed.
Eyes on the prize
As these examples show, monitoring infrastructure is often the first scientific domain where geopolitical competition plays out. Today, it is controlled by a mix of national and private actors. The commercial remote-sensing sector has vastly expanded spaceborne infrastructure that was once limited to a few high-quality, government instruments. And more countries are developing their own capabilities to survey the planet’s systems. China, for example, has its own high-resolution Earth observation system to support high-precision agriculture and ocean monitoring, and it has invested in a fleet of meteorological satellites to serve its partners in the Belt and Road Initiative.
These investments have digitized planetary observations, fuelling optimism that an age of plentiful data is at hand. But the new geopolitical competition raises the risk that knowledge of the commons will be turned into an instrument of hegemony. Governments have already recognized this potential. On December 15, 2016, China seized a submersible drone that the US Department of Defense claimed was gathering oceanographic data in the South China Sea. The vessel was picked up in international waters, which the US Navy described as unprecedented and illegal.
Moreover, since last year’s agreement between the US, the United Kingdom and Australia (AUKUS) to supply Australia with nuclear submarines, China has invested in underwater surveillance infrastructure to monitor contested waters in the region. With measurements of the commons once again becoming a matter of security strategy, scientists and companies – such as Microsoft, which recently launched a planetary computer to house and share monitoring data – will find themselves treading a fine line between the global environmental agenda and national-security interests.
Perhaps more important, the focus of our observations may change. With Russian and Chinese subs patrolling closer to the shores of the US and its allies, and vice versa, previously marginal maritime zones like the Mediterranean and Black Seas will become more central to international strategy.
Like data collection, forecasting and computer infrastructure can also become a highly contested domain. In the early stages of the war in Ukraine, the European Centre for Medium-Range Weather Forecasts suspended Russian access to its weather forecasting and climate-modelling products, because it recognized the tactical value of such information.
When Russia invaded, its aim was to take advantage of its superior equipment in open field combat. But Ukrainian forces had no intention of being drawn out of urban areas. They bunkered down, and then it started to rain. Russian tanks were forced onto roads, where they became easy targets. The Russian forces’ superiority evaporated, and the Ukrainians gained the tactical advantage. Knowing what will happen to environmental conditions can be a matter of life or death.
This has long been true. In fact, the modern relationship between national security and environmental forecasts began on November 14, 1854, when hurricane-force winds destroyed the British and French fleets blockading the Russians at Sevastopol during the Crimean War – a curious echo of current events. The first European weather forecasting system emerged from that experience. Initially, forecasters tried to divine the future by assessing how closely present conditions matched past weather maps, compiled by relying on the newly installed telegraph to communicate measurements from across the continent (an earlier form of observational infrastructure).
Then, in 1904 – a year before Einstein transformed quantum mechanics and proposed special relativity – the Norwegian scientist Vilhelm Bjerknes extended modern physics to the atmosphere and oceans. His equations described winds and currents as a coherent system governed by knowable laws. Forecasting no longer depended solely on stored observations: it relied on the ability to solve mathematical equations to predict the future.
By World War II, the transition to oil had increased the reach of fleets, aircraft carriers had replaced the dreadnought as capital ships, and fighting had moved from trenches to the seas and the sky. The world’s commons had become battlefields in a fully industrialized war, and strategists began incorporating environmental forecasting into military doctrine. Famously, weather and surf forecasts saved the D-Day landing from disaster. The science of forecasting was integral to victory.
After the war, policymakers’ focus shifted from securing superiority in combat to winning an elaborate strategic game to control the commons. According to a January 11 1946, story in The New York Times, officials in Washington, DC, were told of a computer designed specifically to solve Bjerknes’s equations. This extraordinary machine would “lift the veil from previously undisclosed mysteries connected with the science of weather forecasting”. Not by coincidence, the main audience for this announcement was the US military leadership.
The initiative was the brainchild of John von Neumann, the Manhattan Project mathematician and architect of Cold War game theory. His aim was simple: to increase the speed at which equations could be solved. That, he hoped, would provide forecasts for weeks or even months in advance about environmental conditions around the world, giving the US a tactical and strategic advantage. The promise of these new instruments was to make forecasting completely operational, in support of a new American hegemony over the skies and seas.
The science race
Throughout the Cold War, security and science were uncomfortable bedfellows. The former supplied money and a steady stream of problems for the latter to solve, ensuring that the field was both well-resourced and fully occupied. For example, in 1954, the Castle Bravo nuclear test in Micronesia (still the most powerful bomb ever detonated by the US) caused a fireball four miles (6.4 kilometres) in diameter and created a mushroom cloud 25 miles high. Castle Bravo was one of a regrettable sequence of nuclear tests in the Pacific, which nevertheless generated invaluable data and insights into the workings of the tropical atmosphere.
Thanks to government-sponsored efforts in observations and computation, by the end of the 1960s, the heirs of von Neumann’s legacy had been able to produce a self-contained, fully consistent computer model of the atmosphere and ocean. (Among them was Syukuro Manabe, the recipient of the 2021 Nobel Prize in Physics for his contributions to the field.)
In the final decade of the Cold War, scientific imperatives shifted as military interests came to focus more narrowly on the technological race between the US and the Soviet Union. Climate science turned toward a more explicitly civilian agenda, using increasingly sophisticated models to understand what governed the climate at the planetary level, and how the climate might change over time. This shift required vastly greater computational power, but it happened at a time when technological capacity was ready to meet the demand. Since the mid-1950s, computing power has increased by ten orders of magnitude, enabling the production of an enormous number of climate simulations.
Today’s geopolitical fractures are appearing at a time when the modelling infrastructure for earth science has never been more complex. As complementary metal-oxide semiconductor computing reaches its speed limits, computation has expanded to ever more processors, which has led to industrial-scale infrastructure, making it more the domain of dedicated institutions than of academic departments.
Private technology companies have also become important players in earth-system modelling, providing both substantial resources to encourage the digitization of planetary analysis (a new frontier in the so-called Fourth Industrial Revolution) and services in fields like artificial intelligence. For example, DeepMind, an Alphabet (Google) subsidiary, is collaborating with the UK Met Office to use AI in localized weather forecasts. Some scientists believe this industrialization will bring a step change in physical insights, a view that is driving substantial investment, such as the European Union’s support for the creation of a digital twin for Earth.
But the merger of industrial and scientific interests that was mostly benign in a globalized world risks turning into a zero-sum competition. Modelling capabilities will be increasingly integral to governments’ and companies’ efforts to evaluate climate changes, plan long-term investments in critical infrastructure, and understand strategic conduct. As this happens, countries aiming for influence or leadership in a multipolar world (including China, India, Brazil, Russia, the US, Europe and Japan) will face strong incentives to build their domestic capacity.
Adapting to new realities
Governments should be assessing their national capabilities and ensuring that they have the infrastructure and human capital they need to support their management of a changing climate. Countries that cannot afford to build their own capabilities will inevitably be excluded from this race, leading to deeper international dependencies as climate comes to play a greater role in economic policy. Earth-sciences infrastructure will increasingly become a tool in scientific diplomacy, just as it was during the Cold War.
At the same time, oversight of the global corporations that produce planetary data will need to be revisited. As capabilities shift toward the private sector, policymakers should recognize that it matters where those capabilities reside, and which sovereignty they are subject to. While outsourcing such services may have made economic sense in an era of relative global stability, now it could raise security concerns.
Scientific infrastructure will feature prominently in the new search for advantage. When The New York Times reported on von Neumann’s plans in 1946, it also mentioned a far more radical goal. If hurricanes could be predicted far enough in advance, the story noted, “the new discovery of atomic energy might provide a means of diverting, by its explosive force, a hurricane before it could strike a populated place.”
The hubris of the nuclear age produced a dangerous dream: to weaponize Earth’s commons. That effort was thankfully short-lived. Military expenditures for weather control fizzled after a decade of little success. But the idea that knowledge of the planet might confer a strategic advantage, and that more research therefore should be directed toward it, was not off the table.
In 1957, a few years after von Neumann’s announcement, the oceanographers Roger Revelle and Hans Suess pointed out that humanity appeared to be engaged in an unprecedented “large-scale geophysical experiment.” In response, their colleague Charles Keeling began measuring carbon dioxide from Mauna Loa, Hawaii. Within a couple of years, he showed that its concentration had increased in line with the known rate of fossil-fuel combustion.
Then, in 1979, a report from the US National Academy of Sciences concluded that a doubling of carbon dioxide would produce an average temperature increase of three degrees Celsius (an estimate of climate sensitivity that has not changed substantially since). It also suggested that the planet was headed toward just such an increase.
It was an extraordinary realization: von Neumann had imagined that climate modelling would lead to people weaponizing Earth’s commons; but the modelling revealed that humanity had been turning the climate into a ticking time bomb. It was also clear that the consequences would be geographically uneven. Some countries would feel greater effects than others. But in 1979, scientists could not predict which countries would fall into each group.
When the Berlin Wall fell, the Cold War obsession with environmental advantage was temporarily buried along with European communism. For the next 30 years, climate scientists focused overwhelmingly on refining their estimates of global climate sensitivity, to guide the world toward a reduction in emissions. A small minority even began exploring more radical interventions. Scientists at Harvard’s Solar Geoengineering Research Program, for example, have proposed injecting particles into the stratosphere to shade the planet.
No time to waste
All humanity loses if Earth’s climate is radically changed. But not all changes will be the same. Less-developed countries will find it much more difficult to lift people out of poverty. A melting Arctic will create winners and losers, triggering changes in trade routes and new contests for resources that will alter where we can source commodities. Just as our efforts to avert climate change will reconfigure the global economy, so, too, will climate change itself.
Scientists and policymakers need to do more to stay ahead of both climate change and geopolitical shifts. Money is beginning to move, with large funders like the EU shifting more resources toward managing the effects of global warming. But recent assessments of the literature suggest that the scientific community’s attention is still directed elsewhere. Far more is published on the planetary behaviour of the climate system than on regional and local engineering or institutional solutions to socioeconomic and security challenges stemming from climate change.
But it is those regional and local innovations that will create relative comparative advantage and which constitute the knowledge frontier of a multipolar world. If we are answering the wrong question, it hardly matters that we have better instruments and greater power to compute solutions than ever before.
The history of climate science exemplifies the power of state-directed research. Policymakers must recognize that, analogous to the investments of the 20th century, climate-focused research and operational capabilities are becoming a matter of national security. Science that helps us understand Earth’s commons is no longer just a tool for environmental advocacy. The sooner we recognize this shift, the easier it will be to prepare for what comes next.
Giulio Boccaletti is a visiting senior fellow at the Euro-Mediterranean Centre on Climate Change and an honorary research associate at the University of Oxford’s Smith School of Enterprise and the Environment.
Copyright: Project Syndicate