We haven't been able to take payment
You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Act now to keep your subscription
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Your subscription is due to terminate
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account, otherwise your subscription will terminate.
author-image

The world is awash with nuclear weapons. We should be more fearful

Sheer luck has saved humanity so far. Arms control talks must be returned to the top of the agenda

The Sunday Times

In his book The Black Swan: The Impact of the Highly Improbable, the polymath Nassim Nicholas Taleb writes about a turkey being fattened in the build-up to Thanksgiving. Each day, the farmer offers juicy grain and, over time, the turkey comes to believe (quite reasonably) that the farmer is a jolly nice person who cares a lot about the welfare of turkeys. Only on Thanksgiving eve, when the farmer arrives with a carving knife, does the poor bird realise the truth. This is what Taleb calls “a reversion of expectation”.

When it comes to nuclear war, I fear that we are the turkeys. Over the past seven decades, the foreign policy establishment has taken each passing year as evidence that the world’s nuclear arsenal has contributed to peace via the inviolable logic of mutually assured destruction. Game theorists tell us that no rational person would initiate nuclear war, given that it would lead to their own destruction. And this, I’d suggest, is why we took our eye off the ball of arms control. Today, according to some experts, there remain enough weapons to wipe out our species — the ultimate reversion of expectation.

But perhaps the crucial insight when it comes to the risks of nuclear war is that psychopaths such as Vladimir Putin — while dangerous — are not the main problem. One imagines that even a crazed chap like him would think twice before initiating a conflict that could wipe out the Earth, or that he would at least be restrained by senior officers. No, the more serious problem is accidental war: an ambiguous blip on a radar screen that leads to rapid counterattack — before anybody knew it, missiles would be crossing the planet like those satellite pictures of planes that light up the night sky.

The BBC recently compiled a list of 22 “near misses” when the world came within a hair’s breadth of catastrophe. One of the most alarming was when Boris Yeltsin stood with his generals clutching an activated nuclear briefcase, frantically trying to decide whether to launch a massive response to a suspicious-looking rocket headed towards Russian airspace. They decided against, thankfully. It turned out to be a Norwegian scientific probe investigating the northern lights.

The Norwegian government was bemused when it learnt that its probe had almost led to global meltdown because it had publicly announced the mission the previous month.But Yeltsin and his senior staff were so fixated on the rocket and the possibility of an imminent apocalypse that they didn’t think to look through the files to see whether foreign governments had made preannounced plans. It probably didn’t even cross their minds.

Advertisement

Examples like this should scare us because they illuminate a deeply human aspect of this high-tech danger. In the event of attack (real or imagined), a nation has only a few minutes to decide whether to respond, for if they wait a moment too long they will already be liquidated.

This is why both America and Russia have “launch on warning” protocols, a hair-trigger standing between our species and calamity. The most important existential questions are necessarily taken in an environment of extreme pressure, with perception narrowing and panic rising.

Another classic example occurred at the height of the Cuban missile crisis when the US navy dropped signalling depth charges to force a Russian nuclear submarine to surface. Far below, Valentin Savitsky, the captain, and Vasily Arkhipov, a senior officer, broke into intense debate.

They had authorisation to launch a nuclear strike, but because they had been out of radio contact with Moscow for days they didn’t know if war had already broken out. The captain, itching to initiate, was talked out of it by Arkhipov — a man who would become known as “the man that saved the world”.

The underlying point is simple: the absence of nuclear war in the postwar period isn’t a vindication of the doctrine of mutually assured destruction but sheer good fortune. Max Roser from the influential World in Data website has shown that, even with an annual risk of nuclear war at 1 per cent per year, conflict is probable within any 70-year period and tends towards certainty through time. A small risk will eventually get us all in the end — which is why the turkey analogy is apt. We are, to change the avian metaphor, sitting ducks.

Advertisement

Yet my growing fear (forgive a rather gloomy column) is that 1 per cent annual risk may be an underestimate, because most analysts calculate risk in part using the known tally of near misses. But in any complex system, many near misses (perhaps a majority) are unknown. Think of a light aircraft without radar missing a mountain by a few inches in heavy fog: the pilot will never know, and may congratulate himself on the safety of his journey. This is what we might call the dark matter of systemic risk.

And this takes us to the nub of the issue. In an industry such as aviation, this dark matter is progressively revealed by tragic accidents. Each time a plane falls out of the sky, the world learns more about the flaws in the system, leading to reforms that make flying safer. As Chesley Sullenberger, the pilot who landed a plane in the Hudson river in 2009, put it: “We have purchased lessons literally brought with blood that we have to preserve as institutional knowledge and pass onto future generations.”

An intuitive example is from the 1950s, when Boeing B-17s kept crashing inexplicably. An investigation revealed that the switches controlling the flaps were identical to those controlling the wheels, and were placed side by side. Under the pressure of a difficult landing, pilots were pulling the wrong lever, so the industry responded by placing a small rubber wheel to the landing-gear switch and a small flap shape to the flaps control. The buttons now had an intuitive meaning, easily identified under pressure. Accidents of this kind disappeared overnight — tragedy led to progress.

But with nuclear risk, this kind of learning may not be possible because the first time something goes catastrophically wrong is the moment our species is potentially wiped out (this is what theorists call an “absorbing barrier”). Worse, this dark matter may be silently accumulating due to the emerging threats of cyberattacks and sophisticated hacking.

And this is why, whatever happens in Ukraine, we must bring arms control back to the top of the agenda. It is impossible to design a fail-safe system, but we can at least reduce the consequences of disaster. An arsenal of, say, 15 per cent of its current size would still deter war (hundreds of millions could be wiped out) but it would not liquidate our species, thus handing survivors half a chance to learn the lessons. Perhaps our descendants might even figure out how to co-operate in such a way as to make nuclear arsenals unnecessary.

Advertisement

What I am sure of is that future generations will thank us for taking action now, while we still can. As long as nuclear weapons exist, nuclear warfare is — at some point — inevitable.

@MatthewSyed