As President Barack Obama counts down his last months in office, he has returned to one of his earliest and most ambitious promises, one he fully articulated in Prague in 2009: “I state clearly and with conviction America’s commitment to seek the peace and security of a world without nuclear weapons.“
In the tumultuous years since, it’s become clear he’s had to lower his disarmament expectations. Proposals for radical reductions have been largely stymied by political realities at home and growing threats abroad. The deal he did make with Iran infuriated Republicans (and some Democrats). Now Obama is going around the first branch again: He is asking the United Nations to help him revisit the Comprehensive Nuclear Test Ban Treaty, which turns 20 next month. White House National Security Council spokesman Ned Price says the administration wants to strengthen detonation detection as well as support for nations that already ban tests.
Republicans in the U.S. Senate, who have consistently blocked treaty ratification, are far from amused—a situation made worse when you throw in the prospect of UN involvement in American national security matters. In the two decades since President Bill Clinton signed the treaty, the U.S. has studiously avoided any testing. Republican opponents nevertheless maintain that tying America’s hands threatens the national interest (tests are underground—above-ground explosions generally ceased in 1963 under a separate accord).
But all the old arguments being dusted off for the latest battle seem to ignore a key fact that may moot the entire debate, at least over U.S. testing: Advances in technology have made live testing pointless. We can do it all in the lab.
How to know a bomb works without setting it off
Nuclear bombs have a special status. They are the president’s weapons. Obama’s longstanding position, that the U.S. can design, build, and maintain nuclear weapons without testing them, raises at least one practical question: Without actual explosions, how will he know his weapons work?
Two advances in science and technology help answer the question: simulation and seismic monitoring. Let’s dip into the deep past for a minute.
As the U.S. government organized its Manhattan Project to build a nuclear weapon in the early 1940s, the nation’s top physicists had to make sure it was feasible and wouldn’t deliver any (undesirable) catastrophic risks. Edward Teller, the maverick physicist who would play a pivotal role in developing the hydrogen bomb, asked his colleagues what had been a “notorious question,” according to physicist Robert Serber’s written reminiscence: Would a nuclear explosion ignite the atmosphere?
Physics giant Hans Bethe “went off in his usual way, put in the numbers, and showed that it couldn’t happen,” Serber wrote.
Bethe’s tools were too crude to give him insight into the physics and chemistry of a full atomic blast. Within three years, the U.S. military would detonate its first bomb in the Trinity test at Alamogordo, N.M.
Put in the numbers and run that model
Today there’s an expression that describes what Bethe did. He “ran a model.” There were no microprocessors, networks, or LED screens. It was largely a model in an old-fashioned sense, a conceptual model, a framework that helps make sense of facts.
The practice of “putting in the numbers” matured over the decades and helped design the bombs, long-range missiles, and submarines that still make up the strategic “nuclear triad” at the heart of U.S. security.
The demise of the Soviet Union ended the arms race and left the U.S. manning an arsenal without an enemy. So the nation embarked on a program to blow up bombs without setting one off. President George H.W. Bush stopped production of new weapons and Clinton asked for options that didn’t require bomb-testing.
The National Nuclear Security Administration, part of the Department of Energy, ginned up an initiative in the mid-1990s to build supercomputers sophisticated enough to assure U.S. military and political leaders of weapon lethality. “The theme–the punch line–is nothing about this program was guaranteed to succeed,” Energy Secretary Ernest Moniz said in October of the Stockpile Stewardship Program. “There was nothing off the shelf about it. Science and technology had to be invented to go into completely new domains.”
Now the Energy Department’s National Laboratories sign off on the safety and war-readiness of U.S. weapons through monitoring, simulation, and maintenance. Ironically, “Trinity” is not only a haunting national landmark in New Mexico, but also the name of a supercomputer at Los Alamos National Laboratory capable of performing 40 quadrillion calculations a second.
Another reason Obama’s proposal shouldn’t be viewed as a new security initiative is tied to monitoring. In January, nuclear experts quickly knew that the quakes coming from North Korea were caused by a bomb, but not a hydrogen bomb. The seismic signal was nearly identical to earlier atomic tests.
Some 76 countries host about 170 seismic monitoring stations that listen 24 hours a day for disturbances on land. Complementary systems listen to the oceans and sniff for radioactivity. Any expansion of monitoring would make it easier to discern the characteristics of a detonated device, but the world is pretty well covered as it is.
Despite political obstacles, scientists have made nuclear weapons testing increasingly harmless to humans and the environment. The countries that benefit from such advances, especially those that signed the Comprehensive Nuclear Test Ban Treaty, are probably less interested in what the U.S. does about making sure testing stays in the lab than they are about who gets the real nuclear button come November.
—With assistance by Toluse Olorunnipa in Washington.