75 years after the first explosive nuclear tests, now outlawed, sophisticated virtual testing allows American physicists to understand these weapons better than ever.
Just before sunrise on July 16, 1945—75 years ago today—a patch of New Mexican desert was incinerated during the first trial of the most destructive weapon ever created. The plutonium bomb used for the Trinity test left a 5-foot crater in the ground and turned the surrounding desert floor into a radioactive green glass. The blast bathed the peaks of the nearby Oscura Mountains in a blinding white light, and dozens of scientific observers watching from 20 miles away reported feeling an immense heat wash over them. As the light from the explosion faded, one of the architects of the bomb, Kenneth Bainbridge, gave a pithy appraisal of the event to J. Robert Oppenheimer, the project’s lead scientist: “Now we are all sons of bitches.”
And he was right. Less than a month later, the United States dropped the same type of bomb on Nagasaki, Japan, just three days after detonating a smaller nuclear warhead over Hiroshima. It effectively ended World War II, but it came at the price of well over 100,000 civilian lives and the enduring suffering of those who survived.
The bombing of Nagasaki was the second and final time a country has deployed a nuclear weapon in combat. But it wasn’t the last nuclear explosion. Despite a lifetime of activism by Bainbridge and many of his colleagues, nuclear tests didn’t end with the war. By the time the US signed the United Nations Comprehensive Nuclear Test Ban Treaty in 1996 and agreed to stop blowing up nukes, American physicists and engineers had conducted more than 1,000 tests. They blew up nuclear weapons in the ocean. They blew them up on land. They blew them up in space. They dropped them from planes. They launched them on rockets. They buried them underground. A small army of US weapons scientists blew up a nuclear weapon every chance they got, and at the height of the nation’s testing program they were averaging one detonation per week.
The test-ban treaty was meant to end all that. Atmospheric nuclear tests have been internationally banned since the early 1960s due to health concerns about radioactive fallout and other hazards. These weren’t baseless fears. In the 1950s, US physicists drastically miscalculated the explosive yield of a thermonuclear bomb during a test in the Pacific Ocean, and the ashy radioactive fallout was detected as far away as India. Exposure to the fallout caused radiation sickness in the inhabitants of the islands around the test site, and a group of Japanese fishers suffered severe radiation burns when the fallout landed on their boat. Miscalculations of this sort were distressingly common at the time. Only a few years later, a bomber accidentally dropped a nuclear weapon on Kirtland Air Force Base on the outskirts of Albuquerque, New Mexico. (Fortunately, no one had yet loaded into the bomb the plutonium pits needed to kick off a nuclear chain reaction.)
The US signed the Partial Nuclear Test Ban Treaty—a bilateral agreement with the Soviet Union to cease above ground tests—in 1963. But nuclear testing only accelerated when it was pushed underground. The US nuclear arsenal peaked in 1967 with 31,255 warheads, and it detonated as many nukes in the 7 years after the partial test ban as it had in the previous 18 years. “With nuclear testing you were under constant pressure to design a new weapon, engineer it, put it down a hole, blow it up, and then move on to the next one,” says Hugh Gusterson, an anthropologist at the University of British Columbia and an expert on the human factors in nuclear weapons research. “The scientists didn’t have a chance to pause and catch a breath.”
This was, obviously, counter to the spirit of disarmament and reducing the world’s nuclear arsenal, which has been the purported goal of the world’s nuclear states since the 1960s. The tests weren’t about ensuring that America’s nukes still worked or learning about the fundamental physics of the weapon. They were about building bigger and better bombs. “Very few of the tests were reliability tests, where you blow it up to see if it still works,” says Gusterson. “They were almost all tests to develop new designs.”
The US ended all underground nuclear tests in the early 1990s in the lead-up to the Comprehensive Nuclear Test Ban Treaty, despite protests from the heads of the nation’s three national weapons labs—Lawrence Livermore, Sandia, and Los Alamos—who fought “tooth and nail” to prevent the ban, says Gusterson. They were concerned, he says, that a ban would reduce the reliability of America’s nukes and prevent the next generation of nuclear weapons designers and engineers from learning the tools of the trade. But perhaps most importantly, they saw the ban as a threat to the labs’ very existence. All three had been founded to further the development of America’s nuclear arsenal. What was the point of keeping them around if not to blow up their creations?
Mark Chadwick, the chief scientist in the Los Alamos Weapons Physics Directorate, arrived at the national lab in 1990 fresh out of a physics doctoral program at Oxford. At the time, he says, there was a lot of debate among the Los Alamos scientists about the future of the lab, or whether it would have a future at all. “Some thought the labs would really end up struggling to find business and that the nuclear deterrence mission would sort of fade away,” Chadwick recalls. “Overall, the pessimism that the national security mission wouldn’t remain important proved wrong. And fairly quickly, in fact.”
The US conducted its last explosive nuclear test in September, 1992. Today, the nation’s nuclear weapons research is focused on reliability testing and maintenance of the roughly 4,000 active warheads in its arsenal, a program broadly referred to as “stockpile stewardship.” After the test ban, the US government lavished funding on the new stewardship program to keep the nation’s weapons up to snuff. The so-called virtualization of US nuclear tests meant that weapons scientists would employ the most powerful lasers and supercomputers in the world to understand these weapons instead of blowing them up. Physicists at the labs work on the best experimental equipment that money can buy, and their funding has ballooned under the Trump administration. “Business is booming, even without nuclear testing,” says Gusterson.
At the heart of the US stockpile stewardship program is Lawrence Livermore National Laboratory, a sprawling complex across the bay from San Francisco. It’s home to the National Ignition Facility, which uses the most powerful laser in the world to re-create the conditions found in the heart of an exploding nuclear bomb. “It’s not so much that it replaces nuclear testing, but it’s a very different, richer perspective on what’s happening in an operating weapon,” says Kim Budil, principal associate director for the Weapons and Complex Integration directorate at Livermore.
Nuclear tests have always served a variety of purposes. Their primary one, of course, has been deterrence—an ever-increasing show of strength meant to discourage America’s allies from ever hitting the big red button. But even back when the military detonated live nukes, its architects were doing everything they could to figure out exactly what was happening inside. Each bomb was outfitted with tens of millions of dollars worth of sensors designed to capture data in the fraction of a fraction of a second before they were destroyed. Virtualization now allows scientists to dig deeper into the physics of the bomb.
“Over this 25-year stockpile stewardship, we have dramatically increased our knowledge of the fundamental science that’s required to do this work,” says Budil “We have types of data and quality of data that were unimaginable during the test era just from the advances in experimental technology.”
The National Ignition Facility at Lawrence Livermore National Laboratory.Photograph: Jason Laurea/LLNL
Say, for example, physicists at Livermore are interested in how tiny imperfections in materials used in a bomb affect its performance. They can load small samples of the material into target vessels that may just be a few millimeters across. NIF channels an enormous amount of energy into 192 laser beams that are aimed at a target; when they strike, the vessel heats up to more than 5.4 million degrees Fahrenheit. If the vessel is in a type of gold target called a hohlraum, the lasers will cause it to act like an x-ray oven and shock the material inside with a high dose of radiation. Scientists can use an imager to study how the x-rays interact with the material, which is relevant to protect the nukes from certain kinds of missile defense systems.
But the real promise of NIF, says Budil, is that it could set us on the path to fusion energy by way of modeling an exploding nuclear bomb. In this case, the laser’s target is loaded with a diamond capsule filled with a gaseous mixture of two hydrogen isotopes called deuterium and tritium. When the lasers hit the target, the x-rays burn off the capsule’s shell. As that material blows off, it causes the capsule to collapse incredibly fast. For a brief moment, pressures inside the capsule are more than 1 million times greater than the atmospheric pressure on the surface of the Earth. This causes the hydrogen isotopes to fuse together and release a tremendous amount of energy.
The conditions in the target at the moment of fusion—extreme as they are—still pale in comparison to the environment in a thermonuclear bomb at the moment of detonation. To re-create those conditions, Budil says, would take an even stronger laser system. “That was something unique—doing a nuclear test that you could generate these incredibly intense environments,” she says. “We don’t have experimental facilities where we have easy access to those conditions.” But by extrapolating from the experimental results in NIF, physicists can still get an unprecedented view of the core of an exploding bomb.
If physicists can sustain that fusion reaction—if they can use lasers to squeeze the hydrogen without letting go—they can get it to release more energy than it took to make the reaction happen. This is called ignition. For the physicists at NIF, achieving ignition would give them a chance to study the conditions of an exploding nuke in detail, and it would also put the world on the path to a virtually unlimited form of clean energy. NIF hasn’t achieved ignition yet, but Budil is optimistic that it’s only a matter of time. “We’re close,” she says.
When they’re not smashing atoms with lasers, scientists at Livermore also conduct what are known as “subcritical tests” at a National Nuclear Security Agency site in the Nevada desert. At BEEF—the Big Explosives Experimental Facility—researchers subject (nonnuclear) materials found in nuclear weapons to extremely powerful conventional explosions to study how they’d respond to an actual nuclear blast. Down the road from BEEF, physicists use a 60-foot, gas-powered gun called Jasper to shoot projectiles at plutonium. These projectiles reach speeds of around 17,000 miles per hour—about 10 times faster than a bullet—and create shock waves as they pass through the barrel. By studying how the plutonium reacts to these pressures and temperatures, physicists can get a better idea of how it will behave inside an exploding nuclear weapon.
The data from these experiments is used to verify the predictions of nuclear weapons simulations cooked up by Livermore’s Sierra supercomputer and to refine the models of the weapon systems that are fed to it. Sierra is the third fastest supercomputer in the world, and Budil says its models are used to understand how changes in the stockpile over time may affect a weapon’s safety or effectiveness. But she cautions that the computer’s models are only as good as its data, which drives physicists at the labs to conduct ever more sophisticated and sensitive experiments.
“The computing machines we’re using today are extraordinary,” says Budil. “But, roughly speaking, they only know what we know. So where there are gaps in our models they won’t give the right answer. We fill that gap with experimental data.”
The Sierra supercomputer at Lawrence Livermore National Laboratory.Photograph: Laura A. Oda/Getty Images
Although the primary directive of the US weapons labs is to conduct experiments to ensure the reliable operation of the nation’s nukes, the same facilities can be used to study the scientific problems that have nothing to do with war. Michael Cuneo is the senior manager for the Z machine at Sandia National Laboratory, a singular experimental facility that create conditions found nowhere else on Earth. No more than once per day, massive capacitor banks near the Z facility are charged with a tremendous amount of electricity that is released all at once in a pulse so powerful it causes the ground around the facility to shake. Each shot has 1,000 times the electrical energy of a lightning bolt, and all of it is focused on a target the size of a quarter.
Like NIF, one of the primary goals at Z is to study the fusion reactions that occur when the target implodes at over 3,000 miles per second. But the extreme pressures and temperatures that occur around the target—sometimes in excess of 3 billion degrees Fahrenheit—also make it a great way to study the conditions during a nuclear detonation. Cuneo oversees about 140 shots of the Z machine each year, many of which are used for classified national security experiments. But Cuneo says the Z machine is also regularly used by researchers working on questions about how planets evolve or the processes that power the sun.
“There may be a few shots a year that are really just basic science, and many of the experiments serve a dual use,” says Cuneo, who estimates that approximately 10 percent of the Z machine’s shots are for fundamental science experiments. “But that same experimental platform and the same techniques are also used to investigate the performance of materials that are relevant to nuclear weapons.”
Today, weapons maintenance has superseded weapons development, and the show of strength implicit via nuclear testing has been replaced with a soft power, says Gusterson. He recounts how John Immele, the deputy director of national security at Los Alamos National Lab, made the case in the ’90s that the US could flex its nuclear superiority by sending scientists out to give cutting-edge presentations at conferences. This would, presumably, impress upon the world just how well America’s weapons scientists knew their stuff. The implication was that if you messed with America, they’d apply that knowledge to you.
“Nuclear testing not only proof-tested new designs during the Cold War, it also had this sort of signaling function where every time the Earth shook you were signaling what you could do to the other side’s cities,” says Gusterson. “Now you have to find another way of signaling, so you do it with PowerPoint presentations instead.”1 / 9Photograph: NASAAWESOME IMAGES OF THE EARTH AT NIGHTFloat through the night sky and gaze upon our planet twinkling through the darkness.
But despite nearly global recognition that ending nuclear tests was a good idea, earlier this year the Trump administration floated the idea of resuming explosive nuclear tests. “It’s not something that came out of nowhere,” says Zia Mian, the codirector of Princeton University’s Program on Science and Global Security. Mian points to the influence of Marshall Billingslea, who President Trump recently appointed as the special presidential envoy for arms control, as a key factor. In the 1990s, Billingslea worked for the Republican senator Jesse Helms, who was a vocal opponent of the US signing the Comprehensive Nuclear Test Ban Treaty. “This has been an ongoing recurring effort in Republican administrations by groups of people and institutional interests who are opposed to the very idea of restraint on the US military’s nuclear weapon capabilities,” Mian says.
Mian characterizes the Trump administration’s interest in nuclear testing as a strongman negotiating tactic at a time when economic and political tensions between the US, China, and Russia are at a boiling point. “It’s a purely political demonstration of American resolve,” he says. “If the US moves toward testing nuclear weapons as proof of alpha-ness in the international community to satisfy Donald Trump’s ego and to force other people into submission, one can imagine that other countries will find their own ways of demonstrating their own determination not to be bullied through detonating nuclear weapons. That leads to a very dark place in international politics very quickly.”
The irony is that resuming nuclear tests would almost certainly serve the interests of other countries more than it would help the US. Only three countries—India, Pakistan, and North Korea—have conducted explosive nuclear tests since the Comprehensive Nuclear Test Ban Treaty was signed 25 years ago. But if the US were to resume nuclear tests, Mian says, it would effectively be an open invitation for other countries to do the same. The US has conducted hundreds more tests than any other nuclear-armed country, and a couple more won’t drastically improve the way American weapons designers understand these systems. But newer entrants to the nuclear arena, like India and Pakistan, have completed only a few explosive tests, and more testing could help them significantly improve their weapons systems. This, in turn, could kick off a new regional or global nuclear arms race.
“The relative benefit to other countries of resuming testing might be greater than for the US in terms of reliability, confidence, weapon design,” says Mian. “That is a strategic calculation to try and maintain US advantage in comparison to other countries, rather than abandoning nuclear testing as a common good for everybody.”
So until the day comes that the world’s leaders go beyond mere test bans and decide to dismantle their nuclear arsenals, physicists and engineers will continue to toil away out of view of the public eye, creating ever more faithful models of the bombs they are compelled to study, but hope will never be used.
All Rights Reserved for Daniel Oberhaus