The overarching question imparting urgency to this exploration is: Can U.S.-Russian contention in cyberspace cause the two nuclear superpowers to stumble into war? In considering this question we were constantly reminded of recent comments by a prominent U.S. arms control expert: At least as dangerous as the risk of an actual cyberattack, he observed, is cyber operations’ “blurring of the line between peace and war.” Or, as Nye wrote, “in the cyber realm, the difference between a weapon and a non-weapon may come down to a single line of code, or simply the intent of a computer program’s user.”
The Fukushima accident in 2011 led to an epistemological crisis in the American nuclear operating, design and regulatory communities leading each to reexamine what it meant for a reactor to be ‘safe’. Even as these communities sought to forge consensus on the ‘root causes’ of the accident, several reactor design projects were underway. In a cross-sectional study of these contemporary reactor design projects through in-depth interviews with American nuclear reactor designers, I show how the lack of an industry-wide consensus led these designers to have individual experiences and interpretations of the accident, leading ultimately to diverging design responses. With the Fukushima accident in the foreground and wishing to make their reactor designs safer, these designers sought not only to elevate levels of safety in an absolute sense but discovered novel design moves that made their technological designs safe in new ways. What is striking is that no new scientific breakthroughs were needed to create these new designs, only a perceptual shift in the reactor designers’ conceptualizations of safety.
This case of nuclear reactor design and safety suggests an argument that may apply more broadly to the governance of complex socio-technical systems. Homogenous analytical cultures create a tunnel vision such that key players are unable to recognize opportunities for invention and improvement in the design and regulation of complex systems thus stymying their technological evolution and making them vulnerable to failure and instability. Broadly these findings suggest that designers and stewards of complex, high-hazard technologies ought to forge more pluralistic and inclusive analytical cultures. An insight with potentially sweeping implications for nuclear energy policy is that the traditional emphasis on institutional convergence and harmonization is deeply flawed.