Policy Brief - Quarterly Journal: International Security

Better Threat Assessments Needed on Dual-Use Science

    Author:
  • Kathleen M. Vogel
| February 2014

This policy brief is based on "Expert Knowledge in Intelligence Estimates: Bird Flu and Bioterrorism," which appears in the winter 2013/14 issue of International Security.

Bottom Lines

  • Flawed Frameworks for Assessing Dual-Use Science. Many U.S. intelligence and nongovernment assessments of dual-use science place too much emphasis on narrow, abstract technical details, rather than on the broader social and scientific factors that can affect terrorist threats.
  • Disconnects between Intelligence and Relevant Experts. Intelligence analysts responsible for assessing security threats in the biological arena have only limited and ad hoc relationships with science advisers or science advisory groups. They have virtually no contact with social science experts who could educate them about the social dimensions of scientific work and technology.
  • Managing the Politics of Expertise. New structures and practices for the acquisition and use of expert knowledge in threat assessments need to be created to reduce the level of distrust between the scientific and intelligence communities.
  • New Resources for Intelligence. Intelligence analysts require a broader array of social, material, and intellectual resources to draw on for their threat assessments of dual-use science.

In October 2013, scientists in California made headlines when they discovered a new strain of Clostridium botulinum, the bacterium responsible for the disease botulism. The new strain produces a type of deadly botulinum toxin for which there is no antidote. The scientists withheld the bacteria's genetic sequence when they published their research, because they and public health officials wanted to prevent the bacteria's sequence information from falling into terrorists' hands.

This incident harkens back to a controversy that began in late 2011, when two leading influenza scientists, Ron Fouchier and Yoshihiro Kawaoka, attempted to publish details of how their research teams had mutated the H5N1 bird flu virus to make it transmissible via aerosol. The U.S. government became concerned that terrorists might be able to use this scientific information. The scientists initially faced objections from an independent expert panel, the National Science Advisory Board for Biosecurity (NSABB), which the U.S. government established within the Department of Health and Human Services to advise the government on biosecurity issues after the terrorist attacks of September 11, 2001. In its initial recommendations, the NSABB argued that the experiments' general conclusions be published, but not "methodological and other details that could enable replication of the experiments by those who would seek to do harm." The NSABB eventually reconsidered its original recommendation, and the researchers published their work in full.

The U.S. government and public's concern that published biological experiments might serve as recipes for terrorism is not new. In 2002, publication of the artificial synthesis of the polio virus raised fears similar to those generated by the H5N1 controversy. Since then, a string of experiments in synthetic genomics, which have created larger and more complex virus and bacterial genomes, has added to concerns about the increasing possibility of the diffusion of these scientific methods to bioterrorists. For instance, in 2005, scientists synthesized an influenza strain containing gene segments from the 1918 pandemic virus that showed a high virulence and mortality rate. These persistent controversies have revealed weaknesses in U.S. scientific, intelligence, and government institutions that assess the security threats from the emerging life sciences and biotechnology. The H5N1 controversy, in particular, illuminates shortcomings in existing threat assessments.

Flawed Frameworks for Assessing Dual-Use Science

To date, most threat assessments of experiments such as those at the center of the H5N1 controversy have concentrated on technical minutiae rather than on the broader social and scientific considerations that can shape specific terrorist threats. Scientific knowledge consists of two components: explicit knowledge and tacit knowledge. Explicit knowledge is the written information found in journal articles, textbooks, or online sources. By contrast, tacit knowledge is the unarticulated, personally held knowledge acquired while working in a laboratory, through practical, hands-on processes and social interactions. U.S. assessments of dual-use biotechnologies have largely neglected to rigorously evaluate tacit knowledge. During the NSABB's deliberations over the H5N1 experiments, the advisory panel focused only on the written details within the scientific manuscripts. Although the NSABB talked with Fouchier and Kawaoka, the two principal investigators tied to the experiments, its members did not carry out site visits or pursue in-depth interviews of the two teams of researchers who actually conducted the experiments over a several-year period. As a result, the NSABB was unable to assess the tacit knowledge that underpinned the teams' scientific work (e.g., how the scientists used or adapted different molecular biology techniques to produce mutations in the virus). Such knowledge was critical to understanding whether an individual or terrorist group could replicate these experiments.

Disconnects Between Intelligence and Relevant Experts

The H5N1 controversy also revealed disconnects between the NSABB and the life science and intelligence communities. The intelligence community is usually viewed by the U.S. government and public as the provider of expertise for the government on security threats. My research into the H5N1 case indicated that, during the controversy over publication of the flu experiments, key intelligence analytic units responsible for assessing life science threats did not have access to the H5N1 scientific manuscripts or the NSABB deliberations. Rather, the NSABB controlled access to critical information, provided little transparency in its deliberations, and drove the analysis of the threat. The intelligence community was drawn into the H5N1 controversy only after the NSABB issued its first recommendations.

Managing the Politics of Expertise

The NSABB was beset by both internal and external political disputes. The media, the public, and the intelligence community accused the board of agenda setting and harboring conflicts of interest. These disagreements made intelligence analysts (and others) doubt the NSABB's recommendations and expertise, and created a challenge for intelligence analysts to obtain reliable expert knowledge. Existing government and nongovernment reports focused on improving intelligence analysis of the life sciences have so far failed to address how politics influences expert-informed assessments and the means of evaluating the validity of expert knowledge.

New Resources for Intelligence

With advances in the life sciences continuing, more questions and controversies over how dual-use science should be handled and published will emerge. Intelligence analysts need to be more involved in these evaluations, and the U.S. government should give them a wider range of resources, structures, and mechanisms that would enable them to do so. For example, intelligence analysts must devote more attention to assessing both the explicit and tacit knowledge in dual-use experiments. Such assessments would examine not only written scientific information, but also knowledge that is not codified, residing instead within the array of research staff, postdoctoral fellows, graduate students, and technicians who actually carry out the experiments. In addition, intelligence analysts should give more attention to the broader socio-organizational factors that can shape science and technology (e.g., how scientists are trained, how scientists produce knowledge by working in teams, and how management can foster or hinder scientific work).

Moreover, assessing the bioterrorism threat coming from the life sciences requires a broad range of expertise and information. A better analysis of such threats would involve relevant analysts within the intelligence community engaging with a range of social science experts. Such experts could provide information about terrorist intentions, motivations, and capabilities, as well as a more nuanced understanding of the difficulties involved in replicating scientific experiments and utilizing them for terrorist purposes. Finally, the U.S. government needs to create a new forum and set of expert practices that would increase opportunities for information exchange and deliberation among a variety of experts and intelligence analysts. In this forum, a mediator could be employed to moderate deliberations and to identify strengths and weaknesses of the various positions. All of these improvements would produce more accurate, engaged, and less politicized threat assessments to inform U.S. policymaking.

Related Resources

Ben Ouagrham-Gormley, Sonia. "Barriers to Bioweapons: Intangible Obstacles to Proliferation," International Security, Vol. 36, No. 4 (Spring 2012), pp. 80–114.

Berling, Trine Villumsen, and Christian Bueger. "Practical Reflexivity and Political Science: Strategies for Relating Scholarship and Political Practice," PS: Political Science & Politics, Vol. 46, No. 1 (January 2013), pp. 115–119.

Räsänen, Minna, and James M. Nyce. "The Raw is Cooked: Data in Intelligence Practice," Science, Technology, & Human Values, Vol. 38, No. 5 (September 2013), pp. 655–677.

Revill, James, and Catherine Jefferson. "Tacit Knowledge and the Biological Weapons Regime," Science and Public Policy, Vol. 40, No. 6 (December 2013), pp. 1–14.

 

Kathleen M. Vogel is Associate Professor at Cornell University with a joint appointment in the Department of Science and Technology Studies and the Judith Reppy Institute for Peace and Conflict Studies. Vogel holds a Ph.D. in biological chemistry from Princeton University. Prior to joining the Cornell faculty, she was a William C. Foster Fellow in the U.S. Department of State's Office of Proliferation Threat Reduction in the Bureau of Nonproliferation.

 

International Security is America’s leading peer-reviewed journal of security affairs. It provides sophisticated analyses of contemporary, theoretical, and historical security issues.

International Security is edited at Harvard Kennedy School’s Belfer Center for Science and International Affairs and is published by The MIT Press.

For more information about this publication, please contact the International Security publications coordinator at 617-495-1914.

Statements and views expressed in this policy brief are solely those of the author and do not imply endorsement by Harvard University, the Harvard Kennedy School, or the Belfer Center for Science and International Affairs.

For more information on this publication: Please contact International Security
For Academic Citation: Vogel, Kathleen M.. “Better Threat Assessments Needed on Dual-Use Science.” Policy Brief, Quarterly Journal: International Security, February 2014.

The Author