National Academies Press: OpenBook

Risk Analysis Methods for Nuclear War and Nuclear Terrorism (2023)

Chapter: 5 The Structure of Risk Analysis

« Previous: 4 The Use of Risk Assessment for Nuclear War and Nuclear Terrorism
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

5

The Structure of Risk Analysis

In this chapter, the committee reviews the basic components of risk analysis, including definitions, how risk analyses are structured, the main sources of evidence for risk analyses related to nuclear war or terrorism, and the assumptions involved in risk analysis.

RISK DEFINITIONS

Risk is a complex and often controversial concept, derived from the existence of a hazard, and its fundamental characteristic is the uncertainty of possible undesirable events and their outcomes. Risk analysis generally includes a description of the possible events and hazards with the associated probabilities and consequences of such events. “Risk” is used to represent a set of related, but distinct ideas. When using the term, speakers and risk analysts may be referring to a number of meanings (DHS 2017) and focus on different aspects of the term, including

  • Hazards: risk as the result of a hazard or set of hazards. For example, “How should we rank the risks that we all face when we need to set priorities?”
  • Event likelihood: the likelihood of an unfortunate occurrence. For example, “What is the risk of nuclear explosion in our lifetime?”
  • Event outcome: risk as a consequence. For example, “What are the risks—especially, the consequences—of the detonation of an improvised nuclear device in a densely populated city center?”
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
  • Both the likelihood and the possible outcomes: risk as a function of both the likelihood and the severity of the consequences of an activity. For example, “How great is the risk of damage to a house in an earthquake, taking into account both the severity of damage that might occur and its likelihood?”

The committee uses the term “risk”—following Kaplan and Garrick (1981) with the addition of the time horizon in which these events might happen following Paté-Cornell (2011)—to refer to four questions:

  1. What can happen? Specifically, what can go wrong? This is the scenario identification or description.
  2. How likely is it that these events will happen? This is the probability of that scenario.
  3. If these events happen, what are the potential consequences? This is the consequence or evaluation measure of that scenario.
  4. What is the time horizon in which these events might happen?

In the realm of nuclear and radiological weapons, the hazard is the existence of nuclear material, design, engineering, production, and equipment that allows for the real or threatened manufacturing and launching of nuclear weapons, as well as the use or threatened use of a nuclear or radiological device for terrorist attacks.

The level of severity for a nuclear war or terrorist incident depends on the kind and number of weapon(s) that are used, the nature and number of target(s) affected, and the immediate and long-term consequences (physical, social, psychological, economic, political, cultural, and environmental). The level of severity can vary significantly—for instance, from the use of a single small-scale nuclear weapon to incapacitate an enemy on the battlefield to the nuclear destruction of several big cities by strategic weapons.

In some cases, risk may be described solely by the probability of an event in a given time period (e.g., in a given year, how likely is a nuclear terrorist attack?), or by a quantitative description of both the probability and consequences of the event (Paté-Cornell 1996). Such a quantitative description of a risk can be a complementary cumulative probability distribution, continuous or discrete, which represents the probability that different levels of consequences are exceeded in a given time unit or time horizon.

RISK ANALYSIS AS A SYSTEMATIC PROCESS

Risk analysis is a systematic process designed to comprehend the risk mechanisms associated with a hazard or threat and to express that risk, based on available evidence and knowledge, to inform decision making. Risk analysis is used across

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

many domains and relies on a logical, systematic accounting of event dependencies and the dynamics of different possible scenarios. The power of a risk analysis is in providing a systematic framework for making clear the assumptions and evidence (imperfect as it may be) used to identify and assess the risks. Done well, risk analyses separate the risk characterization from the value judgments made by decision makers and stakeholders. Risk analyses do not dictate decisions; they inform them.

Risk analysis requires structuring a risk problem and systematically considering a number of interdependent or independent factors and scenarios. The objective is to clarify the assumptions and their effects on the results and to identify information that needs to be gathered (in the time available) to improve the understanding of the uncertainties involved.

A vital part of risk assessment is framing—formulation of the structure of the problem, the dependencies among the elements, and the possible external factors all must be considered. Risk assessment and management standards are available (ASIS 2015; ISO 2018). A successful assessment might also uncover important gaps that need to be addressed by gathering new evidence or a previously unrecognized interplay of components of the problem. An appropriately structured risk analysis brings transparency to the assumptions, scenarios, data, methods, and results. It permits verifying the sources of information for different parts of the problem and understanding the different models that were developed, the effect of the assumptions, and the effects of external events and circumstances.

The work of a risk analysis team is thus to identify the key components of the question being posed, formulate the analytical models, and characterize uncertainties as best as they can—including recognition of the limits of the analysis. Relevant statistical data may exist for some parts of the problem, but other inputs and sources of information, including surrogate data, models, and expert judgment, may be needed, especially for complex problems.

The results of a risk assessment can be either quantitative or qualitative. For example, when analyzing probabilities, qualitative results are described by words (such as “frequently,” “unlikely,” “low,” or “high”). Quantitative results can represent different types of results, such as the probability of exceeding different levels of damage per time unit.

Risk analysis supports decision making through a systematic decomposition of the problem into scenarios and assessing the risk variations associated with different options. This is done by identifying a set of scenarios that attempt to span the scope of the risk events to be considered in the analysis. Analysts then characterize the uncertainties and consequences of hazardous events, relying on a variety of techniques. In particular, risk analyses may allow identifying risk reduction measures and their effectiveness, computed as a difference of risk results with and without the considered options. Done well, the values and assumptions

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

underlying a risk characterization are transparent, and the risk analyses can help separate the risk characterization from the value judgments made by decision makers and stakeholders.

Of course, a risk analysis can be poorly done and lead to questionable results. It is critical that the results not be inadvertently biased by the analysts, such as truncating databases to eliminate inconvenient possibilities (see National Research Council 1996; Slovic 1990).

Risk analysis is a relatively new tool in human history. Humans have always had to navigate a dangerous world. Acquired knowledge and intuitive understanding of risks (combined with fear and other emotions) served as a compass to guide protective actions. There was—and still is—no time to ponder when one hears an ominous sound in a bush or heavy footsteps in the night. Over time, humans have developed a more analytic approach to these concepts, including probability and risk, which could bring science and analytic thinking (time permitting) to supplement experience, intuition, and emotion.

In the 18th century, Reverend Thomas Bayes applied laws of logic to show how a conditional probability (e.g., the probability of an event A given a signal B) could be simply expressed as the ratio of the joint probability of A and B divided by the probability of B. That simple formula and its derivations allow the computation of the probability of a scenario composed of several events, accounting for possible interdependencies.

Some early applications of Bayesian methods were developed in the 1940s by the British intelligence services to break the Japanese and German codes and develop strategic and tactical protections of maritime convoys across the Atlantic. Bayesian probability was formally defined by Savage (1954) and de Finetti (1974). The methods were developed further, including in the form of game analysis in response to the arms race with the Soviet Union that followed World War II.

One of the first systematic risk analyses of an engineered system was developed to support the choices of designers and engineers striving to improve the safety of nuclear power plants (NRC 1975). This study stimulated the further development and application of what became known as probabilistic risk assessment (NRC 1975).

Later, in the space industry, these methods were used to minimize the risk of space flights by assessing the risk of failure of rockets and spacecraft, and assigning probabilities to catastrophic consequences. These risk analyses were applied, for instance, to the U.S. Space Shuttle and the International Space Station. However, the results were not always implemented—for instance, recommendations made in an existing risk analysis could have mitigated the risk of the Columbia shuttle accident (Paté-Cornell and Fischbeck 1994).

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

AVAILABLE SOURCES OF EVIDENCE

Evidence to support risk analyses for nuclear war and nuclear terrorism may be drawn from many sources, but direct evidence in war situations is—thankfully—limited to the nuclear bombing of Hiroshima and Nagasaki. Given the time since those events and the changes in the world, this limited evidence base leaves great uncertainty in assessments of the current risks of nuclear war and nuclear terrorism. Risk analyses include, whenever possible, statistical data relevant to scenarios. They also rely on past experience and near misses (focusing on the parts of these scenarios that are still relevant today), on surrogate data, models, exercises, and expert opinion: all of these can serve as input to a formal risk analysis of systems and policies, including estimates of the consequences of possible scenarios. In the case of nuclear war and nuclear terrorism, large uncertainties remain.

Statistical Evidence

Statistical data, if they exist and are both relevant and sufficient, are of great value to risk analysis. For many scientific and technical questions, there are some statistical data that are relevant to understanding risks relating to nuclear war and terrorism. For example, there is information that helps characterize the materials used in weapons or the effectiveness of border portals that detect smuggled nuclear material.

The validity of statistical data, however, assumes that the system has not changed—as a result of changing policies and technologies, discovered deficiencies in the system, or geopolitical circumstances, for a few examples. It also requires that enough information exists about past events to extract what is still relevant today. While it cannot generally be assumed that the systems are unchanged, past events and near misses can still provide relevant information.

For example, data from previous nuclear tests and the radiation emitted from them can be relevant in understanding current nuclear systems, even if the overall systems are different today. Similarly, past nuclear incidents that did not lead to a conflict can provide information on processes and human reactions that can inform policies and systems under new circumstances. It is important, however, to consider such past near misses as statistical data only after scrutiny to ensure that they are applicable to the new circumstances.

Statistical evidence can be thought of as the result of a statistical analysis using appropriate observations or measurements. The quality and appropriateness of the statistical analysis need to be considered, as well as the quality and appropriateness of the data.

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

Conventions for communicating risks can also pose a complication, particularly when academic decision analysts work with national security professionals (Friedman et al. 2017).

Past Near Misses and False Alerts

Past near misses and false alerts may inform estimates of the likelihood of future events pertaining to nuclear war and nuclear terrorism, in part because they might point to some potential weaknesses that still exist. Yet, as the circumstances may have changed in response to past events, there may be a wide range of interpretations of these past events. Causes of past failures may have been remedied—by improvement of technical communications, for example—and yet egregious oversights, biases, or flawed assumptions may remain as unrecognized sources of systematic error. Similarly, the systems and practices that other nations have developed and implemented following such near misses or false alerts may have changed or may still be evolving.

Moreover, evidence related to past near misses and false alerts may not be readily available or complete. For example, the Global Terrorism Database of the University of Maryland lists 13 events involving radiological weapons1: 10 involved one individual who, over a 3-day period, sent envelopes containing radioactive minerals to government offices in Japan in an attempt to draw attention to alleged smuggling of nuclear materials to North Korea; the other 3 occurred over more than a decade’s period in the United States and Europe. The information available in this database is evidently sparse and heterogeneous. The International Atomic Energy Agency also maintains an Incident and Trafficking Database,2 but this database relies on voluntary reporting by its member states, and it is both incomplete and not limited to terrorism. For example, it includes incidents involving illegal transportation of radioactive material across national borders, and loss or discovery of radioactive materials that should have been under safeguards.

While the interpretation of historical episodes may help analysts identify potential weaknesses or vulnerabilities, these interpretations are subject to the judgment of risk analysts and rely on available (often limited) information.

Surrogate Data

Surrogate data can potentially be used to estimate the corresponding phenomena in the scenarios considered, as long as the differences between the scenarios

___________________

1 See the Global Terrorism Database website at https://www.start.umd.edu/gtd.

2 See International Atomic Energy Agency, “Incident and Trafficking Database (ITDB),” https://www.iaea.org/resources/databases/itdb.

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

are taken into account. As an example, while the delivery systems used for nuclear weapons are not tested with nuclear payloads, their reliability and performance can be characterized by looking at tests with inert payloads while accounting for the known differences.

Models

Models are used to represent many important aspects of scenarios as well as systems, so the choice of models is critical for risk analyses. For analyses related to nuclear war or nuclear terrorism, models might represent engineered systems, cyber attacks based on vulnerabilities given a system’s structure, conflict development, human behaviors, environmental factors, radiation dispersions, or many other important aspects of the analysis. The models chosen need to include interdependencies among events and external factors that affect the probabilities and consequences of different scenarios.

War Games

Wargaming approaches assume a set of parameters and rules, and allow participants to make decisions and respond to the consequences of those decisions in the context of the game. These are well-studied techniques for exploring the threat landscape and assessing response and mitigation strategies. These tools can help identify possible threats and provide some information about the ease or difficulty of a possible attack and the response to an attack, but do not directly place probabilities on particular scenarios or outcomes.

War games have been used in designing and assessing security systems for nuclear weapons, materials, and facilities. Tabletop exercises and computer simulations are used to assess different possible defense approaches, to identify possible adversary tactics that may be most difficult to defend against, and to train new staff or government officials. While realistic force-on-force exercises are so expensive and disruptive that only a few can be done, tabletop exercises or computer simulations—with individuals playing roles supported by models and simulation to adjudicate player moves—can be conducted for many scenarios.

Wargaming can be especially useful for exploring the conditions in which one party or another in a conflict might decide to use nuclear weapons. A variety of scenarios can be tested with different arrays of nonnuclear and nuclear forces and different strategies on each side. Computer-supported methods make it possible to run large numbers of games and generate results about their outcomes.

Wargaming can thus be undertaken to provide input to a quantitative model from which qualitative information is derived. It is important to note that this approach does not consider the spectrum of all event sequences that can occur

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

in a changing external context. Rather, such methods typically consider a single scenario that results from the choices of the players, but there may be many other branches in these scenarios depending on the choices and ideologies of the players (Emery 2021).

Red Teaming

Red teaming is a method designed to detect vulnerabilities that an adversary can use to achieve its objective. For nuclear war, red teaming means assigning a group to think like the adversaries and to come up with strategies that the adversaries might use to achieve their objectives (Caffrey 2000; Sandoz 2001; Zenko 2015). It is another potentially useful method to structure thinking about some aspects of the risks of a nuclear war, challenge assumptions about plans, and estimate weapon effectiveness. For example, what factors might lead Russia, North Korea, or China to consider the use of nuclear weapons in a conflict with the United States or another country?

For nuclear terrorism, red teaming is used to explore the effectiveness of security systems designed to prevent nuclear or radiological theft or sabotage. It can, for example, provide input regarding the methods adversaries might use to attempt to break into nuclear material storage spaces, or how insiders might attempt to or sabotage a facility. The technique can expose vulnerabilities that might be exploited.

Expert Judgment

Risk analysis for questions related to nuclear war and nuclear terrorism necessarily relies on qualitative and quantitative judgments made by both subject-matter experts and risk analysts. Expert judgment is central to structuring the key components of risk decisions to address such questions as what objectives matter and what choice options should be considered to achieve those objectives. Judgments are then critical to the assessment of the likelihoods of the components of scenarios, their potential consequences, the way these outcomes are described, and the values that are assigned to them.

Analyzing the risks of nuclear war and nuclear terrorism frequently calls for estimates, ranges, or probability distributions for highly uncertain events about which there is no clear consensus or high-quality and sufficient data sources. Examples include estimating the likelihood that a particular nation or group possesses or can acquire weapons of mass destruction, the military strength of state actors, and the capabilities and intent of terrorist groups.

In dealing with analyses based on expert opinion, it is important to understand how the experts were chosen, to what extent they reflect the range of opinions (both technical and political) in the community, and how the expert elicitation

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

was conducted to recognize and guard against biases. The experts should be asked to assess specific factors of the risk scenarios for which they have relevant experience, and decision makers need to be informed of the competence and credibility of the experts.

Challenges to the Elicitation and Use of Expert Opinion

Given the important role that expert opinion often plays in the analysis of risks related to nuclear war and nuclear terrorism (Argyris and French 2017; Downes and Hobbs 2017; Woo 2021), this section highlights some of the conceptual and technical challenges in developing useful and accurate information derived from expert elicitation. The historic approach for dealing with expert opinion and its aggregation is well described in the literature (e.g., Hora 2007; O’Hagan et al. 2006; see also NRC 1990, for an early discussion of the use of these methods in the nuclear power industry).

In integrating expert opinion for use in a risk analysis, some means must be found to summarize the viewpoints across the community (Clemen and Winkler 1999), whether through behavioral strategies for reaching consensus, such as the Delphi method (Rowe and Wright 1999); through mathematical methods of aggregating divergent opinions, such as weighted averaging (Colson and Cooke 2017; Cooke 1991); or a Bayesian combination. The challenges include avoiding undue influence from individuals with forceful personalities (rather than actual expertise) (Wittenbaum and Stasser 1996) and managing the effects of group polarization (Sniezek 1992). In addition, the elicitation method needs to be carefully structured so that the participating experts can accomplish their task (O’Hagan et al. 2006); indeed, many experts are not accustomed to providing their judgment in a quantitative form and may resist doing so (Walker et al. 2001). Classification of nuclear weapons information can also restrict the accuracy and full understanding of the context.

Mellers and colleagues (2014) have shown that, in the long run, quantitative forecasts allow one to measure the relative accuracy of forecasts by different individuals or organizations. Of course, in the nuclear context there have been few actual events, so one can question whether there are enough data for this methodology to be useful. However, insisting on explicit descriptions of possible outcomes and their assessed probabilities and maintaining unclassified aspects of these data in accessible form would allow some long-term analysis of the accuracy of such assessments.

It is widely recognized that expert elicitations, like all individual intuitions, tend to be derived from mental strategies or heuristics that are subject to a variety of serious judgmental biases (Kahneman et al. 1982). There are four known heuristics

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

that lead to biases: availability, representativeness, probability neglect, and framing (Kadane and Wolfson 1998).

Availability

Probability and frequency judgments are biased upward for events that are easier to recall or to imagine happening (Tversky and Kahneman 1973), and even more so when the event is emotionally powerful (Lichtenstein et al. 1978). An inability to recall or imagine important pathways to failure can lead fault trees to underestimate the overall failure probability (Fischhoff et al. 1978).

Representativeness

People may judge the likelihood of an event by the similarity of the image conveyed by the evidence to an idealized or easily imagined image (Tversky and Kahneman 1974). For example, a woman described as interested in social justice may well be judged more likely to be a bank teller and a feminist than to be either a bank teller or a feminist. This is logically impossible, as the conjunction of two events cannot be more probable than either event alone. As a scenario becomes more detailed, it tends to be seen as more coherent and realistic, and thus more probable, when in reality, adding details to a conjunction of events actually makes it less probable. The representativeness heuristic might severely bias the judged probabilities of detailed scenarios, such as those described to represent pathways to nuclear war or nuclear terrorism (see Chapter 4).

Probability Neglect

The perception of risk and avoidance behavior is roughly as strong when the probability is very small (but not zero) as when it is quite high (Rottenstreich and Hsee 2001). Sunstein (2003) observed that, when probability neglect is at work, people’s attention is focused on the bad outcome itself, and they are inattentive to the fact that it is unlikely to occur. When consequences carry sharp and strong affective meaning, as is the case with a lottery jackpot or a cancer, the variation in probability often carries too little weight. As Loewenstein and colleagues (2001) observe, one’s images and feelings toward winning the lottery are likely to be similar whether the probability of winning is 1 in 10 million or 1 in 10,000. They further note that responses to uncertain situations appear to have an all-or-none characteristic that is sensitive to the possibility rather than the probability of strong positive or negative consequences, causing very small probabilities to carry great weight. This perception helps explain why societal concerns about such hazards as nuclear power and exposure to extremely small amounts of toxic chemicals fail to

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

recede in response to information about the very small probabilities of the feared consequences from such hazards. Rottenstreich and Hsee (2001) show that if the potential outcome is emotionally powerful (e.g., experiencing a painful electric shock), the strength of actions to avoid that outcome is relatively insensitive to changes in the probability of experiencing that outcome as great as from 0.99 to 0.01. Sunstein (2003) described this phenomenon as probability neglect and related it to expensive measures to reduce risks from terrorism that were extremely improbable.

Framing

There are often multiple ways to describe or frame the probability of an uncertain event. These descriptions can make a large difference in the perception of the risk and in subsequent decisions, even when they are logically identical (McNeil et al. 1982). For example, consider a man who is being evaluated for release from a hospital where he has been treated for a mental condition that posed a risk of violence. Is it safe to release him? Clinicians who are told that 1 in 10 persons like this individual are likely to be violent if released judge him as more dangerous and are less likely to release him than are clinicians who are told that 10 percent of persons like him are likely to be violent (Slovic et al. 2000). The frequency frame evokes images of “the guy being violent” that do not occur as often with a percentage frame.

Although this example shows that different representations of the same probability can lead to different risk perceptions and decisions, the same can be true of different representations of the same event or consequence. This occurs because probability judgments may be attached not to events but to descriptions of events (Tversky and Koehler 1994). The more explicit the description, the more likely the event seems. For example, in one study, the mean judgment of the probability of a death in the United States occurring from an unnatural cause was about 70 percent higher when participants were asked to estimate and then sum separate probabilities for each of these components of unnatural causes: accidents, homicides, or other unnatural causes. After describing numerous examples of this phenomenon, Tversky and Koehler concluded that probability judgments can be like the measured length of a coastline which increases as the map becomes more detailed. This highlights a major problem with probability assessments, namely, the need to consider unavailable possibilities. This problem is likely to be especially difficult when dealing with new hypotheses or the construction of novel scenarios, such as those discussed below.

While experience and rational reasoning are critical for expert judgment, many biases can distort intuitive judgments. An early overview of such cases was presented by Kahneman and colleagues (1982) and updated and extended

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

by Kahneman (2011). Some methods for decreasing judgmental biases focus on structuring the elicitation, including the following:

  • Specifying the problem to define precisely what is being addressed;
  • Identifying the variables of interest, including objectives, courses of action, consequences, and associated probabilities and uncertainties;
  • Selecting experts with substantial expertise in different parts of the risk analysis and a wide range of credible views on the parameters or events of interest;
  • Facilitating the communications among experts so that they can exchange their mental models as well as their base of experience;
  • Designing and pilot testing the elicitation process and questions;
  • Training experts—for example, explaining the desired response format (such as percentiles)—and educating them about overconfidence and other biases;
  • Processing the elicited data—for example, by performing sensitivity analysis or providing feedback to the experts; and
  • Avoiding imposing accountability indiscriminately, since some accountability amplifies rather than attenuates bias (Lerner and Tetlock 1999).

Even with these approaches, there are problems with expert judgment that may not be addressed adequately. It has been known for many years that modest levels of training and feedback may not significantly reduce overconfidence in estimating the precision of one’s estimate of an uncertain quantity (Moore 2020; Moore and Healy 2008). For example, a study by Alpert and Raiffa (1982) found that after one round of feedback, “The percentage of times the true values [being estimated] fell outside the extreme values (i.e., the 0.01–0.99 confidence ranges) fell from a shocking 41% to a depressing 23%” (Alpert and Raiffa 1982, p. 324). A more effective strategy is the use of “intensive performance feedback” (Lichtenstein and Fischhoff 1980; Plous 1993). However, that process can be burdensome both for the experts whose opinions are being assessed and for the risk analyst performing the assessment. The items used for training in one context may not generalize to other contexts, and expertise in a particular discipline may not translate into expertise on how best to express what an expert knows in probabilistic terms (Fischhoff et al. 1978).

Cooke (1991) developed a method for aggregating expert judgments based on the performance of the experts on “seed questions” with answers that are known (e.g., the extent of radioactive deposition observed in a tracer experiment) or knowable (e.g., prices of company stocks or real estate in the next month). The Cooke method, which evaluates expert judgments using scoring rules (to lessen the chance of experts gaming the system), has been applied in diverse fields, including medicine, maintenance, banking, real estate, nuclear safety, volcanology, food safety,

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

and climate change, and was accepted by the UN Compensation Commission as an adequate basis for assessing the reparations due from Iraq to Kuwait for the Kuwaiti oil fires in the early 1990s (over $50 billion) (Cooke and Goossens 2008). The Cooke method frequently puts zero weight on some subset of the experts whose opinions were assessed, because their performance on the seed questions was demonstrably much poorer than that of other experts in the same study. Lin and Bier (2008) found significant differences in the extent of overconfidence among experts in a given study, lending support to the differential weighting of experts. Similarly, Cooke and Goossens (2008) indicate that in the majority of studies, performance weighting yielded better calibration than equal weights, and it also yielded better calibration than the judgment of the best-calibrated individual expert (Cooke and Goossens 2008).

This approach has several advantages. It does not generally require great statistical knowledge on the part of the experts whose opinions are being elicited. Moreover, Aspinall (2010, p. 295) wrote: “The speed with which such elicitations can be conducted is one of their advantages. Another advantage is that it encourages experts wary of getting involved in policy advice: the structured, neutral procedure, and the collective nature of the result, reassures experts and relieves them of the burden of sole responsibility.” The Good Judgment Project (Mellers et al. 2014; Tetlock and Gardner 2015), established in response to a competition by the Intelligence Advanced Research Projects Agency, has taken a roughly similar approach, using empirical data to help identify the best-performing forecasters. This body of work has found that some people are much better than others at providing probabilistic forecasts of future geopolitical events, even though many of them had no specialized training or formal expertise. In fact, such well-calibrated laypeople (known as “superforecasters”) often outperform intelligence experts with access to classified intelligence (McKinley 2021).

This finding means that, even though some experts can be biased, there is evidence that empirical calibration of expert opinion is feasible and can produce useful results. It thus seems appropriate to combine structuring elicitation with empirical calibration to improve the information obtained from expert elicitation. Indeed, Sutherland and Burgman (2015, p. 317) observe that “a large and growing body of literature describes methods for engaging with experts that enhance the accuracy and calibration of their judgments.” Additionally, it should be noted that specific training tools for debiasing judgment have been developed by Morewedge and colleagues (2015).

Calibration of multiple experts can be difficult when the experts have different sources of information: therefore, it is important, if possible, to understand the basis of their opinions. Nuclear war and nuclear terrorism pose a multiple-expert challenge because of the different character, amount, or quality of the information available to different experts, as well as a spectrum of political viewpoints.

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

Laypeople who have been shown empirically to be well calibrated can provide a useful and relatively unbiased source of input, even without access to classified information. A technical expert may also assess knowledge of adversary capability and intent without any access to intelligence data. An analyst may have access to many intelligence sources that provide information on political, social, and economic aspects of capability and intent, but still may not understand the details of adversary technological capabilities or limitations. And a senior intelligence analyst may have access to information provided by very sensitive intelligence sources and methods that are not available to the broader intelligence community. Care should be taken to assess the value of different sources of input objectively, without excessive deference to seniority or access to classified information, since that can bias the results of an analysis.

The judgmental biases in many situations of nuclear war and nuclear terrorism may be quite different from those in other elicitation problems, raising serious challenges for these methods, notably because of the lack of direct experience with modern versions of either hazard and the strong emotions associated with events and their consequences. Efforts will be needed to develop and validate methods for making expert elicitations adequately trustworthy in these contexts.

ASSUMPTIONS IN RISK ANALYSIS

Technical and Modeling Assumptions

Technical and modeling assumptions are inevitably part of any risk analysis, including the definition of the risk problem and the conditions under which the analysis is expected to be valid. Some assumptions may have to be made to account for unavailable information, others may have to be made for the convenience or simplification of a particular model or analysis technique, and some may have to be made to suggest a particular course of action. Such assumptions are often necessary, and it is important for policy makers and other stakeholders to be aware of these assumptions since they can affect the applicability and reliability of risk analyses.

Modeling assumptions can include presumptions about adversaries (discussed below in more detail), about the reliability of data, and about the future evolution of technical systems. They may also be needed to simplify otherwise overly complex models. These assumptions are sometimes based on the conclusions of prior studies or analyses, which introduce additional levels of uncertainty. Other times, they may be well-established facts and historical events and behaviors that demonstrate the possibility that a particular kind of event can still occur. Nevertheless, it has to be demonstrated that they have relevance for future events. As described in Chapter 6, the very act of describing the risk events that go into a model is an exercise of

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

judgment that carries implicit values that may strongly affect the analysis (Slovic 1999).

Technical assumptions, including the operational capabilities of various weapons (e.g., their reliability, survivability, precision, and effects) or the capabilities of detectors (e.g., false-alarm rates, failure to provide an alarm, sensitivity and standoff distances of radiation detectors) are involved in the analysis of many questions related to the risks of nuclear war and nuclear terrorism. Many of these assumptions may be reasonably well understood or characterized through testing or detailed modeling and may even be codified within a given agency or service so that all analysts are using the same data or assumptions. Such technical assumptions, sometimes called planning factors, play an important role in risk analyses related to nuclear war and nuclear terrorism. These technical assumptions, however, have to be revisited when the system is modified or the geopolitical situation has changed.

It is important to note that unidentified assumptions are a serious source of error in risk analysis. Such assumptions can include, for example, that the systems will operate as designed, that the intended processes are captured appropriately in the models, and that no oversimplifications or systemic biases exist.

Assumptions About Adversaries

Assumptions about adversaries may be necessary for risk analyses related to nuclear war and nuclear terrorism, but specifying realistic assumptions is not always straightforward. Rationality can be defined as the quality of preferences (are they reasonable by a given set of standards?) or by the consistency of their actions in aligning with their preferences (the von Neumann–Morgenstern preference axioms). In the latter case, some preferences of the adversary can be internally consistent but unacceptable by the given standards of morals and ethics. Uncertainty exists when making assumptions about both aspects of rationality.

Adversarial risk assessment in a game analysis has to make assumptions about the capability, goals, information, intent, and behavior of specific adversaries. It is typical for analyses of adversarial risks to assume at least some degree of consistency on their part. This assumption may be appropriate for some types of stable adversaries and stable circumstances (e.g., nation states or organized terrorist groups, although their preferences can change), but less appropriate for other adversaries, such as lone-wolf or other terrorist actors, who may be more opportunistic or idiosyncratic in their choice of targets and attack strategies, or apocalyptic terrorist groups, who are not attempting to achieve a set of goals.

What is rational and consistent to one adversary may be misunderstood or simply overlooked by another. Stable adversaries may change their arsenals and their strategy, and past actions may not be representative of future moves. Regimes can change and circumstances can change, so the nature of a conflict can be affected

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

suddenly. The cognitive, social, and political biases described by Slovic and Lin (2020) challenge and explore the assumptions of rationality and consistency of all actors (Slovic et al. 2020). Adversaries can also act irrationally or signal that they would act irrationally if provoked. In some cases, the irrationality of an adversary may go beyond the bounded rationality of human cognition and extend into delusional or insane patterns of thought.

Assumptions About the Reliability of Data

Risk analysts may also make assumptions about the reliability and accuracy of the data available to them. For example, judgments from the intelligence community (e.g., National Intelligence Estimates) may be taken as givens, despite the risk that these judgments may be biased, inaccurate, or incomplete. The lack of experience with nuclear war and nuclear terrorism implies that risk analyses will have to rely extensively on multiple sources of information, including technical models and tests, surrogate data, historical experience, and expert opinion. This is especially challenging in assessing the risks of nuclear war or nuclear terrorism because of the consequences of such attacks.

Assumptions to Simplify Complex Models

Risk analysts often make assumptions to simplify an otherwise complex model and its formulation. It is important to be aware of such assumptions, to ensure that they are valid, and to understand how they might color the results. A fully realistic model could be as complex as the world it is trying to represent. The challenge is to make sure that simplifying assumptions do not ignore or obscure phenomena that are critical to the risks being analyzed.

Examples of simplifying assumptions that may be inappropriate in an analysis include treating uncertain quantities as if they were known, assuming probabilistic independence (that the occurrence of one event does not change the likelihood of other events), and presuming that the magnitudes of effects are proportional to those of causes. In reality, nonlinearities can mean that a small change in one element produces a large consequence.

Another factor that may be overlooked in complex dynamic situations is the possibility that seemingly favorable events could actually increase the risk of a particular event, while apparently unfavorable events could decrease it. For example, it might seem natural that a more sensitive detector would be more useful than a less sensitive one, but it is possible that increased sensitivity leads to an increase in false alarms, the consequence being that the problematic detector is turned off or ignored. Thus, an attempt to improve security instead diminishes it. Yet a sensor that is not sensitive enough might miss an event altogether or leave little lead time

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

to react to a signal. Managing this tradeoff and deciding what is the best warning threshold to reduce the identified risk is one of the challenges of risk management (Paté-Cornell 1986).

U.S. STRATEGIC ASSUMPTIONS ABOUT NUCLEAR RISKS

U.S. assumptions are developed by analysts and decision makers, often as starting points for further analysis. These assumptions are often not explored by those conducting risk analysis, though they may serve to define the boundaries or constraints of a given analysis (OSD 2018).3 Therefore, the connection between strategic assumptions about risks and the analysis of the risks of nuclear war or nuclear terrorism can be unclear, implicit, or tenuous in some cases. Understanding assumptions provides a context for the types of questions that should be asked of risk analysts by decision makers.

To make the rationale for its policies clear, the United States communicates many of the assumptions that underlie its nuclear security strategy in publicly available documents.4 Given this commitment, many strategic assumptions are explicit in U.S. nuclear security strategy policy and documentation, and they are diverse in character from general assumptions about the nature of deterrence to specific assumptions about the goals of particular nuclear nations, potential proliferators, and nonstate actors. Some of these assumptions are explicit about risks related to nuclear war and nuclear terrorism, such as assumptions about whether certain policies or actions have increased or decreased risks, the nature and variety of threats that confront the United States, and the most likely scenarios.

The committee has identified the following five key categories of strategic assumptions that enter into risk analysis for nuclear war and nuclear terrorism:

  • Assumptions about the risks posed by nuclear weapons use,
  • Assumptions about the strategic intent of adversaries,
  • Assumptions about the capabilities of and information available to adversaries,
  • Assumptions about U.S. strategic goals, and
  • Assumptions about deterrence.

These assumptions are often made explicit through assertions in public statements or other documents. A sampling of U.S. government statements on assumptions associated with nuclear risks is shown in Appendix A.

___________________

3 Modeling assumptions, discussed above, are developed and used by those conducting risk analysis and are different from strategic assumptions.

4 Those documents include the Nuclear Posture Review, National Security Strategy or Strategic Guidance, National Strategy for Countering Weapons of Mass Destruction Terrorism, and the Annual Threat Assessment of the U.S. Intelligence Community, among many others.

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

CONCLUSIONS

CONCLUSION 5-1: Information elicited from experts is often all that is available for assessing some aspects of the risks associated with nuclear war and nuclear terrorism. Analysts and decision makers need to be aware of the sources of that information, of the biases and limitations that the experts could introduce in the analysis, and of the resulting effects of this information on the results of risk analyses. Best practices for expert elicitation can be adapted from other risk analysis disciplines, although some aspects of nuclear war and nuclear terrorism may pose challenges in applying these methods.

CONCLUSION 5-2: Analysts inevitably make assumptions in risk analysis, including about the definition and framing of the risk problem; which models can be used effectively; the reliability of the available data; and the capabilities, intent, and potential actions of adversaries. It is important to show and clearly communicate assumptions and related uncertainties in a risk analysis and their effect on the results.

CONCLUSION 5-3: Strategic assumptions can affect the characterization of a risk problem. Some strategic assumptions address the nature or magnitude of risks, the effect of risk drivers, whether policies or actions increase or decrease the risks, the nature and the variety of threats that confront the United States, and the most likely scenarios. Strategic assumptions also concern risks of nuclear wars outside the borders of the United States.

Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 68
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 69
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 70
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 71
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 72
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 73
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 74
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 75
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 76
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 77
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 78
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 79
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 80
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 81
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 82
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 83
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 84
Suggested Citation:"5 The Structure of Risk Analysis." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 85
Next: 6 Risk Analysis Methods and Models »
Risk Analysis Methods for Nuclear War and Nuclear Terrorism Get This Book
×
 Risk Analysis Methods for Nuclear War and Nuclear Terrorism
Buy Paperback | $28.00 Buy Ebook | $22.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The assessment of risk is complex and often controversial. It is derived from the existence of a hazard, and it is characterized by the uncertainty of possible undesirable events and their outcomes. Few outcomes are as undesirable as nuclear war and nuclear terrorism. Over the decades, much has been written about particular situations, policies, and weapons that might affect the risks of nuclear war and nuclear terrorism. The nature of the concerns and the risk analysis methods used to evaluate them have evolved considerably over time.

At the request of the Department of Defense, Risk Analysis Methods for Nuclear War and Nuclear Terrorism discusses risks, explores the risk assessment literature, highlights the strengths and weaknesses of risk assessment approaches, and discusses some publicly available assumptions that underpin U.S. security strategies, all in the context of nuclear war and nuclear terrorism.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!