National Academies Press: OpenBook

Risk Analysis Methods for Nuclear War and Nuclear Terrorism (2023)

Chapter: 7 Risk Information and Risk Management Decisions

« Previous: 6 Risk Analysis Methods and Models
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

7

Risk Information and Risk Management Decisions

Just as the structure, parameters, and modeling assumptions in a risk analysis may affect the results, the formulation of the risk analysis and the communication of risk information affects decision making. This chapter addresses some of what is known about human judgment and decision making under uncertainty and the communication of the results of risk analyses.

EMPIRICAL STUDY OF JUDGMENT AND DECISION MAKING

The dawn of the nuclear age coincided with new academic interest in understanding how people make judgments and decisions, including those involving risks. Economists and other scientists developed decision analysis models, guiding decision making based on the analysis of outcomes and uncertainties. At the same time, psychologists, political scientists, and philosophers began to conduct experiments asking people to make judgments and decisions about simple gambles to study how people make decisions under uncertainty and in the face of risks.

Normative models of decision making have long been dominated by theories that described how rational people choose among options. Faced with uncertain or risky prospects, the option that has the highest expected value or expected utility (when values were expressed as utilities based on individual preferences and risk attitudes) is expected to be chosen. Moreover, a decision maker’s preferences are assumed to be orderly; for instance, someone who preferred option A over option B and option B over option C is expected to prefer option A over option C.

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

Early cognitive theorists and experimentalists were skeptical of the accuracy of these views and instead focused on descriptive models. Herbert Simon’s (1957) influential book Models of Man introduced the notion of bounded rationality that took human cognitive limitations into account. He proposed that people actually search for solutions that are “satisfactory” rather than “optimal,” and stop searching when the first satisfactory solution is found.

Experimental psychologists were even more skeptical than Simon about the accuracy of the expected utility approach. Their studies found preference patterns that could not easily be accounted for by economic theory (Coombs and Pruitt 1960; Edwards 1953, 1954). Edwards’s work motivated others to study decisions across a wide range of everyday human activities, including medical diagnoses and treatments, and technologies, such as nuclear power and chemicals. New fields of study were created (e.g., risk perception) along with a new conception of choice that recognized that many of human’s most important decisions are determined by preferences that are constructed during the act of deciding (Jasanoff 1986; Lichtenstein and Slovic 2006; Slovic 1995). These constructed preferences are very much influenced by subtle contextual factors, such as the way the choice options are framed or described and the specific nature of the response.

The instability inherent in observed preferences was radically different from prevailing notions of rationality (Gretter and Plott 1979). Half a century after the beginning of this movement, Daniel Kahneman received the 2002 Nobel Memorial Prize in economics for his research with Amos Tversky (who had died in 1996) on the psychology of risk and decision making, leading to a new discipline called behavioral economics.

Preference instability is likely to be especially prevalent in situations in which decision makers’ values have not been shaped by learning from experience or for which there have been no events and, thus, no experience. It poses a challenge for theories of nuclear deterrence that assume rational preferences and for risk assessment methods that rely on understanding the values and preferences of experts and decision makers.

Group judgment of risks has also been explored with respect to decision making under uncertainty. Contrary to the “two heads are better than one” idiom, risk assessment by a group of people that are interacting directly with each other is not necessarily better than individual judgment and can be worse (Houghton et al. 2000). Not only do heuristics and biases guide individual judgment and decision making under uncertainty, but they also play a role in group judgments and decisions. In groups, individual judgments can be affected by biases such as groupthink and information pooling (Stasser and Titus 1985; Turner and Pratkanis 1998). Group biases have been at least partly to blame for such incidents as the USS Vincennes mistaking a civilian airbus for a pending enemy attack in 1988 (Johnston et al. 1998), and for the decisions made leading to the 1986 Challenger shuttle disaster

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

(Hughes and White 2010). Similarly, mistakes at the group level can also be made in the course of communicating risk assessment to decision makers.

The Vincennes incident prompted a large research program on tactical decision making under stress. This program resulted in team training and technology solutions aimed at facilitating group judgment and decision making and avoiding biases (Cannon-Bowers and Salas 1998). As an example of a technological mitigation, Rajivan and Cooke (2018) demonstrated that information visualizations to improve perceptual processing and memory in the cyber security domain could reduce the information pooling bias. That is, the visualization facilitated the sharing of information held uniquely by individuals (Stasser and Titus 1985). Professional facilitators have developed practices to improve group judgment and decision making and to avoid group bias (Bens 2017; Kaner 2014; Schwarz 2002). For instance, facilitators are trained to support group judgment and decision making by testing assumptions; asking questions; and paraphrasing, summarizing, and synthesizing ideas (Bens 2017). For judgments and decisions on issues as consequential as nuclear war and nuclear terrorism, it would be of value to consider some of these mitigation procedures.

An evolving area of research on judgment and decision making concerns emotion and risk analysis (Slovic 1999). The field of emotion science has grown substantially in recent years. Relevant to the perception of terrorism and of nuclear war, research has shown predictable relationships from specific emotions and risk perception, some of which are very counterintuitive (Lerner et al. 2015). For example, although anger is a negative emotion (which intuition suggests should produce a negative outlook), anger actually diminishes the perception of risks (e.g., Ferrer et al. 2017; Lerner and Keltner 2001). A nationally representative study revealed that anger and fear have opposing effects on risk perception: fear increases the perceived risks of terrorism, and anger decreases such perceptions (Lerner et al. 2003). As understanding of the ties between emotion and risk perception have become better understood, the inclusion of motivation has also entered into the field (Lerner et al. 2015; Phelps et al. 2014).

Few decision makers would experience neutral emotion in the context of a nuclear attack, and so models need to be infused with specific parameters predicting the effects of emotions on the depth of information processing, on implicit goal activation, and on the content of information processing (Lerner et al. 2015; Phelps et al. 2014). In addition, it is important to recognize that high-stakes decision making occurs in the context of social and institutional systems with norms, culture, and systems of accountability.

Studies of the ways people make decisions and the logical problems associated with intuitions have challenged assumptions of rationality. The question is whether risk analysis and the notion of rationality can help address these behavioral

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

problems, especially when policies and decision making involve the management of the risks of nuclear war and nuclear terrorism.

THINKING ABOUT RISKS

While social scientists were conducting experiments to understand the cognitive dynamics of heuristics and biases in judgments under uncertainty, and the fundamental nature of preference and choice, societal disagreements were emerging over such topics as the safety of nuclear power plants and pesticides. At the heart of these conflicts was the perception of risks and safety (Covello et al. 1986; Starr 1981), and questions regarding the sensitivity of judgments to some characteristics of risks, such as controllability, equity, and unstated uncertainties (Slovic 1987).

Subsequent analyses contested the distinction between objective and subjective risks, arguing that assessing risks is thoroughly subjective, based on (1) theoretical models whose structure is subjective; (2) assumptions; (3) inputs that are dependent on judgment, from the initial structuring of a risk problem to deciding which endpoints or consequences to include in the analysis; and (4) identifying and estimating exposures to the hazard (Slovic 1999). Even the apparently simple task of choosing a risk measure for a well-defined endpoint such as human fatalities can be surprisingly complex and judgmental.

For example, there are many different ways that fatality risks from exposure to a toxic chemical can be described and measured, including but not limited to (Slovik 1999)

  • Deaths per million people in the population
  • Deaths per million people within x miles of the source of exposure
  • Deaths per unit of concentration
  • Deaths per facility
  • Deaths per ton of air toxin released
  • Deaths per ton of air toxin absorbed by people
  • Deaths per ton of chemical produced
  • Deaths per million dollars of product produced
  • Loss of life expectancy associated with exposure to the chemical

An analyst has to select which measures to use in a risk assessment, recognizing that the choice could make a big difference in how the risks are perceived and evaluated.

Recent research about thinking has clarified new ways of understanding decision making. While early studies of risk perception were primarily descriptive, more recent research has benefited from theoretical advances in cognitive psychology that inform the underlying mechanisms that drive perceptions and behavior. In particular, dual-process theories (e.g., Epstein 1994; Finocchiaro 1994) distinguish

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

between what Kahneman later characterized as fast and slow thinking (intuitive or analytical decision making) (Kahneman 2011).

According to these theories, people apprehend reality in two fundamentally different ways: “fast” is intuitive, automatic, natural, nonverbal, narrative, and experiential; “slow” is analytical, deliberative, and verbal. Note that although the latter is likely to be slower if the analysis takes time, rational thinking may also be intuitive and quick. Fast thinking, which may be nonconscious, relies on intuition, quick impressions, reflexive judgments, and gut feelings. Slow thinking relies on careful analysis and deliberation, often with numbers and calculations. People rely on fast thinking most of the time because it is easier and feels right in spite of frequent mistakes in understanding uncertainties, as described by Tversky and Kahneman (1974). Education, however, may lead some to slow down, to think rationally through uncertainties rather than accepting their first intuitive conclusion.

Risk analysis is designed to help those who wish to think more systematically about decisions characterized by danger and uncertainty, and to avoid some of the mistakes known to compromise the rationality of such decisions. It is a relatively recent tool in the long evolution of analytical thinking. From the origins of mankind, human brains developed the capacity to think symbolically, and apply logic and reason to guide decision making beyond immediate instincts. Analytical thinking enables one to imagine and critically evaluate consequences of actions beyond those “right in front of our eyes.” It is important for decision makers to recognize the need to think rationally, and make the effort to do so, especially when the potential consequences of the decisions are extreme, outside the realm of direct experience, or when external events, such as a changing geopolitical situation, can drastically affect the risks.

Misinformation and disinformation are real and ongoing threats. The committee did not discuss or explore the role of this threat as it pertains to risk analysis; it may be addressed in the second phase of the committee’s work.

Clearly, human intuitions about risks and uncertainties can be wrong in significant ways, and human decision making can be deeply flawed. Risk analysis, when conducted as objectively and transparently as possible to support decision making (rather than to justify an already-made decision), can inform decision makers and stakeholders, and enhance their ability to make decisions with a clear understanding of the available information and its implications.

COMMUNICATING RISK ANALYSIS RESULTS

Risk analysis is conducted to inform decisions and policies. Therefore, the findings of these analyses must be communicated effectively to those who need to use them. The difficulties people have in understanding risks point to the need to train analysts and decision makers to communicate risk information in ways that are

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

cognizant of how people think about and understand risks. An individual’s mental model, or approach of thinking about a problem, can also have a significant effect on how the results are interpreted (Morgan et al. 2002).

Research has shown that the presentation of risk information is a frame that can greatly influence the way the information is interpreted and used. For example, probabilities can be expressed (i.e., framed) quantitatively as numbers between 0 and 1, as percentages, or as relative frequencies. They can be expressed qualitatively through words such as unlikely, rare, or probable. Verbal probability, however, may be translated into a wide range of numerical probabilities by the receiver (Beyth-Marom 1982), and numerical and verbal probabilities used in risk assessments may not be equally accurate (Budescu et al. 2014).

Another example of a framing problem is that percentage and frequency formats are interpreted differently. For example, as mentioned above, a 1 percent (or 0.01) chance of a harmful event occurring tends to be seen as less risky than the logically equivalent chance of 1 in 100. The latter triggers mental images of the harmful event occurring (“imagining the numerator”), creating unpleasant feelings that increase the perceived risks. The 1 percent or 0.01 frames rarely produce such imagery, and thus create less feeling of a dangerous situation. Similarly, an event described as having a 90 percent chance of success may be perceived much more favorably than the same event described as having a 10 percent chance of failure (Slovic et al. 2000).

Framing problems can contribute to communication breakdowns. The 2011 nuclear accident at the Fukushima Daiichi Nuclear Power Plant is one such example. Since 2007, the frequency of a disaster-inducing tsunami at the plant has been estimated to occur approximately once every 1,000 years or less (Rampton 2011), or a 0.1 percent chance annually on average. The government and plant owner Tokyo Electric Power Company failed to frame the frequency of disaster events appropriately and to clearly communicate the risks to the public, resulting in unsatisfactory evacuation orders and poor emergency preparedness (Faculty of Societal Safety Sciences 2018). These missteps contributed to the eventual nuclear accident at Fukushima.

Another problem is that precise estimates or a lack of a discussion of uncertainty associated with likelihoods or quantities can be seen as more trustworthy than estimates surrounded by uncertainty bounds or presented as a range (Johnson and Slovic 1995; Van der Bles et al. 2019). An unfortunate corollary is that it may be tempting to hide these uncertainties from the decision maker, exactly when uncertainties should be an important part of a decision.

Although events whose likelihoods are certain or near certain may trigger a strong response, differences in less extreme probabilities may not matter much in the way people make decisions (Rottenstreich and Hsee 2001; Sunstein 2003). Context also matters. The word “rarely” may be interpreted much differently when

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

referring to a disease than when referring to a hurricane, or a headache compared with blindness (Fischer and Jungermann 1996). And a scenario describing a possible disaster may be perceived as more likely when surrounded by text describing common but irrelevant information that contributes to the perceived reality of the scenario (Tversky and Kahneman 1983).

When sufficient data allow one to describe risks quantitatively, analysts face a wide choice of options regarding the specific measures and statistics used to communicate the magnitude of risks, as illustrated in the list above. In a similar vein, Wilson and Crouch (2001) used coal mining statistics to demonstrate how different measures of the same risks can sometimes create quite different impressions. They showed that accidental deaths per million tons of coal mined in the U.S. had decreased steadily over time. In this respect, the industry was getting safer. However, they also showed that the rate of accidental deaths per 1,000 coal mine employees had increased because the miners had become more productive. Neither measure was the one right measure of overall mining risks. They each told part of the same story.

Individuals’ fluency with numerical information (numeracy) varies and greatly affects the degree to which one can draw appropriate meaning from quantitative information about risks and other important matters (Peters 2020). Political ideologies also skew the interpretation and response to technical information and analysis (Makridis and Rothwell 2020). Trust in the analysts and the communicator is also a major factor in how information is received and acted on. Trust is affected by the degree to which the communicator is perceived to share one’s values and is acting in one’s best interests. Messages sent from an opposing political party or based on a process that is not seen as fair or inclusive of diverse views may be distrusted and ignored (Flynn and Slovic 1993; Kahan et al. 2010).

Another key factor in the quality of communications is the choice of the language that is used in the interaction (e.g., quantitative or qualitative), as well as the decision maker’s understanding of the sources of information and the way it was processed (Blastland et al. 2020; van der Bles et al. 2020). Having an intermediary explain to the consumers of the information the meaning and limitations of technical analyses can improve understanding.

It is difficult to predict how a risk analysis will be received without testing it in advance. While testing is challenging in times of conflict, different frames may be pretested on relevant audiences to identify potential problems in subsequent communication. Experts in communication can play important roles in ensuring that risk analyses are properly understood.

The magnitude of the risks associated with nuclear and radiological weapons is inherently difficult to assess and communicate. It becomes even more problematic when descriptions of the weapons and their effects are communicated with what Cohn (1987) called tech-

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×

nostrategic language, or in terms that are abstract, euphemistic, and devoid of negative emotion. A review of the war games conducted by RAND in the 1950s noted that games with detached, quantitative, emotionless language, and devoid of moral judgment, were more likely to lead to the use of nuclear weapons than games with emotional realism and ethical considerations. (Emery 2021)

Risk analysis can be a key input to a risk management decision, but framing and communication of risk information also has an effect on the decision makers and the options they choose. Of course, the results of risk analyses are not the only—or sometimes even the most important—factors in decision making. Decisions involve, in addition to the risk results, the risk attitude and preferences of the decision maker.

CONCLUSION

CONCLUSION 7-1: The ways that risk information is assessed, framed, or presented have powerful effects on how that information is understood and used in decisions. Risk analysis results are most valuable when the method and assumptions by which they were generated is clear, the process is replicable, trust in the analytical process is established, and the analysis addresses the real questions or decisions that confront the decision makers.

Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 104
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 105
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 106
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 107
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 108
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 109
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 110
Suggested Citation:"7 Risk Information and Risk Management Decisions." National Academies of Sciences, Engineering, and Medicine. 2023. Risk Analysis Methods for Nuclear War and Nuclear Terrorism. Washington, DC: The National Academies Press. doi: 10.17226/26609.
×
Page 111
Next: 8 Conclusions and Next Steps »
Risk Analysis Methods for Nuclear War and Nuclear Terrorism Get This Book
×
 Risk Analysis Methods for Nuclear War and Nuclear Terrorism
Buy Paperback | $28.00 Buy Ebook | $22.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The assessment of risk is complex and often controversial. It is derived from the existence of a hazard, and it is characterized by the uncertainty of possible undesirable events and their outcomes. Few outcomes are as undesirable as nuclear war and nuclear terrorism. Over the decades, much has been written about particular situations, policies, and weapons that might affect the risks of nuclear war and nuclear terrorism. The nature of the concerns and the risk analysis methods used to evaluate them have evolved considerably over time.

At the request of the Department of Defense, Risk Analysis Methods for Nuclear War and Nuclear Terrorism discusses risks, explores the risk assessment literature, highlights the strengths and weaknesses of risk assessment approaches, and discusses some publicly available assumptions that underpin U.S. security strategies, all in the context of nuclear war and nuclear terrorism.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!