Skip to main content

Currently Skimming:

Appendix A: Characterization of Uncertainty
Pages 127-132

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 127...
... This uncertainty arises from missing or incomplete observations and data; imperfect understanding of the physical and behavioral processes that determine the response of natural and built environments and the people within them; and our inability to synthesize data and knowledge into working models able to provide predictions where and when we need them. A key element of effective treatments of uncertainty is the ability to clearly distinguish between the (inherent)
From page 128...
... So, for example, if our uncertainty in p is characterized by a beta distribution with mean = 0.15 and standard deviation = 0.10 (a standard deviation nearly as great or greater than the mean is not uncommon for highly uncertain events such as those considered in homeland security applications) , then the standard deviation of the number of events that could occur in the 20-year period is computed to be 2.5.
From page 129...
... distribution of number of events in a future 20year period; the binomial distribution considers only variability while the betabinomial model reflects both variability and uncertainty.
From page 130...
... When risk assessments include an explicit representation of uncertainty, the value of new information can be measured by its ability to reduce the uncertainties that matter in subsequent decisions derived from the risk analyses. A number of methods have been developed to quantify this, including scientific estimates based on variance reduction, decision-analytic methods based on the expected value of decisions made with and without the information, and a newer approach based on the potential for information to yield consensus among different stakeholders or decision makers involved in a risk management decision.
From page 131...
... . The basic decision-analytic approach described above assumes a single decision maker with a single set of valuations for the outcomes, a single set of prior probabilities for these outcomes under the different decision options, and a fixed and known mechanism for translating study results into posterior probabilities (i.e., a known and agreed-upon likelihood function for the proposed or ongoing research and data collection)
From page 132...
... The Bayesian framework provides a good model for this process: even very different prior distributions should converge to the same posterior distribution when updated by a very large sample size with accurate and precise data. Consider now a decision-analytic framework that must translate the implications of changes in assessments resulting from new information for scientists and the "decision support community" into new assessments for decision makers and interested and affected parties.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.