Skip to main content

Currently Skimming:

4 Analytical Considerations for Population Estimates of Mortality and Morbidity
Pages 121-160

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 121...
... This chapter begins by discussing the issues associated with using conventional household or individual person survey interviewing for collecting data in disaster studies, whether to provide data on rates that can be scaled up to the level of the population or data to facilitate linkage of exposures to morbidity and mortality outcomes. Next, the chapter turns to the common practice of modeling excess mortality and significant morbidity effects -- differences relative to baseline-level data and trends that may be attributed to a disaster.
From page 122...
... 390) framework lists six broad categories of measurement technique categories: • Documentation, listing deaths person-by-person along with other information about the circumstances; • Derivation of excess mortality, which is "the use of census and other population demographic information to estimate mortality potentially attributable to both direct and indirect losses;" • The use of personal survey interviewing methods (through "epide miologic or demographic household surveys")
From page 123...
... That said, in understanding disaster impacts as in other areas of study, surveys can be an indispensable resource. As its form of a "rapid needs assessment" in the immediate wake of a disaster, the Centers for Disease Control and Prevention (CDC, 2014)
From page 124...
... . This review and the capsule description of the suggested CASPER process motivate a discussion of the key advantages and disadvantages of survey methods for studying disaster effects.
From page 125...
... Taken together, these basic facts are essential to understanding some key disadvantages of survey methods in disaster impact analysis: • If survey content focuses on topics about which respondents' knowledge is limited or error-prone, the resulting statistics will not be accurate -- and, logically, survey measures of mortality or causes of death are necessarily proxy information provided by surviving respondents. Members of a household may know quite precisely whether other members perished in a disaster, particu larly if they saw them die or identified the bodies afterward, but their knowledge of the circumstances may be incomplete.
From page 126...
... (2018) , one of the major studies of excess mortality in Puerto Rico after Hurricane Maria, based their analysis on a sample survey.
From page 127...
... Compounding this difficulty is that in a quick response to a major disaster, survey interviewing may need to be put into relatively unpracticed hands. Survey respondents may be unwilling to engage under the best circumstances, but this can be even more so at a time when their life and property is under direct threat.
From page 128...
... But, with an eye toward studying disaster effects, the critiques suggest that potential coverage and measurement errors should give serious pause to those considering self-report measures of household mortality and make clear the need for a more nuanced analysis than simple before-and-after comparison of estimated fatality rates. Similar themes are invoked by the Working Group for Mortality Estimation in Emergencies (2007)
From page 129...
... The studies that have arguably made the best use of survey methods to estimate disaster-related effects share some important common features: they play to the strengths of survey techniques and focus less on shortterm mortality impacts and more on the longer-term impacts of disasters. Importantly, they have been able to build from or extend existing data collection efforts and thus have not needed to be built fully from scratch.
From page 130...
... . An excellent example of a survey-based disaster impact study that vitally benefited from building from a solid baseline in the form of another survey data collection is the Study of the Tsunami Aftermath and Recovery (STAR)
From page 131...
... , efforts are under way to use survey methods to estimate the total impact of the virus. Understanding whether surveys or counting only test-positive cases from those presenting for care more accurately describes if the outbreak in a city or state is getting better or worse can inform a number of important policy questions.
From page 132...
... Indeed, it is the resulting variety of estimates arising from excess mortality studies conducted in Puerto Rico in the wake of Hurricane Maria in September 2017 that was a major impetus for establishing this committee.
From page 133...
... Moreover, the methodology depends critically on a loosely implied but unproven causal relationship that the detected excess deaths resulted from the disaster. In light of these major intangible factors, what keeps excess mortality studies from lapsing into post hoc ergo propter hoc fallacy is care in specification and documentation of assumptions.
From page 134...
... This report has already described the high variation in estimated mortality counts in Puerto Rico following Hurricane Maria, which made landfall on the island on September 20, 2017. The following brief review highlights the choices made by various researchers in modeling excess deaths attributable to the storm.
From page 135...
... and an article in The Lancet Planetary Health (Santos-Burgoa et al., 2018) and, importantly, has gone on to be accepted by Puerto Rico authorities as the official death toll for the commonwealth.
From page 136...
... Census Bureau population estimates for Puerto Rico) to a "displacement counterfactual." This approach decremented the population based on air travel data from the U.S.
From page 137...
... , the chronic conditions suggesting that these might be death types where the hurricane indirectly contributed to the death by hindering access to regular treatment. One particularly notable excess mortality study not focusing on Hurricane Maria and Puerto Rico was Morita et al.
From page 138...
... might plausibly serve as a surveillance indicator for changing mortality trends. However, the study was mainly about whether death notice counts roughly correlated with historic death record totals and focused principally on death notice counts between January and June 2006, which themselves were considerably time-lagged relative to the disaster, rather than trying to estimate the extent of elevated mortality potentially attributable to Katrina.
From page 139...
... (2020) analyzed mortality between March 1 and April 25, 2020, and estimated 87,001 excess deaths nationally, of which 65 percent were attributed to COVID-19.
From page 140...
... (2020) discuss these and other methodologies for estimating hidden populations, adopting common notation and focusing attention on the asymptotic properties of related estimators, in addition to citing their use in a wide array of substantive settings.
From page 141...
... NSUM is one of a class of methods that uses some implicit structuring within the hidden set to arrive at an adjusted estimate for the hidden population size. As the name implies, the particular structuring assumed by NSUM is based on a survey respondent's personal/social network size.
From page 142...
... Particularly for hidden populations characterized by stigmatized or illegal behavior, studies have often had to resort to snowball or chainreferral sampling: on finding and completing a survey with a single member of the target, hidden population (the "seed") , the researchers then ask the person to name others in the population who might be interviewed.
From page 143...
... RDS begins by finding a small number of subjects (members of the target hidden population) who are recruited to serve as "seeds." Upon completing the interview, the seeds are offered an incentive to recruit their peers (other members of the hidden population)
From page 144...
... COMMON ISSUES RELATED TO ESTIMATION TECHNIQUES In its review of statistical estimation techniques for disaster-related mortality and significant morbidity, the committee reviewed the literature and, to supplement and close the review, hosted two public webinars on February 11 and 18, 2020, to gather input related to several of the studies and techniques (see Chapter 1)
From page 145...
... In addition, the estimation methods described in this chapter can also provide more detail than case counts in terms of demographic and other disparities, types of illness and injuries experienced, and specific causes of death. While the adoption of a standardized, universally applicable method for estimating the mortality and significant morbidity effects of major disasters is not recommended, there is value in some degree of standardization so that, as much as possible, observed differences reflect substantive differences rather than arbitrary methodological choices.
From page 146...
... In general, all of these estimation techniques rely on accurate and appropriate baseline, contextual data. Many of the techniques rely on vital records or vital statistics data -- for example, the counting-based methods described in Chapter 3 -- so it is certainly true that improving the countingtype mortality and morbidity data is important to improving the quality of the estimates that use them.
From page 147...
... The development of effective baseline data may also include exploring opportunities to use alternative and emerging data sources, such as cell phone location records and other administrative data, in ways that derive benefit from the new data resources while managing privacy and confidentiality concerns. Developing an effective data and information structure for studying disaster impacts is not a basic research activity: it has immediate application value.
From page 148...
... . • Some of the survey procedures and data analyses suggested, par ticularly if building on data previously gathered for other purposes, may appear to conflict with consent procedures under the Common Rule, which guides human subject research, respondent burden is sues under the Paperwork Reduction Act, which governs clearance of federal information collections, and the Health Insurance Porta bility and Accountability Act (HIPAA)
From page 149...
... Accordingly, the estimation techniques discussed in this chapter are unlikely to be able to provide direct insight in the early disaster response phase. However, with time to gather data and develop proper specifications, the estimation techniques are useful in assessing the total impact of disasters and in planning for future disasters.
From page 150...
... The counting-based methods described in Chapter 3 rely on accurate baseline and contextual data, including vital statistics data. Improving BOX 4-1 Selected Research Priorities for a National Research Program The research program could address factors such as: • Specification of a comparison period or the handling of confounding or seasonal structure in the data; • Determination of an accurate sampling frame; • Comparison of different estimates from the same disaster to evaluate the effect of methodological choices and assumptions; • Development of appropriate standard survey questionnaires; • Creation of appropriate statistical models; • Development of effective means of characterizing migration and popu lation displacement before, during, and in the immediate wake of each common type of disaster; • Furthering methodological research to lessen and characterize uncer tainty in estimations; • Exploration of modern causal inference techniques to determine ap propriate causal estimands and methods for their estimation; and • Development and evaluation of methods and tools for integrating social determinants of health data into estimations of disaster-related mortality and morbidity to produce more actionable and descriptive data.
From page 151...
... Conclusion 4-2: Developing an effective data and information structure for studying disaster impacts on mortality and morbidity should be a cornerstone of the nation's operational disaster response function. Be cause the necessary analytical sophistication and high-quality fieldwork are generally beyond the capabilities and time availability of most SLTT health departments, it is essential that federal partners work to build and sustain the capacity of the nation's existing research and survey infrastructure to support the collection of survey data on the health effects of disasters.
From page 152...
... Recommendation 4-1: Fund and Conduct Research on Analytical Methods for Population Estimates The Centers for Disease Control and Prevention, the National Institutes of Health, and the National Science Foundation should es tablish a national research program to advance analytical methods for conducting population-level estimates of mortality and morbidity related to disasters. This national research program should include the development and refinement of minimum standard methods and pro tocols for conducting population-level mortality and morbidity assess ments as well as the creation and testing of tools for use by researchers, states, and localities to enhance their capabilities to carry out and use these analyses.
From page 153...
... -approved sampling frames and methods for dealing with methodological challenges, such as population migration, for use by researchers conducting population estimates following large-scale disasters. • The stakeholders listed above should address issues with in formed consent procedures under the Common Rule, respon dent burden issues under the Paperwork Reduction Act, and privacy under the Health Insurance Portability and Account ability Act Privacy Rule in advance and ensure that alternative arrangements are in place to protect privacy and confidentiality.
From page 154...
... • The following immediate actions should be undertaken to en sure SLTT access to and use of mortality and morbidity data: o The National Center for Health Statistics (NCHS) should code and automatically provide, with the assistance of FEMA and ASPR, location-specific, baseline mortality data and up-to-date data on disaster deaths following a declared disaster and upon request, as well as offer ready to-use tools within a set time frame following disasters to states and localities.
From page 155...
... 2019. Causes of excess deaths in Puerto Rico after Hurricane Maria: A time-series estimation.
From page 156...
... 2002. Respondent-driven sampling II: Deriving valid population estimates from chain-referral samples of hidden populations.
From page 157...
... 2020. Assessing mortality and significant morbidity following Hurricane Maria in Puerto Rico.
From page 158...
... Presentation at the February 11, 2020, public meeting of the National Acad emies of Sciences, Engineering, and Medicine's Committee on Best Practices for Assessing Mortality and Significant Morbidity Following Large-Scale Disasters, conducted in webinar style. https://www.nationalacademies.org/event/02-11-2020/ best-practices-in-assessing-mortality-and-significant-morbidity-following-large-scale disasters-webinar-methodological-considerations-of-the-estimation-of-disaster-related morbidity-and-mortality-at-a-population-level (accessed September 1, 2020)
From page 159...
... 2018. Use of death counts from vital statistics to calcu late excess deaths in Puerto Rico following Hurricane Maria.
From page 160...
... Presentation at the February 18, 2020, public meeting of the National Acad emies of Sciences, Engineering, and Medicine's Committee on Best Practices for Assess ing Mortality and Significant Morbidity Following Large-Scale Disasters, conducted in webinar style. https://www.nationalacademies.org/event/02-18-2020/best-practices in-assessing-mortality-and-significant-morbidity-following-large-scale-disasters-webinar methodological-considerations-for-estimating-excess-mortality-and-morbidity (accessed September 1, 2020)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.