National Academies Press: OpenBook
« Previous: Front Matter
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Summary

Under its general plan, the 2020 Census of the United States was intended to have all of its procedures and systems tested and refined years in advance, conduct its extensive questionnaire delivery and information collection as close to the April 1, 2020, Census Day as practicable, and conclude its field operations in sufficient time to permit the Census Bureau to complete rigorous data processing in time to meet statutory deadlines for data release. But the idealized plan was not to be, with the 2020 Census confronting an array of challenges unprecedented in their volume and severity. Delays in getting full appropriations in the years leading up to the census required resource prioritization and scaled back tests, and a late-stage attempt to add a question on citizenship to the 2020 Census questionnaire would raise widespread concern about the effects of the question on census response. But the most massive disruptions arose from the outbreak of the COVID-19 pandemic, forcing a complete stoppage in mid-March and significantly delayed resumption of census field operations at precisely the time that those operations would be entering high gear under the plan. Census respondents and enumerators alike had reason to be fearful of face-to-face interaction in the midst of an infectious disease pandemic, and months and years of planned “get out the count” activity and messaging was suddenly outmoded and drowned out. Delaying census field operations also meant that they would have to directly confront disruptive influences both familiar (intense hurricane and tropical storm activity in the Gulf Coast, wildfires and poor air quality in the west) and unfamiliar (civil unrest).

As it resumed operations, the Census Bureau issued a new timeline to complete collection and ensure quality results, asking for a four-month extension of its statutory data-delivery deadlines. However, Congress did not act on this request; moreover, a Presidential Memorandum of July 2020 declared the presidential administration’s policy to remove undocumented

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

immigrants from the 2020 Census state-level apportionment totals, calling for those numbers to be generated at the same time as the 2020 Census apportionment numbers. With the statutory deadline looming, a new plan was developed to halt census operations on September 30, spurring new litigation amidst concern that the census was being rushed, and the census end date would shift several times. Ultimately, another U.S. Supreme Court intervention would clear the way for 2020 Census operations to end on October 15. But, though the statutory deadlines remained operative, the Census Bureau would conclude that it did not have sufficient time to complete processing and editing of the census returns in time to meet the deadline; ultimately, the 2020 Census apportionment totals would be released (as initially replanned) four months after the statutory deadline, in April 2021.

Even more so than in previous decades, the nation and the full array of stakeholders in the decennial census face a daunting and difficult question relative to this particular census, conducted amidst circumstances so dire: Of what quality is the 2020 Census and its data? Recognizing the need for independent, external review to answer that question, the Census Bureau asked the JASON advisory group to make a quick, high-level assessment of its 2020 Census processes in January 2021 and worked with a task force formed by the American Statistical Association on the generation of 2020 Census quality indicators. For a longer-range and more intensive study of the 2020 Census and its quality, the Census Bureau requested that the National Academies of Sciences, Engineering, and Medicine (NASEM) establish this Panel to Evaluate the Quality of the 2020 Census, giving it a very general and expansive statement of task:

The National Academies of Sciences, Engineering, and Medicine will appoint an ad hoc panel to review and evaluate the quality of the data that were collected in the 2020 Census. As part of its work, the panel will:

  1. Review information from the Census Bureau on the data collected as well as various process measures and indicators of data quality obtained as part of the 2020 Census operations;
  2. Review other available information, such as results from demographic analysis, process measures and preliminary results from the post-enumeration survey; and analyses of administrative records; and
  3. Consider the results from evaluations of similar indicators from the 2010 and 2000 Censuses.

The panel will produce an interim report with its initial findings and conclusions, and a final report that includes conclusions about the quality of the data collected in the 2020 Census and makes recommendations for further research by the Census Bureau to evaluate the quality of the 2020 data and to begin planning the 2030 Census. The panel’s reports will be reviewed according to institutional review procedures and released publicly on the National Academies Press web site.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

The Census Bureau has committed to providing access to census data and operational paradata,1 through a designated data analysis subgroup of the panel. However, the delayed 2020 Census data processing and delivery and the intricacies of brokering such data access are such that the panel is just beginning its data analytic work. Further, the Census Bureau has not yet produced any of its own internal evaluations and operational assessments, reviewing the conduct of its own operations. Accordingly, this first report is necessarily a plan and a framework for what is to come rather than a true interim, first-pass assessment of the quality of the 2020 Census as a whole. It cannot and should not be expected to be a comprehensive answer to the panel’s expansive charge, but it is an important opportunity to lay down markers for what will follow.

In this report, we discuss the meaning of “quality” in the census context. It is critical to understand that error—differences between observed values and the unknown, true values—is an inevitable part of the census and that perfection, or the absence of all error, is an unattainable standard for any census. Bearing this in mind, the quality of the census is a composite of the accuracy of the census data (the relative absence of or mitigation of error, substantial or systemic) and the manner in which those census operations were carried out. We note that there is no unique or simple scorecard for assessing census quality, and rather that quality assessment depends crucially on the filters through which individual census operations or the census as a whole are viewed, the purposes of the decennial census and its data and their fitness for use in meeting those purposes. We also review the major methods by which census error is usally measured and quality assessed, including comparison to external data sources (demographic analysis estimates, an independent postenumeration survey,2 administrative records data, and the results of the previous censuses among them) and detailed analysis of census process data. In this, we note as well that the unique challenges and conditions of the 2020 Census apply with even fuller force to these evaluation measures—particularly the 2020 Postenumeration Survey, for which follow-up and matching activities continued into 2022.

In our initial conclusions regarding the 2020 Census, we think it appropriate to begin with a note of genuine appreciation for what the Census Bureau was able to accomplish in 2020:

Conclusion 4.1: The 2020 Census was implemented in light of severe and unprecedented operational challenges. Faced with extremely difficult decisions and reconciling operational demands with strong and appropriate concern for public health conditions,

___________________

1 Paradata are data about the underlying data collection process itself, such as the history of respondent contact attempts.

2 The first results from the 2020 Census Postenumeration Survey were released by the Census Bureau on March 10, 2022, as this report was in the late stages of review. The Glossary in Appendix A describes Postenumeration Survey and other terms.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

the professional staff of the Census Bureau generally made principled and reasoned choices in adjusting 2020 Census operations to the COVID-19 pandemic, natural disasters, and other disruptions. The basic fact that the 2020 Census was completed, as close to schedule as it was, is itself a major accomplishment, and the Census Bureau and its staff (and the responding American public) deserve commendation for heroic effort amidst the difficult circumstances.

Some detailed points flow from this general conclusion. Though the commendation in the conclusion’s last sentence applies to the full Census Bureau staff, we particularly commend the Census Bureau’s National Processing Center (NPC) in Jeffersonville, Indiana, for its impressive adaptation to meet unprecedented logistical demands, as well as the Census Bureau technical staff who achieved remarkable feats in getting home-grown software solutions harmonized with outsourced-development or commercial off-the-shelf solutions. These latter Census Bureau-developed systems include the Census Review Analysis and Visualization Application (CRAVA) tool and other operational dashboards for review of census operations, the iCADE system used for paper data capture, and—perhaps above all—the Primus application for self-response via the Internet that stepped up to handle the full volume of 2020 Census Internet response with zero down time.

Given the constitutional mandate for the U.S. census, postponing the 2020 Census when the COVID-19 pandemic emerged was not an option. Accordingly, it appears that having an agreed-upon and robust design was important to the enumeration being completed at all.

Conclusion 4.2: The ability of the Census Bureau to complete the 2020 Census amidst its difficult circumstances depended critically on early commitment in the preceding decade to a general design for the 2020 census, premised on targeted development work in a tractable number of priority innovation areas: increased field automation, wider use of administrative records data in census processes, modernized address list development, and Internet response.

The 2020 Census was put in good stead to adapt to the volatile operational environments of 2020 by the fact that these principal, targeted innovation areas underlying the census design were premised on making major improvements to previous decades’ practice. The combination of a capable Internet response channel and an operative Non-ID Processing system—thus making it feasible to complete a census return anytime and anywhere, without requiring contact via the mail or by an enumerator—was surely useful in obtaining census responses in the midst of a pandemic. Moreover, the automation of a large part of census field operations was done expressly to achieve efficiency in data

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

collection; the automation and provision of daily workload assignments directly to enumerators’ handheld devices facilitated getting work done in a shortened timespan, and undeniably provided a better workload distribution model in the COVID era than in-person handoffs of questionnaires, binders, and work-hour timesheets (as in 2010) would have afforded.

It is entirely appropriate, and well deserved, to give the Census Bureau due credit for its perseverance in exceedingly difficult circumstances. However, we would not be giving a full assessment if we did not temper our initial conclusions in two crucial ways. The first is similar in tenor to the concern noted by the 2020 Census Quality Indicators Task Force (2020:1) regarding the Census Bureau’s oft-repeated claims—during the manic final days of 2020 Census field operations—of “reaching 99 percent completion for each state.” Such statements are “aspirational” reasoning because they inappropriately equate “completion” of a case with the availability of a high-quality response from each case, when in fact the completion may be a return from a proxy respondent or may omit responses for several data items. So, while we give the Census Bureau full credit for completing the 2020 Census, that is not equivalent to rendering a conclusion on the overall quality of the census or its data.

Conclusion 4.3: The fact that the 2020 Census was completed under difficult circumstances is not the same, and is not meant to be interpreted, as a broader statement that the 2020 Census and its data products are high quality and credible. While that may be the case, the evidence base for such a definitive statement on 2020 Census quality is still in development, and such statements require careful attention to bases of comparison.

There are numerous known factors that need to be studied carefully before reaching any conclusion about 2020 Census quality, not least of which is the potential for recall error and recall bias in the census and (especially) in the post-enumeration survey used to assess census quality due to delayed field operations. The pandemic and the timing of lockdown orders necessarily made the already difficult problem of counting persons in group quarters like college dormitories, correctional facilities, and health care facilities vastly more difficult, and also complicated accurate counting in their surrounding communities (i.e., off-campus college populations and populations surrounding correctional facilities). Diminished interview response in the main nonresponse follow-up operation, including that due to reluctance to personal contact in the midst of the pandemic, requires consideration, as do the measurement effects that may result from the simple need to start, stop, and then resume complex field operations.

The second key caveat that we must note, in the interest of fairness and improvement, that at least some of the formidable challenges and disruptions facing the 2020 Census were outcomes of the Census Bureau’s own decisions:

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Conclusion 4.4: The 2020 Census labored under formidable challenges, disruptions, and impediments, many of which were forced upon the Census Bureau in the sense that they were purely beyond the Bureau’s control. But it must also be acknowledged that some of the disruptions and impediments to the 2020 Census arose from design decisions and factors within the Census Bureau’s control.

In this framework, the COVID-19 pandemic, natural disasters, and civil unrest that affected field operations were classic examples of forced disruptions. So too are disruptive policy choices made by actors external to the Census Bureau, such as the inability for national statistical policy to coalesce around a combined race and Hispanic origin question (and other improved questions) despite several years of sound research alternatives, the abrupt July 2020 attempt to change the apportionment base to exclude undocumented immigrants (Presidential Memorandum for the Secretary of Commerce, 2020), and the shifting end date of field operations due to Congressional inaction on extending statutory deadlines and legal rulings. In comparison, the Census Bureau’s decision to commit to a new disclosure avoidance system based on differential privacy stands as the most prominent disruptive factor that was a choice of the Census Bureau’s making. Other factors in this category include the Census Bureau’s 2017 decision to switch software platforms for its planned enterprise-wide system, at the known expense of undoing progress and having to recode anew systems that had been employed in earlier census tests. Tougher to deal with are those disruptive factors that fall in-between, neither fully in or out of the Census Bureau’s control and often amounting to debatable judgment calls. In the 2020 context, the lead examples include those operational adjustments made in the face of budgetary uncertainty, such as the elimination of census tests in 2017, the scaling back of the 2018 Census End-to-End Test to a single site, and the 2017 decision to abandon the Active Block Resolution component of In-Office Address Canvassing. All of these decisions were justifiable in their own ways, but still raise lingering questions of missed opportunities that have potential implications for data quality, had other choices been pursued.

We are very concerned, based on presentations to the panel and our knowledge of reactions to previous demonstration data, that the Census Bureau’s adoption of differential privacy-based disclosure avoidance has increased the level of public mistrust in the 2020 Census and the Census Bureau itself. To be clear, our concern is not with differential privacy or disclosure avoidance per se; differential privacy is grounded in quantifiable protections and other very desirable properties, the Census Bureau’s commitment to being good stewards of its own data is strong and entirely appropriate, and preservation of respondent confidentiality is a solemn responsibility for any statistical agency. However, what can be said—fairly and neutrally—is that the adoption of the

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

differential privacy-based solution was made with unusual haste relative to other sweeping changes in census methodology that were more extensively researched and tested. Achieving mailout-mailback methods in the 1970 Census took decades of pretesting and half-implementation in the 1960 Census; Internet response in the 2020 Census only came to fruition after very small-scale implementation in the 2000 Census, abandonment of the methodology in the 2010 Census, numerous national census tests, and implementation of online response in the American Community Survey and other Census Bureau surveys. But, by comparison, the Census Bureau committed to a differential privacy-based solution—at its core, as implemented to date, involving adding statistical noise to almost every tabulated cell and synthesizing the census returns in full based on those noisy measurements—in 2018 without a prototype or working system in place. More to the point, the solution was adopted without demonstrating that it is capable of handling the basic nature of census data and simultaneously addressing the needs of the broad user base of the decennial census. Reconciling the role of disclosure avoidance and privacy protection, in the context of both data user needs and the other important statutes that the Census Bureau is obliged to uphold, is among the issues that may require further attention from the panel and within the federal statistical system more generally.

The 2020 Census could not have been completed without mobilization of and participation by the American public. In this, the 2020 Census made concerted efforts to increase its partnership, communication, and outreach programs—and the work that these programs did amidst the disruptive conditions and shifting timelines is genuinely impressive. The tangible effects of these auxiliary programs on census response and census quality are not often explored, in large part because such assessment is difficult without designed studies in advance (to its credit, some of the planned studies in the 2020 Census program of evaluations and experiments may speak partially to the impact of the 2020 Census communication and media plans). That said, we note that the contribution of these support operations to the census writ large should be examined as much as possible—in the spirit of doing better for 2030 and better understanding the impacts of the special conditions of the 2020 Census.

Conclusion 4.5: The 2020 Census partnership and communication programs and the mobilization of complete count committees and other outreach/get-out-the-count efforts (including those with states and philanthropic bodies) were intended to boost public attention to the census, but there has been little attention to the return on investment and effectiveness of these support operations.

It will be important in early planning for the 2030 Census to consider designed experiments and other evaluations of issues in the formation, coverage, continuity, and timing of these partnership and outreach functions.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Our panel and data analysis subgroup looks forward to working in earnest with operational and process data from the 2020 Census. In this, we benefit from the prior work of the American Statistical Association’s (ASA) Task Force; we also acknowledge, and commend, the Census Bureau for its issuance of three rounds of 2020 Census Operational Quality Metrics alongside its data releases in 2021. There is still a great deal to be learned about the dynamics of the 2020 Census processes at the national and state levels through investigation of the data behind the ASA Task Force and Census Bureau Operational Quality Metric tabulations, before extending those analyses to finer geographic levels. It is important to note at the outset that our analytic work will, to a large degree, be exploratory and descriptive in nature rather than inferential and confirmatory, with the goals being to uncover patterns, associations, distributions, and outliers in the 2020 Census. We will, of course, be guided by our understanding of census operations, findings from previous censuses, and the relative consequences of quality deficits. An exploratory approach is particularly applicable to the extent that we can tap the completely new-for-2020 paradata flowing through the 2020 Census operational control systems for reengineered field operations and the Internet response channel. In this work, we will seek to balance the study of indicators that are well-precedented and thus directly comparable with the 2010 or prior censuses with those that are novel and lack obvious benchmarks or standards; in identifying outliers in factors like proxy enumeration, count imputation, and enumeration via administrative records data, we are not interested in outliers for the sake of spotlighting problems in the specific 2020 application of those methods, but rather in deriving commonalities and explanations that could help make improvements for 2030.

We also note at the outset that we hope to study the temporal and sequential nature of census operational data, as well as emphasizing the need for finer spatial resolution. Previous decades’ census evaluations have tended to be tightly specific to individual operations, and there is great value in studying connections and linkages—for example, tracing back from tabulations of how occupied housing units were resolved in the census (e.g., response via the Internet or counted by proxy enumeration) to determine how the addresses came to be on the Master Address File in the first place. Similarly, it will be useful to study how the Census Bureau’s multistage strategy for the major Nonresponse Follow-up operation in 2020—a first full-automation phase, a second semi-permanent assignment phase that tried to concentrate remaining workload among best-performing enumerators, and the final closeout phase—worked in terms of specific enumeration outcomes, requiring a study of the timing and sequence of enumerator contact attempts.

There are a great many analysis opportunities ahead to understand the 2020 Census, but we close this interim report with an overarching conclusion and two related recommendations. The conclusion is an admittedly blunt, but true,

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

statement that is a critical caveat to the preceding, stemming from the fact that our work is necessarily bounded by access to the data, analytic tools, and data processing support that only the Census Bureau can provide:

Conclusion 4.6: It will not be possible for this panel (or any other evaluator) to understand and characterize the quality of the 2020 census unless the Census Bureau is forthcoming with informative data quality metrics, including new measures based on operational/process paradata, at substate levels and small-domain spatiotemporal resolution, unperturbed by noise infusion.

The ASA 2020 Census Quality Indicators Task Force (2021) offered much the same basic comment in its final report conclusions, observing that the existing indicators released by the Census Bureau do not permit conclusions about the quality of the count.

The first recommendation regarding the availability of census quality and operational data that flows directly from Conclusion 4.6 reflects the fact that what our panel can actually say in its final report depends (properly) on what information the Census Bureau is willing to publicly disclose. As we work with the Census Bureau behind its data firewall, we will promote the value of maximizing transparency with the broader stakeholder and research communities to bolster confidence in census tabulations. To date, the Census Bureau’s position has been to echo the JASON (2021) conclusion that quality metrics below the state level would have to be subjected to differentially private disclosure avoidance—and thus that these should not be published, lest too much of the global privacy-loss budget be consumed. It is our hope that the risk of complementary information disclosure will be reassessed in the coming months so that some detailed quality metrics are discussable in the public eye, so that the panel can “show its work” and say more than “trust us” as argument in our final report:

Recommendation 4.1: The Census Bureau should work on ways to make 2020 Census data quality metrics publicly available at small-domain spatiotemporal resolutions, unperturbed by disclosure avoidance, to bolster confidence in the published tabulations. The Census Bureau should also develop ways to enable qualified researchers to access a full range of data quality metrics and report their findings.

A second, final recommendation stemming directly from Conclusion 4.6 is that, through its own formal evaluation role, the Census Bureau plays a critical role in assessing, documenting, and studying its own data. Like all other census stakeholders, we look forward to reviewing the Census Bureau’s own operational assessments and evaluation reports as they become available in 2022. Consistent with the guidance in previous National Academies studies of

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

census research and evaluation, we reiterate a challenge to our Census Bureau colleagues to make the assessment reports more than basic operational logs, and instead position them as a key resource for getting an early jump on next-census design and testing.

Recommendation 4.2: The Census Bureau should cast its operational assessments and evaluations of the 2020 Census as evidence of the effectiveness of census operations and key inputs to the research agenda for the 2030 Census, rather than purely procedural and documentary of the 2020 experience. Historically, the Census Bureau’s operational assessments and evaluations have been limited to high-level tabulations; the 2020 program should examine 2020 Census operations at finer levels of spatiotemporal and demographic resolution.

The decennial census is foundational to the functioning of American democracy, and maintaining the public’s trust in the census and its resulting data is a correspondingly high-stakes affair. This panel’s interactions with census stakeholders have indicated an erosion of trust and confidence, in the wake of exceptionally difficult circumstances surrounding 2020. We again commend the Census Bureau for the truly remarkable achievement of conducting the 2020 Census, but urge the Census Bureau to keep working toward additional transparency and openness and fostering greater improvements in 2030.

Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 1
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 2
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 3
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 4
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 5
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 6
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 7
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 8
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 9
Suggested Citation:"Summary." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 10
Next: 1 Introduction »
Understanding the Quality of the 2020 Census: Interim Report Get This Book
×
Buy Paperback | $35.00 Buy Ebook | $28.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The decennial census is foundational to the functioning of American democracy, and maintaining the public's trust in the census and its resulting data is a correspondingly high-stakes affair. The 2020 Census was implemented in light of severe and unprecedented operational challenges, adapting to the COVID-19 pandemic, natural disasters, and other disruptions. This interim report from a panel of the Committee on National Statistics discusses concepts of error and quality in the decennial census as prelude to the panel’s forthcoming fuller assessment of 2020 Census data, process measures, and quality metrics. The panel will release a final report that will include conclusions about the quality of the 2020 Census and make recommendations for further research by the U.S. Census Bureau to plan the 2030 Census.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!