National Academies Press: OpenBook

Understanding the Quality of the 2020 Census: Interim Report (2022)

Chapter: 4 Initial Conclusions and the Path Ahead

« Previous: 3 Other Evaluations of the 2020 Census
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

4

Initial Conclusions and the Path Ahead

Based on the information presented to the panel through initial (Meeting 1) briefings similar to those received by JASON (2021), and through a series of operational overviews at the panel’s subsequent meetings, the panel can reach some initial, high-level conclusions regarding the processes underlying the 2020 Census, as well as make some brief indications of the work we intend to perform over the next year with access to census operational data.

4.1 INITIAL CONCLUSIONS ON THE 2020 CENSUS

It is entirely appropriate to begin with a note of genuine appreciation, and partially echoing the first summary conclusion offered by our predecessor Panel to Review the 2000 Census (National Research Council, 2004):

Conclusion 4.1: The 2020 Census was implemented in light of severe and unprecedented operational challenges. Faced with extremely difficult decisions and reconciling operational demands with strong and appropriate concern for public health conditions, the professional staff of the Census Bureau generally made principled and reasoned choices in adjusting 2020 Census operations to the COVID-19 pandemic, natural disasters, and other disruptions. The basic fact that the 2020 Census was completed, as close to schedule as it was, is itself a major accomplishment, and the Census

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Bureau and its staff (and the responding American public) deserve commendation for heroic effort amidst the difficult circumstances.

Under this general conclusion, some detailed points deserve to be added for consideration. First, the commendation in the conclusion’s last sentence applies to the full Census Bureau staff, both permanent and the temporary field staff corps. We wish to emphasize the Census Bureau’s National Processing Center (NPC) in Jeffersonville, Indiana, for specific mention. That the NPC was able to accommodate the pandemic-induced disruption to its own work schedules and clerical operations with its near-overnight need to become one of the nation’s leading purveyors of personal protective equipment with its support role in other Census Bureau programs was a genuinely impressive feat. Likewise, while there were distinct challenges along the way in adapting to agile programming and rigorous software development processes, we note that the Census Bureau achieved impressive feats in getting its home-grown software solutions harmonized with outsourced-development or commercial off-the-shelf solutions. Of particular note are the development of the Census Review Analysis and Visualization Application (CRAVA) tool for review during data processing and other operational progress dashboards, the adoption of the Bureau’s iCADE software for paper data capture, and—perhaps above all—the Bureau’s homegrown Primus application for self-response via the Internet being relegated to backup status in 2017 but being promoted to primary again shortly before the start of 2020 data collection. Primus would go on to handle the full volume of 2020 Census Internet response with, as the Bureau is proud to note, zero down time.

Uncovering the level of quality the 2020 Census was able to accomplish in light of the obstacles it faced will require much additional analysis, but we think it can be said that the execution of the 2020 Census in a reasonably timely manner depended on early agreement on a general design. Given the constitutional mandate for the census, postponing the 2020 U.S. Census was not an option, and it appears that having an agreed-upon and robust design was important to the enumeration being completed.

Conclusion 4.2: The ability of the Census Bureau to complete the 2020 Census amidst its difficult circumstances depended critically on early commitment in the preceding decade to a general design for the 2020 census, premised on targeted development work in a tractable number of priority innovation areas: increased field automation, wider use of administrative records data in census processes, modernized address list development, and Internet response.

It also helped that these principal innovation areas underlying the 2020 Census design were premised on enabling major changes relative to previous

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

decades’ census practice; they helped put the 2020 Census in good stead in adapting to the volatile operational environments of 2020. The combination of a capable Internet response channel and an operative Non-ID Processing system—thus making it feasible to complete a census return anytime and anywhere, without requiring contact via the mail or by an enumerator—was surely helpful to obtaining census responses in the midst of a pandemic. The late reinstatement of Update Leave enumeration for many rural areas likely helped as well, because a quick one-and-done enumerator visit to deploy an invitation to self-respond by Internet or mail was better suited to pandemic conditions than initial plans to handle them all with face-to-face interviews in Update Enumerate methodology. Moreover, the automation of a large part of census field operations was done expressly to achieve efficiency in data collection; the automation and provision of daily workload assignments directly to enumerators’ handheld devices facilitated getting work done in a shortened timespan, and undeniably provided a better workload distribution model in the COVID era than in-person handoffs of questionnaires, binders, and work-hour timesheets (as in 2010) would have afforded.

It is entirely appropriate, and well deserved, to give the Census Bureau due credit for its perseverance in exceedingly difficult circumstances. However, we would not be giving a full assessment if we did not temper our initial conclusions in two crucial ways. The first is similar in tenor to the concern noted by the 2020 Census Quality Indicators Task Force (2020:1) regarding the Census Bureau’s oft-repeated claims—during the manic final days of 2020 Census field operations—of “reaching 99 percent completion for each state.” Such statements are “aspirational” reasoning because they inappropriately equate “completion” of a case with the availability of a high-quality response from each case. In fact, the completion may be a return from a proxy respondent with flawed information or may omit several data items. So, while we give the Census Bureau full credit for completing the 2020 Census, that is not equivalent to rendering a conclusion on the overall quality of the census or its data.

Conclusion 4.3: The fact that the 2020 Census was completed under difficult circumstances is not the same, and is not meant to be interpreted, as a broader statement that the 2020 Census and its data products are high quality and credible. While that may be the case, the evidence base for such a definitive statement on 2020 Census quality is still in development, and such statements require careful attention to bases of comparison.

Indeed, there are known factors and complications that need to be studied carefully before reaching conclusions about 2020 Census quality. The delay in beginning and completing census field operations relative to the April 1, 2020, reference date inevitably creates the potential for recall error and

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

recall bias in the census, and even more so in the postenumeration survey used to assess census quality. The pandemic and the timing of lockdown orders necessarily made the already difficult problem of counting persons in group quarters like college dormitories, correctional facilities, and health care facilities vastly more difficult, and also complicated accurate counting in their surrounding communities (i.e., off-campus college populations and populations surrounding correctional facilities). Diminished interview response in the main Nonresponse Follow-up (NRFU) operation, including that due to reluctance to personal contact in the midst of the pandemic, requires consideration, as do the measurement effects that may result from the simple need to start, stop, and then resume complex field operations.

The second key caveat that we must note, in the interest of fairness, is related to the well-worn proverb that success has many parents but failure is an orphan. It is in the spirit of doing better next time—and decidedly not to assign blame to anyone or anything—that we acknowledge that at least some of the formidable challenges and disruptions facing the 2020 Census were outcomes of the Census Bureau’s own decisions:

Conclusion 4.4: The 2020 Census labored under formidable challenges, disruptions, and impediments, many of which were forced upon the Census Bureau in the sense that they were purely beyond the Bureau’s control. But it must also be acknowledged that some of the disruptions and impediments to the 2020 Census arose from design decisions and factors within the Census Bureau’s control.

In this framework, the COVID-19 pandemic, natural disasters, and civil unrest that affected field operations were classic examples of forced disruptions. So too are disruptive policy choices made by actors external to the Census Bureau, such as the inability for national statistical policy to coalesce around a combined race and Hispanic origin question (and other improved questions) despite several years of testing alternative question structures, the abrupt July 2020 attempt to change the apportionment base to exclude undocumented immigrants (Presidential Memorandum for the Secretary of Commerce, 2020), and the shifting end date of field operations due to Congressional inaction on extending statutory deadlines and legal rulings. In comparison, the Census Bureau’s rapid decision to commit to a new Disclosure Avoidance System based on differential privacy stands as the most prominent disruptive factor that was a choice of the Census Bureau’s making. Other factors in this category include the Census Bureau’s 2017 decision to switch software platforms for its planned enterprise-wide system, at the known expense of undoing progress and having to recode anew systems that had been employed in earlier census tests. Tougher to deal with are those disruptive factors that fall in-between, neither fully in or out of the Census Bureau’s control and often amounting to

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

debatable judgment calls. In the 2020 context, the lead examples include those operational adjustments made in the face of budgetary uncertainty, such as the elimination of census tests in 2017, the scaling back of the 2018 Census End-to-End Test to a single site, and the 2017 decision to abandon the Active Block Resolution component of In-Office Address Canvassing. All of these decisions were justifiable in their own ways, but still raise lingering questions of missed opportunities that have potential implications for data quality, had other choices been pursued.

We are very concerned, based on presentations to the panel and our knowledge of reactions to previous demonstration data, that the Census Bureau’s adoption of differential privacy-based disclosure avoidance may have resulted in some public mistrust in the 2020 Census and the Census Bureau itself.1 To be clear, our concern is not with differential privacy or disclosure avoidance per se; the Census Bureau’s commitment to being good stewards of its own data is strong and entirely appropriate, and there is no doubt that preservation of respondent confidentiality is a solemn responsibility for any statistical agency. The disclosure avoidance strategy chosen by the Census Bureau is an extreme variation from the past—at its core, as implemented to date, it involves adding statistical noise to nearly every tabulated cell and synthesizing the census returns in full based on those noisy measurements—but it is an approach grounded in quantifiable protections that has some very desirable conceptual properties. What can and must be said, however, is that the adoption of the differential privacy-based solution was made with unusual haste relative to other sweeping changes in census methodology that were researched and tested much more extensively. As our predecessor National Research Council (2010:61–62) noted, it took limited implementation as early as 1890, increased levels of small-scale implementation in the 1910–1950 Censuses, numerous tests in intercensal years, extensive mailout of census questionnaires (but not yet asking for the forms to be mailed back) in the 1960 Census, and an act of Congress to achieve mailout-mailback methods in the 1970 Census. Also, the 2020 Census Internet response option only came to fruition after a small-scale and unpublicized implementation of online response in the 2000 Census, abandonment of the methodology in the 2010 Census, numerous national census tests, study of Internet response in other nations’ censuses, and the adoption of online response in the American Community Survey and other Census Bureau surveys. By comparison, the Census Bureau committed to overhauling its disclosure avoidance approach in 2018 without testing or protoyping, much less having a working system in place, and without

___________________

1 These concerns came to the fore in the group discussion with invited census stakeholders at the panel’s fourth meeting, as indicated in the agenda for that meeting in Appendix B. See also the proceedings of the National Academies of Sciences, Engineering, and Medicine (2020) workshop that followed the initial release of Demonstration Data Products and, with it, the first public airing of the new methodology.

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

demonstrating that the new system was capable of simultaneously handling the basic nature of census data (e.g., non-negative integer counts with some totals that must, by law, be rendered exactly as counted without the use of statistical methods) and addressing the needs of the broad user base of the decennial census. Reconciling the role of disclosure avoidance and privacy protection, in the context of both data user needs and the other important statutes that the Census Bureau is obliged to uphold, is among the issues that may require further attention from the panel and within the federal statistical system more generally.

The 2020 Census could not have been completed without mobilization of and participation by the American public. In this, the 2020 Census made concerted efforts to increase its partnership, communication, and outreach programs—and the work that these programs did amidst the disruptive conditions and shifting timelines is genuinely impressive. The tangible effects of these auxiliary programs on census response and census quality are not often explored, in large part because such assessment is difficult without designed studies in advance (to its credit, some of the planned studies in the 2020 Census program of evaluations and experiments may speak partially to the impact of the 2020 Census communication and media plans). That said, we note that the contribution of these support operations to the census writ large should be examined as much as possible—in the spirit of doing better for 2030 and better understanding the impacts of the special conditions of the 2020 Census.

Conclusion 4.5: The 2020 Census partnership and communication programs and the mobilization of complete count committees and other outreach/get-out-the-count efforts (including those with states and philanthropic bodies) were intended to boost public attention to the census, but there has been little attention to the return on investment and effectiveness of these support operations.

It will be important in early planning for the 2030 Census to consider designed experiments and other evaluations of issues in the formation, coverage, continuity, and timing of these partnership and outreach functions, including linkage with state, local, and philanthropic resources for outreach.

4.2 THE PATH AHEAD FOR ASSESSING THE 2020 CENSUS

Our panel and data analysis subgroup looks forward to working in earnest with operational and process data from the 2020 Census, and it is useful to outline some basic precepts in getting underway. First, we reiterate the note of appreciation from Chapter 3 that the Biemer et al. (2021) data work in support of the American Statistical Association (ASA) Task Force and the Census Bureau’s rounds of 2020 Census Operational Quality Metrics released to date provide very useful first steps for our own inquiries. There is much

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

still to be learned through examination of the data behind the public Quality Metric releases, developing a sense of the dynamics of 2020 Census processes at the national level (and getting clear on exactly how the metrics were computed by examining the counts that underlie the publicly reported percentages) before extending those analyses to finer geographic levels. Likewise, the 10 indicators identified in the ASA Task Force work (see Table 3.1) cover a good, extensive range of census activity; it is reasonable to assume that, after getting a sense of operations at the national and state levels, a natural next step would be to examine those process indicators at finer levels of geography (e.g., to the census tract level).

It is important to note at the outset that our analytic work will, to a large degree, be exploratory and not inferential in nature, with the goals being to uncover patterns, associations, distributions, and outliers in this census. Of course, we will be guided by our understanding of census operations, findings from previous censuses, and the relative consequences of quality deficits. This is particularly the case to the extent that we can tap the completely new-for-2020 paradata flowing through (and ideally retained by) the 2020 Census operational control systems for reengineered field operations and the Internet response channel. These include, for example, studying the frequency and nature of major deviations made by enumerators from the specified/optimized routes that were pushed to their smartphone enumeration devices, as well as clues from navigation information within the Internet questionnaire that might speak to response errors. We also hope to explore some questions raised by the Census Bureau’s release of census quality metrics to date, perhaps most notably the counterintuitive finding of higher item nonresponse rates in the 2020 Census than in 2010—counterintuitive both because of increased self-response in 2020 and because of most of that self-response coming through Internet response with its built-in completeness checks. In identifying outliers in factors like proxy enumeration, count imputation, and enumeration based on administrative records data when possible, we are not interested in outliers for the sake of highlighting problems in the specific application of census methods in 2020 (finding fault with specific areas or enumerators) but rather in deriving commonalities and explanations that could help improve procedures for 2030 (i.e., whether urban/rural areas experience similar phenomena).

We will continue to emphasize the need for finer spatial resolution of census operational data, but some aspects of the temporal and sequential nature of those operational data are useful to discuss here, too. First, a particular strength of the ASA indicator set is its before-the-beginning and after-the-end scope, from construction of the Master Address File (MAF) through post-data collection processing. We hope to reinforce and extend this approach by drawing linkages between census operations when possible, and trying to get beyond the operation-specific (siloed) focus of past Census Bureau evaluations and assessments. For example, it would be useful to consider

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

whether concentrations of cases requiring imputation or proxy response are more likely to stem from some MAF-building operations relative to others, or whether there are important covariates that describe households/areas reached late in the census process. Similarly, the final Census Bureau’s operational plan for the 2020 Census (U.S. Census Bureau, 2018) envisioned a three-stage plan for the major NRFU operation: a first full-automation phase in which the route optimizer and enumerator assignment generator had least restrictions, a second “semi-permanent” assignment phase in which remaining workload was concentrated among best-performing enumerators, and the closeout/last-resort phase to get possible information from the most difficult/remaining cases. It will be useful to compare enumeration outcomes and quality indicators (such as the need to employ proxy or administrative records enumeration) in each of these NRFU phases—and even more to draw connections with how the addresses in question originated in the MAF.2

There are a great many analysis opportunities ahead to understand the 2020 Census, but we close this interim report with an overarching conclusion and two related recommendations. The conclusion is an admittedly blunt, but true, statement that is a critical caveat to the preceding, stemming from the fact that our work is necessarily bounded by access to the data, analytic tools, and data processing support that only the Census Bureau can provide:

Conclusion 4.6: It will not be possible for this panel (or any other evaluator) to understand and characterize the quality of the 2020 census unless the Census Bureau is forthcoming with informative data quality metrics, including new measures based on operational/process paradata, at substate levels and small-domain spatiotemporal resolution, unperturbed by noise infusion.

The ASA 2020 Census Quality Indicators Task Force (2021) offered much the same basic comment in its final report conclusions, observing that the existing indicators released by the Census Bureau do not permit conclusions about the quality of the count. By this conclusion, we agree with the Task Force as well as incorporate the sharp and useful JASON (2021) attention to temporal behavior of indicators as well as purely spatial disaggregation. Understanding the dynamics of 2020 Census processes at the national and even the state level is a starting point, but decisions to participate (or not) in the census and the actual enumerations are inherently hyperlocal; assurance that processes worked as intended at finer spatiotemporal levels is essential to having confidence in the resulting census data.

___________________

2 In this, it will also be important to reconcile the predefined operational phases with the shifting timelines for 2020 Census field data collection, as described in Table 1.1 and Box 1.1, such as changes in whether supervisor areas could be shifted to closeout on a date certain rather than based on a percentage of completed workload.

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

A first recommendation that flows directly from Conclusion 4.6 reflects the fact that what our panel can actually say in its final report depends (properly) on what information the Census Bureau is willingly to publicly disclose. As we work with the Census Bureau behind its data firewall, we will promote the value of maximizing transparency with the broader stakeholder and research communities to bolster confidence in census tabulations. To date, the Census Bureau’s position has been to echo the JASON (2021) conclusion that quality metrics below the state level would have to be subjected to differentially private disclosure avoidance—and thus that these should not be published, lest too much of the global privacy-loss budget be consumed. It is our hope that the risk of complementary information disclosure will be reassessed in the coming months so that some detailed quality metrics are discussable in the public eye, so that the panel can “show its work” and say more than “trust us” as argument in our final report:

Recommendation 4.1: The Census Bureau should work on ways to make 2020 Census data quality metrics publicly available at small-domain spatiotemporal resolutions, unperturbed by disclosure avoidance, to bolster confidence in the published tabulations. The Census Bureau should also develop ways to enable qualified researchers to access a full range of data quality metrics and report their findings.

Census process data and quality metrics for small domains (demographic and geographic, finer than the national and state level) will be crucial to assessing the quality of the 2020 Census, for several reasons. As discussed earlier, the necessary delays in completing the postenumeration survey for the 2020 Census are likely to compromise its ability to answer essential questions with confidence; accordingly, detailed process data will have to carry additional evidentiary weight simply because they may be the only option to study some coverage problems. Another argument in favor of more detailed process data and quality metrics is simply that there is ample reason to expect that the quality of the 2020 Census varied considerably by demographic and geographic groups, and so data are needed to understand those effects. The first report of the American Statistical Association 2020 Census Quality Indicators Task Force (2020) included an appendix discussing response rates by census tract in 2010 and 2020, demonstrating interesting effects at both extremes—more tracts with better response and more tracts with worse response in 2020 than in 2010. This point was elaborated by O’Hare and Lee (2021), focusing on the 10 percent of census tracts where response dropped by 10 percentage points or more in 2020 relative to 2010, and discussing variation in response by tract-level characteristics (derived from the American Community Survey) of race, ethnicity, and tenure. These analyses—and the general concept of assessing the quality of something like the decennial census that is predicated on hyperlocal effects and variation—

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

underscore the need for assessment data suited to the task. Finally, the need for small-area and small-group process metrics for the 2020 Census relates to the extraordinary nature of this particular census year, in which the challenges confronting the count are so daunting that the more customary high-level quality evaluations of previous decades may not be adequate to shore up trust in the results.

A second, final recommendation stemming directly from Conclusion 4.6 underscores that, through its own formal evaluation role, the Census Bureau plays a critical role in assessing, documenting, and studying its own data. Like all other census stakeholders, we look forward to reviewing the Census Bureau’s own operational assessments and evaluation reports as they become available in 2022 and beyond. Consistent with the guidance in previous National Academies studies of census research and evaluation, we reiterate a challenge to our Census Bureau colleagues to make the assessment reports more than basic operational logs, and instead position them as a key resource for getting an early jump on next-census design and testing.

Recommendation 4.2: The Census Bureau should cast its operational assessments and evaluations of the 2020 Census as evidence of the effectiveness of census operations and key inputs to the research agenda for the 2030 Census, rather than purely procedural and documentary of the 2020 experience. Historically, the Census Bureau’s operational assessments and evaluations have been limited to high-level tabulations; the 2020 program should examine 2020 Census operations at finer levels of spatiotemporal and demographic resolution.

The decennial census is foundational to the functioning of American democracy, and maintaining the public’s trust in the census and its resulting data is a correspondingly high-stakes affair. This panel’s interactions with census stakeholders have indicated an erosion of trust and confidence, in the wake of exceptionally difficult circumstances surrounding 2020. We again commend the Census Bureau for the truly remarkable achievement of conducting the 2020 Census, but urge the Census Bureau to keep working toward additional transparency and openness and fostering greater improvements in 2030.

Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 53
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 54
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 55
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 56
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 57
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 58
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 59
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 60
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 61
Suggested Citation:"4 Initial Conclusions and the Path Ahead." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 62
Next: References »
Understanding the Quality of the 2020 Census: Interim Report Get This Book
×
Buy Paperback | $35.00 Buy Ebook | $28.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The decennial census is foundational to the functioning of American democracy, and maintaining the public's trust in the census and its resulting data is a correspondingly high-stakes affair. The 2020 Census was implemented in light of severe and unprecedented operational challenges, adapting to the COVID-19 pandemic, natural disasters, and other disruptions. This interim report from a panel of the Committee on National Statistics discusses concepts of error and quality in the decennial census as prelude to the panel’s forthcoming fuller assessment of 2020 Census data, process measures, and quality metrics. The panel will release a final report that will include conclusions about the quality of the 2020 Census and make recommendations for further research by the U.S. Census Bureau to plan the 2030 Census.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!