Skip to main content

Currently Skimming:

4 Initial Conclusions and the Path Ahead
Pages 53-62

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 53...
... , and through a series of operational overviews at the panel's subsequent meetings, the panel can reach some initial, high-level conclusions regarding the processes underlying the 2020 Census, as well as make some brief indications of the work we intend to perform over the next year with access to census operational data. 4.1 INITIAL CONCLUSIONS ON THE 2020 CENSUS It is entirely appropriate to begin with a note of genuine appreciation, and partially echoing the first summary conclusion offered by our predecessor Panel to Review the 2000 Census (National Research Council, 2004)
From page 54...
... That the NPC was able to accommodate the pandemic-induced disruption to its own work schedules and clerical operations with its near-overnight need to become one of the nation's leading purveyors of personal protective equipment with its support role in other Census Bureau programs was a genuinely impressive feat. Likewise, while there were distinct challenges along the way in adapting to agile programming and rigorous software development processes, we note that the Census Bureau achieved impressive feats in getting its home-grown software solutions harmonized with outsourced-development or commercial off-theshelf solutions.
From page 55...
... The late reinstatement of Update Leave enumeration for many rural areas likely helped as well, because a quick one-and-done enumerator visit to deploy an invitation to self-respond by Internet or mail was better suited to pandemic conditions than initial plans to handle them all with face-to-face interviews in Update Enumerate methodology. Moreover, the automation of a large part of census field operations was done expressly to achieve efficiency in data collection; the automation and provision of daily workload assignments directly to enumerators' handheld devices facilitated getting work done in a shortened timespan, and undeniably provided a better workload distribution model in the COVID era than in-person handoffs of questionnaires, binders, and work-hour timesheets (as in 2010)
From page 56...
... despite several years of testing alternative question structures, the abrupt July 2020 attempt to change the apportionment base to exclude undocumented immigrants (Presidential Memorandum for the Secretary of Commerce, 2020) , and the shifting end date of field operations due to Congressional inaction on extending statutory deadlines and legal rulings.
From page 57...
... The disclosure avoidance strategy chosen by the Census Bureau is an extreme variation from the past -- at its core, as implemented to date, it involves adding statistical noise to nearly every tabulated cell and synthesizing the census returns in full based on those noisy measurements -- but it is an approach grounded in quantifiable protections that has some very desirable conceptual properties. What can and must be said, however, is that the adoption of the differential privacy-based solution was made with unusual haste relative to other sweeping changes in census methodology that were researched and tested much more extensively.
From page 58...
... In this, the 2020 Census made concerted efforts to increase its partnership, communication, and outreach programs -- and the work that these programs did amidst the disruptive conditions and shifting timelines is genuinely impressive. The tangible effects of these auxiliary programs on census response and census quality are not often explored, in large part because such assessment is difficult without designed studies in advance (to its credit, some of the planned studies in the 2020 Census program of evaluations and experiments may speak partially to the impact of the 2020 Census communication and media plans)
From page 59...
... We also hope to explore some questions raised by the Census Bureau's release of census quality metrics to date, perhaps most notably the counterintuitive finding of higher item nonresponse rates in the 2020 Census than in 2010 -- counterintuitive both because of increased selfresponse in 2020 and because of most of that self-response coming through Internet response with its built-in completeness checks. In identifying outliers in factors like proxy enumeration, count imputation, and enumeration based on administrative records data when possible, we are not interested in outliers for the sake of highlighting problems in the specific application of census methods in 2020 (finding fault with specific areas or enumerators)
From page 60...
... The conclusion is an admittedly blunt, but true, statement that is a critical caveat to the preceding, stemming from the fact that our work is necessarily bounded by access to the data, analytic tools, and data processing support that only the Census Bureau can provide: Conclusion 4.6: It will not be possible for this panel (or any other evaluator) to understand and characterize the quality of the 2020 census unless the Census Bureau is forthcoming with informative data quality metrics, including new measures based on operational/process paradata, at substate levels and small-domain spatiotemporal resolution, unperturbed by noise infusion.
From page 61...
... conclusion that quality metrics below the state level would have to be subjected to differentially private disclosure avoidance -- and thus that these should not be published, lest too much of the global privacy-loss budget be consumed. It is our hope that the risk of complementary information disclosure will be reassessed in the coming months so that some detailed quality metrics are discussable in the public eye, so that the panel can "show its work" and say more than "trust us" as argument in our final report: Recommendation 4.1: The Census Bureau should work on ways to make 2020 Census data quality metrics publicly available at small-domain spatiotemporal resolutions, unperturbed by disclosure avoidance, to bolster confidence in the published tabulations.
From page 62...
... Consistent with the guidance in previous National Academies studies of census research and evaluation, we reiterate a challenge to our Census Bureau colleagues to make the assessment reports more than basic operational logs, and instead position them as a key resource for getting an early jump on next-census design and testing. Recommendation 4.2: The Census Bureau should cast its operational assessments and evaluations of the 2020 Census as evidence of the effectiveness of census operations and key inputs to the research agenda for the 2030 Census, rather than purely procedural and documentary of the 2020 experience.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.