National Academies Press: OpenBook

Understanding the Quality of the 2020 Census: Interim Report (2022)

Chapter: 3 Other Evaluations of the 2020 Census

« Previous: 2 Frameworks for Understanding the Decennial Census and Its Quality
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

3

Other Evaluations of the 2020 Census

Our panel’s evaluation of the quality of the 2020 Census is fundamentally limited, at this point, to general comments on the processes of the census and not their detailed results, because that information is only beginning to become available for analysis. We intend to perform a complete and thoroughgoing assessment of the 2020 Census, and we acknowledge and are grateful to other players in this space of census evaluation and improvement who have also provided critical feedback to the Census Bureau during its challenges in 2020. This includes the Census Bureau’s formal advisory committees (particularly its Scientific Advisory Committee) as well as the regular, rigorous oversight monitoring provided by the U.S. Government Accountability Office and U.S. Department of Commerce Office of Inspector General, as well as the microsimulation work performed by the Urban Institute (Elliott et al., 2021).

In this chapter, we pay particular note to two extant independent reviews of the 2020 census that we first commented upon in Chapter 1. We think it important to restate and describe the important recommendations made by the JASON advisory group and the American Statistical Association’s Task Force on 2020 Census Quality Indicators (hereafter, ASA Task Force). We also discuss the sets of operational quality metrics that the Census Bureau has released alongside the first data releases from the 2020 Census, and close the chapter with our general assessment of these extant analyses of the 2020 Census.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

3.1 JASON REVIEW OF 2020 CENSUS DATA QUALITY PROCESSES

Following some prior census-related consultations in recent years, the Census Bureau requested the JASON advisory group to hear a wide-ranging set of briefings on 2020 Census process over the course of January 4–8, 2021, and to provide a rapid letter report assessment of the Bureau’s data quality processes. JASON and the Census Bureau permitted staff from the National Academies’ Committee on National Statistics to participate in the briefings as observers. After two weeks of deliberations, JASON submitted its report to the Census Bureau, which then released the report publicly (JASON, 2021). The JASON recommendations are listed in Box 3.1.

JASON’s first recommendation—that the Census Bureau should be afforded the time to complete its data quality checks—is arguably its most important and essential contribution, given the timing of its work and the status of the 2020 Census in early January 2021. On December 2, the Census Bureau had issued a statement acknowledging that it was continuing to address “anomalies” in census data processing and that this processing work put the target date for delivering apportionment data “in flux;” on December 30—the eve of the statutory deadline—the Census Bureau stated that “projected dates are fluid” and that it planned to deliver apportionment counts “in early 2021, as close to the statutory deadline as possible.”1 The JASON briefings occurred at a point of peak uncertainty in the census release timetable—which was further roiled on January 12, when the Commerce Department’s Office of Inspector General divulged whistleblower complaints that Census Bureau staff were under pressure to produce counts of citizens, noncitizens, and undocumented immigrants before the January 20 change in presidential administrations.2 Then-Census Bureau director Steve Dillingham replied on January 13 that he had ordered all work on the immigrant data to “stand down” and he subsequently announced his retirement effective January 20. Executive Order No. 13986 (2021) on January 20 rescinded the previous administration’s orders on citizenship and immigration status data in the census, but did not specify a target date for 2020 Census apportionment data—the first inkling of which would come on January 27 in a Census Bureau webinar for the National Conference of State Legislatures, which set April 30 as the apportionment release target (and left the redistricting data release deadline as “to be announced”).3

___________________

1 See https://www.census.gov/newsroom/press-releases/2020/update-2020-data-processing.html and https://www.census.gov/newsroom/press-releases/2020/2020-census-update-apportionment.html, which were originally posted to the now-retired 2020census.gov domain.

2 See https://www.oig.doc.gov/OIGPublications/OIG-21-019-M.pdf.

3 Shortly thereafter, on February 3, the Census Bureau formally stipulated in federal court in the National Urban League v. Ross (2020) case that it “will not under any circumstances report the

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Generally, JASON commended the Census Bureau’s work under the difficult circumstances, and the balance of its recommendations point to the major, looming challenge of communicating the results of the 2020 Census in such a way that reinforces the trustworthiness of the results. Regarding the overall quality of the census, JASON (2021:3) posited that “a reasonable standard is whether the plan for the decennial census would produce results that improve on the standards set in previous decades given legally mandated constraints, including budget, and the information available at the time.” The uniquely difficult circumstances of the 2020 Census are such that the critical test is a modified benchmark: “being judged as having executed the 2020 Census and achieving a process quality, within the range of previously accepted decennials.” Accordingly, JASON (2021:3) noted:

It is important to understand—and for the Census Bureau to communicate to the public—that the accepted range over previous decades allows for considerable imperfections, as long as these do not knowingly embody a priori biases against individual states or statutorily defined classes of individuals.

To that end, JASON urged care and transparency in communicating how the Census Bureau was working through issues in the post-collection processing of 2020 Census data, from compilation of Decennial Response File 1 (DRF1, applying some editing routines), Decennial Response File 2 (DRF2, the stage at which deduplication of responses is performed), Census Unedited File (CUF, in which imputation and augmentation using administrative records data are applied, and from which the state-level apportionment totals are generated), and finally the Census Edited File (CEF), the product that is passed through disclosure avoidance and used to produce redistricting data. In particular, JASON urged clarity in the quality controls on administrative records data used to supplement census work and suggested that the Census Bureau avoid using the term “anomaly” (as it had in past censuses, and had been doing since late fall of 2020) to characterize the technical issues in data processing that commonly occur during the post-collection processing work.

3.2 AMERICAN STATISTICAL ASSOCIATION TASK FORCE ON 2020 CENSUS QUALITY INDICATORS

As a second-stage part of independent, external review of the 2020 Census, the Census Bureau recognized a task force established under the aegis of the American Statistical Association. On a quick basis, as the Census Bureau still worked toward the statutory deadline, the task force issued a first report (2020 Census Quality Indicators Task Force, 2020) in October 2020 suggesting a range of possible indicators that could be applied to the 2020 Census. The Census

___________________

results of the 2020 Census to the Secretary of the Department of Commerce, the President, and Congress, before April 16, 2021.”

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Bureau then worked with a small group of analysts tapped by the task force to assemble available census quality indicators, behind the Census Bureau’s computer and disclosure-review firewalls. After several months of marshaling the available data and conducting its initial review, the data analysis group reported its work to the task force (Biemer et al., 2021), and two members of the parent task force supplied supplemental notes specific to demographic analysis (Hogan, 2021) and administrative records data (Fay, 2021), respectively. Based on its review of those reports, and its own deliberations, the task force issued its second and final report in September 2021 (2020 Census Quality Indicators Task Force, 2021).

In its first report, the task force partitioned the overall census experience into three phases and discussed general metrics within each. In the self-response data collection phase, the task force noted that the response rate information already generated by the Census Bureau “is an exceptionally important indicator” available at a wide range of geographic levels, because “it is widely recognized that self-response from the household provides the most accurate data” and because a particular area’s hard-to-count nature is generally associated with low self-response (2020 Census Quality Indicators Task Force, 2020:10). The task force identified field data collection as the second phase but focused on the major Nonresponse Follow-up (NRFU) operation within it; within NRFU, suggested indicators were divided between those that spoke to enumeration outcomes (e.g., percentage enumerated by proxy response or by resort to administrative records) and those based on process paradata (e.g., the number of attempts required to resolve each NRFU case and the number of non-enumerated cases resulting from active refusal). Indicators for the third phase, post-data-collection processing, would include characterizing the nature of routines applied at both the CUF and CEF stage, including the percentage of records that are either identified as duplicate enumerations or that do not contain enough information to inform the deduplication work and the percent of records filled by imputation of various sorts. Though the task force would ultimately limit its focus to the quality of the state-level apportionment totals, it advocated the creation of quality metrics for at least the geographic levels of counties and census tracts, with an eye toward assessing variation across geographic areas (and identification of extreme outliers) and comparison with external estimates such as similar results in the 2010 Census. The task force offered five initial recommendations, mainly promoting continued work on the indicators expressed in the report; these are listed in Box 3.2.

Equipped with that framework, Biemer et al. (2021) identified 10 metrics—“process statistics”—that were intended to cover the full range of census processing; these are described in Table 3.1. Biemer et al. (2021) extended the task force’s three-phase model for census operations to include two additional phases, one the cross-cutting step of address list development to support census operations (designating a statistic meant to summarize churn in the Master

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Table 3.1 American Statistical Association Task Force Data Analysis, Process Statistics by Major Census Phase

Process Statistics Description

Master Address File (MAF) Development

  1. MAF Revisions

Percent of all addresses that were either deleted or added during the 2020 Census data collection period

Self-Response (SR)

  1. Questionnaires Without Identification (ID) not on MAF (Non-Matching No IDs)

Percent of housing units (HUs) submitting questionnaires without census IDs and no matching address was found on the MAF for 2020

  1. Multiple Responses

Percent of occupied HUs with two or more responses from various sources for 2020 minus the corresponding percentage for 2010

  1. Usual Residence at College (URC)

Percent of occupied HUs with two or more people where one or more occupant indicated their usual residence was at college for 2020 minus the corresponding percentage for 2010

Nonresponse Follow-up (NRFU)

  1. Responses Obtained by Proxy (Proxy)

Percent of persons in occupied HUs whose count was obtained by proxy interview for 2020 minus the corresponding percentage for 2010

  1. Enumerations With Only a Population Count (Count Only)

Percent of occupied HUs where only a population count was obtained for 2020 minus the corresponding percentage for 2010

  1. Enumerations via Administrative Records (Admin Recs)

Percent of occupied HUs enumerated by administrative records for 2020

Data Processing

  1. MAF Addresses Having Imputed Status (Status Imputation)

Percent of MAF units whose status was imputed for 2020 minus the corresponding percentage for 2010

  1. Occupied Housing Units With Imputed Population Counts (Count Imputation)

Percent of occupied HUs with known status but whose population count was imputed for 2020 minus the corresponding percentage for 2010

Group Quarters (GQs)

  1. Group Quarters With Imputed Count (GQ Imputation)

Percent of the GQs population that was imputed in 2020

SOURCE: Adapted from Biemer et al. (2021:Table 1).

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Address File (MAF) for a particular geographic area through the duration of the 2020 Census) and another on the novel-for-2020 step of imputing parts of the group quarters (GQ) population due to the increased difficulty of reaching many of those facilities in 2020. Under the self-response phase, Biemer et al. (2021) proposed examining the efficacy of the 2020 Census Non-ID operation (the mechanism by which respondents could answer the census via the Internet without reference to any physical mailing) in producing returns from addresses not already found in the MAF, as well as a measure based on response to the 2020 Census coverage probe question indicating whether a respondent’s usual residence was at a college or university location. The core five indicators include one unique to 2020 (percent of housing units enumerated using administrative records data) and four well-known quality metrics: levels of imputation for

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

status of housing unit and for the population count, enumerations via proxy response, and enumerations in which the response includes only an answer to the question on total household population (i.e., providing the bare minimum of information required by the Constitutional mandate).

Biemer et al. (2021) worked with the Census Bureau to examine these ten metrics at the state level and, as was possible for six of the indicators, construct analogous measures for the 2010 census as a benchmark for comparison. The state-level process statistics were cut into quintiles in order to construct a choropleth map, permitting some rough examination of geographic variation in the indicators, while noting that “more conclusive assessment” of 2020 Census quality would have to await more detailed investigation of the metrics and comparison with other resources like the Postenumeration Survey. In its final report, the 2020 Census Quality Indicators Task Force (2021:2) considered the Biemer et al. (2021) results, agreeing that most of the metrics showed considerable variation across the states, but not rising to the level of “major anomalies that would indicate census numbers are not fit for use for purposes of apportionment.” Where the task force disagreed with the Biemer et al. (2021) analysis was the analysts’ derivation of a summary index measure (combining state-level ranks across the ten process statistics) as a unified indicator, which the task force concluded lacked support.

Still, the Biemer et al. (2021) process statistics helped the task force to reach a set of conclusions and recommendations, reprinted in Box 3.3—the gist of which is that much work remains to be done to answer the overall question of the quality of the 2020 Census. Absent information from the postenumeration survey and analysis of process statistics like the Biemer et al. (2021) indicators at “more detailed levels of geography and subgroups of the population,” the task force noted that a thorough assessment of census quality is not yet possible. The task force reinforced a subtle criticism of the Census Bureau’s public promotion of the percentage of completed enumerations in a state that it had previously advanced in its first report; completion goals of 99 percent are good, but they are “not sufficient to draw conclusions about the quality” of those enumerations and the resulting counts. Importantly, 2020 Census Quality Indicators Task Force (2021) formally passed the baton to this panel and other researchers to continue the comprehensive evaluation of the quality of the 2020 census, signaling some areas of key exploratory interest in its lead recommendation.

3.3 CENSUS BUREAU’S OPERATIONAL QUALITY METRIC RELEASES

It is important to note—and to give credit to the Census Bureau—that more publicly available information on 2020 Census data quality has been made available now than at corresponding points in previous censuses (even

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Table 3.2 Final Status of Addresses in the 2010 and 2020 Censuses

Percent Resolved As 2010 2020
Self-Response Occupied 61.05 64.28
Self-Response Vacant/Delete 1.00
NRFU Occupied 20.18 17.36
NRFU Vacant 10.41 7.95
NRFU Delete 3.26 6.59
Group Quarters Occupied 0.14 0.13
Group Quarters Vacant/Delete < 0.1 < 0.1
Other Enumeration Activities Occupied 3.75 1.21
Other Enumeration Activities Vacant 0.51 < 0.1
Other Enumeration Activities Delete 0.29 0.51
Unresolved Housing Units (went to Count Imputation) 0.38 0.93
Unresolved after data collection 0.38 0.23
Unresolved after person unduplication 0.71

NOTES: NRFU, nonresponse follow-up. “Other Enumeration Activities” in the 2010 Census include Remote Update Enumerate and Coverage Follow-up; in the 2020 Census, they include Self-Response Quality Assurance and Coverage Improvement; in both censuses, they include Update Enumerate and Remote Alaska. Metrics are based on total address count of 136,700,000 in the 2010 Census and 151,800,000 in the 2020 Census.

SOURCE: U.S. Census Bureau (2021a); all metrics considered to be preliminary.

if that information is generally limited to high-level summaries). In one sense, the fact that some of these numbers are available now (rather than after the release of all data products and the Census Bureau’s own assessments and evaluations) is a natural extension of past practice. Previous censuses began the practice of making rough self-response rates available on a rolling basis during the enumeration itself, to promote get-out-the-count efforts and communication efforts, and the 2020 Census continued that practice. But the unusual circumstances of the 2020 Census prompted major additions—first, in the generation of NRFU completion rates during the delayed and compressed NRFU operation in 2020. Later, the Census Bureau would release three sets of operational quality metrics, accompanying the release of 2020 Census apportionment and redistricting data in April, May, and August 2021.4

___________________

4 Each of the operational quality metric releases took the form of a Microsoft Excel spreadsheet, and was accompanied by a press release or a summary post in the Census Bureau’s “Random Samplings” blog; these were posted on the 2020 Census data quality page at https://www.census.gov/programs-surveys/decennial-census/decade/2020/planning-management/process/data-quality.html.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Table 3.3 Resolution of Housing Unit Enumeration in the 2010 and 2020 Censuses

Percent Addresses (Including Deletes) Resolved As 2010 2020
U.S. Distribution Across Statesa U.S. Distribution Across Statesa
Min Median Max Min Median Max
Self-Response 61.05 43.73 60.29 69.61 65.28 38.34 64.76 73.60
Internetc 52.06 22.20 51.11 62.23
Paper 61.01 43.73 60.29 69.61 11.84 5.73 12.01 21.37
Telephone < 0.1 0.00 0.00 1.39 0.96 1.38 1.89
All NRFU and Other Enumeration Activities 38.41 30.10 38.95 55.66 33.64 25.41 34.13 60.53
Household Interview 18.79 14.19 19.05 28.59 10.84 7.33 10.65 23.23
Proxy 19.51 14.80 20.31 29.20 18.21 13.83 19.18 37.31
Occupied 5.03 3.28 4.77 6.87 4.53 3.33 4.29 6.34
Vacant 10.92 7.40 11.00 21.87 6.82 4.45 7.21 15.53
Delete 3.55 2.10 3.52 8.00 6.86 3.86 7.29 18.24
Unknown Respondent Typeb 0.10 0.10 0.13 0.31
Administrative Recordsc 4.59 0.00 4.49 5.88
Occupiedc 3.20 0.00 3.00 4.46
Vacantc 1.15 0.00 1.22 2.11
Deletec 0.24 0.00 0.19 0.54
Unresolved Housing Units (Sent to Count Imputation) 0.38 0.12 0.36 1.68 0.93 0.65 0.90 1.67
Unresolved, data collection 0.38 0.12 0.36 1.68 0.23 0.13 0.20 1.00
Unresolved, person unduplicationc 0.71 0.00 0.68 1.22

a Summary statistics computed across 50 states, the District of Columbia, and Puerto Rico.

b Resolution type applies to the 2010 Census only.

c Resolution type applies to the 2020 Census only.

NOTES: NRFU, nonresponse follow-up. “Other Enumeration Activities” in the 2010 Census include Remote Update Enumerate and Coverage Follow-up; in 2020, they include Self-Response Quality Assurance and Coverage Improvement; in both censuses, they include Update Enumerate and Remote Alaska. Metrics based on address count of 136,700,000 in the 2010 Census and 151,800,000 in 2020.

SOURCE: Adapted from U.S. Census Bureau (2021a); all metrics considered to be preliminary.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Release 1 of the 2020 Census operational quality metrics (U.S. Census Bureau, 2021a), accompanying the release of the state-level apportionment totals, focused on the means by which housing units and addresses were resolved or completed in 2020 Census operations, with completion percentages being computed for the nation as a whole and separately by state (including Puerto Rico and the District of Columbia). A companion table calculated the same percentages for the 2010 Census for benchmarking purposes. Key results from Release 1 are excerpted in Tables 3.2 and 3.3, from which general contours may be traced. The majority of housing units were accounted for in the self-response phase of the census in both 2010 and 2020, with the 2020 Census achieving a slightly higher rate (65.3 percent) than 2010 (61.1 percent). Internet response was the dominant part of that self-response group in 2020 (79.7 percent of the self-response tier), followed by paper (18.1 percent) and telephone (2.2 percent), while response to the paper-based 2010 Census was almost exclusively on paper questionnaires.5 An additional 10.8 percent of 2020 Census responses were obtained through personal interview with household respondents in NRFU or other field operations, a smaller share than in the 2010 Census (18.8 percent). However, the 2020 Census recouped some of that gap, enumerating 4.6 percent of housing units through recourse to administrative records data when available and when a household respondent could not be reached. In both decades, about 19 percent of housing units would be accounted for in proxy responses (e.g., from a neighbor or building manager), with the 2020 Census actually having slightly less proxy response (18.2 percent) than 2010 (19.5 percent). Finally, a higher percentage (0.9 percent) of unresolved housing units would need to be imputed in full in 2020 relative to the 2010 Census’ 0.4 percent of housing units going to count imputation. The small but important population of group quarters (such as correctional facilities, college housing, and health care facilities) would account for near identical percentages (about 0.14 percent) of addresses in the 2010 and 2020 Censuses. The mix and relative share of these response types would vary considerably by state—for example, resolution by administrative records in the 2010 Census would range between 2.8 percent in Hawaii6 to 5.9 percent in Florida—but it is not possible to infer much from those aggregate indicators.

Release 2 of the 2020 Census operational quality metrics (U.S. Census Bureau, 2021b) turned attention to the average size of occupied housing units in the 2020 and 2010 Censuses by major census phase, as well as variation by state. In 2020, households averaged 2.4 persons whether counted by self-response or in NRFU, relative to 2.5 persons per household by self-response in

___________________

5 2010 Census interviews could be completed during calls to the Census Questionnaire Assistance telephone line, accounting for the small percentage of self-response resolutions by telephone.

6 The minimum by state of 0 percent reported in Table 3.3 corresponds to Puerto Rico, for which the administrative records substitution was not performed.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

2010 and 2.6 by NRFU. The metrics also examined percentage share by major census phase (self-response, NRFU, other enumeration operations, or count imputation) for the modal categories of one-person and two-person housing units, separately. Release 2 also shed some first light on self-response involving Non-ID processing—that is, completion of the census questionnaire without reference to the MAF ID printed on mailed or delivered census materials. In the 2010 Census, Non-ID processing was primarily triggered by a respondent completing a blank, paper “Be Counted” form, from boxes that were made available in libraries and other public locations, and so only accounted for less than 0.5 percent of all housing units. But Non-ID processing is particularly critical to Internet response, enabling the capacity for census returns to be completed anytime and anywhere—a capacity that was actively promoted in 2020 and not in 2010. At the national level, self-response Non-ID returns comprised 8 percent of housing unit enumerations in the 2020 Census, and ranging between 5 percent (in Minnesota, Nebraska, and Wisconsin) and 11 percent (in New Mexico) by state.

Release 3 of the 2020 Census operational quality metrics was divided into two tables released one week apart in August 2021 (U.S. Census Bureau, 2021c,d). The first table nodded to requests that had been made for quality indicators at geographic levels below the state but, rather than releasing those, hinted at their content: providing three-number statistical summaries (mean, standard deviation, and median7) for the distribution of selected metrics, separately for all counties within states and all census tracts within states. The metrics included in this table included the percentage of occupied households in NRFU that were enumerated by interview, proxy, or administrative records, as well as overall and Internet self-response rates. The simple three-number summaries provide no real opportunity for identification of statistical outliers or detailed geographic patterning in the metrics, limiting their analytical value.

The second table in Release 3 began the process of looking at the quality of census returns themselves by providing estimated item nonresponse rates for four major items on the census questionnaire itself: the overall population count for the household, age or date of birth, Hispanic origin, and race. These item nonresponse percentages were calculated separately for self-response (Internet, paper, and telephone modes), NRFU (interview, proxy, or administrative records), other enumeration modes, and for group quarters. These rates were also calculated separately by state, and the corresponding values for the 2010 Census were put in a companion worksheet for benchmarking. Portions of this table (selected rates for the 2010 and 2020 Censuses at the national level) are included here as Table 3.4. Prior to the Release 3 metrics, the Census Bureau had

___________________

7 Medians were suppressed in the county-level distribution for states with few than five counties (Delaware, Hawaii, and Rhode Island), and no county-level statistics were presented for the single-county District of Columbia.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

Table 3.4 Item Nonresponse Rates in the 2010 and 2020 Censuses

Percent Item Nonresponse 2010 2020
All Occupied Housing Units
Population Count 1.43 0.52
Age or Date of Birth 3.35 5.95
Hispanic Origin 3.99 5.35
Race 3.31 5.77
Self-Response Occupied Housing Units
Population Count 1.77 0.66
Age or Date of Birth 0.65 1.28
Hispanic Origin 4.26 2.26
Race 3.26 2.61
NRFU Occupied Housing Units
All NRFU Enumerations
Population Count 0.42 < 0.01
Age or Date of Birth 12.24 24.04
Hispanic Origin 4.08 16.77
Race 4.08 17.51
NRFU Household Interviews
Population Count 0.08 < 0.01
Age or Date of Birth 5.25 16.04
Hispanic Origin 1.85 5.98
Race 2.09 8.71
NRFU Proxy Interviews
Population Count 1.43 < 0.01
Age or Date of Birth 51.93 60.81
Hispanic Origin 16.68 38.09
Race 15.34 41.22
NRFU Administrative Records Enumerations
Population Count
Age or Date of Birth 3.48
Hispanic Origin 28.28
Race 18.10
Other Enumeration Activities Occupied Housing Units
Population Count 0.63 1.00
Age or Date of Birth 0.91 6.70
Hispanic Origin 0.79 10.94
Race 1.12 10.85
All Occupied Group Quarters
Population Count
Age or Date of Birth 6.25 17.81
Hispanic Origin 25.01 43.85
Race 18.07 30.17

NOTES: NRFU, nonresponse follow-up. “Other Enumeration Activities” in the 2010 Census include Remote Update Enumerate and Coverage Follow-up; in the 2020 Census, they include Self-Response Quality Assurance and Coverage Improvement; in both censuses, they include Update Enumerate and Remote Alaska.

SOURCE: U.S. Census Bureau (2021d); all metrics considered to be preliminary.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

acknowledged that it was seeing item nonresponse rates higher than the 2010 and previous censuses, across the board for all items.8 The Release 3 metrics, as shown in Table 3.4, generally confirmed the higher nonresponse rates but—by accounting for duplicate entries in post-processing—the rates were not as high as earlier suggested. The overall item nonresponse rates confirmed the historical norm that those rates are lowest among self-response households—but the table also dramatically underscores the weakness of proxy responses, with about 61 percent of proxy responses lacking age or date of birth, 38 percent lacking Hispanic origin information, and 41 percent lacking race information. The table also suggests lingering concern about the completeness of race and Hispanic origin data in administrative records data sources. Group quarters enumeration was particularly hard-hit by the operational challenges of the 2020 Census, and the item nonresponse rates for occupied group quarters increased markedly over their 2010 Census values.

3.4 PANEL’S ASSESSMENT

We are generally in strong agreement with both JASON and the ASA Task Force concerning their recommendations, listed in Boxes 3.13.3. Both groups are forceful in advocating that the Census Bureau develop new census quality indicators based on its 2020 Census operational control systems—that is, based on the real-time operational and process data that were made possible by the increased automation of field data collection. The ASA Task Force is particularly emphatic about drawing connections between these process metrics and census quality for different geographies and demographic groups, while JASON strongly and usefully argues that the temporal nature of these process statistics be exploited as well. Both JASON (prospectively) and the ASA Task Force (prospectively in its first report, retrospectively in its second) acknowledged the importance of giving the Census Bureau professional staff the time it needed to complete its post-data collection processing rather than force an earlier release of apportionment and redistricting data. In terms of the distinctive recommendations that each group offered on its own, we resonate with JASON’s encouragement of further exploration of operational uses of administrative records and third-party data in beginning to envision the 2030 Census and with the ASA Task Force’s note that the nation’s census law (Title 13 of the U.S. Code) is ripe for review and modernization (including to bolster the Census Bureau’s position of independence, as called for in the principles and practices expected of a statistical agency; Box 2.1).

___________________

8 See, e.g., acting Census Bureau director Ron Jarmin’s blog post of July 28, 2021, at https://www.census.gov/newsroom/blogs/director/2021/07/redistricting-data.html. Very preliminary, and high, item nonresponse rates had also been hinted at in a batch of documentation for ongoing litigation surrounding the 2020 Census.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×

We do have some quibbles about two of the JASON recommendations, but even in those cases, we understand why they were made and agree with their general tenor. First, a comparatively minor point, but we are not as sanguine as JASON on the utility of the 2019 Census Test—a quick conversion of a portion of American Community Survey sample to test the possible impact on response of including a citizenship question on the 2020 Census questionnaire—for better understanding self-response behavior. However, the notion of trying to estimate the impact of the COVID-19 pandemic on census self-response rates is sound. More substantively, we are concerned about JASON’s recommendation about data quality metrics below the national level being interpreted as a barrier to those metrics being generated—largely because we have not seen a compelling case made by the Census Bureau about why subnational quality metrics are unduly disclosive of respondent information. But the balance of that JASON recommendation, and the group’s other suggestions, properly suggests strong encouragement for their production (the open question being the extent of their publication and audience). It is also the case that the conversation about the size of the global privacy-loss budget for disclosure avoidance in the 2020 Census has shifted considerably—increasing markedly in the final production and release of 2020 census redistricting data—since JASON’s January–February 2021 time frame.

Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 37
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 38
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 39
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 40
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 41
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 42
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 43
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 44
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 45
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 46
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 47
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 48
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 49
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 50
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 51
Suggested Citation:"3 Other Evaluations of the 2020 Census." National Academies of Sciences, Engineering, and Medicine. 2022. Understanding the Quality of the 2020 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/26529.
×
Page 52
Next: 4 Initial Conclusions and the Path Ahead »
Understanding the Quality of the 2020 Census: Interim Report Get This Book
×
 Understanding the Quality of the 2020 Census: Interim Report
Buy Paperback | $35.00 Buy Ebook | $28.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The decennial census is foundational to the functioning of American democracy, and maintaining the public's trust in the census and its resulting data is a correspondingly high-stakes affair. The 2020 Census was implemented in light of severe and unprecedented operational challenges, adapting to the COVID-19 pandemic, natural disasters, and other disruptions. This interim report from a panel of the Committee on National Statistics discusses concepts of error and quality in the decennial census as prelude to the panel’s forthcoming fuller assessment of 2020 Census data, process measures, and quality metrics. The panel will release a final report that will include conclusions about the quality of the 2020 Census and make recommendations for further research by the U.S. Census Bureau to plan the 2030 Census.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!