Skip to main content

Envisioning the 2020 Census (2010) / Chapter Skim
Currently Skimming:

3 Initial Views on 2010 Census Evaluations
Pages 247-258

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 247...
... There is also a time pressure for them since, as stated previously, much of the data collection in support of the 2010 census evaluations needs to be specified relatively early, in particular so that the contractors involved in many of the census processes can make plans for the collection and structuring of data extracts that relate to the functioning of those processes. 3–A.1 Address List Improvement For the 2000 census, the Census Bureau departed from past practice of building the address list for the census from scratch.
From page 248...
... . The Master Address File used to support the American Community Survey during the intercensal period is essentially an update of the 2000 census MAF, revised to include edits to the Postal Service's Delivery Sequence File and new construction.
From page 249...
... In each decennial census, there are unanticipated problems that need to be fully understood in order to make modifications to the census design, to partially or completely eliminate their chance of occurring in the subsequent decennial census. A master trace sample provides an omnibus tool for investigating the source of any of a large variety of potential deficiencies that can arise in such a complicated undertaking as the decennial census.
From page 250...
... A master trace sample database would be extremely useful in addressing the needs described in the previous section, including understanding the source of duplicates in the Master Address File and evaluating the benefits of LUCA and the block canvass operation. An overall assessment of the workings of the coverage follow-up interview would be feasible if the master trace sample database collected sufficient data so that it was known for each housing unit in the CFU interview what triggered the CFU interview and what the result of the interview was -- that is, what changes were made and what information precipitated the change.
From page 251...
... Recommendation 5: The Census Bureau should initiate efforts now for planning the general design of a master trace sample database and should plan for retention of the necessary infor mation to support its creation. 3–A.3 Reverse Record Check The Canadian Census has successfully employed a reverse record check for the last eight censuses to measure net coverage error.
From page 252...
... should determine the extent to which the American Com munity Survey could be used as a means for evaluating the cov erage of the decennial census through use of a reverse record check. 3–A.4 Edit Protocols Edit protocols are decisions about enumerations or the associated characteristics for a housing unit that are made based on information already collected, hence avoiding additional fieldwork.
From page 253...
... Again, as with targeting, edit protocols avoid field costs but do have the potential of increased census error. However, given the increasing costs of the decennial census, understanding precisely what the trade-offs are for various potential edit protocols would give the Census Bureau a better idea of which of these ideas are more or less promising to use in the 2020 census.
From page 254...
... While the implementation of handheld computing devices was tested in the 2006 census test and will be tested further in the 2008 dress rehearsal, there remain concerns as to how successful training will be and whether some enumerators will find the devices too difficult to comfortably learn to use in the five days allotted to training. Given that it will be extremely likely that such devices will again be used to collect information in 2020 (and in other household surveys intercensally)
From page 255...
... However, evaluations focused on general functioning do not usually provide as much help in pointing the way toward improving census processes as analyses for subdomains or analyses that examine the interactions of various factors. Since the costs of such analyses are modest, we strongly support the use of evaluations for this purpose.
From page 256...
... While certainly not as reliable or useful as a true experiment, analyses such as these could provide useful evidence for the assessment of various component processes without any impact on the functioning of the 2010 census. Second, comprehensive data from the 2010 census, its management information systems, the 2010 census coverage measurement program, and contextual data from the American Community Survey and from administrative records need to be saved in an accessible form to support more exploratory analysis of census processes, including graphical displays.
From page 257...
... The Census Bureau needs to develop a long-term plan for obtaining knowledge about census methodology in which the research undertaken at each point in time fully reflects what has already been learned so that the research program is truly cumulative. This research should be firmly grounded in the priorities of improving data quality and reducing census costs.
From page 258...
... As pointed out by the Panel on Residence Rules in the Decennial Census, "Sustained research needs to attain a place of prominence in the Bureau's priorities. The Bureau needs to view a steady stream of research as an investment in its own infrastructure that -- in due course -- will permit more accurate counting, improve the quality of census operations, and otherwise improve its products for the country" (National Research Council, 2006:271)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.