National Academies Press: OpenBook

Reducing Response Burden in the American Community Survey: Proceedings of a Workshop (2016)

Chapter: 6 Tailoring Collection of Information from Group Quarters

« Previous: 5 Using Improved Sampling and Other Methods to Reduce Response Burden
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

6

Tailoring Collection of Information from Group Quarters

It has been 10 years since a sample of group quarters (GQs)—correctional facilities, nursing homes, college dormitories, and the like—was added to the then-year-old American Community Survey (ACS), with the goal of more closely mirroring the design of the census long-form sample that the ACS was designed to replace. People in the group quarters sample units are now asked the same questions as household members about such topics as personal characteristics (e.g., disability, veteran status, and employment). The ACS housing questions are not asked, but information about type of facility is collected from the facility contacts.

The Census Bureau has found the collection, estimation, and analysis of GQ information to be quite challenging over this decade. For example, the small representation of group quarters in the monthly ACS samples has affected the quality of the estimates in many small areas that have large GQ populations relative to the total population.

In 2010, the Census Bureau asked the National Research Council (NRC), through a panel of the Committee on National Statistics, to review and evaluate the statistical methods used for measuring the GQ population. The panel’s report contained several recommendations calling for improvements in the sample design, sample allocation, weighting, and estimation procedures and suggested further research to address the underlying question of the relative importance and costs of the GQ data collection in the context of the overall ACS (National Research Council, 2011).

Five years after that report and the introduction of the first ACS data products based on samples of both households and group quarters, the Census Bureau again requested the Committee on National Statistics to revisit

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

issues surrounding the collection of data from group quarters as one of the key topics for this workshop. The steering committee developed an agenda that focused broadly on data collection methods for the group quarters component of the survey. The session featured an overview presentation by Judy Belton (U.S. Census Bureau) and additional presentations by Barbara Anderson (University of Michigan), Lauren Harris-Kojetin (National Center for Health Statistics), Andy Peytchev (University of Michigan), Michael Brick (Westat), and Colm O’Muircheartaigh (NORC at the University of Chicago). The session was chaired by steering committee member David Dolson (Statistics Canada), a member of the 2010 study panel.

THE FEASIBILITY OF TAILORING GROUP QUARTERS—SPECIFIC QUESTIONNAIRES IN THE ACS

Judy Belton addressed the feasibility of tailoring GQ-specific questionnaires in the ACS. She defined group quarters, provided background on ACS GQ data collection and the questionnaire items on the ACS form, and reported on the analysis and recommendations of an internal Census Bureau study to determine the feasibility of developing a GQ-specific questionnaire.

Group quarters, according to the official definition, are places where people live or stay in a group living arrangement that is owned and managed by an entity or organization that provides housing or services for its residents. Group quarters are divided into two groups—institutional and noninstitutional. Some examples of institutional are correctional facilities for adults, juvenile facilities, and nursing facilities. Some examples of noninstitutional GQs are college and university student housing, military barracks, and residential treatment centers.

The ACS samples about 18,000 GQs a year, classified as large (15 or more people living in the facility) and small (with fewer than 15 people). There are about 15,000 large GQs, mostly college and university student housing, nursing facilities, and correctional facilities, and nearly 3,000 small GQs, mostly group homes and workers’ dorms.

The ACS collects data from a sample of about 1,600 facilities every month. From those facilities, the Census Bureau samples residents to participate in the ACS, with an average of about 10 people who participate in each GQ. The collection from 2006 through 2008 was paper-based, with field representatives using a paper questionnaire and conducting the interviews. In 2009, the ACS converted to collection using the computer-assisted personal interviewing (CAPI) instrument.

In every sample GQ, the interviewer first talks to a contact person—the gatekeeper—using an automated Group Quarters Facility Questionnaire (GQFQ). The GQFQ obtains the contact person’s name, verifies the GQ

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

type and address, and obtains the number of people who are living or staying at the GQ that day. Based on the number of people who are living or staying there on that day, the GQFQ randomly selects residents to participate in the survey. Field representatives attempt CAPI with the selected people. If CAPI cannot be conducted with a sample resident, the resident is asked to participate in a telephone interview. The next step is to conduct an interview with a proxy—a relative, parent, or contact at the facility. After failing to obtain an interview by these methods, the interviewer will leave a questionnaire with the sample resident and field staff will return to pick it up. If that fails, the Census staff member swears in a contact person at the facility, commissioning the contact person to drop the questionnaires off with the sample residents, and the field staff returns upon completion. As a last resort, field staff can use the facility’s administrative records after obtaining permission from a Census Bureau regional office.

CAPI is the preferred method of data collection because it has skip patterns based on GQ type, Belton explained. For example, residents living in institutional GQs are not asked questions about how they travel to work or whether they have any children living in the facility. However, paper questionnaires are not totally eliminated because computers are not allowed and access is restricted in some facilities such as federal Bureau of Prisons facilities. For this type of group quarters, the forms are dropped off and mailed to the regional offices.

Belton stated that a special paper version of the CAPI questionnaire is provided to facility management and others as an example of the kinds of questions asked in the computer-based interview. The content of the GQ questionnaire is much like a housing unit questionnaire with only the social and economic questions. The GQ questionnaire does not contain plumbing and heating questions, but receipt of Supplemental Nutrition Assistance Program (SNAP) assistance, which is a housing question, is asked.

The Census Bureau is assessing the possibility of developing GQ-specific questionnaires. ACS data on response by type of institution and mode indicates that 84.3 percent of GQ interviews are conducted via CAPI while close to 16 percent are self-response using paper. For institutional GQ responses, only 7.6 percent are completed via paper questionnaires compared to 92.4 percent of CAPI responses. Reflecting the paper-based policies of the Bureau of Prisons, more paper is used in adult correctional—almost 12 percent—than in juvenile and nursing facilities.

Belton next reported on data on GQ paper responses by GQ type and respondent type. She reported that the majority of all GQ paper responses, almost 92 percent, were answered by the sample resident—80.4 percent of institutional paper GQ responses and almost 96 percent of noninstitutional paper responses were completed by the sample resident. Almost 50 percent of the responses came from proxies in nursing homes, and almost 39 per-

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

cent were proxies in the other institutional GQ types. GQ CAPI responses varied from those who used paper. While the majority of CAPI interviews were completed by the sample resident, proxies completed a larger proportion of GQ CAPI responses than for paper responses—proxies accounted for 24 percent of responses for paper, and CAPI proxies accounted for almost 28 percent.

The Census staff also assessed missing data rates for GC reporters. The rates varied but were generally low regardless of the mode, except for the health insurance questions, for which there were reporting problems in both paper and CAPI. Comparing questionnaire missing data rates by institutional versus noninstitutional GQ types, the Census Bureau found that missing data rates were higher in institutional GQ types than in noninstitutional GQ types.

The institutions showing the highest missing data rates on the paper questionnaire were nursing facilities, probably related to the fact that health insurance is not available on administrative records. As might be expected, missing data rates were higher for paper responses completed by the proxies. Finally, the study found that facility-provided administrative records, while not used by the majority of GQ responders, were used by only 9 percent of GQ responses via paper questionnaire and 32 percent of GQ responses from CAPI interviews.

Belton summarized that the findings suggest that very few institutional GQ respondents self-respond using paper. Even responses to the eight questionnaire items proposed for removal in a paper version had relatively low missing data rates for these items, suggest to her that respondents were not burdened by the extra questionnaire items.

Developing and implementing a paper questionnaire for GQs would present operational issues related to the additional workload involved in assembling, distributing, and controlling paper questionnaires. She acknowledged that any changes in the processing and field-data collection procedures would need to be thoroughly tested.

Based on this review, the staff recommended that the Census Bureau should not create a GQ-specific questionnaire for institutional GQ types. The study also suggested the need to consider the option of offering the Internet option for students in college dorms, residents in military barracks, and perhaps some residents in group homes.

CENSUS SCIENTIFIC ADVISORY COMMITTEE FINDINGS

Barbara Anderson’s comments drew from her work on the Census Scientific Advisory Committee Working Group on Group Quarters in the ACS. She, Robert Hummer (University of North Carolina), and Irma Elo

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

(University of Pennsylvania) constitute this working group. The committee developed several suggestions for consideration:

  • Make the Internet version of the ACS available to noninstitutional GQ residents, especially in college dorms, military barracks, and group homes. Of the noninstitutional GQ respondents, 79 percent are college students or military personnel—very computer-savvy groups. Allowing noninstitutional GQ respondents to answer on the Internet should lower costs and improve data quality. Non-institutional GQ respondents should be treated the same way as non-GQ respondents, which would substantially reduce the problems with obtaining data on GQ respondents, cutting them by about one-half.
  • Ask only a short list of items to institutional GQ respondents that can be filled out by administrators, which would perhaps eliminate the paper form for institutional GQs. For institutional GQs, the committee suggested collecting data for a very short set of items—age, sex, race, ethnicity, and educational attainment. The advantage is that these data could be obtained completely from administrative sources. This option is attractive because the per-respondent cost of collecting GQ data is far higher than for non-GQ data, and it could eliminate the need for a paper form for institutional GQs.
  • Flag imputed cases and values in the ACS Public Use Microdata Samples (PUMS). The ACS PUMS data should have an imputation flag for an entire imputed GQ respondent and also for specific imputed variables. As of now, there is no indication in the PUMS data whether particular variables, or even the whole case, were imputed from another source. Furthermore, if only a small set of items is collected for institutional GQ respondents, imputation strategies need to be rethought.
  • Include more information in the ACS PUMS file on GQ type beyond the institutional/noninstitutional divide, currently the only breakdown available. A variable with a more detailed breakdown of GQ type in the ACS PUMS data would make these data more useful to users.

Anderson discussed the needs of the two main user communities for GQ data—municipalities, which are mainly concerned with average values and distributions, and researchers who want to build models and run multivariate models using the PUMS data. Her perception is that there has been overwhelmingly more concern for the needs of municipalities than PUMS data users. She pointed out the value of PUMS data analysis to the scholarly community, government, and policy.

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

NATIONAL NURSING HOME SURVEY EXPERIENCE

Lauren Harris-Kojetin said that she based her remarks on her experience at the National Center for Health Statistics (NCHS) Division of Healthcare Surveys, where she heads up the long-term care statistics program, obtaining data and producing national and state estimates on nursing facilities and residents. Nursing facilities are either the second or third largest set of group quarters in the ACS, she noted.

Harris-Kojetin stated that NCHS conducted the National Nursing Home Survey seven times between the 1970s and 2004. Similar to the ACS, the National Nursing Home Survey used in-person interviews to collect information on up to 10 sampled residents. Similar to the ACS protocol, National Nursing Home Survey field representatives went to a nursing facility at a scheduled interview appointment time and administered the questionnaire. Also similar to the ACS, while onsite, the field representatives then worked with the nursing facility respondent (typically the administrator) to sample 10 residents. In contrast to the ACS, the National Nursing Home Survey field representatives usually worked with the administrator or designated staff to complete the questionnaire for each of up to 10 sampled residents. Residents themselves were never interviewed, nor was there a need for a proxy, such as a relative.

Typically, the National Nursing Home Survey interviewers made only one visit to each reporting facility to complete all of the data collection both for the facility and the 10 sampled residents. However, in terms of burden, the interviewer may have spent several hours at the site. Harris-Kojetin posed a question about whether it is more burdensome to have one visit of several hours or shorter, multiple visits, with the nursing facility contact person having to coordinate with resident family members as proxies and meet with the field representative.

Prior to 2004, the National Nursing Home Survey used paper; CAPI was introduced in 2004. Starting in 2012, NCHS replaced the National Nursing Home Survey and its other existing long-term care provider surveys with the Biennial National Study of Long-Term Care Providers, which covered the supply, use, and characteristics of five major sectors of paid, regulated long-term care including nursing facilities. For the nursing facilities sector, NCHS now uses only administrative data from the Centers for Medicare & Medicaid Services (CMS).

The conversion to administrative data was made largely with the aim of lowering costs, but the effect has also been to reduce or eliminate burden for the nursing facilities. Harris-Kojetin suggested that the Census Bureau explore the feasibility of using administrative data maintained by CMS as an alternative to survey data collection for nursing facility group quarters

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

in the ACS—specifically, the Minimum Data Set (MDS) 3.0.1 Part of this exploration by the Census Bureau could include whether the actual MDS is needed versus a more processed, user-friendly version of the MDS data, such as the MDS Active Resident Episode Table, which NCHS uses.

Harris-Kojetin assessed that there is considerable overlap between the ACS Group Quarters questionnaire items and the MDS items. She pointed to overlap in demographic characteristics—name, gender, date of birth, race/ethnicity, and marital status—and in health insurance items like Medicare/Medicaid, language spoken, ability to hear or see, short-term and long-term memory issues, and ability to walk or in need of assistance with walking or dressing. Many of the GQ items collected in the ACS are not available from the MDS, she commented, so using MDS administrative data to completely replace survey data collection for nursing facilities in the ACS would require development of a much shorter version of the GQ set of items for nursing facilities.

There would be other benefits from using the administrative data from the MDS, she said. The universe of Medicare- and Medicaid-certified nursing facilities would be represented as well as the universe of residents in those nursing facilities, not just a sample as is done now in the ACS. By collecting from the universe rather than a sample, the ACS could address some of the GQ-related, small-area estimation issues in the ACS, at least for the variables that are comparable between administrative data sources and the ACS. Another benefit would be that using administrative data such as MDS would alleviate respondent burden on nursing facility staff, residents, and resident family proxies, she said.

Harris-Kojetin presented other scenarios for the ACS in addition to substantially shortening the ACS GQ item set for nursing facilities by using the MDS information and complete substitution for survey data collection. For instance, under the complete substitution scenario, there may be other administrative data sources, such as Department of Veterans Affairs administrative data for military service and service-connected disability and Social Security Administration information on work and income. The use of the other data sources would require that a unique identifier such as the Social Security number be available across administrative datasets and the ACS. Another scenario, rather than complete substitution for all sample nursing facility residents, would be to use administrative data sources after data collection for cases with specific survey items that have historically high missing rates.

___________________

1 MDS 3.0 is an assessment done by nursing homes at regular intervals on every resident in a Medicare- or Medicaid-certified nursing home. It covers almost 99 percent of all nursing homes in the United States that are either Medicare- or Medicaid-certified. The MDS collects resident characteristics including demographic, functional, and clinical characteristics.

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

Harris-Kojetin commented on other aspects of the current ACS GQ design for nursing facilities. She stated that some of the alternative versions of the paper questionnaire that are being considered to enhance the ACS GQ design appear less relevant to nursing facilities. In the 2014 ACS, 99 percent of nursing facility responses were submitted through CAPI, so she concluded that it would not be worthwhile to create another questionnaire and have to deal with the logistics and the costs of an additional paper form for 1 percent of nursing facilities. She also addressed the use of proxy respondents for the resident questionnaire response process in nursing facilities. According to the 2014 National Study of Long-Term Care Providers, half of nursing home residents have Alzheimer’s or other dementias, a likely cause of the fact that proxies complete about one-half of ACS nursing facility resident questionnaires. It would be useful to gain further understanding of the ACS Group Quarter Resident Questionnaire completion process in nursing facilities regarding the quality of data under the three main scenarios: where the resident self-completes, where the questionnaire is completed by a proxy who is a relative of the selected respondent, or where he survey is completed by a proxy who is a nursing facility staff.

Finally, she discussed potential uses of the CMS’s Nursing Home Compare Website (data.medicare.gov)—a publicly available Website that lists all Medicare- and Medicaid-certified nursing homes; provides the name, address, phone number, bed size, current number of residents, and number of certified beds; and the location of nursing facilities. She suggested Nursing Home Compare as a valuable resource for updating the nursing facility information on the master address file between decennial census years. Further, the Website provides for downloading the file, including the federal provider number, for each nursing facility.

In summary, she pointed out that NCHS has used administrative data in two ways: before 2012, to substitute for personal collection from selected respondents by using records maintained by the nursing facility, and starting in 2012, to avoid going to a nursing facility entirely by using administrative data from another federal agency.

ACS CONSTRAINTS RELATED TO GROUP QUARTERS

Andy Peytchev presented and discussed a list of constraints under which the ACS operates that influence the environment for collecting, processing, estimating, analyzing, and publishing data on group quarters. A key constraint is the need to continue to collect the same data from the GQ questionnaires that are currently collected. He commented that reducing burden will involve design changes, and every design change involves tradeoffs. In the ACS, the tradeoffs are complex because of the multiple

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

components of the program. He offered a conceptualization of the relationship between burden, variance, bias, cost, and other quality dimensions.

The relationships are complex. For example, within bias and variance, there are nonresponse, coverage, and measurement effects. All these sources of error need to be measured and balanced against other changes in the survey, because any time an intervention or a change in protocol is made to affect burden, at least some of the other components will be affected.

He said the Census Bureau could use two different paper forms to minimize the unnecessary questions; another alternative would be to embed additional skip logic. He stated that the option of dropping the paper and pencil interviewing (PAPI) instrument altogether seemed like a radical choice, but a more feasible alternative would be to implement a Web option for most of the PAPI. Alternatively, the paper instrument could be employed in an even more limited and targeted manner. Targeting would require understanding what burden means, to whom the burden accrues, and, once the data are collected, from whom the suspect information (in terms of bias and variance) is obtained.

Peytchev noted that the proposed paper instrument labeled 48 items, and, counting the subquestions, it totals about 80 questions. The proposed GQ survey would be shorter than the household instrument by half, and the items appear to be simpler to answer. The proposed paper survey, in terms of Norman Bradburn’s framework, is simpler in at least three of the four dimensions of burden. However, in some facilities, one person would have to answer for everybody else and, when that happens, the individual burden could explode to potentially 800 or 1,600 questions.

Based on his assessment, he suggested the following:

  • Consider limiting the use of the paper instrument, whether only to self-administration, to specific types of facilities, or some combination of the two, as a short-term solution to this aspect of burden.
  • Consider reevaluating sampling for some GQ types, for example, increase the facility sampling rates and decrease the within-facility sampling rates to reduce burden. (Currently, the sampling rate within selected facilities with 10 or fewer residents is 100%.)
  • Evaluate the impact on the survey estimates. It is important to be cognizant of the implications of burden on the properties of the survey data that are being collected.

ADDITIONAL PERSPECTIVES ON GROUP QUARTERS

Michael Brick summarized his perception of some of the key suggestions related to GQ:

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
  • Split off the institutional from the noninstitutional GQs with a much smaller set of questions relevant to each. The remaining questions could be completed by administrative records.
  • For institutional GQs, eliminate the CAPI interview and the paper instrument, and import administrative records or, where administrative data are not available, use CAPI.

On the noninstitutional side, he supported the Internet option particularly for college student facilities and military barracks. He further suggested that the Census Bureau:

  • Limit the number of times that field representatives go back to the same facility over the year.
  • Ensure the questionnaire makes sense to the intended respondents.
  • Address the issue of possible double burden for college students, that is, the burden that comes when parents within their household have a student living in a dorm and the student’s information is also collected in the GQ survey.

Colm O’Muircheartaigh praised the Census Bureau and characterized the ACS as a triumph, given where it started and what it has become. He advocated eliminating the term “group quarters,” which he called “an almost meaningless term.” To him, the term has led to conflicts in dealing with an extraordinarily heterogeneous collection of arrangements under one heading. Instead, he suggested the principle of stratification, which is one of the key means by which the survey can treat different parts of the population differently. It is important, he said, to think about the challenges of data collection, not about labeling entities under one term that has nothing to do with how they should be approached.

DISCUSSION

Connie Citro commented on the history of the treatment of group quarters in the ACS. Most household surveys cover only the noninstitutional population because the institutional population is only 3 percent of the U.S. population, has been pretty steady over the past few decades, and is hard to count. The census mandate to collect information about everybody and not just the civilian, noninstitutionalized population was carried over to the ACS. The 2010 NRC panel considered the appropriateness of the GQ survey but determined users want the information. To some extent, she noted, users come to this conclusion failing to understand that the ACS does not provide detailed data about group quarters. It provides some state totals by type of GQ but no detail for the GQ population on characteristics such as

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

education, health insurance, and disabilities. Furthermore, for small areas, the information can distort characteristics because the GQ population is very different from the rest of the population in an area.

A participant asked about the treatment of group quarters in coverage measurement. It is important to recognize that unrelated people in group quarters should mostly be treated separately. Dolson replied that a count of the number of unrelated people in an area is important, but the concept is not clean. For example, universities or private apartment complexes offer individual leases for group quarters, making count of unrelated individuals at an address unreliable. Belton agreed that this is an important issue and reported that the Census Bureau is working on a definition that encompasses these new arrangements. O’Muircheartaigh suggested people in these arrangements should be classified as living in apartments. Brick concurred, adding that people in these arrangements are part of the noninstitutionalized population and, as such, should be given the noninstitutional questionnaire to complete.

Belton added that assisted living facilities raise some of the same issues. They are classified as housing units, but some have a floor or a wing with continuous skilled nursing care. According to the ACS definition, they are group quarters.

A participant asked how the Census Bureau would conduct sampling at GQs if, as has NCHS, administrative records become the primary means of obtaining information from nursing facilities and field representatives no longer physically go to the locations. Harris-Kojetin responded that when NCHS transitioned to using only administrative data from CMS for the nursing home sector, sampling was no longer required. The universe of residents at the Medicare- and Medicaid-certified nursing facilities was obtained from administrative records. New issues did present themselves, however, such as the reference period and how often the information is updated. For example, the MDS information on the Website is updated quarterly. The GQ approach for the ACS has a separate sample every month.

Salvo asked about the classification of multiple-use GQs. With the aging of the population and the complexity of some assisted living arrangements, step-up arrangements are becoming more popular where part of the facility is a nursing home, part assisted living, and part independent apartments, he observed. These arrangements are difficult to disentangle from an address standpoint. It may mean the Census Bureau has to internally collect the data and then decide how to categorize them, he suggested.

O’Muircheartaigh added that the issue of estimation was one of the debates when the ACS was introduced. The position in the Census Bureau and others in the demographic community was that the ACS could not be matched with the census because the ACS collected these data on an

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×

ongoing basis throughout the year and the census was clearly defined as being on the first of April. This would lead to confusion. However, he advocated making decisions on the basis of the information that is to be produced or estimated and then to collect data to make it possible to estimate that information.

Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 79
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 80
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 81
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 82
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 83
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 84
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 85
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 86
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 87
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 88
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 89
Suggested Citation:"6 Tailoring Collection of Information from Group Quarters." National Academies of Sciences, Engineering, and Medicine. 2016. Reducing Response Burden in the American Community Survey: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/23639.
×
Page 90
Next: 7 Future Directions »
Reducing Response Burden in the American Community Survey: Proceedings of a Workshop Get This Book
×
 Reducing Response Burden in the American Community Survey: Proceedings of a Workshop
Buy Paperback | $48.00 Buy Ebook | $38.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Although people in the United States have historically been reasonably supportive of federal censuses and surveys, they are increasingly unavailable for or not willing to respond to interview requests from federal—as well as private—sources. Moreover, even when people agree to respond to a survey, they increasingly decline to complete all questions, and both survey and item nonresponse are growing problems.

In March 2016, the National Academies of Sciences, Engineering, and Medicine convened a workshop to consider the respondent burden and its challenges and opportunities of the American Community Survey, which is conducted by the U.S. Census Bureau. This publication summarizes the presentations and discussions from the workshop.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!