Skip to main content

Currently Skimming:

5 Data Collection
Pages 86-106

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 86...
... . In this chapter, several aspects of the survey operation that can contribute to maximizing data quality and controlling measurement error are addressed.
From page 87...
... interviewers are managed through a network of 46 NASDA state field offices -- some of which represent multiple states. A total of 43 of the field offices have responsibility for conducting ARMS data collection (those in Alaska, Hawaii, and Puerto Rico do not)
From page 88...
... QUALITY ASSURANCE In this decentralized survey operation, NASS imposes quality measures and monitors the survey process to maintain the quality of the ultimate data from ARMS. The quality control methods used include recruiting and training, sample case control procedures, monitoring of interviewers, and data review.
From page 89...
... They typically have an agricultural background and in fact often come from a farm household. The NASDA interviewers are a somewhat diverse group.
From page 90...
... 0 UnDeRStAnDing AMeRiCAn AgRiCUltURe government surveys, this situation would be systematically avoided. There are concerns that respondents may be inhibited about speaking forthrightly about personal information in front of someone known to them, or that interviewers may not strictly observe the survey protocol in such situations.
From page 91...
... The potential behavior of interviewers in dealing with item nonresponse is also a concern. It is believed that interviewers sometimes work intensively with respondents to obtain answers.
From page 92...
... This departure from strict standardization does not necessarily compromise data quality and may actually be appropriate for collecting ARMS data. But to our knowledge there is no evidence that directly bears on the impact of nonstandardized interviewing in ARMS.
From page 93...
... However, we just don't know for sure. For this reason, the ongoing research and evaluation program we recommend should systematically explore the ways that different interviewing practices may affect ARMS data.
From page 94...
... Role of the Reinterview in Data Accuracy The panel considers the control and measurement of data accuracy to be major issues for the ARMS data, especially for the cost-of-production and farm income figures. Some control and measurement methods are employed, and others, found useful in other settings and in prior incarnations of ARMS, are not.
From page 95...
... These surveys not only identified conceptual difficulties that could be remedied by changes in questions or in interviewer instructions and training, but also had the more practical application of informing the board estimation process with measures of response bias. Although expensive in terms of statistician and interviewer time and additional burden on the part of respondents, these formal reinterviews were considered a success and formed the basis for a panoply of recommendations for a more robust reinterview program in the future.
From page 96...
... Research will commence on the development of a mail instrument for the full ARMS economic version (ARMS Phase III cost and returns) , which to date has been collected through face-to-face interviews.
From page 97...
... . In that regard, development is currently under way for a web-based version of the ARMS economic phase (Phase III)
From page 98...
... has indicated it would like to accomplish in the ARDIS initiative (see Chapter 2) , dependent interviewing is a good way to detect and eliminate spurious data changes.
From page 99...
... . However, the extremely low levels of change after dependent interviewing was introduced may lie below true levels of change, reflecting respondents' recognition that when asked if a change has occurred, reporting no change will lead to the shortest interview because there will be no follow-up questions about the new job.
From page 100...
... The experience of ARMS with self-administered interviews is largely concentrated in the short version of the Phase III interview. A systematic investigation of possible mode effects in that questionnaire version should be a high priority, and it should certainly take place before considering more intensive webbased data collection.
From page 101...
... CAPI and webbased data collection will provide opportunities to increase timeliness, improve data quality, reduce cost, and obtain important paradata. Electronic Devices in Data Collection NASS has recently experimented with using electronic devices in personal interviews, such as for locating sample points with GPS devices in Washington State and collecting cotton yield objective survey data in North Carolina.
From page 102...
... would allow the direct uploading of a farming operation's financial records to a USDA database. This approach should be seriously explored as an alternative to conventional modes of self-reporting for Phase III data collection.
From page 103...
... DATA CAPTURE, EDITING, AND PROCESSING ARMS employs a multilayered process of data capture, editing, and processing. Interviewers perform an initial review of their interviews with the goal of correcting errors; a systematic review of the data occurs in the field offices; keyed data at data entry points is carefully monitored; NASS data review happens simultaneously with the field office review; and an outlier board with representation from both NASS and ERS reviews outliers.
From page 104...
... The wider categories of paradata can include aspects of individual interviewers' speech, such as whether they read the question exactly as intended, whether they probed for more information; indications of respondent effort or uncertainty, such the response latency or changes to initial answers; indications of the use of auxiliary information by respondents, such as administrative records; case history information on all attempts to interview each respondent; an indication of the mode of data collection; information on imputation; information about interviewer training and support; cognitive evaluations of survey questions; computer routines for data processing and imputation; and other systematic processes affecting the final data.
From page 105...
... . The ARMS data page (http://www.ers.usda.gov/Data/ARMS/)
From page 106...
... Systematic collection and organization of such data on attempted contacts with respondents, together with relevant data on interviewer, respondent, and neighborhood characteristics, are particularly important for use in understanding and potentially improving the methods of case administration as well as in understanding nonresponse and detecting nonresponse bias. In light of the relatively high nonresponse rate in ARMS, making such data available should have a high priority.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.