National Academies Press: OpenBook
« Previous: 4 Criminal Justice Involvement and Health
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

5

Asking Criminal Justice Involvement Questions

Household surveys can serve as a vehicle for collecting information on the experience of incarcerated people and measuring the effects of criminal justice contact on individuals, families, and communities. Because criminal justice involvement questions are sensitive, they can have an impact on survey response rates. It is a challenge to craft sensitive questions that will obtain the desired information without offending respondents. The workshop session covered in this chapter responds to the objective of developing ideas for potential survey questions that can be included in the household surveys of the U.S. Department of Health and Human Services. The goal was to provide examples of how sensitive questions are asked in surveys and discuss the potential impact they have on response rates. Both presenters in the session stressed the importance of cognitive testing and field testing sensitive questions before adding them onto a survey. Respondents with varied experiences approach the same question differently. Cognitive interviews can reveal those nuanced differences, which will improve context setting and framing of questions. Field tests provide an opportunity for interviewers to debrief respondents about content and ensure they understood what was being asked.

ASKING SENSITIVE QUESTIONS

Ting Yan (Westat) opened her presentation with an example of different ways in which very sensitive questions can be presented to respondents. She referred to a classic paper by Allen H. Barton (1958) on how to ask embarrassing questions in non-embarrassing ways. For his study, Barton

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

provided the following variations of the question, “Did you kill your wife?” as examples of strategies a survey might use to generate responses to difficult-to-answer questions:

  • The casual approach: “Do you happen to have murdered your wife?”
  • The numbered card approach: “Would you please read off the number on this card which corresponds to what became of your wife?”
  • The “everybody” approach: “As you know, many people have been killing their wives these days. Do you happen to have killed yours?”
  • The “other people” approach: “Do you know any people who have murdered their wives? How about yourself?”
  • The sealed ballot approach: In an effort to preserve anonymity, the respondent is asked to place his or her response in a sealed ballot.

Yan said that questions are most often considered sensitive for one of three reasons: (1) they are deemed intrusive, asking people about details respondents feel are too personal to answer; (2) they cause concern for respondents about the threat of disclosure and repercussions if they respond truthfully; or (3) they raise social desirability concerns as respondents want their answers to reflect societal norms, which leads to overreporting of socially desirable answers and underreporting of socially undesirable answers.

Sensitive questions can be categorized into two types—those that are sensitive to ask and those that are sensitive to answer. Questions that are sensitive to ask are generally sensitive for all people and driven by content. Examples include questions on income, sexual partners, and Social Security numbers. Irrespective of wording and placement of such questions, they evoke the same level of sensitivity concerns among respondents. Questions that are sensitive to answer are perceived so by some respondents because of specific behaviors or attitudes. Examples include questions on use of marijuana and voting preferences.

Inclusion of sensitive questions in a survey can often result in a low participation rate, more missing data, and high measurement error. People, particularly those with undesirable behavior or attitudes, are less likely to participate in surveys with sensitive topics (Tourangeau et al., 2010). Even when people agree to participate in such a survey, they are more likely to not answer sensitive questions. For example, Yan and colleagues (2010) showed that income questions have 25 percent item nonresponse rates on surveys. Moreover, when responses are provided to sensitive questions, they frequently have measurement error from the usual sources (comprehension,

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

retrieval, judgment), which is further compounded by self-editing done by respondents due to the nature of the question.

Yan discussed her research findings on items deemed sensitive by survey respondents. She and her coauthors surveyed University of Maryland alumni about various aspects of academic performance. Disaggregating survey responses by characteristics of people who participated in the survey, the researchers found that people who graduated with honors were more likely to participate in the survey than those who did not graduate. On questions of the students’ status at graduation, the researchers found that survey respondents overreported graduating with honors and underreported dropping classes. However, the researchers found grade point average to be a more neutral item, having very minimum bias.

Yan refers to the process of providing an inaccurate response as “editing” and said that editing is a deliberate process even though sometimes it happens automatically or subconsciously. She described characteristics of respondents who are more likely to edit their responses: those with things to hide, those with a higher level of social security concerns, those perceiving survey items as posing a higher level of threats or losses are more likely to edit. For instance, literature on voting has shown that educated people are more likely to overreport having voted, as voting behavior is expected in their particular socioeconomic group (see, e.g., Preisendörfer and Wolter, 2014). In asking teenagers about smoking, Aquilino and colleagues (2000) found that they were more likely to respond affirmatively when interviewed in school settings than when interviewed in home settings, confirming that the presence of certain people encourages self-editing of answers. Also, Yan said, the more sensitive the question, the higher the likelihood of editing. Hindelang and McDermott (1981) found evidence of this phenomenon when measuring engagement in delinquent or criminal behavior.

Yan noted that a question on criminal justice involvement, irrespective of how it is defined, worded, and placed, is sensitive because of all three aspects noted above—intrusion, disclosure threat, and negative social stigma. In one study of criminal justice involvement, Preisendörfer and Wolter (2014) found some evidence of reporting bias in a mail questionnaire sent to people with some criminal history (they relied on criminal records as a sample frame). The authors found that women, older people, people with relatively higher education, people with higher social desirability concerns (measured through the survey), and people with more severe offenses were more likely to underreport their criminal histories. Underreporting was also found in people who responded late to the survey and those who were interviewed by less experienced interviewers. Overall, criminal justice involved populations are difficult to capture in household surveys as they are hard to find, and even if they are found, they are very likely not to report completely truthfully, Yan said.

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

Yan next turned to strategies to reduce underreporting of socially undesirable behaviors or attitudes: use of self-administered modes; question wording; question format; and indirect techniques.

Use of Self-Administered Modes Self-administered interviews reduce social desirability concerns and improve reporting. The National Survey of Family Growth (NSFG) and the National Survey of Drug Use and Health start with computer-assisted self-interviewing (CASI): at the beginning, interviewers administer the questionnaire to the respondents, but transition the respondent to audio computer-assisted self-interviewing (ACASI) for more sensitive items. In the NSFG, women respondents were more likely to report abortions on self-interviews than in personal interviews. Table 5-1 details the empirical research that has compared self-administered and interviewer-administered survey modes. The fourth column in the table is the ratio of the estimate of the variable (second column) between the two modes. A ratio of more than 1.0 refers to more reporting in the self-administered mode. There is strong evidence that switching to a self-administered mode leads to an increase in reporting of socially undesirable behavior, and the result holds across all modes.

Question Wording Questions on criminal justice contact can be framed in a manner that encourages people to admit that they have had such an experience. An example is to use a forgiving introduction. To see its effect on reporting of academic cheating, vandalism, shoplifting, littering, illegal drug use, and drinking and driving, Holtgraves and colleagues (1997) used the following forgiving introduction: “Almost everyone has probably committed vandalism at one time or another.” The forgiving introduction improved reporting of academic cheating, illegal drug use, drinking and driving, vandalism, and littering; it did not change the reporting for shoplifting. Yan said that setting up a permissive context can make it easier for people to respond truthfully in some cases; therefore, careful pretesting is required to frame questions appropriately.

Three other strategies with question wording can be useful. One is to include using familiar words instead of abstract terminology: “arrested” rather than “criminal justice involvement.” Another is to pose the question presupposing the behavior: “In the past 10 years, how many times did you witness a crime?” rather than “Did you witness a crime in the past 10 years?” The third is asking questions about past behavior, as the perception of past behavioral attitudes can be less sensitive than that of a current behavior: ask “have you ever” before asking “are you currently.”

Question Format Response format can aid respondents’ retrieval of memory, Yan said. For example, when respondents are asked about the

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

TABLE 5-1 Comparison of Self-Administration and Interviewer Administration on Estimates of Variables of Interest in Criminal Justice

Study Source Topic Modes Compared Ratio of Estimates
Preisendörfer and Wolter (2014) Ever convicted Mail vs. face to face 1.16
Lind et al. (2013) Number of sexual partners ACASI vs. face to face 1.93
Villarroel et al. (2006) Same-gender sexual experience IVR vs. CATI 1.56
Schober et al. (2015) Ever smoked 100 cigarettes Texting vs. CATI 1.08
Corkrey and Parkinson (2002) Ever use marijuana IVR vs. CATI 1.26
Tourangeau and Smith (1996) Ever use cocaine ACASI vs. face to face 1.81
Tourangeau and Smith (1996) Ever use marijuana ACASI vs. face to face 1.48

NOTES: ACASI, audio computer-assisted self-interview; CATI, computer-assisted telephone interview; IVR, interactive voice response.

SOURCE: Yan (2016).

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

number of times or frequency of occurrence, use of a specific range or high frequency list (0, 1-4, 5-9, 10-59, 60-99, 100 or more) generates more responses than an open-ended format. Such a list makes the respondents speculate about the mean answer and rank order their answers relative to the mean, which improves the chances of getting a truthful answer as respondents believe they are doing better than average.

Indirect Techniques Indirect techniques that are helpful in understanding population sizes include randomized response techniques, crosswise models, and item count techniques.

CAPTURING INFORMATION IN A FEW SIMPLE QUESTIONS

David Cantor (Westat) began his presentation by responding to Yan’s point about error. He said that although there is surely error associated with asking the types of questions being discussed, research shows that the error rate for them is not significantly higher than for other questions. He noted that response error is most commonly due to a lack of question comprehension or a problem with recall and that it is possible to significantly reduce total error by carefully designing questions to minimize these common issues.

Cantor offered the example of the San Jose recall study, in which respondents were asked to report instances of victimization—namely, rape, assault, and robbery (Turner, 1981). These self-reports were then matched to police records. Though it is arguably the most sensitive question, the percentage of rapes that were accurately reported (67) fell between the percentages of assaults (48) and robberies (76). Cantor noted that the error rate was likely higher with the assault question because the respondents’ comprehension of the term “assault” can vary greatly. He also noted examples from his 1996 survey of criminal offenders of how incident recall for topics like drug use can diminish greatly over time. Referring to question design advice from Sudman and Bradburn (1982), he offered guidelines on framing questions on sensitive topics, including making questions as specific as possible, using words that virtually all respondents will understand, and lengthening the questions by adding memory cues and examples to improve recall. He added that the guidelines can be compromised for particular applications, but specificity is key for ensuring comprehension. For example, when collecting data on sexual assault, it is better to ask whether respondents have had some sort of sexual contact against their will than to simply ask if they have been victims of assault. Breaking down complex items can help respondents better understand the question and attend to the specific terms. Cantor added that the work of Biderman and colleagues (1985) and Tourangeau and colleagues (2015) has shown that use

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

of examples makes the memory search more specific to the behavior or situation that the researcher is targeting. He noted, however, that although such strategies can prove helpful, the tradeoffs are the additional space required to include examples and the concern that too much specificity can ultimately confuse respondents.

Cantor discussed whether the criminal justice contact questions on various national health surveys followed the guidelines. The Survey of Criminal Justice Experience (SJCE), the NSFG, and the 1997 National Longitudinal Survey of Youth (NLSY) all included a question about supervision that listed the specific levels of interest; for example SJCE asked, “Since age 18 have you ever been under any form of criminal justice supervision, including on probation, in jail, or in prison?” The National Longitudinal Study of Adolescent to Adult Health (Add Health), the National Survey of Drug Use and Health (NSDUH), and the NLSY all contain questions about criminal justice contact, but the questions vary greatly.

  • Add Health asks respondents if they have ever been arrested or taken into custody by police. Asking about contact in this manner mitigates the ambiguity often seen when asking respondents what actually constitutes a formal arrest—being picked up by police, being interviewed, being formally booked, etc.
  • The NSDUH includes a definition in the question to aid recall and accuracy: “Not counting minor traffic violations, have you ever been arrested and booked for breaking the law? Being “booked” means that you were taken into custody and processed by the police or by someone connected with the courts, even if you were released.”
  • The NLSY uses an exclusion phrase in the question in an attempt to weed out certain types of offenses: “Have you ever been arrested by the police or taken into custody for an illegal or delinquent offense (do not include arrests for minor traffic violations)?”

Drawing on earlier discussions during the workshop, Cantor outlined the essential informational needs of the research and policy communities and questions that need to be addressed before framing survey questions. He also provided examples of questions to obtain information about criminal justice contact (see Table 5-2).

In conclusion, Cantor said the most important principle is to first decide on the desired content. What is your variable of interest? What are you trying to estimate? Then be sure you are providing the right context of reference for survey respondents.

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×

TABLE 5-2 Possible Survey Questions on Criminal Justice Contact

Information Sought Possible Survey Questions
Type of contact: police contact, conviction, or supervision

Since you were 18, have you been arrested for breaking the law? Please don’t count minor traffic violations.

Since you were 18, have you been convicted of a crime? Please don’t count minor traffic violations.

Since you were 18, have you been locked up for at least 30 days to punish you for breaking the law?

Timing: recent versus lifetime

Specific calendar periods are often difficult for respondents to remember, as people tend to store life events in their memory by their significance.

Asking about experiences in a specific time frame—the past one or two years, for example—can cause respondents to bring their past experiences forward into the reference period in a phenomenon called telescoping. This can bias the estimates on prevalence. One can instead consider anchoring to specific events such as “most recent time it happened” or “age first occurred.”

Number of times versus time period

Choose between asking people how many times an event happened and how frequently the event happened.

Another level of decision making happens when choosing between the question being open-ended or providing a response scale.

Length of time: total, by spell, or longest

Asking respondents about how long an incarceration term or spell occurred requires them to do calculation and estimation. As longest spells are generally easier to remember, the question can ask specifically about that.

Type of crime: specificity

Since you were 18, have you been arrested for any of the following (y/n to each):

  • stealing something
  • breaking into a home or a building
  • assaulting someone with a weapon
  • injuring someone with a weapon or your hands
  • injuring someone with a motor vehicle
  • drunk driving
  • tricking a person into giving you money
  • using someone else’s identity to obtain money
  • for anything else?

SOURCE: Cantor (2016).

Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 37
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 38
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 39
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 40
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 41
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 42
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 43
Suggested Citation:"5 Asking Criminal Justice Involvement Questions." National Academies of Sciences, Engineering, and Medicine. 2017. Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24633.
×
Page 44
Next: 6 Measuring Criminal Justice Contact and Incarceration Experience in Health Surveys »
Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs: Proceedings of a Workshop Get This Book
×
Buy Paperback | $58.00 Buy Ebook | $46.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In the U.S. criminal justice system in 2014, an estimated 2.2 million people were in incarcerated or under correctional supervision on any given day, and another 4.7 million were under community supervision, such as probation or parole. Among all U.S. adults, 1 in 31 is involved with the criminal justice system, many of them having had recurring encounters.

The ability to measure the effects of criminal justice involvement and incarceration on health and health disparities has been a challenge, due largely to limited and inconsistent measures on criminal justice involvement and any data on incarceration in health data collections. The presence of a myriad of confounding factors, such as socioeconomic status and childhood disadvantage, also makes it hard to isolate and identify a causal relationship between criminal justice involvement and health. The Bureau of Justice Statistics collects periodic health data on the people who are incarcerated at any given time, but few national-level surveys have captured criminal justice system involvement for people previously involved in the system or those under community supervision—nor have they collected systematic data on the effects that go beyond the incarcerated individuals themselves.

In March 2016 the National Academies of Sciences, Engineering, and Medicine held a workshop meant to assist the Office of the Assistant Secretary for Planning and Evaluation (ASPE) and Office of the Minority Health (OMH) in the U.S. Department of Health and Human Services in identifying measures of criminal justice involvement that will further their understanding of the socioeconomic determinants of health. Participants investigated the feasibility of collecting criminal justice experience data with national household-based health surveys. This publication summarizes the presentations and discussions from the workshop.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!