National Academies Press: OpenBook
« Previous: 1 Introduction
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

2

Methods and Approach

Chapter 1 presented the committee’s interpretation of the statement of task for the third and final report in this series, which informed the committee’s approach and methodology. This chapter describes that approach, as well as the materials received from the Substance Abuse and Mental Health Services Administration (SAMHSA) to evaluate grantee progress and the limitations of those data.

ASSESSING EFFECTIVENESS: APPROACH

Developing the Approach

For this final report, the committee received additional types of data for the Center for Substance Abuse Prevention (CSAP) programs1 (Improving Access to Overdose Treatment [OD Treatment Access], First Responder Training [FR-CARA]) and more updated data for the Center for Substance Abuse Treatment (CSAT) programs (Building Communities of Recovery [BCOR], State Pilot Grant Program for Treatment for Pregnant and Postpartum Women [PPW-PLT]) than it received when preparing the second report. However, a formal evaluation was still not possible for the same reasons outlined in the second report and discussed in Chapter 1; a more detailed discussion of data needs for effectiveness and cost-effectiveness evaluations is presented in

___________________

1 Throughout this report, “program” is used to refer to the four grant programs administered by SAMHSA, and the entities to which the funds were distributed are referred to as “grantees.”

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

Chapter 8 and in Appendix C of this report. The limitations of the available data (e.g., lack of experimental design, lack of or inadequacies of pre-post data, aggregation across grantees, lack of comparison groups, lack of random allocation of grantee project participation) precluded use of a more rigorous methodological approach. As stated in the second report in this series:

Data were not available across programs on relevant evaluation and implementation constructs from a given framework, nor were measures used that are validated indicators for those constructs—and the committee was not in a position to require specific measures. Although the committee did not adopt a specific framework, general principles from implementation science and evaluation frameworks were used as part of the evaluation. (NASEM, 2021)

For the previous report, the committee reviewed the information it received and then matched the information for each program to each of the “required and allowable activities” laid out in the respective Funding Opportunity Announcements from SAMHSA (NASEM, 2021). Ultimately, the committee found that the reporting materials were not organized around these activities, and that this structure likely did not fully capture the wide variety of activities and efforts undertaken by the CARA grantees. Additionally, for this report, the committee felt that it would be helpful to apply a consistent structure when assessing each of the four programs.

The committee reiterates that, as it had for the second report, it considered using existing, formal implementation frameworks to organize the information it received about grantees, including frameworks for services and research and those that pertain specifically to partnered efforts. In particular, it considered the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2009, 2022), the Exploration, Preparation, Implementation, Sustainment (EPIS) framework (Aarons et al., 2011), and RE-AIM (Glasgow et al., 1999, 2019). As already discussed, some of these frameworks were less relevant to this effort given that the committee could not comment on effectiveness. However, the committee decided that to do justice to these implementation frameworks would have required more information than it had available; for example, the committee did not receive information pertinent to the EPIS model on organizational culture, staff knowledge and attitudes, etc. Because the committee was relying on secondary information and did not have the ability to use such a framework to structure requests for primary information or design measures from the outset that would reflect these models, the frameworks would have been of limited utility for organizing the information after-the-fact.

For this report, the committee developed a logic model around which to organize the grantee information by the various actors involved in the activities of the CARA programs, displayed in Figure 2-1. The model was

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Image
FIGURE 2-1 Logic model that the committee used to assess program activities and outcomes.
NOTE: SAMHSA = Substance Abuse and Mental Health Services Administration.

informed, indirectly, by some of the concepts of these established frameworks, but framed within language more relevant to the grantees’ activities; for example, the model considered partnership and structural and systems change, elements consistent with the “outer” context of the EPIS model. The committee members then reviewed the materials described below with an eye to the logic model components and tried to understand the activities and outcomes of the grants through this common organizational framework. These findings are laid out in Chapters 36.

The Logic Model

On the top level, each box represents a different type of actor involved in carrying out the CARA programs. The arrows represent the flow of funding and/or impact, moving from one actor to the next. Congress wrote the legislation, designated SAMHSA as the authorized agency to execute the programs, and allocated funds to set up the programs. SAMHSA then created Funding Opportunity Announcements, selected grantees, awarded funds to its grantees, and provided oversight. Grantees then used the funding to carry out activities. Often, they partnered with a consortium or network of partners that also carried out similar or complementary activities. Then, following along the arrows, the logic model presents a new row of individuals or groups that are hypothetically impacted by the actions of the actors in the first row. This includes (1) clients, patients, and professionals

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

(e.g., pharmacists and first responders) and (2) the community and broader public. For example, someone with substance use disorder (SUD) received recovery support services or first responders were trained to administer overdose reversal drugs. These are individual-level outcomes. Decreased stigma around SUD in a community because of the community education program is a community-level outcome. The committee included an additional category called “environmental or structural change” to reflect that some grantee activities could have ramifications for long-term change and potentially for the sustainability of this type of programming or could indirectly impact individual- or community-level outcomes. Although they are not measurable, these activities could nevertheless be considered important impacts of the grant funding.

Given that it was not possible to conduct an effectiveness evaluation, the committee’s approach to the first part of its task for this report was to understand and describe what actions were taken by the grantees and their partners; what impacts to clients, patients, the community, and public were observed; and what structural or environmental changes might have resulted from the grant funding. In its response to the second part of its task—rather than answering the question of whether the four CARA programs were a cost-effective response to the opioid epidemic—the committee provided general guidance regarding future program evaluations.

ASSESSING CARA PROGRAMS: STREAMS OF INFORMATION

The committee had three major streams of information to draw upon in assessing the four programs: SAMHSA-mandated reporting tool submission, grantee progress reports, and a commissioned report involving qualitative grantee interviews. Each of these information sources, their content, and their limitations are presented in this section. All information supplied by SAMHSA is available on request through the National Academies Public Access File,2 and information used directly in the report is publicly available in an online appendix on the National Academies Press website as “additional resources” to the published report.3

The committee focused on materials provided by SAMHSA that highlighted the perspectives and experiences of the implementers of the four CARA programs at the grantee level. Given the limited time frame and funding for this evaluation effort, and potential issues for patient privacy and data sharing limitations, the committee was unable to interview grantee

___________________

2 Public Access File available by request via the National Academies. See https://www.nationalacademies.org/our-work/review-of-specificprograms-in-the-comprehensive-addiction-and-recovery-act (accessed March 6, 2023).

3 See https://nap.edu/26831 (accessed March 9, 2023).

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

clients. It notes that this is an important general limitation of this evaluation; a truly complete understanding of the implementation of impact of the CARA programs would include client perspectives.

1. SAMHSA-Mandated Reporting Tools

As for the previous report, the committee requested grantees’ reporting materials from SAMHSA. Between July and September 2022, SAMHSA shared materials with National Academies staff. The committee and SAMHSA made a concerted effort to ensure that, in contrast to the second report, the committee could review all of the information that SAMHSA had available on grantee progress. As a result of these changes, the committee did have a greater amount of information available to assess than it had for the second report; specifically, as will be described in a later section, it received substantially more information about the CSAP programs, and more final information for both the CSAT and CSAP programs. However, because the type of information did not change, most of the prior limitations persisted and are discussed in this section.

As discussed in the second report, SAMHSA stated that because its original agreements with grantees did not indicate that data would be shared outside of SAMHSA, it provided aggregate information or, when material was grantee-specific, redacted information (NASEM, 2021). This largely prevented the committee from using secondary data sources on community characteristics (e.g., community SUD indices, local policies) to contextualize grantee-reported outcomes. It also prevented the committee from tracking grantee-specific progress over time.

The committee notes that it had many exchanges with SAMHSA staff between July and the completion of the report, in attempts to clarify the nature of the data that was received. At times, SAMHSA promptly clarified. At other times, SAMHSA staff reported that they were unsure of the cause of inconsistencies or gaps in the data.

The committee would also like to note, as a general limitation and as stated in the second report of this series, that the mandated reporting tools were not always aligned with the goals or activities of grantees. This presented two main issues: first, there may have been information that would have been pertinent to the committee’s review that grantees did not report; second, not all information they did report was pertinent. As such, in the text of the report, the committee does not report on every piece of information that SAMHSA provided. The committee notes that mandatory reporting serves several functions, which are not limited to assessing effectiveness or accomplishments; it may serve SAMHSA project officers, for example, in tracking grantees or identifying opportunities for technical assistance.

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

There were two mandated reporting tools for the CSAT programs (BCOR and PPW-PLT): the Government Performance and Results Act (GPRA) Data Reports and the grantee progress reports. Both CSAP programs (OD Treatment Access and FR-CARA) had mandated reporting through the Division of State Programs Management Reporting Tool (DSP-MRT); the OD Treatment Access program had an additional reporting tool, the OD Treatment Access Reporting Form, and the FR-CARA program had program-specific questions within one of the modules of the DSP-MRT. The following sections describe each of these tools, their content, how the committee utilized the information each tool elicited from grantees, and the limitations of that information.

Generally speaking, to extract and evaluate information from grantee reporting to SAMHSA, committee members reviewed the materials and identified the range of thematic topics associated with the core categories described in the logic model. To synthesize its findings, the committee reports on those topics that were most frequently4 mentioned by grantees within and across each program’s grantees.

CSAT: Government Performance and Results Act (GPRA) Data Reports

The committee received a suite of data reports generated from client-level data that grantees submitted to SAMHSA through the GPRA reporting tool, an off-the-shelf tool used by many CSAT programs. Grantees submit information to this system on a rolling basis over the course of their funding periods (see Box A-5 in Appendix A for a full description of the tool and its contents).

The reports that the committee requested included (1) participant characteristics at each measurement point (i.e., intake, discharge, and 6-month follow-up); (2) changes in key client outcomes from intake to the last measurement point, whether discharge or 6-month follow-up; and (3) differences between intake sample characteristics and the subsample of participants with complete follow-up data.5

The nature of the data the committee received from the GPRA tool for this report was overall similar to the information received for the

___________________

4 The committee uses terms such as “some” and “many” to describe its findings. Quantification in qualitative research is used “to facilitate pattern recognition or otherwise to extract meaning from qualitative data” (Sandelowski et al., 2009) and does not necessarily have a direct translation to frequencies or percentages. The phrases few, some, many, and most are best treated as ordinal categories that fall between “none” and “all.” The ordinal category “few” is strongly anchored to rare cases, and the ordinal category “most” always means greater than 50 percent but is more typically anchored closer to “all.” Some is more than a “few,” and “many” is less than “most.” Translating these ordinal descriptors into numbers is likely to introduce a false sense of precision.

5 Text in this paragraph is reprinted from the second report in this series (NASEM, 2021).

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

previous report, with a few small improvements. Because grantees were further along in their implementation periods, the data were more up to date. SAMHSA also shared the information in a format that made analysis easier to perform.

Limitations

There were at least three general limitations of the GPRA data for assessing the progress or effectiveness of the BCOR and PPW-PLT grantees.

First, the GPRA data tool was in many ways not well aligned with the goals and activities of these programs. For the BCOR program in particular, participants were often individuals already in recovery who were undergoing training to provide peer support services, rather than patients receiving SUD treatment or recovery services themselves. Many of the questions included in the GPRA tool, as a result, were irrelevant to the goals for engaging with these participants. These issues are described in greater detail in the BCOR and PPW-PLT chapters. For both programs, because the content of the GPRA tool focuses on client-level, clinical data and outcomes, it cannot capture grantee efforts around systems or community-level change, which were major focuses for both the BCOR and PPW-PLT programs.

Second, the data in these reports were typically aggregated at the program level and not attributable to specific grantees. For instance, data reports on past 30-day substance use were available for the BCOR program as a whole rather than at the client level or the grantee level. The committee did receive grantee-level client data from GPRA on the following metrics: (1) number of actual client intakes at each grantee site versus the grantee’s goal for client intakes, and (2) number of 6-month follow-up interviews conducted at each grantee site versus the grantee’s goal for 6-month follow-up interviews.6 As some grantees reported outcomes for patients in treatment while others reported outcomes of trained personnel, the aggregated information may have mixed outcomes for two different types of populations.

The committee reviewed the GPRA datasets for both the PPW-PLT and BCOR grantees but found that much of the information was not useful in assessing progress or effectiveness of grant activities. To the extent that the GPRA data might have been informative for the assessment of select outcomes for the two programs, there were significant confounds that limited the committee’s ability to analyze the data and draw conclusions.

___________________

6 Text in this paragraph was previously printed in the second report in this series (NASEM, 2021). The committee notes, additionally, that the COVID-19 pandemic likely shifted what was feasible in the way of recruitment and follow-up; as such, comparisons to the targets set at the time of grant application may be of limited use.

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

For example, the committee did not receive GPRA data from all grantees; as such, the data are not reflective of the experiences of all grantees in the two programs, and there may be selection bias in which grantees were included in SAMHSA’s analysis. The committee received data on all 6 grantees for PPW-PLT, but on only 19 of 26 for BCOR. Additionally, the committee did not receive data from every fiscal year for all of the reporting grantees but was not informed of exactly which fiscal years were missing and for which of the reporting grantees. SAMHSA was not able to explain why some grantees were not included.

Among those grantees that did report intake data, follow-up rates varied. From the data provided, the committee was not able to determine the reasons for non-reporting; it was unclear whether participants had dropped out of the grantees’ programming, whether follow-up interviews had simply not been conducted, or whether there was any selection bias in the clients that did or did not have follow-up data. The committee requested from SAMHSA an analysis comparing the intake characteristics of those clients who had only an intake interview versus those with at least one follow-up interview (6-month or discharge). However, ultimately, the committee determined that there were too many confounds in the data to draw adequate inferences about selection bias.

Grantee progress reports and interviews shed light on some implementation challenges of the GPRA tool, which limited follow-up data and thus ultimately hindered use for evaluation. These challenges are discussed throughout the program-specific chapters.

Finally, the committee lacked additional information that might have contextualized the GPRA outcomes data. This includes control groups; information about expected outcomes/status in a given community; preprogram data; and information about other treatment or recovery programs operating in the service regions of these grantees, given that individual clients could be interacting with multiple service providers, making it difficult to determine which outcomes are attributable to the PPW-PLT and BCOR funding programs.7 Because the information the committee did receive was aggregated and not attributable to specific grantees, the committee was not able to seek out such secondary information on its own and compare that to the GPRA outcomes.

Ultimately, due to these substantial limitations in the data, it was difficult for the committee to interpret the GPRA data and the individual-level outcomes as attributable to the CARA programs. Only limited figures from these data are presented in the body of this report, due to the potential for misinterpretation.

___________________

7 A lengthier discussion of these limitations is presented in NASEM, 2021.

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

CSAP: Division of State Programs Management Reporting Tool (DSP-MRT)

The DSP-MRT is an off-the-shelf tool used by a variety of CSAP-administered programs. The questions in the tool center around eight modules comprising administration, needs assessment, capacity, planning, behavioral health disparities, implementation, evaluation, and sustainability. See Box A-6 in Appendix A for definitions of these modules. Every module presents to grantees the opportunity for free text submissions, as well as numerical and “drop down” responses. For example, the Planning Module directs grantees to select Accomplishments from a list:

  • “Use of statewide needs assessment in the development of the strategic plan
  • Discussion on adjustments based on on-going needs assessment activities
  • Identification of the state/tribe/jurisdiction level priorities
  • Articulation of a vision for prevention activities
  • Identification of key milestones and outcomes
  • Identification/coordination/allocation of resources
  • Identification of other sources of funding for the plan
  • Identification of appropriate funding mechanism(s)
  • Establishment of key policies
  • Involvement of public and private service systems in planning
  • Planning for sustaining the infrastructure
  • Other planning accomplishment”

Modules other than Administration offered space for free text responses to “accomplishments” and “barriers.” Some modules elicited free text responses only regarding accomplishments and barriers. Some modules asked multiple questions related to a specific query. For example, the “Use and Reach of Prevention Efforts” section of the Behavioral Health Disparities module, asked four questions (and allowed for free text responses to each): “How do you monitor the efforts related to addressing behavioral health disparities at the community level? What are your data collection processes related to behavioral health disparities data? How are you determining the accuracy of numbers directly served and numbers indirectly reached for each high-need community? How are you helping communities use their data to address the identified behavioral health disparities?”

For this report, the committee received substantially more information from grantees’ responses to the DSP-MRT compared to the second report. For the committee’s previous report, SAMHSA generated “summary” reports—one for OD Treatment Access and one for FR-CARA—from the

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

information grantees had provided in response to the tool’s prompts. For this report, the committee requested from SAMHSA all free-text responses, as well as “counts” of how many times grantees selected a specific response (such as “type” of barrier). Grantees funded in 2017 submitted information biannually; the 2018 cohorts submitted information annually.

The committee received the information that grantees reported through this tool in the form of 77 Excel workbooks containing multiple worksheets, some of which included more than 1,000 cells of free-text information. This information covered information submitted by OD Treatment Access grantees biannually between October 2018 and April 2021 for the 2017 cohort and annually between 2019 and 2021 for the 2018 cohort. For FR-CARA grantees in the 2017 cohort, the information was submitted biannually from April 2018 through April 2022; for the 2018 cohort, annual reporting was submitted beginning in 2019 through 2021. SAMHSA informed the committee that of the FR-CARA grantees, the whole 2017 cohort was represented in the information shared; four grantees in the 2018 cohort were not included. All the OD Treatment Access grantees were included. SAMHSA did not provide an explanation as to why some grantees were not included.

The number of free text responses were not proportionate to the number of grantees per cohort, which the committee assumes indicates that some grantees were more comprehensive in their answers than others. For example, although there were 21 FR-CARA grantees in the 2017 cohort and 27 in the 2018 cohort, 83 percent of the free text entries were from the 2017 cohort [F1a].8 For OD Treatment Access, 70 percent of the free text responses were entered by the one grantee in the 2017 cohort compared to the five grantees in the 2018 cohort [O1a]. These discrepancies cannot be explained by the biannual reporting requirements for only the 2017 cohorts. Additionally, particularly in response to modules focused on “start-up” activities, grantees submitted identical text for multiple reporting periods, leading to many redundancies.

Given the substantial amount of information reported (more than 13,000 free text responses) and the disproportionate number of responses described above, the committee used a purposeful quota sample to capture the full range of potential reports for reviewing the free text responses grantees submitted through the DSP-MRT. Given the short time frame available for analysis, the committee reviewed a total of 250 OD Treatment Access free text responses, selected proportionally across cohorts and modules and reporting periods and a total of 500 responses from the FR-CARA

___________________

8 Citation refers to the source of the material from that submitted by SAMHSA and included in the online appendix. Indicates the program and excel citation (FR CARA [F], workbook [O], and worksheet [a]).

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

grantees, also spread proportionally across cohorts, modules, and reporting periods. Within each set of possible responses for review, the committee selected the items using a random number generator.

In addition to the free text responses, the committee reviewed numerical information that was pooled across grantees pertaining to specific queries in the DSP-MRT. For example, grantees were asked how many partnerships they made, and the “count” was provided to the committee. As another example, for each question querying accomplishments and barriers, the committee received the “count” of how many times a specific item was selected; however, one grantee could have selected multiple accomplishment categories and others none; thus, it was sometimes difficult to interpret much of this information.

CSAP: OD Treatment Access Reporting Tool

In addition to the more general CSAP DSP-MRT, SAMHSA created a supplemental reporting tool specifically for the OD Treatment Access grant program, called the OD Treatment Access reporting form. This tool consisted of eight questions, five of which prompted open-ended narrative responses, and three of which requested quantitative data (see Box A-7 in Appendix A for a complete description of this tool). SAMHSA generated a report from these responses that included largely full, narrative responses from grantees (which were de-identified) and some information that had been aggregated across grantees.9

SAMHSA forwarded a total of 14 reports from five of six grantees (references O14-27), with varying numbers of submissions per grantee. Where the committee received more than one report from a grantee, the committee reviewed the most recent.

CSAP Limitations: DSP-MRT and OD Treatment Access Reporting Form

The information from the DSP-MRT and OD Treatment Access Reporting Form had several limitations that restricted the committee’s ability to assess progress or effectiveness of the CSAP grantees.

First, as the DSP-MRT is an off-the-shelf tool, not all of the prompts were well matched or aligned with the goals and objectives of the two CSAP CARA programs.

Second, aggregation and lack of attribution to specific grantees limited analysis of the information from both tools. For instance, it was unclear to the committee whether achievements and/or barriers were disproportion-

___________________

9 The text in this paragraph is reprinted from the committee’s second report (NASEM, 2021).

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

ately experienced by a select group of grantees or distributed across an entire cohort. SAMHSA’s aggregation of quantitative data across all grantees in the two respective programs also precluded quantitative analysis by the committee. As discussed at the beginning of this section, aggregation and lack of attribution also precluded the committee’s ability to use secondary data sources, such as overdose rates in a given grantee’s catchment area, to contextualize the information provided by grantees.

Third, not all grantees from both programs were represented in the information that SAMHSA shared with the committee. The committee notes that there may have been selection bias in which grantees were included and which were not; as such, the committee’s findings on grantee progress may not be representative or generalizable.

There were also a number of inconsistencies in how grantees responded to questions in the DSP-MRT and the OD Treatment Access Reporting Form. Free text responses did not always seem to match the module from which the material was supplied, and grantees seemed to interpret questions differently. Such inconsistencies in grantee responses made it difficult to compare or identify common themes across grantee experiences.

Finally, the committee’s sampling strategy presented additional limitations. It is possible that the sampling ended up not representative of all submissions, but the samples were chosen randomly, decreasing that possibility.

2. Grantee Progress Reports

In an attempt to address some of the limitations of the materials received from SAMHSA, the committee reached out to grantees directly and requested comprehensive progress reports or most recent progress reports, as available, explaining that any information shared with the National Academies would be made publicly available in the Public Access File. Twenty-eight grantees sent material ranging from one-page summaries to multiple reports spanning several years. Progress reports sent by FR-CARA and OD Treatment Access grantees appeared to contain primarily material identical to the free text submissions to the DSP-MRT sent to the committee by SAMHSA and were not reviewed separately.

Because both cohorts of PPW-PLT and BCOR completed their funding periods no later than the end of fiscal year 2021, the committee requested cumulative closeout reports for each grantee. The committee first requested unredacted reports from grantees, then requested redacted reports from SAMHSA for the remainder of reports.

Between SAMHSA’s submissions and grantees’ submissions, the committee received closeout reports for two of the six PPW grantees. Two reports covered only part of the reporting period, despite being sent as

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

“closeout” reports and two other grantees had received no-cost extensions from SAMHSA, and as such had not yet completed closeout reports; for those grantees, the committee reviewed the most recent annual progress reports, which included information from previous years as well.

For the BCOR grantees, the committee ultimately reviewed closeout reports for all 8 grantees in the 2017 cohort, and 15 of 18 in the 2018 cohort. Two of the remaining grantees in the 2018 cohort had received no-cost extensions, and as such had not yet completed closeout reports; the committee reviewed their most recent annual reports. One other grantee had relinquished the grant after 2 years; the committee reviewed the report that covered those 2 years of activity.

Limitations

The inconsistent and unclear time frames of some of the progress reports complicate the committee’s ability to draw comparisons across multiple grantees. Further, it was at times not clear to the committee that they were reviewing the most up-to-date information, despite SAMHSA’s labels; as such, it is possible that the committee was left with an incomplete picture on grantee progress and activity.

Importantly, as stated in the second report, grantee progress reports were inconsistent; they

varied greatly in completeness, clarity, and quality. Some progress reports provided thorough descriptive and quantitative data on recruitment, retention, and services provided to clients, whereas others did not provide such data, or the data were contradictory or inconsistent. (NASEM, 2021)

The committee notes that the inconsistencies across reports seem to be primarily the result of a lack of formal specification from SAMHSA about exactly what grantees should be reporting, rather than a reflection of any shortcomings on the part of grantees.

Additionally, the format of the grantee progress reports was not always directly linked to the goals of the SAMHSA programs overall.

Finally, redactions in the reports shared by SAMHSA may have resulted in the committee misinterpreting or understating some of the activities reported by grantees.

3. Contracted Report with NORC

To supplement the information received from SAMHSA, the committee subcontracted with NORC to conduct qualitative interviews directly

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×

with grantees. The committee helped to develop the interview guide. The report is reprinted in its entirety in Appendix B, and the limitations and recruiting strategies are discussed in greater depth. In general, limitations included that, although NORC intended to interview 40 grantees, only 22 responded and agreed to be interviewed; results cannot be generalized to all CARA grantees. In order to protect grantee confidentiality, among participating grantees, findings were often broken out into CSAP programs versus CSAT programs; as such, comments often cannot be attributed to grantees of specific programs. Overall, the report describes useful general findings that often echoed or provided greater detail to the committee’s findings from other sources. As described in Chapters 26 the committee used the findings from the NORC report as additional input to its analysis of SAMHSA-provided information from mandatory reporting.

The committee notes that one of the unique features of the NORC report is that it includes recommendations, provided by CARA grantees themselves, about how the CARA programs and other similar programs might be strengthened in the future. Though not all of these recommendations appear in the body of the report, the committee highlights as an important contribution of the report the fact that it provided a platform for grantee input to SAMHSA and/or Congress.

FUTURE EVALUATION MANDATES: APPROACH

The committee’s response to the second part of its charge, found in Chapter 8, consists of a description of key attributes and data challenges in program evaluations. The chapter is forward-looking, inspired by and informed by what the committee experienced in reviewing the CARA programs, but does not answer the question posed by Congress about cost-effectiveness, as described in Chapter 1. A primer on cost-effectiveness evaluations and the information needed to conduct them can be found in Appendix C.

Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 35
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 36
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 37
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 38
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 39
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 40
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 41
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 42
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 43
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 44
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 45
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 46
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 47
Suggested Citation:"2 Methods and Approach." National Academies of Sciences, Engineering, and Medicine. 2023. Review of Four CARA Programs and Preparing for Future Evaluations. Washington, DC: The National Academies Press. doi: 10.17226/26831.
×
Page 48
Next: 3 BCOR Findings »
Review of Four CARA Programs and Preparing for Future Evaluations Get This Book
×
 Review of Four CARA Programs and Preparing for Future Evaluations
Buy Paperback | $30.00 Buy Ebook | $24.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Comprehensive Addiction and Recovery Act (CARA; P.L. 114-198) was signed into law in 2016 to help address the challenges of overdose deaths and opioid use disorder, and to expand access to evidence-based treatment. Among these efforts was the authorization of four grant programs to be overseen by the Substance Abuse and Mental Health Services Administration (SAMHSA).

In 2018, SAMHSA requested that the National Academies establish a committee to conduct a review of the four programs, which focus primarily on opioids, but occasionally include treatment and recovery services for co-occurring substance use disorders. The review resulted in three consensus study reports over five years. This third and final report aims to (1) understand the processes of the four grant programs; actions taken by grantees and their partners; impacts to clients, patients, the community, and public; and structural or environmental changes that might have resulted from grant funding, and (2) analyze how future congressionally mandated evaluations can be structured and carried out to better support policy makers.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!