Skip to main content

Currently Skimming:

4 Program Evaluation
Pages 29-40

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 29...
... as formulated. This chapter describes effective methods for evaluating programs, the limitations of evaluation approaches used in the federal earth science education and training programs considered in this report, and evaluation of these programs in the context of the Chapter 3 framework of education and training opportunities.
From page 30...
... Outcomes are the longer term changes that the program aims to achieve. Earth science education programs generally aspire to three types of outcomes: awareness, engagement, or professional preparation.
From page 31...
... AGENCY PROGRAM EVALUATION Two of the committee's tasks concern the evaluation of federal earth science education and training programs. Task 3 was to identify criteria for evaluating success and, using those criteria and the results of previous federal program evaluations, to identify examples of successful programs in federal agencies.
From page 32...
... Program evaluation in the context of the framework is described in the next section. Because the committee lacked the robust data needed to choose successful examples of federal earth science education and training programs, it could not offer insight on why these programs are successful (Task 4)
From page 33...
... The evaluation used surveys of participants and recipients of bachelor's degrees to assess the characteristics of participants, why faculty and students choose to participate, and the impacts of different types of research experiences on students' academic and career decisions. The results showed that under graduate research experiences increased participants' understanding of the research process and confidence in their ability to conduct research.
From page 34...
... . Example Evaluations of Awareness Programs Many of the earth science awareness programs discussed at the workshop employ enumeration of participants as the primary evaluation mechanism (e.g., NSF's Geoscience Education Program, USDA's Agriculture and Food Research Initiative programs)
From page 35...
... Provider inputs to the logic model include specific content knowledge and skills as well as pedagogic expertise in designing engaging experiences. Outputs include participants' increased motivation to engage in learning activities beyond the formal science curriculum, increased understanding, and a more complete sense of ownership of a specific work product, project, or artifact through the application of new skills.
From page 36...
... These diverse audiences and objectives require a range of evaluation approaches. Approaches used in the two most common professional preparation activities -- research experiences and internships -- are described below.
From page 37...
... Key elements for summative evaluation include the appropriateness of the internship, provider and participant obligations and responsibilities, participant qualifications and expected competency gains, onsite supervision frameworks, and participant performance evaluation. Example Evaluations of Earth Science Professional Preparation Programs For the federal professional preparation programs discussed at the workshop, the most commonly employed evaluation strategy is the enumeration of participants.
From page 38...
... Programs that collect these data include NOAA's Educational Partnership Program, which analyzes participant work products and tracks the transition of participants to the workforce, and NSF's Earth Sciences Postdoctoral Fellowships program, which collects some information on the workforce transition. Overall, providers that collect all of the information described (enumeration, self-reports, supervisor evaluations, work product analysis, and tracking)
From page 39...
... Broad indicators of program activities could be developed by aggregating relevant information from individual program evaluations, and supplemented with targeted program evaluations aimed at understanding how to create effective programs. Network analysis of the programs in the system could reveal which connections among participating organizations help move individuals through the system, and qualitative studies would help show how individuals find education and training opportunities and what they learn from them.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.