Skip to main content

Currently Skimming:

12 Envisioning a Future for Evaluation
Pages 105-114

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 105...
... • Strong evaluations require an investment of resources, time, commitment, trust, and strong relationships. On the final afternoon of the workshop, two experienced evaluators commented on their ideas for how to design a hypothetical evaluation of a fictitious global initiative that embodied many of the characteristics of the large, complex, multidisciplinary, global interventions that were the focus of the workshop.
From page 106...
... Funding of $3.4 billion is to be provided cooperatively by Fundación María Elena, a fictitious philanthropic foundation described as newly established by a wealthy South American banker, USAID, and the Canadian International Development Agency, with 10 percent of funding from locally sourced assets in each partner country. During a 1-year planning phase, a stakeholder coalition is to set priorities among such outcomes as improved health and well-being, environmental improvements, better water systems, improved public awareness, and reduced violence and crime due to water disputes.
From page 107...
... The package of familiar components and how they coalesce is what makes an intervention complex. This project, he said, "is begging for some developmental evaluation, where the evaluation team itself participates in the development of an intervention." Charlotte Watts, head of the Social and Mathematical Epidemiology Group and founding director of the Gender, Violence, and Health Centre in the Department for Global Health and Development at the London School of Hygiene and Tropical Medicine, agreed, suggesting that national researchers from the countries where the intervention will take place should be part of the evaluation from the very beginning to embed the element of capacity building in to the evaluation.
From page 108...
... These can include informed, intelligent intervention delivery; an increasing capacity for strengthened networks and ownership of programs with nationally led evaluation and research; and the use of monitoring and evaluation data by program staff and practitioners. Watts was struck by the three categorizations of purpose of evaluation stated earlier in the workshop by Chris Witty: (1)
From page 109...
... Early in a project, an evaluator may be able to provide valuable input to program staff as they design or modify an intervention. After this developmental phase, evaluators may need to achieve more independence from a program to deliver unbiased results, even if that means altering a relationship over time.
From page 110...
... A theory of change makes it possible to revisit design plans, frame data collection and feedback, and replicate interventions in other settings. In contrast, she was unenthusiastic about the logical framework approach, which she judged to be difficult to use, especially with low literacy populations or evaluation staff.
From page 111...
... "These are things that most evaluations should be striving to do." Considerations for Evaluation Design and Methods When thinking about the design for this complex intervention, there are some things to keep in mind, said Watts. To balance all of the different demands and multiple evaluation aims, there is a need to really try to make the proposed evaluation a prospective, mixed methods study conducted by a multidisciplinary team.
From page 112...
... In a way, when we're doing large evaluations we have a bit of an engine that's moving. However, qualitative work can be proactively and flexibly nested into quantitative studies by embedding researchers into the program to enable course corrections and provision of timely data.
From page 113...
... People may be trained to use sophisticated evaluation tools, but they may limit the solution space before bringing those tools to bear on a problem. "The first lesson of solution space is to work with communities, be humble, go home and reflect on these issues." Watts agreed with the need for humility, especially in complex interventions where evaluators need to spend a lot of time understanding the intricacies of a program, especially given that research methods can be blunt tools.
From page 114...
... Watts observed more generally that strong evaluations pose a challenge for current public health models of evaluation training and development. "Are the models of public health evaluation that we teach our students broad and flexible enough?


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.