Skip to main content

Currently Skimming:

9 An Evaluation Initiative to Support Learning the Impact of USAID's Democracy and Governance Programs
Pages 219-234

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 219...
... But as discussed in earlier chapters, they have so far provided little evidence that meets accepted standards of impact evaluation about whether these projects have strengthened local governments, contributed to more robust civil societies, or helped create more legitimate judicial sectors in the countries in which they have been implemented. Five years from now, the committee hopes that the USAID will be in a position not only to clearly and persuasively identify the effects of its DG programs but also to claim leadership in the procedures for conducting 219
From page 220...
... Earlier chapters analyzed current USAID approaches to assessment and evaluation and proposed ways to provide the evidence of project impact that USAID needs both for its own programming and for presenting and defending its programs to the broader policy community in Washington and internationally. Earlier chapters focused on the specific policy and process changes that the committee believes are needed to help USAID overcome concerns that hinder undertaking sound impact evaluations and to augment USAID's overall learning to support DG programming.
From page 221...
... As discussed in Chapter 2, the World Bank has taken this approach through its Development Impact Evaluation (DIME) Initiative and NGOs such as the Poverty Action Lab at the Massachusetts Institute of Technology and the Evaluation Gap Working Group of the Center for Global Development are working to promote impact evaluations for a range of social programs. This is a time when many policymakers, both within and outside the United States, are calling for reinvigoration and rethinking of foreign assistance programs (among myriad sources, see, e.g., Lancaster 2000, 2006; National Endow   2006 study from the National Research Council addressed the broader issues of the A decline in evaluation capacity across USAID (NRC 2006)
From page 222...
... Tasks for the DG Evaluation Initiative The committee strongly recommends that, to accelerate the building of a solid core of knowledge regarding project effectiveness, the DG evaluation initiative should immediately develop and undertake a number of well-designed impact evaluations that test the efficacy of key project models or core development hypotheses that guide USAID DG assistance. A portion of these evaluations should use randomized designs, as these are the most accurate and credible means of ascertaining program impact.
From page 223...
... However, for important programs for which USAID desires impact evaluations but for which randomization is not feasible, carefully developed alternative designs, of the types discussed in Chapter 5, should be developed and implemented. At the end of this five-year period, USAID would have: • practical experience in implementing the evaluation designs that can indicate where such approaches are feasible, what the major obstacles are to wider implementation, and whether and how these obstacles can be overcome; • where the evaluations prove feasible, a solid empirical foundation to begin (1)
From page 224...
... But the important point is that the funds not come out of current mission program budgets that are already stretched thin. It is also important that the resources be used to support both the special impact evaluations chosen as the chief task of the DG evaluation initiative and efforts by country missions to improve their evaluations or conduct their own impact evaluations on chosen projects.
From page 225...
... The evaluation capacity of USAID's DG programs, like other capabilities, has thus increasingly shifted to the implementers who design and carry out the projects. Although the committee found in its own field visits that DG officers were, in general, quite willing to work with the committee's consultants who were evaluation experts and that the DG officers were open to considering new approaches to testing the efficacy of their programs, few of the officers thought they were capable of judging and overseeing varied impact evaluation designs without additional assistance and resources.
From page 226...
... who wish to develop impact evaluations of their programs would be a valuable augmentation of USAID's in-house resources. Partnerships to Add Capacity from Outside USAID While the committee believes that a substantial augmentation of USAID's internal capacity for evaluation design is necessary for the proposed evaluation initiative to be effective, there is no reason that USAID's efforts to improve evaluation must be purely an in-house affair.
From page 227...
... In most cases the focus was and remains on doing democracy rather than studying how to do democracy. There were and are important exceptions, and in addition some universities are major implementers of USAID DG programs, such as SUNY Albany's long-term efforts at legislative strengthening, or the work of the IRIS Center at the University of Maryland on issues related to economic development and governance. Although not necessary for the initial DG evaluation initiative, for the longer term USAID might consider investing resources to develop a   Further information about the IRIS program may be found at http://www.iris.umd.edu/ and about SUNY Albany's Center for Legislative Development at http://www.albany.edu/cld/.
From page 228...
... In addition to providing expertise to advise programming and research to advance knowledge, such agencyuniversity centers could assist DG -- and USAID more broadly -- in developing a standardized training module on evaluation techniques for DG program staff. Agenda for USAID and SORA As part of its charge from USAID, the committee was asked to recommend a "refined and clear overall research and analytic design that integrates the various research projects under SORA into a coherent whole in order to produce valid and useful findings and recommendations for democracy program improvements." Various parts of this design have been dealt with in depth in earlier chapters and will not be repeated here.
From page 229...
... and its understanding of the needs of DG officers in Washington and in the field would make it a logical place from which such an initiative could be developed. Improving Monitoring and Evaluation This chapter has outlined the proposed evaluation initiative the committee believes should be the core of the effort to improve USAID's ability to assess the effectiveness of its projects in the future.
From page 230...
... Taken together and supported by the leadership of USAID, the SORA program and the wider efforts of the DG office and USAID that are more broadly discussed throughout this report would provide USAID with the capacity to effectively evaluate and continuously improve its work to support democratic development. Role of Congress and the Executive Branch USAID cannot undertake the evaluation initiative and other efforts recommended here alone.
From page 231...
... Indeed, given the currently uncertain knowledge and difficult challenge of advancing democracy in diverse conditions, learning that half or two-thirds of USAID's DG programs have real and significant effects in helping countries advance should be seen as fundamentally positive and evidence of success, while learning which half or one-third of programs are not effective should be seen as an important step in advancing the targeting and effectiveness of democracy assistance. Unrealistic expectations for universal success or rapid advances, given USAID's modest budgets for DG assistance and the complexities and many countervailing forces that prevail in the real world of democracy assistance, will not help the necessary learning -- which will involve some incremental advances and some cases of learning from setbacks -- that would lead to meaningful advances in the field of foreign assistance.
From page 232...
... Yet perhaps the single most significant deficiency that the committee observed in regard to USAID learning which of its DG projects are most effective and when was the lack of well-designed impact evaluations of such projects. The committee sees an enormous opportunity for USAID to accelerate its learning and the effectiveness of its programming by learning through the proposed evaluation initiative whether and how impact evaluations could be applied to DG projects.
From page 233...
... 2006. The Backlash Against Democracy Assistance.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.