Skip to main content

Currently Skimming:

11 Monitoring, Evaluation, and Continuous Improvement
Pages 211-227

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 211...
... database nor the few available evaluation studies enable an assessment of the relative value of each of the 14 component programs. Program monitoring and performance measurement efforts have largely been conducted as a top-down enterprise accompanied by little consultation with grantees.
From page 212...
... Fellowships directors distributed in the mid-1990s provide analytic syntheses   Congress stated: "The conferees are disappointed that the Department has not fully addressed the staffing needs of the Title VI and Fulbright-Hays international education programs" (House Report 108-010)
From page 213...
... . There are now two performance measures and one efficiency measure for most programs with an emphasis on measures intended to indicate outcomes, as preferred by OMB.
From page 214...
... reporting period. UISFL Percentage of critical languages Percentage of Cost per high addressed/covered by foreign projects reported quality, successfully language major, minor, or and validated as completed project.
From page 215...
... GPA Average fellow increases Percentage of Cost per grantee language competency by at least projects reported increasing language 0.50 level. and validated as competency by at of high quality least one level in one or successfully area (or all three)
From page 216...
... In practice, the system has been used by ED exclusively for two purposes: (1) to enable project officers to review the performance of individual grantees and make decisions about annual continuation grants for multiyear grants and (2)
From page 217...
... Grants do not typically allow significant feedback from the sponsoring agency and are not the ideal mechanism for developing a program monitoring   Infact, several universities reported that they were not aware that they could access their own data, and certainly did not know how to do so once it had been submitted.
From page 218...
... However, it appears that use of the redesigned system, which was renamed the International Resource Information System when launched as a "new" system (see http://www.ieps-iris.org) shortly before release of the committee's report, will continue to focus on individual grant monitoring and reporting performance measures.
From page 219...
... PROGRAM EVALUATION Although well-designed grantee data systems are vital program monitoring tools, they are rarely adequate to assess the effectiveness of programs   Martin Kramer requested the data to verify information included in an article published by Steven Heydemann based in part on the placement data.
From page 220...
... Few Title VI/FH program evaluations have been conducted in general; even fewer have focused on program outcomes. Of the evaluations that have been conducted (see Appendix B, Attachment B-1)
From page 221...
... Each program has a dues-paying organization of program directors: NRCs have the Council of NRC Directors, LRCs have the Council of Directors of National Foreign Language Resource Centers, and Centers for International Business Education and Research (CIBER) have the Association for International Business and Research.
From page 222...
... IEPS staff have actively encouraged the creation and content of the CIBER web site and have supported a shared web portal for the Business and International Education (BIE) program, hosted by Grand Valley State University (2006)
From page 223...
... Systems to encourage and support measurement of performance and sharing of information concerning promising and transferable approaches could improve overall performance of the Title VI/FH programs. AWARD TRANSPARENCY At the most basic level, successful grant applications are a source of information about best practice; at a minimum, they represent proposals that expert review panels rated as most responsive to application criteria.
From page 224...
... Curiously, while electronic review should make sharing of review results with applicants a quick process, ED prints and mails paper copies of the review. CONCLUSIONS AND RECOMMENDATIONS Monitoring, evaluation, and continuous improvement efforts at ED have been affected by staffing limitations, the availability of data, resources, and program leadership.
From page 225...
... Recommendation 11.2:  The Department of Education should com mission independent outcome and impact evaluations of all programs every 4 to 5 years. Well-designed program evaluations would be a more reliable approach to determine program outcomes and impacts than use of performance measures.
From page 226...
... Recommendation 11.3:  The Department of Education should work with universities to create a system of continuous improvement for Title VI and Fulbright-Hays programs. The system would help de velop performance indicators and other improvement tools and should include networks of similar centers (National Resource Centers, Lan guage Resource Centers, Centers for International Business Education and Research)
From page 227...
... Recommendation 11.4:  The Department of Education should make its award selection process more transparent, including making successful applications publicly available via the Internet. ED has been moving all of its grant competitions to the web-based http://resource-grants.gov.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.