Skip to main content

Currently Skimming:

2. Agency Methods
Pages 17-26

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 17...
... At the focus groups, the five agencies examined DOD, DOE, NSF, NIH, and NASA were asked to respond to the following questions regarding their methodology: · What methodology is used for evaluating research programs under GPRA?
From page 18...
... At NIH, a single overarching panel evaluates all NIH's research programs at one time. At NSF, numerous committees of visitors review individual research programs on a rolling 3-year basis.
From page 19...
... , such as that in Basic Energy Sciences at DOE and at NASA, have had more success than others within the agency. Both these agencies are undergoing major redesign efforts in how they respond to GPRA for their research programs.
From page 20...
... to evaluate their research programs.
From page 21...
... The panel found that agencies recognize the importance of relevance in planning and review and that they consider the degree to which research programs and projects support their missions. However, although the use of relevance as an evaluation criterion is commonly embedded as an implicit element of planning and reviewing, it might not appear as an explicit element of published GPRA performance plans or reviews.
From page 22...
... use international benchmarking to evaluate the Ieaclership level of research programs, as clescribec! in COSEPUP'S earlier Goals anc!
From page 23...
... This objective must be explicit not only because it affirms the value of educating scientists and engineers by including them in the research programs of their advisers, but also because it demonstrates how reductions in research funding in specific fields can jeopardize the preparation of the next generation of scientists and engineers who will be important to the nation's future. Recommendation M-S The cievelopment of human resources show!
From page 24...
... , the agency's GPRA performance plans and reports are expressed primarily in terms of quantitative goals and milestones. Research programs in these agencies might find themselves compelled to conform to a prescribed reporting format.
From page 25...
... . The process of expert review is implicitly understood by those involved in research, because agencies consider expert review to be the most objective and reliable mechanism for evaluating their research programs.
From page 26...
... The panel concluded that the criteria of quality, relevance, and leadership are more effective than quantitative performance indicators for evaluating research programs.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.