Executive Summary1
For more than 50 years, the National Assessment of Educational Progress (NAEP) has served as an essential resource that helps educators and policy makers understand important educational outcomes for students in the United States. As the nation’s only mechanism for tracking student achievement over time and comparing trends across states and districts, NAEP is invaluable. It is also expensive, costing about $175.2 million per year. Moreover, its costs are rising, which has led to concerns about the program’s long-term viability.
The independent National Assessment Governing Board (NAGB) sets policy for NAEP, which is administered by the National Center for Education Statistics (NCES), a part of the Institute of Education Sciences (IES) in the U.S. Department of Education. Given current concerns, IES asked the National Academies of Sciences, Engineering, and Medicine to form an expert panel to recommend innovations to improve the cost effectiveness of NAEP while maintaining or improving its technical quality and the information it provides.
To carry out its task, the panel sought detailed information about NAEP’s costs. Despite extensive NCES assistance, however, the panel concluded that there is insufficient information to completely understand NAEP’s costs and connect them to key parts of the program.
___________________
1 After a prepublication version of the report was provided to IES, NCES, and NAGB, this section was edited to remove an incomplete comparison with international assessment costs; reflect a broader range of costs related to management, planning, support, and oversight; and revise the description of those costs.
- The panel’s first recommendation is that NCES and NAGB should develop clear, consistent, and complete descriptions of current spending on NAEP’s major components and use them to ensure that the budget can support any major programmatic decisions (Recommendation 2-1).
The panel then identified a set of innovations to improve NAEP. Some of these involve structural changes related to the assessments included in the program and their frameworks.
- NAGB should give high priority to considering integrating subjects that are now assessed separately, such as reading and writing or science, technology, and engineering literacy (Recommendation 3-3).
- Long-term trend NAEP provides duplicate trend information for reading and mathematics, although it is relatively inexpensive and provides useful complementary information to main NAEP. NCES should prepare a detailed plan and budget for the modernization of long-term trend NAEP to support a joint consideration with Congress and NAGB of its value in comparison with other program priorities (Recommendation 3-1).
- Because the greatest threat to maintaining NAEP’s trend line comes from updates to its assessment frameworks, NAGB and NCES should work both independently and collaboratively to achieve smaller and more frequent framework updates (Recommendation 3-2).
Other innovations identified by the panel concern changes to the major assessment components. The most expensive component of NAEP—about 28.6 percent of its budget—is test administration, because of the program’s unusual approach to administering the assessment by sending contractor teams and computers to the sampled schools.
- NCES should continue to develop its plan to administer NAEP using local school staff as proctors with online assessment delivery on local school computers, with tailored support for schools with limited resources (Recommendation 5-1).
- Because local administration will involve greater variation across locations, NCES should collect information about local devices and administration conditions, and explore statistical techniques to produce estimates that generalize across those differences (Recommendation 5-2).
- The panel’s analysis suggests that full deployment of local administration might save substantially more than NCES currently
- estimates. NCES should review its estimates of the potential savings that are possible from local administration (Recommendation 5-3).
Other innovations in NAEP administration have the potential to reduce costs and, in some cases, also improve the program’s technical quality or reduce its burden on students and schools.
- NCES should continue its plan to administer NAEP in longer sessions that allow 90 minutes for the testing of cognitive items for each student (Recommendation 6-1).
- NCES should analyze the tradeoff between NAEP’s sample sizes and statistical power for detecting policy-relevant differences in performance (Recommendation 6-2).
- NCES should not pursue adaptive testing for NAEP as a way of saving costs, but should continue to investigate its potential to improve the statistical estimates and the test-taking experiences for low-performing students (Recommendation 6-3).
- NCES should not attempt to coordinate NAEP administration with the administration of international tests as a way to reduce costs (Recommendation 6-4).
Program management, planning, support, and oversight costs account for more than 28.7 percent of NAEP’s budget, which is large both in absolute terms and as a percentage of NAEP’s budget.
- NAGB and NCES should commission an independent audit of the program management and decision-making processes and costs in the NAEP program, with a charge and sufficient access to review the program’s costs in detail and propose ways to streamline these processes (Recommendation 10-1).
- NCES should increase the visibility and coherence of NAEP’s research activities with an identifiable budget, innovation strategy, and program of activities (Recommendation 10-2).
The item development contract is much larger than is accounted for by item creation and pilot testing.
- The cost and scope of the item development contract should be examined (Recommendation 4-1).
- NAGB and NCES should use more structured processes for item development to both decrease costs and improve quality (Recommendation 4-2).
- NAGB should commission an analysis of the value and cost of different item types (Recommendation 4-3).
Automated scoring would be cost effective for the large NAEP assessments, which could reduce costs by about 0.7 percent of NAEP’s budget.
- NCES should continue its work to implement automated scoring (Recommendation 7-1).
The costs of analysis, reporting, and program management account for about 10.0 percent of NAEP’s budget.
- A greater percentage of the analysis and reporting budget should be devoted to innovations that will increase the use and understanding of NAEP’s data (Recommendation 8-1).
As NCES develops the Next-Gen eNAEP platform for assessment administration, it needs to pay close attention to costs for technology support, which account for about 16.8 percent of NAEP’s budget.
- NCES should regularly evaluate software built by other vendors or available in open-source libraries for use in Next-Gen eNAEP (Recommendation 9-1).
- NCES should ensure that there is adequate expertise related to enterprise software development to support and oversee Next-Gen eNAEP development (Recommendation 9-2).
- NCES should seek expert guidance from enterprise application developers and educational technologists about the platform’s projected costs (Recommendation 9-3).