Skip to main content

Currently Skimming:

3 The Development of a Guide for Evaluating Instructional Materials
Pages 23-38

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 23...
... Potential users field tested the prototype to provide information to guide the Committee in making rev~sions. The chart below outlines the Examine existing review tools.
From page 24...
... This exercise provided the context for designing early versions of the evaluation too} for use in initial field tests. The varied professional experiences of the Committee members (science teachers, scientists, science supervisors, and science curriculum designers)
From page 25...
... The Committee then constructed a prototype too} and subjected it to an iterative process that cycled experiences from field tests and focus groups back to the Committee to inform the modifications made in subsequent drafts. The Committee established the foDowing general principles as the basis for its (resign of a prototype evaluation tool.
From page 26...
... Other review tools designed for use in limited time periods commonly use a checklist of items for consideration, a numerical scale, and weighted averages of the numerical evaluations. Use of such tools can result in a superficial evaluation of a set of materials that may identify the content stan(lar(ls covered, but fail to indicate whether the coverage will help teachers foster student learning and understanding.
From page 27...
... Moreover, some materials do not consistently renect an understanding of what is and what is not important in a particular scientific discipline. The Committee found, in its examination of instructional materials, many cases where materials contained detailed information of little relevance, extensive unnecessary vocabulary, and only cursory treatment of the essential concepts.
From page 28...
... In some school districts, one team evaluates instructional materials and reports to another group that is responsible for final approval and selection. In others, one team is responsible for both evaluating and selecting instructional materials.
From page 29...
... Each set reviewed the same middle school environmental science materials consideredbytheCommittee. One test involved leaders from four states cooperating in a rural systemic initiative supported by the NSF.
From page 30...
... All three sets of reviewers rejected a review criterion that required publishers to supply data on the materials' effectiveness based on field tests or other research findings. They considered the criterion unlikely to produce useful information.
From page 31...
... 6. Consideration of the cost of the materials, an element in the prototype tool, seemed to confuse reviewers, required extensive research, and did not contribute to the evaluation.
From page 32...
... Committee's response: Include in the training guide advice on organizing and carrying out evaluation and selection, designed for He school district facilitators of these processes. SECOND ROUND OF FIELD TESTS USING THE MODIFIED TOOL The Committee modified the prototype too!
From page 33...
... Because the unit did not meet the two content standards, several reviewers expressed concern that the stanciardsbased review would undermine the use of the unit that had been chosen by their school district. Expressing satisfaction with the process as a whole, the reviewers said they viewed the process as one they could use to select instructional materials, despite concerns about the time involved.
From page 34...
... Toward the same end, each step of the suggested review process reiterates the overall goal of increasing student achievement by applying the stan(lar(ls. Scientists participated at each site during the second round of field testing, in each case contributing a point of view that complemented that of the educators and emphasizing their importance to a thorough evaluation.
From page 35...
... Field tests were carried out both with and without prior training. The sophistication and depth of the evaluations carried out after training were significantly improved compared to those obtained when training was omitted.
From page 36...
... Another lesson learned from the field trials concerns the priorities given to different aspects of the review materials. In the absence of training, some reviewers made no priorities among the several criteria being considered.
From page 37...
... The Committee envisions that the too} will be regularly revised in response to experience and ongoing learning research. The Committee recognized an inherent (lifficulty in trying to (letermine whether a particular instructional material is "good." The definition of "good" must inclu(le an assessment of the match between the instructional material anti the applicable stan(lar(ls, learning goals, and pedagogical approaches.
From page 38...
... Moreover, most assessments evaluate the effectiveness of a student's entire learning experience; they do not distinguish between what students learn from instructional materials and the teaching centered on the materials, as distinct from what they have learned from their own activities and experiences and from their parents. There is no substantial body of research that tries to evaluate the effectiveness of particular instructional materials as a separate variable in the total learning experience.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.