5 Resources for Training
This chapter contains the training materials, background information, and samples referred to earlier in the report and guide in a format that makes copying these pages convenient. Refer to the Web site of the National Research Council (<http://www.national-academies.org>) for an electronic version of the entire publication (useful for customizing materials). This chapter is composed of the following topics:
Numbering standards
Citing evidence
Defining criteria
"Instructional Analysis" from Project 2061
"Judging How Well Materials Assess Science Learning Goals" from Project 2061
NUMBERING STANDARDS
Numbering the standards assists reviewers in communicating with one another and making written records. If the standards you are using are not numbered, like the National Science Education Standards below, adopt a numbering system similar to this example.
It is best to have reviewers write the numbers on their copies of the standards, instead of providing a separate list. Using only a list could encourage a shallow topical review.
SAMPLE FROM THE K-4 SCIENCE CONTENT OF THE NATIONAL SCIENCE EDUCATION STANDARDS
A Science as Inquiry (see pp. 121-122)
1 Abilities necessary to do scientific inquiry
-
Ask a question about objects, organisms, and events in the environment. This aspect of the standard emphasizes students asking questions that they can answer with scientific knowledge, combined with their own observations. Students should answer their questions by seeking information from reliable sources of scientific information and from their own observations and investigations.
-
Plan and conduct a simple investigation. In the earliest years, students may design and conduct simple experiments to answer questions. The idea of a fair test is possible for many students to consider by fourth grade.
-
Employ simple equipment and tools to gather data and extend the senses. In early years, students develop simple skills, such as how to observe, measure, cut, connect, switch, turn on and off, pour, hold, tie, and hook. Begin-
This standard can be referred to as K-4/A1b. |
-
ning with simple instruments, students can use rulers to measure the length, height, and depth of objects and materials; thermometers to measure temperature; watches to measure time; beam balances and spring scales to measure weight and force; magnifiers to observe objects and organisms; and microscopes to observe the finer details of plants, animals, rocks, and other materials. Children also develop skills in the use of computers and calculators for conducting investigations.
B Physical Sciences (pp. 123 and 127)
2 Position and motion of objects
-
The position of an object can be described by locating it relative to another object or the background.
-
An object's motion can be described by tracing and measuring its position over time.
-
The position and motion of objects can be changed by pushing or pulling. The size of the change is related to the strength of the push or pull.
-
Sound is produced by vibrating objects. The pitch of the sound can be varied by changing the rate of vibration.
This standard can be referred to as K-4/B2c. |
CITING EVIDENCE
During review training, examples of good and poor citations for the review criteria provide reviewers with a model and help ensure that convincing evidence is collected. The content criterion (Form 2) is particularly improved by giving examples. The quality and quantity of the evidence becomes important in the selection process (Step 4), and is essential in documenting the rigor of the process. A few examples, shown on overhead transparencies during review training, can be effective in getting the point across. Reviewers also appreciate knowing who may be reading their reviews and why.
Good Examples
-
The lessons that highlight the first part of the standard (the Sun can be seen in the daytime) include shadows and model ships, discussion of the shape of the Earth, tracking shadows, the Earth as a sphere, the part of the Earth that is illuminated by the Sun at any one time. (Problem: the models are not necessarily convincing.)
-
The lessons do not address why the moon can be seen during the day.
-
The "lab" model does reinforce that the Earth rotates. (Problem: it very much reinforces the incorrect notion of a geocentric universe.)
-
Missed opportunity. In the Explorations with the Lab there is potential for a full-scale inquiry (e.g., pick a location anywhere in the world, figure out where it is in relation to the equator, and make up a question on how much daylight it has during a particular season).
Fair Examples
-
The module covers only the relationship of Sun and Earth, and does not develop a model of the Universe; so the moon and stars are excluded.
-
The shadow tracking sheet seems to be an easy and observable way of collecting analyzable data.
Poor Examples
-
There are many pieces of content that lend themselves to matching this standard.
-
I'm not a teacher — can't respond.
DEFINING CRITERIA
It is critical to the success of the review that definitions of the criteria on Form 3 be agreed upon and understood by all reviewers and be compatible with local needs. To develop a shared understanding of the criteria have the reviewers participate in developing a working definition for each criterion:
-
active engagement
-
depth of understanding
-
scientific inquiry
-
assessments
The finished working definitions should be distributed to reviewers with their review forms.
SUGGESTED PROCEDURE
-
Divide the review team into small groups and assign one of the student learning criteria to each group.
-
Each small group should brainstorm, then prioritize, and finally summarize responses to the following questions in brief statements:
-
Why is this criterion important?
-
What are the most important elements for meeting this criterion? Apply your knowledge of effective teaching strategies and research on learning. Consult reference documents on standards and effective science education.
-
What qualities of instructional materials should a reviewer look for in reviewing with this criterion?
-
A representative from each small group can present its products to the others for review and comment. All reviewers need to understand and agree with the definitions of each criterion. If necessary, the small groups should meet again to make revisions.
-
At this point, you may need to customize the review criterion to meet your needs. It may become apparent that another criterion should be added. Or the group may decide that a certain element of a criterion is so essential it should be made mandatory. This kind of customization is very much in the spirit of this guide and the forms provided. Changes that would compromise the quality of your review would be the deletion of any criterion, the substitution of a scale or checklist for the citing of evidence, or any process that does not focus on one standard at a time.
During development of the tool it was decided that equity concerns can be naturally and appropriately addressed within the four criteria on Form 3. You may want to reexamine your definitions to determine whether the opportunity for all students to learn is adequately addressed. Alternatively, each group could add an equity aspect to each definition, or you could add and define a criterion on addressing diverse learners.
-
The definitions should be distributed with Form 3 during the review.
Suggested resources for the small groups
-
Provide copies of the National Science Education Standards and Benchmarks for Science Literacy, in particular the sections introducing and accompanying the standards, the research citations, and teaching standards.
-
The addendum to the National Science Education Standards concerning inquiry (NRC, forthcoming) will contain resources to help develop a deeper understanding of scientific inquiry as content subject matter, student abilities, and a teaching strategy.
-
Make available the expanded definitions of review criteria from Project 2061's Identifying Curriculum Materials for Science Literacy: A Project 2061 Evaluation Tool (Roseman et al., 1997). See "Instructional Analysis." Many applicable terms are defined and described in these comprehensive, thoroughly researched, and field tested criteria.
-
The forthcoming Resources for Science Literacy: Curriculum Materials Evaluation from AAAS provides complete workshop plans for helping participants understand the meaning of specific learning goals.
-
The "guiding principles" in Part II of In Search of Understanding: The Case for Constructivist Classrooms (Brooks and Brooks, 1993) may be a useful refresher for those who have studied constructivist learning theories, and may be applicable to the criterion on developing depth of understanding.
-
Evaluating the assessments in instructional materials has also been studied and published by staff at Project 2061 (Stern, 1999). These assessment evaluation criteria will be helpful in developing the definitions for assessment criterion 3.4. An excerpt from that paper is in "Judging How Well Materials Assess Science Learning Goals."
INSTRUCTIONAL ANALYSIS AN EXCERPT FROM A Project 2061 Report*
The purpose of the instructional analysis is to estimate how well the material addresses targeted benchmarks from the perspective of what is known about student learning and effective teaching. The criteria for making such judgments are derived from research on learning and teaching and on the craft knowledge of experienced educators. In the context of science literacy, summaries of these have been formulated in Chapter 13: Effective Learning and Teaching in Science for All Americans; in Chapter 15: The Research Base of Benchmarks for Science Literacy; and of science education alone in Chapter 3: Science Teaching Standards in National Science Education Standards.
From those sources, seven criteria clusters have been identified to serve as a basis for the instructional analysis. (One could view these as standards for instructional materials.) A draft of the specific questions within each cluster is shown below. The proposition here is that (1) in the ideal all questions within each cluster would be well addressed in a material — they are not alternatives; and (2) this analysis has to be made for each benchmark separately — if we are serious about having science literate high school graduates then we want to focus effective instruction on every single one of the important ideas in Science for All Americans.
Cluster I, Providing a Sense of Purpose. Part of planning a coherent curriculum involves deciding on its purposes and on what learning experiences will likely contribute to achieving those purposes. But while coherence from the designers' point of view is important, it may be inadequate to give students the same sense of what they are doing and why. This cluster includes criteria to determine whether the material attempts to make its purposes explicit and meaningful, either by itself or by instructions to the teacher.
* |
Excerpt from Roseman, J.E., S. Kesidou, and L. Stern. 1997. Identifying Curriculum Materials for Science Literacy. A Project 2061 Evaluation Tool. Based on a paper prepared for the colloquium "Using the National Science Education Standards to Guide the Evaluation, Selection, and Adaptation of Instructional Materials." National Research Council, November 10-12, 1996. See <http://project2061.aaas.org/newsinfo/research/roseman/roseman2.html>. |
Framing. Does the material begin with important focus problems, issues, or questions about phenomena that are interesting and/or familiar to students?
Connected sequence. Does the material involve students in a connected sequence of activities (versus a collection of activities) that build toward understanding of a benchmark(s)?
Fit of frame and sequence. If there is both a frame and a connected sequence, does the sequence follow well from the frame?
Activity purpose. Does the material prompt teachers to convey the purpose of each activity and its relationship to the benchmarks? Does each activity encourage each student to think about the purpose of the activity and its relationship to specific learning goals?
Cluster II, Taking Account of Student Ideas. Fostering better understanding in students requires taking time to attend to the ideas they already have, both ideas that are incorrect and ideas that can serve as a foundation for subsequent learning. Such attention requires that teachers are informed about prerequisite ideas/skills needed for understanding a benchmark and what their students' initial ideas are — in particular, the ideas that may interfere with learning the scientific story. Moreover, teachers can help address students' ideas if they know what is likely to work. This cluster examines whether the material contains specific suggestions for identifying and relating to student ideas.
Prerequisite knowledge/skills. Does the material specify prerequisite knowledge/skills that are necessary to the learning of the benchmark(s)?
Alerting to commonly held ideas. Does the material alert teachers to commonly held student ideas (both troublesome and helpful) such as those described in Benchmarks Chapter 15: The Research Base?
Assisting the teacher in identifying students' ideas. Does the material include suggestions for teachers to find out what their students think about familiar phenomena related to a benchmark before the scientific ideas are introduced?
Addressing commonly held ideas. Does the material explicitly address commonly held student ideas?
Assisting the teacher in addressing identified students' ideas. Does the material include suggestions for teachers on how to address ideas that their students hold?
Cluster III, Engaging Students with Phenomena. Much of the point of science is explaining phenomena in terms of a small number of principles or ideas. For students to appreciate this explanatory power, they need to have a sense of the range of phenomena that science can explain. "Students need to get acquainted with the things around them — including devices, organisms, materials, shapes, and numbers — and to observe them, collect them, handle them, describe them, become puzzled by them, ask questions about them, argue about them, and then try to find answers to their questions." (SFAA, p. 201) Furthermore, students should see that the need to explain comes up in a variety of contexts.
First-hand experiences. Does the material include activities that provide first-hand experiences with phenomena relevant to the benchmark when practical and when not practical, make use of videos, pictures, models, simulations, etc.?
Variety of contexts. Does the material promote experiences in multiple, different contexts so as to support the formation of generalizations?
Questions before answers. Does the material link problems or questions about phenomena to solutions or ideas?
Cluster IV, Developing and Using Scientific Ideas. Science for All Americans includes in its definition of science literacy a number of important yet quite abstract ideas — e.g., atomic structure, natural selection, modifiability of science, interacting systems, common laws of motion for earth and heavens. Such ideas cannot be inferred directly from phenomena, and the ideas themselves were developed over many hundreds of years as a result of considerable discussion and debate about the cogency of theory and its relationship to collected evidence.
Science literacy requires that students see the link between phenomena and ideas and see the ideas themselves as useful. This cluster includes criteria to determine
whether the material attempts to provide links between phenomena and ideas and to demonstrate the usefulness of the ideas in varied contexts.
Building a case. Does the material suggest ways to help students draw from their experiences with phenomena, readings, activities, etc., to develop an evidence-based argument for benchmark ideas? (This could include reading material that develops a case.)
Introducing terms. Does the material introduce technical terms only in conjunction with experience with the idea or process and only as needed to facilitate thinking and promote effective communication?
Representing ideas. Does the material include appropriate representations of scientific ideas?
Connecting ideas. Does the material explicitly draw attention to appropriate connections among benchmark ideas (e.g., to a concrete example or instance of a principle or generalization, to an analogous idea, or to an idea that shows up in another field)?
Demonstrating/modeling skills and use of knowledge. Does the material demonstrate/model or include suggestions for teachers on how to demonstrate/ model skills or the use of knowledge?
Practice. Does the material provide tasks/questions for students to practice skills or using knowledge in a variety of situations?
Cluster V, Promoting Student Reflection. No matter how clearly materials may present ideas, students (like all people) will make their own meaning out of it. Constructing meaning well is facilitated by having students (a) make their ideas and reasoning explicit, (b) hold them up to scrutiny, and (c) recast them as needed. This cluster includes criteria for whether the material suggests how to help students express, think about, and reshape their ideas to make better sense of the world.
Expressing ideas. Does the material routinely include suggestions (such as group work or journal writing) for having each student express, clarify, justify, and
represent his/her ideas? Are suggestions made for when and how students will get feedback from peers and the teacher?
Reflecting on activities. Does the material include tasks and/or question sequences to guide student interpretation and reasoning about phenomena and activities?
Reflecting on when to use knowledge and skills. Does the material help or include suggestions on how to help students know when to use knowledge and skills in new situations?
Self-monitoring. Does the material suggest ways to have students check their own progress and consider how their ideas have changed and why?
Cluster VI, Assessing Progress. There are several important reasons for monitoring student progress toward specific learning goals. Having a collection of alternatives can ease the creative burden on teachers and increase the time available to analyze student responses and make adjustments in instruction based on them. This cluster includes criteria for whether the material includes a variety of goal-relevant assessments.
Alignment to goals. Assuming a content match of the curriculum material to this benchmark, are assessment items included that match the content?
Application. Does the material include assessment tasks that require application of ideas and avoid allowing students a trivial way out, like using a formula or repeating a memorized term without understanding?
Embedded. Are some assessments embedded in the curriculum along the way, with advice to teachers as to how they might use the results to choose or modify activities?
Cluster VII, Enhancing the Learning Environment. Many other important considerations are involved in the selection of curriculum materials — for example, the help they provide teachers in encouraging student curiosity and creating a classroom community where all can succeed, or the material's scientific accuracy or
attractiveness. Each of these can influence student learning, even whether the materials are used. The criteria listed in this cluster provide reviewers with the opportunity to comment on these and other important features.
Teacher content learning. Would the material help teachers improve their understanding of science, mathematics, and technology and their interconnections?
Classroom environment. Does the material help teachers to create a classroom environment that welcomes student curiosity, rewards creativity, encourages a spirit of healthy questioning, and avoids dogmatism?
Welcoming all students. Does the material help teachers to create a classroom community that encourages high expectations for all students, that enables all students to experience success, and that provides all different kinds of students a feeling of belonging to the science classroom?
Connecting beyond the unit. Does the material explicitly draw attention to appropriate connections to ideas in other units?
Other strengths. What, if any, other features of the material are worth noting?
JUDGING HOW WELL MATERIALS ASSESS SCIENCE LEARNING GOALS An Excerpt From A Project 2061 Report*
CRITERION 1. ALIGNING TO GOALS. Assuming a content match between the curriculum material and the benchmark, are assessment items included that match the same benchmark?
Indicators of meeting the criterion:
-
The specific ideas in the benchmark are necessary in order to respond to the assessment items.
-
The specific ideas in the benchmark are sufficient to respond to the assessment items (or, if other ideas are needed, they are not more sophisticated and have been taught earlier).
CRITERION 2. TESTING FOR UNDERSTANDING. Does the material assess understanding of benchmark ideas and avoid allowing students a trivial way out, like repeating a memorized term or phrase from the text without understanding?
Indicators of meeting the criterion:
-
Assessment items focus on understanding of benchmark ideas (as opposed to recall).
-
Assessment items include both familiar and novel tasks.
CRITERION 3. INFORMING INSTRUCTION. Are some assessments embedded in the curriculum along the way, with advice to teachers as to how they might use the results to choose or modify activities?
Indicators of meeting the criterion.
-
The material uses embedded assessment as a routine strategy (rather than just including occasional questions).
-
The material suggests how to probe beyond students' initial responses to clarify and further understand student answers.
-
The material provides specific suggestions to teachers about how to use the information from the embedded assessments to make instructional decisions about what ideas need to be addressed by further activities.