Scenario-, Problem-, and Case-Based Teaching and Learning
The primary purpose of the October workshop was to thoughtfully examine the evidence behind a select set of promising practices that came to light during the June workshop. Susan Singer opened the October workshop by linking its agenda to key themes of the June workshop (see Chapter 3). Although these practices are not perfect and do not represent the universe of evidence-based innovations, she said, they are recognized by experts as promising, and each is supported by some evidence.
The promising practices discussed include scenario-, problem-, and case-based teaching and learning (this chapter); assessments to guide teaching and learning (Chapter 5); efforts to restructure the learning environment (Chapter 6); and faculty professional development (Chapter 7). Singer explained that the presentations were based on papers prepared following a template the steering committee developed after the June workshop.1 The authors were asked to describe the context in which the promising practice was implemented, identify examples of how the practice was used, and provide evidence to support the claim that the practice was promising, including evidence of its impact or efficacy.
The template for the papers is available at http://www.nationalacademies.org/bose/Commissioned_Paper_Template.pdf and the papers are available at http://www.nationalacademies.org/bose/PP_Commissioned_Papers.html.
David Gijbels (University of Antwerp) described the cycle of problem-based learning (see Figure 4-1). After the instructors present a problem to the class, students meet in small groups to discuss what they know about it and what they need to learn. During a short period of independent self-study, students gather the needed resources to solve the problem. They then reconvene their small groups to re-assess their collective understanding of the problem. When they solve the problem, the instructor provides a different problem and the cycle begins anew.
Noting that problem-based learning has many possible definitions and permutations, Gijbels nonetheless stressed the importance of identifying a core set of principles that characterize this type of learning. Having a core definition enables researchers to compare problem-based learning with other types of learning environments. In his research, Gijbels uses a model developed by Howard Barrows (1996) that identifies six characteristics of problem-based learning:
Tutor as a facilitator or guide.
The problem is the tool to achieve knowledge and problem-solving skills.
Gijbels then described a meta-analysis conducted to examine the effects of problem-based learning on students’ knowledge and their application of knowledge, and to identify factors that mediated those effects (Dochy et al., 2003). The meta-analysis focused on empirical studies that compared problem-based learning with lecture-based education in postsecondary classrooms in Europe, and almost all of the studies that met the criteria focused on medical education.2 Through the analysis, Gijbels and his colleagues found the following:
Students’ content knowledge was slightly lower in problem-based learning courses than in traditional lecture courses.
Although students in problem-based learning environments demonstrated less knowledge in the short term, they retained more knowledge over the long term.
Students in problem-based learning settings were better able to apply their knowledge than students in traditional courses.
These findings prompted Gijbels and his colleagues to undertake a deeper analysis of the assessment of problem-based learning (Gijbels et al., 2005). That analysis focused on three levels of knowledge that were assessed in the selected studies: (1) knowledge of concepts, (2) understanding of principles that link concepts, and (3) the application of knowledge. Gijbels noted that of the 56 studies in the analysis, 31 focused on concepts, 17 focused on principles, and 8 focused on the application of knowledge. The analysis revealed the following:
Students in problem-based learning environments and traditional lecture-based learning environments exhibited no differences in the understanding of concepts.
Students in problem-based learning environments had a deeper understanding of principles that link concepts together.
Students in problem-based learning environments demonstrated a slightly better ability to apply their knowledge than students in lecture-based classes.
The study is described in the workshop paper by Gijbels (see http://www.nationalacademies.org/bose/Gijbels_CommissionedPaper.pdf).
Gijbels concluded by stating that problem-based learning has not completely fulfilled its potential. He suggested that students might become better problem solvers if faculty members assessed them more on problem solving. Noting that students often do not develop a sense of shared cognition when working in teams in problem-based learning environments, he also stressed the importance of attending to group developmental processes when implementing problem-based learning.
Mary Lundeberg (Michigan State University) defined some key elements of case-based teaching. In the paper she wrote for the workshop (Lundeberg, 2008, p. 1), she said:
Cases involve an authentic portrayal of a person(s) in a complex situation(s) constructed for particular pedagogical purposes. Two features are essential: interactions involving explanations, and challenges to student thinking. Interactions involving explanations could occur among student teams, the instructor and a class; among distant colleagues; or students constructing interpretations in a multimedia environment. Cases may challenge students’ thinking in many ways, e.g., applying concepts to a real life situation; connecting concepts [and/or] interdisciplinary ideas; examining a situation from multiple perspectives; reflecting on how one approaches or solves a problem; making decisions; designing projects; considering ethical dimensions of situations. Brief vignettes, quick examples, or unedited documents are not cases.
She presented four examples to illustrate the wide range of cases that might be used in undergraduate science, technology, engineering, and mathematics (STEM) education:3
The Deforestation of the Amazon: A Case Study in Understanding Ecosystems and Their Value, a problem-based case used in a biology seminar for nonmajors.
Cross-Dressing or Crossing-Over: Sex Testing of Women Athletes, a historical case used in large lecture courses with clicker technology (handheld wireless devices through which students register their responses to multiple-choice questions that are projected on a screen).
Case It!, in-depth problem-based multimedia cases used in biology labs.
Project-based scenarios used in engineering.
For more detail on these cases, see the workshop paper by Lundeberg (see http://www.nationalacademies.org/bose/Lundeberg_CommissionedPaper.pdf).
Citing the National Research Council (2002), Lundeberg identified three types of research questions often investigated in studies of educational activities—those that focus on description, cause, and process. She explained that there is much more descriptive research (i.e., faculty and student perceptions of what is happening) than research showing causal effects or describing the process of learning.
Lundeberg described the research that she and her colleagues have conducted on case-based learning. The descriptive aspects of their research involved surveys of 101 faculty members in 23 states and Canada who were using cases from the National Center on Case Study Teaching and Science (see http://library.buffalo.edu/libraries/projects/cases/case.html). On the surveys, faculty members reported that cases make students more engaged and active learners and help them to develop multiple perspectives, gain deeper conceptual understanding, engage in critical thinking, enhance their communications skills, and develop positive peer relationships (Lundeberg, 2008). Lundeberg also reported that faculty members cited the increased time needed to prepare lessons and assess students as obstacles to implementing case-based learning.
To identify the systematic effects of case-based learning, Lundeberg and her colleagues conducted a year-long study of the use of cases in large undergraduate biology classes equipped with clickers. The study combined a design involving random assignment to experimental and control groups with an A-B-A-B design in which 12 participating faculty members alternated the use of cases and lectures systematically across two semesters. They found that “students (n = 4,366) who responded to cases using ‘clicker’ technology performed significantly better than their peers on five of the eight biology topics (cells, Mendelian genetics, cellular division, scientific method, and cancer), and in five of the eight areas in which they were asked to transfer information (cells, cellular division, scientific method, microevolution and DNA)” (Lundeberg, 2008, p. 8).
Students in the clicker classes also performed significantly better on tests of data interpretation than students in lecture classes. However, students who used cases with clicker technology showed no difference or lower effects on standardized tests measuring accumulated medical knowledge, on one topic in biology (characteristics of life), and on standardized tests of critical thinking.
Lundeberg argued that cases are effective for several reasons. First, stories are a powerful mechanism for organizing and storing information. In addition, the real-life context engages students. Cases also challenge students’ thinking and require them to integrate knowledge, reflect on their ideas, and articulate them. Lundeberg noted that role-playing during case-based education engages students and enables them to consider multiple perspectives.
In closing, Lundeberg reiterated that cases have an impact on understanding, scientific thinking, and engagement. She cited the need for more multiyear, mixed-methods studies on the effectiveness of case-based teaching, particularly classroom experiments that do not confound instructor or student effects. She also identified several gaps in the knowledge base at the undergraduate level: Which students benefit from cases? What content is most suitable? What benefits do different types of cases afford? What kinds of interaction between students and faculty matter? Do cases promote scientific literacy?
USE OF COMPLEX PROBLEMS IN TEACHING PHYSICS
Tom Foster (Southern Illinois University) discussed the use of complex problems in teaching physics. He explained that complex problems are rooted in cooperative group problem solving, which is characterized by the following traits (Foster, 2008):
positive interdependence among group members;
monitoring of interpersonal skills;
frequent processing of group interactions and functioning; and
aspects of the task or learning activity that require ongoing conversation, dialogue, exchange, and support.
Foster emphasized the importance of designing the appropriate task in using this teaching method. He noted that if the problems are simple enough to be solved moderately well alone, students will not abandon their independence to work in a group. Students also will not abandon their independence if the problems are too complex for the group to initially succeed in solving them.
Context-rich problems are one example of an appropriate task for group problem solving. Foster creates such problems by converting traditional end-of-chapter problems into complex problems that students solve cooperatively, placing students in the problem by using the word “you.” Foster and his colleagues prefer not to include pictures in the problem, as a way of encouraging the group to decide whether and how to illustrate it. According to Foster, context-rich problems also provide many other decision points to foster ongoing interaction among group members. For example, problems might include extra information, omit information, or leave variables unnamed. These problems also “hide the physics” by avoiding technical terms and focusing on real-world settings. By hiding the physics, the problems demonstrate that the world is rich in physics and require students to determine which fundamental physical principles to apply (Foster, 2008).
In physics, context-rich problems are closed-ended, meaning that there is essentially one correct answer that is dictated by the rules of mathematics and physics. Even though they are closed-ended, the problems still require creativity to define and apply the correct principles and equations. Citing Schwartz, Bransford, and Sears (2005), Foster said that this balance between effectiveness and innovation is vital to the transfer of knowledge from one situation to another.
Foster noted that context-rich problems are an excellent way to challenge students’ misconceptions about problem solving. For example, students often believe that the aim of solving a physics problem is to reduce it to a mathematical exercise, and that it is always necessary to use all the information in a problem. Faculty members can address these misconceptions by structuring the problems differently, as described in previous paragraphs.
In Foster’s experience, it is easy to make context-rich problems too difficult. He and his colleagues have developed a set of 21 “difficulty traits” that fall into the broad categories of approach, analysis of the problem, and mathematical solution. Faculty members can use the traits as a checklist to design context-rich problems and to assess and adjust their level of difficulty.
Turning to the evidence, Foster explained that he uses traditional instruments, such as the Force Concept Inventory and conceptual surveys on electricity and magnetism, to measure students’ concept development. He has found that students who solve context-rich problems in cooperative group settings score as well on these measures as their peers who are taught using other interactive methods. To assess problem solving, Foster uses a rubric developed at the University of Minnesota that includes five dimensions: (1) description of the problem, (2) physics approach (i.e., whether students used the correct physics), (3) specific application of the physics, (4) mathematical procedures, and (5) logical progression. Foster reported that students’ problem-solving abilities improve through the use of context-rich problems, but he cautioned that the method does not result in quantum leaps in problem-solving abilities. Foster called his evidence on students’ attitudes and behaviors about context-rich problems anecdotal but positive.
He closed by identifying future directions for this method of physics instruction. Citing the need to create more context-rich problems in physics, he mentioned problems that begin with an answer and require the formulation of a question (such as on the television show “Jeopardy!”) as well as problems in which students identify and correct errors. He also stressed the importance of developing context-rich problems outside physics to assess the transfer of knowledge from one domain to another.
Remarking on the differences in terminology across disciplines, Karen Cummings (Southern Connecticut State University) observed that these differences pose a challenge for researchers. She asked Gijbels how he distinguished between knowledge of concepts and application of knowledge in his study. Gijbels agreed and explained that for his review of the literature he examined actual assessment questions to determine what type of knowledge they were assessing. Lundeberg added that it was a challenge for the faculty members in her study to develop assessments that measure higher order thinking, because it is easier for them to write questions that focus on definitions and conceptual knowledge.
Martha Narro (University of Arizona) asked Gijbels to clarify some of the findings that he discussed in his presentation. He explained that, across studies that assessed student learning of concepts, there was no significant difference between students in problem-based and traditional settings. Across studies that assessed student learning of principles and application of conceptual knowledge, however, students in problem-based environments performed better. He also pointed out that the findings varied depending on the context (specifically, whether the students were in their first or last year of medical school) and the curriculum, and that he was reporting on the overall trends in the data.
Responding to another question, Lundeberg and Foster discussed the issue of relevance when constructing scenarios, problems, and cases. They agreed that there is very little research on what it means to be relevant. Lundeberg related several examples of cases that faculty members designed to be relevant but that did not resonate with students. In her experience, allowing students to design their own cases is a powerful way to make the cases relevant. Foster added that many college students are still developing their identities, which makes the notion of relevance more challenging. An audience member, referring to a paper by Mayberry (1998) about pedagogies that encourage students to develop their own sense of science, cautioned faculty members to be careful about coming across as knowing more than students about what is relevant.
Following another question, the speakers engaged in a discussion about the importance of longitudinal research to understand the longer term impact of these pedagogical strategies. Lundeberg mentioned some examples of longitudinal studies of innovative instructional strategies that show mixed results. Foster added that it is difficult to measure long-term knowledge or to trace it back to its origins. As an example, he said that although students might not demonstrate understanding of a concept after a certain