National Academies Press: OpenBook

Train-the-Trainer Pilot Courses for Incident Responders and Managers (2013)

Chapter: Chapter 2 - Research Approach

« Previous: Chapter 1 - Background
Page 4
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 4
Page 5
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 5
Page 6
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 6
Page 7
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 7
Page 8
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 8
Page 9
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 9
Page 10
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 10
Page 11
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 11
Page 12
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 12
Page 13
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 13
Page 14
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 14
Page 15
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 15

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

4C h a p t e r 2 The research approach for the L32A project was designed to accomplish the following objectives: • Pilot-test National TIM Responder train-the-trainer cur- riculum and support material developed in SHRP 2 L12. • Revise and finalize curriculum and train-the-trainer sup- port material based on input from Technical Expert Task Group (TETG) and feedback from training participants. • Evaluate effectiveness of train-the-trainer course and materials for preparing trainers to deliver training through FHWA-sponsored national roll-out. Figure 2.1 illustrates the research approach for L32A. A TETG provided input and helped to shape the research approach, which the research team initiated with the kickoff (KO) meeting (Task 2), conducted in December 2011. The approach involved the following activities: • The Research Team (“the team”) maintained an amplified work plan (Task 1). • The team adjusted the training curriculum (Task 3), based on TETG and pilot participant feedback. • The team provided a series of four pilot train-the-trainer workshops (Task 6A) to multidisciplinary student-trainer audiences by a pair of subject matter expert (SME) trainers in Tennessee, Virginia, and Montana. A Transitional Work- shop (Task 9) was originally envisioned to occur at the end of the task and to not have an evaluation component. FHWA requested, and the team agreed to conduct, the Transitional Workshop earlier. The team conducted this workshop in Florida and treated it as an additional fully evaluated pilot, even though this was not originally scoped. • The team conducted an Alumni-Led Pilot (Task 6B) in Tennessee, led by a team of selected student-trainers who satisfactorily completed the train-the-trainer course. The alumni-led pilot allowed the team to examine full delivery of the training program by graduates of one of the preced- ing train-the-trainer pilots. • The team oversaw logistical arrangements for the pilot workshops (Task 4). • The team developed a feedback tool for obtaining partici- pant feedback (Task 5) • The team developed an evaluation plan (Task 6C) to assess the train-the-trainer curriculum and materials. • The team developed a Final Report (Tasks 7 and 8). The research approach solicited extensive student input and feedback on every aspect of the train-the-trainer pro- gram, including proposed selection criteria for candidate trainers, and cross-referenced all feedback and the evaluation results with student profiles (e.g., discipline, years of training, and field experience). These results are summarized in Chap- ter 3, Findings and Applications. The following sections summarize each subtask of the project. amplified Work plan (task 1) The team maintained an amplified work plan throughout the project’s duration. A full explanation of the work plan can be found in Appendix A. The work plan involved the following adjustments: • The National Academy of Sciences issued a limited autho- rization to proceed with Task 2 in early November 2011. The team coordinated, prepared for, and conducted the project KO meeting in December 2011, before receiving approval to develop the Amplified Work Plan (Task 1). • In April 2012, 4 months later, the team received authoriza- tion to proceed and immediately commenced scheduling of the five pilots within a 4-month period of time. • In response to requests from FHWA and the accelerated workshop delivery cycle, the team made changes to the Research Approach

5curriculum and materials progressively, rather than stick- ing to two curriculum change cycles originally scoped in the work plan. • The team received and processed feedback from the tran- sitional workshop (conducted in Florida) at FHWA’s request, although this workshop was not originally scoped to have an evaluation component. Figure 2.2 illustrates the overall project time line. Kickoff Meeting (Task 2) The team conducted the project KO meeting at the Transpor- tation Research Board in Washington, D.C., on December 20 and 21, 2012. On the first day the team focused on an over- view of the L32A project and scope and a discussion of can- didate training locations and criteria for train-the-trainer students. Twelve members of the TETG and four additional team TIM SMEs participated in a detailed curriculum review on the second day (December 21). Key outcomes of this meeting are summarized as follows: 1. The TETG approved the research approach and criteria for selecting pilot locations, which included • Mix of well-established and emerging TIM programs; • Geographical diversity; • Mix of TIM program leadership models (DOT-led ver- sus law enforcement-led programs); • Presence of a multidisciplinary TIM program in the state; • Support of a strong agency champion for TIM in the state; • Demonstrated commitment to training; • State’s expressed willingness to commit necessary resources and personnel to training; and • Ability to accommodate the training in the required time frame. In Figure 2.3, the final selected pilot workshop loca- tions are highlighted in red, finalized after the project KO meeting (the states shaded in beige are the locations of the original pilot locations from Project L12). Table 2.1 summarizes the locations and the rationale for choosing each. As the project unfolded and the first pilots were held, the team (SHRP 2 staff and FHWA) decided to convert the transitional workshop (Task 9) into a fourth pilot course, held in Florida and led by master instructors. 2. The TETG agreed to formally name the course the “National TIM Responder Training.” Outside the SHRP 2 Project L32A project group, the training from this point forward was referred to by this name. 3. The TETG approved an initial list of consensus-based change requests to the curriculum. The team conducted a meticulous review of the curriculum and supporting materials. The team documented approximately 160 TETG and SME comments in a change log. Participants then reviewed the lessons sequentially, moving segment by seg- ment within lessons. Two note takers recorded comments and change agreements, and the meeting was also recorded to ensure accuracy. Consensus was defined as either full agreement by all participants or the absence of expressed objection or concern by any participant to a proposed change agreement. Figure 2.4 depicts the structure of the change log that the team used throughout the project to track changes Figure 2.1. Summary of research approach for SHRP 2 Reliability Project L32A. The numbers within parentheses are the number of pilots by category.

6Figure 2.2. Project timeline. Figure 2.3. Final Project L32A pilot locations (in red). (The L12 pilots were in Georgia and Indiana.)

7made to the curriculum across each of the pilots. For each comment, the log documented comment, proposed reso- lution, comment source, lesson and slide reference of the comment, and products affected (training PowerPoint presentation, instructor guide, or student workbook). 4. The TETG agreed with recommended student-trainer cri- teria proposed by the team, as noted in Table 2.2. Curriculum Changes The National TIM Responder Training curriculum and train-the-trainer materials consist of two packages: the Core Training course (presentation covering 12 lessons [two of which are practicum], accompanying student workbook, and instructor guide); and the train-the-trainer course (presentation covering five lessons, accompanying student Workbook, and instructor guide). The curriculum and materials underwent more than 1,500 discrete improve- ment adjustments through the course of the KO meeting and the five pilot workshops conducted. As noted earlier, rather than collect, adjudicate, and implement approved changes in two change cycles, the team collected and made progressive changes throughout. Figure 2.5 depicts the categories of adjustments made to the curriculum. Table 2.3 summarizes the sources of the changes over the course of the research approach. Types of changes were refined content, expansion of instructor notes, new or updated imagery, Table 2.1. Summary of Rationale for Selection of SHRP 2 Project L32A Pilot Locations Location Rationale Tennessee • Opportunity to test the L32A curriculum in an area with a mature TIM program interested in innovations • State Patrol interested in implementing a more comprehensive TIM training program across the state • Ability to attract multidisciplinary participants from across four regions of the state Virginia • Strong TIM champion in the form of the statewide TIM Committee, chaired by law enforcement • Renewed emphasis on roadside incident manage­ ment from a service­patrol perspective • Incident management coordinators with back­ grounds in both law enforcement and fire provide multiagency collaborative infrastructure to sup­ port the multidisciplinary TIM approach of the L32A curriculum • Ability to attract multidisciplinary participants from across the state Montana • Opportunity to pilot­test the training in a more rural state with less structured TIM activities to date • Ability to attract multidisciplinary participants from across four regions of the state Florida • Statewide commitment to TIM • Strong law enforcement interest in TIM resulting from FHWA’s initial outreach • Ability to attract multidisciplinary participants from across four regions of the state Figure 2.4. Comment log snapshot.

8• Established online registration to capture training partici- pant demographics. 4 In addition to pertinent contact information, regis- trants were asked to provide other professional details, such as agency, role, years of TIM experience, years of training experience, and NIMS and ICS course experi- ence. This information was critical for obtaining the necessary data to properly evaluate course effective- ness relative to the different experiential perspectives of student-trainers. With the added professional information, the team was able to correlate perspec- tives on the course as well as course performance by discipline, level of expertise, and level of experience as a trainer. • Developed invitational language that workshop hosts could use as the basis for initial and follow-up or reminder correspondence to recruit participants for the training. • Developed and sent welcome package to registered attendees. updated or refined messaging (i.e., emphasis on quick clear- ance to balance safety-related messaging in the training) or typographical. Course Planning and Logistics Planning for each confirmed pilot workshop began 4 to 6 weeks before the course because of the schedule (up to 8 weeks lead time is recommended). The team provided the following logistical support to each pilot workshop: • Conducted a minimum of three conference calls with local planning point-of-contact (POC) teams: 4 Initial planning call; 4 Mid-term planning call; and 4 Final planning call. 4 Optional: a call with master instructors to support cus- tomization of training delivery to regional needs. Table 2.2. Recommended Student–Trainer Criteria for the Train-the-Trainer Course Criteria Metric TIM­related field experience Minimum of 5 years in field Willing to participate in full course Agreement to participate in course Experience as instructor Recognized as instructor in his or her specific discipline Commitment to multi­ disciplinary TIM Desirable/Preferred: • Member of multidisciplinary TIM task force, working group, or committee • National Incident Management Sys­ tem (NIMS) Training, particularly Incident Command System (ICS) 100, 200, and 700 Table 2.3. Curriculum Modification Sources Location Quantity KO Meeting 169 Nashville, Tennessee 112 Virginia 156 Montana 153 Florida 302 Alumni­led pilot (Knoxville, Tennessee) 80 Other 512 Total 1,484 Figure 2.5. Curriculum modification profile.

9The Kirkpatrick four-level assessment model is provided in Table 2.5. This methodology accomplished Kirkpatrick Levels 1 and 2 assessments as follows: • Identified any variables affecting participant attitude toward learning, that is, student reaction and response to the instruc- tional flow, instructor, facilities, equipment, resources, and so forth, and perceptions of the extent to which the instruc- tional techniques and materials prepared the student to perform as an instructor and lead the training. • Provided registrants logistical support correspondence: 4 Two weeks before the course, registrants received an e-mail confirming their participation in the 2-day course and were provided with hotel lodging information, if necessary. 4 One week before the course, a full participant package was e-mailed to registrants, including course location and directions, items to bring, and draft agenda. A sam- ple participant package is provided in Appendix B. 4 Two days before the course, a final reminder e-mail was sent to attendees. • Arranged catering as needed for breakfast, lunch, and snacks. • Shipped training materials 1 week before training. During the planning calls, the team completed a workshop planning checklist shown in Table 2.4. Figure 2.6 depicts the recommended course logistics timetable. evaluation Methodology The team structured the evaluation methodology and tools used for L32A project to assess the sufficiency of materials and instructional methods employed to prepare candidate instructors (train-the-trainer students) to deliver the National TIM Responder Training effectively. The team employed a multilevel feedback approach with participants in train-the- trainer classes, as well as student audiences trained by novice instructors. The approach invited students to provide feed- back on four aspects of the training, at multiple points in the training experience. The four aspects of feedback included 1. Units and lessons—content and visuals, including specific slides; 2. Training delivery; 3. Course structure and teaching methods (presentation, interaction, experiential, duration); and 4. Self-assessment of preparedness (i.e., both in terms of trainer’s criteria and sense of readiness and preparedness on training completion). Participants were invited to provide feedback at the follow- ing intervals: • Before the start of training (regarding sufficiency of advanced information shaping expectations); • During any mid-day breaks; and • Completion of each day. The L32A team employed a methodology for testing and evaluating similar to that of the original L12 training pro- gram. This methodology was based on the application of Lev- els 1 and 2 of the Kirkpatrick four-level assessment model. Table 2.4. Pilot Workshop Planning Checklist Planning Area Status General Workshop dates Workshop location Instructors (SAIC provided) Desired participant mix (agen­ cies and student­trainer criteria) (SAIC helped coordinate/provide) Background on TIM program and history (including sensitivities) Pre­Workshop Coordination Invitation list and contacts Invitational language (SAIC provided) Participant recruitment support or status (registration database) (SAIC helped coordinate/provide) Meeting space (SAIC helped coordinate/provide) Refreshments (SAIC helped coordinate/provide) Lodging arrangements (SAIC helped coordinate where needed) Customization desires (SAIC helped coordinate) Participant package and read­ ahead materials (SAIC provided) Workshop Execution Event setup (SAIC provided) Meeting materials (SAIC provided) Feedback (SAIC provided) Exam (SAIC provided) Meeting recording (SAIC provided) Post­Workshop Follow­up report (SAIC provided) Certificates of Completion and Professional Development Unit Support (SAIC provided)

10 Reaction or Level 1 evaluation instruments are typically used to determine how students felt about the training course they just received. These types of assessments are used to obtain subjective input about training design, delivery, and logistics. Level 2 data, the exam results, measure the degree of change related to learning. Learning occurs when the specific training objectives are met: a change in skills, knowledge, or attitudes is demonstrated through either academic- or performance-based testing. Learning can be defined as the extent to which partici- pants change attitudes, improve knowledge, increase skill as a result of attending the program, or any combination thereof. Finally, the team also created a tool to solicit feedback from novice instructors based on their experiences of teaching the alumni-led pilot course (this is a second Level 1 [Reaction] tool). This tool helped identify any areas in which, for exam- ple, candidate instructors were consistently experiencing dif- ficulty teaching or would benefit from additional instructor direction or clarification. The evaluation tools, in addition to observer input, served as the source of insights in the following seven areas: (1) suf- ficiency of materials and instructional methods to prepare instructors; (2) course length; (3) instructor criteria; (4) achieve- ment of learning objectives; (5) multidisciplinary emphasis of training; (6) curriculum changes; and (7) logistical lessons learned. • Identified where, through test item analysis, there may be discrepancies in testing relative to course delivery such that testing is not providing meaningful results; that is, testing is 4 Not testing against the instructional content; 4 Not specifically testing against the learning objectives; 4 Not effectively constructed; or 4 Any combination of the above. • Identified areas of instruction that are not accomplishing the learning objectives for specific segments of the learning populations. Table 2.5. Kirkpatrick Four-Level Assessment Model Level Description 1 Student Reaction: Measurement of student’s response to training. 2 Demonstrated Learning: Measurement of student’s acqui- sition of required skills, attitudes, and knowledge obtained through training. 3 Transfer of Learning to the Workplace, that is, Behavior: Measurement of student’s ability to implement new skills and attitudes in the workplace. 4 Workplace Results: Measurement of impact training had on key business strategies or indices. TIMEFRAME PLANNING EVENTS PLANNING ACTIVITY ATTENDEE COMMUNICATION EIGHT WEEKS PRIOR Initial Planning Call SEVEN WEEKS PRIOR Venue Confirmed SIX WEEKS PRIOR Initial Participant List Generated FIVE WEEKS PRIOR Midterm Planning Call Classroom needs assigned FOUR WEEKS PRIOR Invitations Sent THREE WEEKS PRIOR Optional – Course Customization Call TWO WEEKS PRIOR Registration Confirmation Sent ONE WEEK PRIOR Final Planning Call Course Materials Sent Participant Package Sent WEEK OF EVENT Pre event Meeting Classroom Props Obtained Final Reminder Sent Figure 2.6. Course logistics timetable.

11 Three additional questions asked participants to provide input on any potential gaps or omissions in the training, any shortcomings of the training, and the most valuable take- away from the training. Participants completed the feedback form at the conclusion of the training course. This tool can be found in Appendix D, and participant feedback on individual pilot deliveries can be found in the individual pilot summary reports in Appendix C. The data from the course evaluations were analyzed following each course to identify potential trends that could be addressed before the next course offer- ing. For instance, feedback from the first Tennessee pilot that some of the content felt “rushed” led the team to develop the instructor pacer guide. B. Novice Instructor Feedback Form This eight-question feedback form was administered to the novice instructors who led the alumni pilot course. The form solicited feedback from the novice instructors as to how pre- pared they felt to lead the course, based on their completion of the train-the-trainer course and the preparation instruc- tions and materials they will use. It invited their feedback on how well the structure, content, and organization of trainer materials would enable instructors to help students achieve the learning objectives. It also invited their feedback and sug- gestions on time allocated to the various lessons in terms of achieving the learning objectives. Finally, it invited their unconstrained suggestions on altering the structure or for- mat of the course to improve its effectiveness in preparing trainers to help students achieve the learning objectives. C. Student Assessment The student assessment provided data on the extent to which (1) the lesson design satisfied the learning objectives and (2) the training changed participant attitudes, improved their knowledge, increased their skills, or any combination thereof. The student assessment questions were based on specific con- tent in each of the training lessons, as described in Table 2.7. Instructional Methods Each pilot course, with the exception of the alumni-led pilot in Knoxville, Tennessee, was led by two instructors from dif- ferent disciplines—one with a fire background, the other with either a law enforcement or a state DOT background. The instructors alternated who led each lesson, although both provided input on the content or responded to student questions where appropriate. At the alumni-led pilot, there were nine instructors: four from the Tennessee DOT, three from fire departments, and two from law enforcement. Two instructors taught each lesson, and the instructors decided in evaluation tools The team employed the three tools described as follows to accomplish the evaluation. Each of these tools is provided in Appendix D. Results of the evaluation are summarized in Appendix E. • Two Kirkpatrick Level 1 (Reaction) Tools: 4 A. Participant/Student Feedback Form: This is a Kirk- patrick Level 1 (Reaction) evaluation that is completed by students at the end of class and measures how the student feels or reacts to the training. The 36-question form, presented in Appendix D, was distributed to course attendees at the completion of the course. This form solicited participant feedback on course schedul- ing, instructor quality, overall training satisfaction, time saving potential, and instructor materials. 4 B. Novice Instructor Feedback Form (used during alumni-led pilot only): This is a Kirkpatrick Level 1 (Reaction) evaluation that is completed by novice instructors upon completion of their first training in the role of instructor (after completing the train-the- trainer course). This tool assessed how prepared the novice instructor felt to lead the training. • One Kirkpatrick Level 2 (Learning) Tool: 4 C. Student Assessment: This is a Kirkpatrick Level 2 (Learning) assessment consisting of a bank of questions that tie directly into the course objectives and measure student knowledge at the end of instruction. A. Participant/Student Feedback Form The Participant/Student Feedback Form consisted of 36 ques- tions on specific training components described in Table 2.6. Participants were asked to provide input on each question using a 5-point Likert scale ranging from Strongly Agree to Strongly Disagree. Table 2.6. Participant/Student Feedback Form Profile Feedback Component Number of Questions Scheduling 3 plus 1 open­ended question for comments or explanation Instructors 6 plus 1 open­ended question for comments or explanation Overall Training 12 plus 1 open­ended question for comments or explanation Time­Saving Measures 1 plus 1 open­ended question for comments or explanation Instructor Materials 6 plus 1 open­ended question for comments or explanation

12 and set of training materials, regardless of where they receive the training. Instructor Materials • Core Instructor Guide: This guide helps the instructor set up the classroom, provides practical tips to make the learn- ing process more engaging, and includes the course lessons and exercises with step-by-step instructions that enable the instructor to provide the material in the appropriate man- ner. It also includes answer keys for all classroom activities to ensure consistent delivery across all training sites. It also has a place for instructor notes. • Core PowerPoint Presentation: The presentation is designed to aid, enhance, and guide the instructor’s presentation to the classroom. It serves to focus the students on the key objectives of the training by using a combination of text, video, and graphic elements, such as images, charts, and diagrams. The presentation is designed in Microsoft PowerPoint 2010 with associated video files. • Train-the-Trainer Instructor Guide: Specific to the train- the-trainer portion of the course, this guide helps the instructor set up the classroom, provides practical tips to make the learning process more engaging, and includes the course lessons with step-by-step instructions to enable the instructor to provide the material in the appropriate man- ner. It also includes answer keys for all classroom activities to ensure consistent delivery across all training sites. It also has a place for instructor notes. • Train-the-Trainer PowerPoint Presentation: The presentation aids, enhances, and guides the instructor’s presentation advance what lessons they would teach so they could focus their preparation time accordingly. In all of the pilots, the instructors followed the core content of the SHRP 2 Project L12 curriculum materials so that stu- dents could follow along in their workbooks. However, they also emphasized key teaching points to aid future trainers of the course, such as important messages that need to be rein- forced to students or certain questions or concerns that stu- dents may raise in specific parts of the course. The team observers captured these comments for incorporation into the updated instructor guide. To facilitate cross-disciplinary discussion, student seating was assigned so that no two responders from the same agency or organization sat next to each other (e.g., two law enforcement students were not seated next to each other). For the hands-on tabletop activity, the class was divided into groups so that each group had a diverse assortment of responder types represented. In addi- tion, students received name tags color-coded by discipline so that both the instructors and other students could easily identify the backgrounds of their fellow responders. A full suite of classroom instructional materials (listed in Table 2.8) was provided to both instructors and students. Having such a suite available ensures consistent delivery of the core training content; when a course is intended to be delivered by multiple instructors in multiple locations, this approach ensures that all instructors can follow a cohesive course outline and students receive a consistent course delivery Table 2.7. Curriculum Lessons Lesson Number Approximate Lesson Length (from Pacer Guide) (minutes) Number of Questions 0 – Course Introduction 47–49 na 1 – Statistics, Terminology, and Structure 39–55 12 2 – Notification and Response 20–23 3 3 – Arrival 73–96 12 4 – Initial Size­Up 30–32 8 5 – Command Responsibilities 18–24 10 6 – Safety, Patient Care, and Investigation 57–68 17 7 – Traffic Management 85–99 15 8 – Removal 50–60 9 9 – Termination 5–10 7 Note: na = not applicable. Table 2.8. Classroom Instructional Materials Instructor (Four Train- the-Trainer pilots) Student (Alumni-led pilot) Classroom (All pilots) Core Instructor Guide Core Student Workbook Tabletop Roadways Core PowerPoint Train­the­Trainer Student Workbook Staging Pads Train­the­Trainer Instructor Guide Assessment Best Practice Sheets Train­the­Trainer PowerPoint Participant Feedback Form Model Vehicles Assessment Answer Key Classroom Poster Classroom Roster Responder Actions Checklists

13 copies of peripheral third-party items, such as brochures and reference cards. It also includes a place for student notes. • Assessment: See preceding description. • Participant/Student Feedback Form: See preceding description. Classroom Materials • Tabletop Roadway Scenes: These consist of five different roadway scenes—city surface street, rural road, limited- access highway, high-occupancy vehicle (HOV) lanes, and an overpass ramp—that are used to create incident scenes during the hands-on tabletop activity. • Staging Pads: Staging pads are used as a holding area for responder model vehicles during the hands-on tabletop activity. • Model Vehicles: These are civilian and responder vehicles, such as matchbox cars, used to simulate accidents and response steps during the hands-on tabletop activity. • Responder Action Best-Practice Sheets: These sheets offer best practices in incident response and are placed on each table during the hands-on tabletop activity (Lesson 11) for each group’s reference. • Quick Clearance Time Line Classroom Poster: This Quick Clearance Time Line visual is used in the classroom to help provide a reference point for students regarding key inci- dent response phases, showcasing how minutes saved in quick clearance contributes to both travel time reliability and safety objectives. Added Materials As a result of observations and feedback from the pilot deliv- eries, the team added the following to the suite of materials: • Pacer Guide: After the first pilot delivery in Nashville, Ten- nessee, the team developed a pacer guide (Figure 2.7). It provides timing guidance to instructors by lesson and sub- section so they can monitor how much time they can afford to spend on a certain lesson, or where they will have to make up time later if they have spent too much time on an earlier lesson. • Photography of Setups: The team added photographs to help instructors set up key course activities—specifically snapshots of the large group lecture forum; the hands-on tabletop activity; and the outdoor or field situational aware- ness activity. Figure 2.8 depicts an example of the photo- graphic support to activity setup instructions. • Quick Clearance Time Line: At the Virginia pilot, students noted that it would be helpful to have a printout of the to the classroom. It serves to focus the students on the key objectives of the training by using a combination of text, video, and graphic elements, such as images, charts, dia- grams, and so forth. The presentation is designed in Micro- soft PowerPoint 2010 with associated video files. • Assessment Answer Key: This includes the answers to the student assessment questions and is used to grade student performance. • Classroom Roster: This tool enables the instructor to track classroom attendance easily. It also captures participant information, such as years of TIM field experience and agency or organization so that instructors can easily see the breakdown of their class by experience level and discipline. Student Materials (for Train-the-Trainer Pilots) • Core Instructor Guide: Students were provided with this guide in the train-the-trainer pilots so they could follow along in the guide as the instructors led the course and see how the content in the guide translated into the presenta- tion of the materials. • Train-the-Trainer Instructor Guide: See Core Instructor Guide description. Students were provided with this guide in the train-the-trainer pilots so they could follow along in the guide as the instructors led the course and see how the content in the guide translated into the presentation of the materials. • Train-the-Trainer Student Workbook: This workbook con- tains all student-related lessons content, including exer- cises, case studies, and scenarios. It also contains a full bibliography of reference materials used to create the con- tent as well as copies of peripheral third party items, such as brochures and reference cards. It also includes a place for student notes. • Assessment: This is a Kirkpatrick Level 2 (Learning) assess- ment consisting of a bank of questions that tie directly into the course objectives and measure student knowledge at the end of instruction. • Participant/Student Feedback Form: This is a Kirkpatrick Level 1 (Reaction) evaluation that is completed by students at the end of class and measures how the student feels or reacts to the training. Student Materials (Alumni-led Pilot) • Core Student Workbook: This workbook contains all student-related lessons content, including exercises, case studies, and scenarios. It also contains a full bibliography of reference materials used to create the content as well as

14 Figure 2.7. Screenshot of pacer guide developed by the team. Figure 2.8. Example of photographic enhancement to activity setup instructions.

15 in Appendix C. The participant mix for each course is pre- sented in Figure 2.9. The pilots were conducted at the following locations and dates: • Pilot 1: Nashville, Tennessee June 19–20, 2012 • Pilot 2: Richmond, Virginia June 27–28, 2012 • Pilot 3: Helena, Montana July 11–12, 2012 • Pilot 4: Fort Lauderdale, Florida August 8–9, 2012 • Alumni-led Pilot: Knoxville, September 12–13, 2012 Tennessee quick clearance time line graphic placed where they could easily see it when the instructors referenced it throughout the course. Therefore, at subsequent training deliveries, the team provided printouts. Pilot Course Deliveries The team conducted four train-the-trainer pilot courses and one alumni-led pilot course taught by graduates of the train- the-trainer course. Summary reports of each course are located Figure 2.9. Participant mix by responder discipline for each pilot delivery.

Next: Chapter 3 - Findings and Applications »
Train-the-Trainer Pilot Courses for Incident Responders and Managers Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s second Strategic Highway Research Program (SHRP 2) Report S2-L32A-RW-1: Train-the-Trainer Pilot Courses for Incident Responders and Managers describes pilot tests of the National Traffic Incident Management train-the-trainer course, the course's revised and finalized curriculum, and an evaluation of its effectiveness.

For more information on traffic incident responder training, contact your state's FHWA division office.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!