National Academies Press: OpenBook

Train-the-Trainer Pilot Courses for Incident Responders and Managers (2013)

Chapter: Appendix E - Assessment Analysis

« Previous: Appendix D - Course Evaluation Tools
Page 106
Suggested Citation:"Appendix E - Assessment Analysis." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 106
Page 107
Suggested Citation:"Appendix E - Assessment Analysis." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 107
Page 108
Suggested Citation:"Appendix E - Assessment Analysis." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 108
Page 109
Suggested Citation:"Appendix E - Assessment Analysis." National Academies of Sciences, Engineering, and Medicine. 2013. Train-the-Trainer Pilot Courses for Incident Responders and Managers. Washington, DC: The National Academies Press. doi: 10.17226/22585.
×
Page 109

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

106 A p p e n d i x e introduction This Pilot Assessment Report presents the findings of a com- prehensive analysis of the SHRP 2 train-the-trainer pilot courses, based on the results of the post-course attendee assessment. The research team conducted four train-the- trainer pilot courses and one alumni-led pilot course taught by graduates of the train-the-trainer course. The pilots were conducted at the following locations and dates: • Pilot 1: Nashville, Tenn. June 19–20, 2012 • Pilot 2: Richmond, Virginia June 27–28, 2012 • Pilot 3: Helena, Montana July 11–12, 2012 • Pilot 4: Fort Lauderdale, Florida August 8–9, 2012 • Alumni-Led Pilot: Knoxville, September 12–13, 2012 Tennessee This analysis is part of the research team’s ongoing efforts to support the National Academies’ pursuit of a high quality train- ing program for traffic incident responders. The objective of this analysis is to evaluate the effectiveness of the train-the- trainer course and materials in preparing trainers to deliver training through FHWA-sponsored national implementation. Course effectiveness is measured by attendee performance on a 92-question assessment administered at the conclusion of the 2-day course. Through this analysis, it can be determined • Whether instructional strategies supported learning objectives. • If the minimum knowledge requirements were met (across incident responder types and experience levels). pilot Course Test Summary of Findings The assessment was distributed to 162 incident responders participating in one of the five pilot courses. The SHRP 2 team primarily targeted incident responders from six separate disciplines to participate in the course: Law Enforcement, Fire/Rescue, Department of Towing and Recovery, Emergency Medical Services (EMS), Dispatch, and Department of Transportation (DOT). Each partici- pant, under the guidance of the instructor, was issued a test with specific instructions. The test was informally proc- tored; the instructors were in the room while the students were taking the exams. The exam was not held to a specific time limit. demographics The respondents consisted of 51 representatives of Law Enforcement, 42 from the Fire/Rescue discipline, 18 from Towing and Recovery, two from EMS, two from Dispatch, 46 from the DOT, and one other. Table E.1 provides a demo- graphic profile of the total respondents. The respondents were asked to provide their years of experience. Of the 162 respondents, 137 answered the ques- tion. Table E.2, provides the experience profile based on the answers received. Student Performance Figure E.1 illustrates the overall student performance as compared workshop to workshop. There was minimal vari- ation among locations. Virginia students achieved the highest score (85.0%). Alumni-led students achieved the lowest scoring (80.4%). The lower alumni-led score was anticipated given that (a) the alumni-led pilot was mar- keted to less-experienced responders than the four train- the-trainer pilot courses and (b) the alumni-led pilot was taught by recent graduates of the train-the-trainer course, whereas the train-the-trainer pilots were taught by master instructors from the research team who were very familiar with the curriculum. Assessment Analysis

107 Instructional Strategies Support of Learning Objectives One purpose of this assessment is to determine whether instructional strategies support learning objectives. Learning of each lesson was separately evaluated. Figure E.2 illustrates the overall student performance by lesson. It demonstrates that learning remains relatively consistent across the lessons. Student scores for the alumni-led pilot were generally lowest in all lessons. Lesson 2 has modest variation in scores. This is likely due to only having three questions for this section. Given that Lesson 2 is designed for 20 minutes of instruction time, it may be necessary to add more questions to that les- son. Scores generally trend downward after Lesson 3, likely due to fatigue. It is important to note that course is designed to be delivered in its entirety or in modules. In instances where the course is broken into several modules, assessment fatigue is anticipated to be less of an issue. Variation in absorption was evaluated to determine if there was an impact on students’ learning by the content presenta- tion. Figure E.3 presents the average lesson scores for those that attended one of the four train-the-trainer pilots and demonstrates there is some variability in the absorption of learning at the start and end of the class. Lesson 3 received the highest score (88.7%). Lesson 9 had the lowest score (66.1%). Several respondents skipped Lesson 9 due to fatigue (skipped sections not included in analysis). Given that Lesson 9 was designed for only 10 minutes of instruction time, yet contains seven assessment questions, it may be necessary to reduce the number of questions for Lesson 9. Learning across Responder Types and Experience Levels A secondary purpose of this assessment is to determine whether the minimum knowledge requirements were met across incident responder types and experience levels. Fig- ure E.4 illustrates that learning is occurring across the various responder types (law enforcement, fire, towing, and DOT shown—EMS, Dispatch excluded due to smaller sample size). It demonstrates that learning remains relatively consistent across the four disciplines. There is not much variation among discipline scores in Tennessee and Montana. Towers generally scored lowest (Virginia, Florida, and Alumni-led). Table E.2. Respondents by Discipline and Experience Discipline 0–5 6–10 11–15 16–20 21 Total Law enforcement 12 5 12 7 7 43 Fire 7 4 7 2 19 39 Towing 3 2 4 2 2 13 EMS 0 0 1 0 1 2 Dispatch 0 0 0 0 1 1 DOT 9 4 9 7 9 38 Other 0 0 1 0 0 1 Total 31 15 34 18 39 137 Table E.1. Respondents by Discipline Discipline Number of Respondents Law enforcement 51 Fire 42 Towing 18 EMS 2 Dispatch 2 DOT 46 Other 1 Total 162 Figure E.1. Average student assessment scores across pilot locations.

108 Figure E.2. Average assessment scores by lesson and pilot location. 0.0% 10.0% 20.0% 30.0% 40.0% 50.0% 60.0% 70.0% 80.0% 90.0% 100.0% Figure E.3. Average assessment scores for all train-the-trainer pilots (variation in absorption). Figure E.4. Average assessment scores by location and discipline (multidisciplinary learning).

109 Student scores for the alumni-led pilot course had the largest spread between high and low discipline score (15.6 points). Figure E.5 illustrates that learning is occurring across the various experience levels in on-scene TIM response. It dem- onstrates that learning remains relatively consistent across the continuum of experience in the field. The 25 students that did not identify their level of experience scored within the same level as those who did. In summary, there is a small dif- ference in scores based on years of experience, as demon- strated by the lowest score of 79.6% for those with 6 to 10 years and the highest score of 84.4% for those with more than 21 years of experience. Summary and Recommendations Overall, the assessment successfully measured course perfor- mance. Learning is occurring across incident responder types and experience levels. There was no major difference in stu- dent performance based on training or testing location. It is apparent from performance on the assessment that the instructional strategies supported the learning objectives. Learning remains relatively consistent across the lessons. It is recommended that additional questions be added to Lesson 2 and questions be removed from Lesson 9 to provide a more balanced ratio of instruction time to number of assessment questions. The analysis also shows that there is variability in the absorption of learning at the start and end of the class. Scores generally trend downward after Lesson 3, which is likely due to fatigue. In instances where the course is broken into smaller modules, fatigue should be less of an issue. Should the course be delivered in its entirety, it is recommended to move the Field Activity (Lesson 11) from Day 2 to Day 1 to provide an extended classroom break on the 1st day. Additionally, this will also keep students in the class- room before the exam and should provide better continuity (i.e., students will not have to transition from the classroom curriculum to a field activity and then back to the classroom for assessment). Finally, the student scores for the alumni- led pilot were generally the lowest in all lessons. This is mostly attributed to the less-experienced students teaching the alumni pilot. However, given that the alumni pilot was led by recent train-the-trainer graduates, the instructors’ relative unfamiliarity of the curriculum may have been partially responsible for lower scores in that pilot. Stressing the importance of preparation time to the instructors of alumni-led pilots should help mitigate the lack of curricu- lum familiarity. Figure E.5. Average assessment scores across the continuum of TIM experience.

Next: Reliability Technical Coordinating Committee »
Train-the-Trainer Pilot Courses for Incident Responders and Managers Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s second Strategic Highway Research Program (SHRP 2) Report S2-L32A-RW-1: Train-the-Trainer Pilot Courses for Incident Responders and Managers describes pilot tests of the National Traffic Incident Management train-the-trainer course, the course's revised and finalized curriculum, and an evaluation of its effectiveness.

For more information on traffic incident responder training, contact your state's FHWA division office.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!