Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
CHAPTER 3 Findings and Applications Phase A: Transforming the Original Classroom Content into an Interactive e-Learning Format Coordination with L32 TETG and L32A and L32C Project Contractors The research team coordinated with a large group of stakeholders for this project. They included the SHRP 2 L12 project team, the L32A project team, the SHRP 2 L32C project team, SHRP 2 staff, the L32 Technical Expert Group (TEG), FHWA, and NHI. To become intimately familiar with the content, learning environment, and audience for the course, members of the L32B project design team attended one of the L32A train-the-trainer in-person courses. The research team also coordinated with the L32A National TIM train-the- trainer project to ensure that the web-based e-learning tool was thoroughly consistent with the L32A course curriculum and material, thus meeting the critical needs of multidisciplinary traffic incident responders. The research team worked with the L32C project team to ensure that they were able to utilize the Post-Course Assessment Tool to ensure consistency and to allow agencies to assess the effectiveness of lessons learned from L32A and L32B. Literature Review A review of the current literature on the topic of using e-learning to train incident and emergency responders and managers in TIM confirmed the need for cross-discipline, standardized training that is widely available across the nation. Literature was reviewed from the following organizations: ⢠SHRP 2 Reliability Research; ⢠U.S. Fire Administrationâs Traffic Incident Management Systems (TIMS); ⢠National Traffic Incident Management Coalition (NTIMC); ⢠International Association of Fire Fighters (IAFF); ⢠I-95 Corridor Coalition; ⢠Ohio DOT Quick Clear; and ⢠Towing and Recovery Association of America (TRAA). There are many examples of TIM training within specific disciplines, but very few across disciplines. In addition, there are no training efforts existing or in development that match the depth and scope of the SHRP 2 traffic incident responders and managers training. The individual literature and training programs that are currently available are summarized in Appendix B. 5
A review of literature on the topic of incident responder training showed the core competencies and the approach of the SHRP 2 National Traffic Incident Management Responder Training curriculum to be both accurate and appropriate. Other Pertinent e-Learning Systems The research team developed a comparison matrix of selected TIM programs with their corresponding applications, tools, audiences, and resources. By comparing previous efforts to provide quality TIM training, the research team sought to benefit from lessons learned on these projects. In addition, the research team examined emerging tools and techniques for effective e- learning. Each training program was evaluated in terms of its management of events, content, assessment tools, learning environment, mobile apps (if any), social learning (if any), type of registration, security level, accessibility (508 compliance), e-mail notifications, and system specifications. In addition, an attempt was made to determine whether or not they met the perceived expectations of the stakeholders and the long-term viability of these tools and applications. A review of e-learning programs with similar topics showed that there are no existing incident responder training programs in an e-learning format with the depth and breadth of the SHRP 2 traffic incident responder training, especially in terms of its multidisciplinary focus. The programs were diverse in their range of capabilities and curricula. The detailed comparison matrix that was developed is presented in Appendix C. SHRP 2 Project Workshop The research team developed, coordinated, and facilitated a workshop held on February 19, 2013, which served as a means to address several in-depth instructional and content-related topics as outlined below: ⢠Discussed the findings and reports associated with Tasks 1, 2, and 3; ⢠Ensured that the overall objectives of the e-learning system corresponded to those of the L12 and L32A projects; ⢠Ensured that the specific objectives of each module in the e-learning system corresponded to those of the instructor-led modules of the L12 and L32A projects; ⢠Worked together to analyze and assess the primary and secondary target audiences; ⢠Identified e-learning tools that will best benefit specific target audiences; ⢠Discussed the potential of packaging learning modules to target specific audiences; ⢠Explored the possibility for establishing tiers or prerequisites for the learning modules; 6
⢠Presented common and suggested innovative e-learning methods for TIM training and brainstormed approaches to be included in subsequent tasks; ⢠Coordinated with L32C contractor regarding assessment methods/technologies to be included in the e-learning system; ⢠Explored potential for using social media to enhance and extend (not replace) the learning experience; and ⢠Discussed the possibility of incorporating mobile learning in terms of how it could make the TIM training more widely available. The results of the meeting were as follows: ⢠The course will be offered for continuing education units (CEUs) or professional development hours (PDHs): o Must have a final examâkeep track of questions missed; o Must have an evaluation survey (coordinate questions with L32C); and o Individual login required. ⢠Student login o Must gather enough data to have a rich profile of students, but no more data than needed; and o Must request e-mail, discipline, agency, state, volunteer versus paid, and description of role. ⢠Course will be free: o There could be a fee-based blended version offered as optional training. The research team would need to develop the course so it can be offered as blended. (It should be noted that this option was not added as part of the project.) ⢠There will be a self-registration process by the students. ⢠The course will be designed in modules by topic. Each topic will have submodules that will be 10 to 15 minutes in length. ⢠Course must be accessible via desktop and laptop computers. It would be a ânice to haveâ if the students could access the material via a tablet. The committee recognized that flash and some other software programs are not readable on a tablet or smartphone. ⢠Course would incorporate discussion boards and forums but only if they encourage participation and donât require a lot of monitoring (need to watch maintenance costs). ⢠Bookmarking is a must. ⢠Must include a downloadable workbook/study guide. 7
⢠Under the Phase B additional scope, the research team will try to develop an app that could be used as a stand-alone resource. [University of Maryland (UMD) did not develop an app; they developed other types of resources instead.] ⢠Want an executive summary version for senior management/executives (this was not possible). ⢠Course should contain feedback from L32C project as it becomes available. ⢠Student tracking: o Need to be able to determine who has taken course from particular agencies. Develop e-Learning System Functional Requirements and Architecture The needs assessment and analysis was completed during the project workshop, and the results were discussed above. This section will discuss the results of the in-depth analysis of learning management systems (LMSs) and the technical architecture, system software requirements, and modules. The research team identified and analyzed a series of candidate LMSs based on the functional requirements identified by the needs assessment/analysis and technical requirements for software that were provided by the SHRP 2 project manager/FHWA. An examination of over 70 LMSs produced no candidates that were in full compliance. Many of the LMSs examined met the functional requirements from the TETG (results of the needs assessment/analysis). However, the research team determined that the full range of technical requirements for software could not be met among the pool of potential professional-grade open-source LMS applications. In addition, all hosted LMS solutions were rejected since they use proprietary codebases and do not provide source code. However, a few select LMSs stood out in the ability to meet the functional requirements determined from the needs assessment/analysis. The detailed comparison matrix that was developed is presented in Appendix D. The research team selected the top three in terms of the following features and/or criteria: ⢠Course authoring; ⢠Assessment tools; ⢠Learning environment; ⢠Mobile app availability; ⢠Social learning; ⢠Registration process; ⢠Security (login); ⢠Customer support, training; ⢠Sharable Content Object Reference Model (SCORM) compliance; and ⢠Accessibility. 8
The top three LMS choices based on the above-listed criteria were Moodle, Canvas, and Adobe Connect. Based on the research into the LMSs and experience in LMS tools, the research team recommended the use of Moodle. Although the research team performed an in-depth evaluation of LMSs and made a recommendation for an open-source LMS (Moodle), the decision was made to host the TIM training course on the LMS used by the NHI (Adobe Connect) since the final e-learning course would be delivered and maintained by NHI. Develop Test Plan for e-Learning System Since the research team did not have to set up an LMS to deliver the course, the test plan for the e-learning system was limited to testing the modules developed. The original classroom course was broken down into nine modules consisting of two to three lessons each. Each of the lessons was intended to take 10 to 15 minutes to complete. Each module was reviewed at several stages of the development process by various stakeholders. Table 3.1 features the development stages and review responsibilities. 9
Table 3.1. Development Review Stages Development Stage Reviewer Research team breaks down original classroom course into modules and lessons. Research team develops storyboards detailing learning objectives, content per page including any narration, and knowledge checks. FHWA L32A project team NHI instructional designers Research team puts course material into Adobe Captivate. 1. Research team reviews first and makes edits. 2. The TETG and FHWA review and provide feedback. 3. Research team edits files. 4. NHI reviews and provides edits. 5. Research team finalizes course files. Functional Testing Completed by NHI when the modules were published into Adobe Connect. Course Pilot Selected reviewers take the entire course. Feedback is provided to NHI. Any major items edited by research team. Below is a list of major items that were reviewed at each step in the development process: ⢠Verification that course/module content meets the needs of intended audience; ⢠Logical flow of course/module content; ⢠Ensure all links, videos, and file downloads were working properly; ⢠Spell check and grammar check all pages; ⢠Multimedia content including images were properly sized for web viewing; ⢠Knowledge checks were graded properly and included the correct review feedback; ⢠Verify that the closed captioning was working properly; ⢠Adherence to NHI style guidance; and ⢠Ensure the modules were meeting 508 compliance standards. Build and Test the e-Learning System As stated above, the original classroom course was broken down into nine modules consisting of two to three lessons each. Each of the lessons was intended to take 10 to 15 minutes to complete. The learning outcomes are clearly and precisely outlined at the beginning of every lesson. Interactive summaries tying these learning outcomes back to the material covered are provided at the end of each lesson. Attainment of the learning outcomes is evaluated through a combination of knowledge checks, thought-provoking exercises, and student assessments to ascertain the studentsâ understanding and knowledge of the material presented in the lesson. 10
Review of the material by the stakeholders was described above (including the pilot test). Figure 3.1 shows a screenshot of one of the pages in the course. Figure 3.1. Sample course page. The NHI piloted the completed e-learning course the week of May 19â23, 2014. A total of 42 individuals, representing all target audiences, were sent information to review the course. Out of that number, 19 registered for the course, and a total of 12 submitted feedback on the course. The research team edited the course based on the feedback received. Following the pilot, the course was made available free of charge through NHIâs web-based training program. NHI will continue to market, deliver, and maintain the course in cooperation with the FHWAâs Office of Operations. Revise, Operate, and Maintain This task was originally set up to have the research team provide a variety of services once the course was completed and was being offered to the target audience. These activities included: ⢠Making any corrections/revisions to the system based on information provided by the testing team; ⢠Providing student and instructor support for the LMS as needed; ⢠Making editorial and functional content changes as necessary; and ⢠Assisting L32C project team with assessment tool evaluation and implementation as necessary. Since the NHI will be delivering and maintaining the completed course instead of the research team, the only item under this task that was conducted by the research team was the first bullet. 11
Final Report The final report was submitted to the TETG as required. Phase B: Development of Performance Support Tools UMD developed performance support tools to enhance the learning experience for the target audience. This type of segmented, targeted learning provides a flexible and efficient tool for in- service or on-the-job training. A description of the performance support tools developed is detailed in Table 3.2. Table 3.2. Performance Support Tools Performance Support Tool Description Small Videos on Vehicle Positioning The small videos were created and embedded into the TIM e- learning course to enhance the learning experience by utilizing animation to depict various vehicle positioning maneuvers. The short clips demonstrate how a first responder would initiate linear, angle, lane plus 1, opposite direction, and ambulance vehicle positioning to provide a level of safety from passing motorists for responders and others at the scene of an incident. Large Video on Vehicle Positioning This video was assembled from the short vehicle positioning animated video clips and is narrated by a fire department official. The intent of this video is to stress the importance of safety, which is realized by proper vehicle positioning by the first arriving responders in preventing errant vehicles from entering into an incident scene. This video is made to be short enough in length to be delivered at safety briefings or roll calls for all agencies, both public and private, which respond to roadway incidents. D Driver Scribble Video âD Driversâ is a term that has been coined for drivers who are drunk, drugged, drowsy, distracted, or just plain dangerous. These types of drivers are a hazard on the roadways and are one of the worst dangers to responders working an incident along a roadway. This video is meant to be delivered to responders from all disciplines as a reminder to always keep an eye on traffic and use caution when working in or around moving vehicles. 12
Taper Video This video was created to demonstrate to responders the proper, safe way to deploy traffic cones when setting up a taper to further protect the incident scene. This video has been inserted into the training curriculum and is also constructed to be a stand-alone video for display to all responder disciplines as a training and awareness tool. Converted Tabletop Exercise In the classroom version of the training, tabletop exercises are conducted using matchbox-scale vehicles to demonstrate how various disciplines position their vehicles at the scene of a roadway incident. This could not be done for the e-learning course, so the research team instead developed an animated tabletop exercise as an addition to the training course. It allows students to experience a hands-on exercise in positioning response vehicles at the scene of an incident to demonstrate what they have learned in the e-learning course. This tool begins with a scenario of the incident scene and description of the order in which the response vehicles will arrive. The student moves response vehicles one by one to the appropriate locations and then places them in the proper position (angled versus linear). The student is allowed two tries with each vehicle before it is automatically placed. Phase C: Development of an Additional e-Learning Module for Dispatchers The FHWA Office of Operations provided content for an additional module for dispatchers. The module was developed into three lessons and added as Module 10 to the e-learning course. The research team developed additional content to the material provided, which was not the case for the other nine modules of the course. The same stakeholders reviewed the dispatcher module as the other modules. 13