National Academies Press: OpenBook

Leveraging Big Data to Improve Traffic Incident Management (2019)

Chapter: Chapter 4 - Big Data and TIM

« Previous: Chapter 3 - State of the Practice of Big Data
Page 39
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 39
Page 40
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 40
Page 41
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 41
Page 42
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 42
Page 43
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 43
Page 44
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 44
Page 45
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 45
Page 46
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 46
Page 47
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 47
Page 48
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 48
Page 49
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 49
Page 50
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 50
Page 51
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 51
Page 52
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 52
Page 53
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 53
Page 54
Suggested Citation:"Chapter 4 - Big Data and TIM." National Academies of Sciences, Engineering, and Medicine. 2019. Leveraging Big Data to Improve Traffic Incident Management. Washington, DC: The National Academies Press. doi: 10.17226/25604.
×
Page 54

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

39 As presented in Chapter 2, the state of the practice of TIM has advanced over the past decade through multiple approaches, including the development and implementation of the National TIM Responder Training Program, legislation, quick-clearance policies, TIM committees, and multi-agency operating agreements. The resulting improvements in responder safety and effectiveness, combined with the use of TIM-related data, have positioned TIM to make another step forward. Ongoing efforts, such as the FHWA’s EDC (“Every Day Counts”) TIM data innovation, are accelerating and advancing the implementation of TIM data collection and use among regional and state entities nationwide. Nonetheless, the current state of the practice in using data for TIM is limited. Moreover, these practices draw on traditional approaches to data collection and use, relying on engineering and decision-maker judgment, augmented by using quantitative analysis of limited data samples and often using subjective, manual, and resource- intensive strategies. With the increased quantity and improved quality of TIM data, there is promise that the application of the Big Data technologies and analytics described in Chapter 3 can further advance the state of the practice in strategic, tactical, and support TIM activities. The ability to merge multiple, diverse, and comprehensive datasets and then to mine the data has the potential to improve TIM programs. The use of Big Data might afford opportunities to: • Develop, evaluate, and refine TIM policies; • Improve scene management practices; • Improve resource utilization and management; • Gain efficiencies with respect to the TIM timeline; • Improve responder and public safety; • Access and query data in real time to augment incident response actions; • Enable predictive TIM; • Support performance measurement and management; and • Support TIM justification and funding. At this point, three questions arise: How might Big Data be applied to TIM? What potential opportunities exist to leverage Big Data to improve TIM? What are the potential benefits of doing so? This chapter explores Big Data opportunities for TIM by presenting specific examples that stem from applications that represent the current state of the practice in TIM data collec- tion and analysis. For each example, a summary of the traditional data collection and analysis approach is given. Then, a potential Big Data approach/opportunity to address the same problem or research question is presented, along with the benefits of the Big Data approach. The purpose of this discussion is to contrast the traditional approach with the Big Data approach, identify the differing data needs and analytical approaches, and discuss the possibilities and benefits afforded by Big Data. C H A P T E R 4 Big Data and TIM

40 Leveraging Big Data to Improve Traffic Incident Management 4.1 Improve On-Scene Management Practices Big Data provides opportunities to examine existing TIM strategies and practices, and to consider how factors such as training, responder experience, and response discipline affect response efficiency (e.g., data could point to superior procedures among responder disciplines). Every traffic incident is distinct, being based on a unique combination of factors that include incident type and severity, location, the combination of individual responders on scene, weather conditions, and special events. A wide array of naturally occurring scenarios (i.e., variety), made available from multiple sources across the country, would enable a more robust exploration of the impacts of differing on-scene management strategies. The National TIM Responder Training Program is the standard by which responders act. The 33 learning objectives of this program offer potential opportunities for improvement wherever data can inform or reinforce one or more of the learning objectives. For example, on-scene, real-time adjustments to responder actions (e.g., adjusting vehicle positions, scene lighting, temporary traffic control devices, and end-of-queue signage) could benefit traffic, safety, and travel time reliability. Protocols for various types of special circumstances like vehicle fires, HAZMAT, or hybrid and electric vehicles could aid responder safety. Development of prediction-assisted protocols or procedures, such as estimating the length of a queue during an incident, could enable responders to adjust traveler information and offer localized advance warning to prevent secondary crashes. Example: Assessment of the National TIM Responder Training Program In summer 2012, FHWA rolled out the National TIM Responder Training Program. As of July 2018, more than 344,000 responders had participated in the training nationwide (Figure 4-1). Figure 4-1. National TIM Responder Training Program implementation progress: percent trained as of July 2018.

Big Data and TIM 41 FHWA wanted to assess how effective the training had been in reducing roadway and incident clearance times and secondary crashes. Question How effective has the National TIM Responder Training Program been in reducing roadway and incident clearance times and secondary crashes? Traditional Approach The U.S. DOT conducted a study to assess the effectiveness of the National TIM Responder Training Program (Einstein and Luna 2018). The evaluation focused on the effectiveness of the TIM training in three areas: (1) disseminating TIM concepts to a wide incident responder community, (2) changing/enhancing agency practices, and (3) improving TIM performance. The first two areas were evaluated using quantitative and qualitative measures of effectiveness such as the number of attendees and the number and proportion of disciplines at trainings, attendees’ self-assessments of the value of training (through a post-course assessment), and changes in responder and agency practices with respect to on-scene traffic-incident practices and management (through interviews with responders). The third area (improving performance) was assessed using quantitative TIM performance measures calculated and crash report data collected from two areas: greater Phoenix, Arizona, and eastern Tennessee (Tennessee DOT Region 1). The analysis included 22,000 crashes from Phoenix and 6,400 crashes from eastern Tennessee, a relatively small number of crashes for a 4-year period (2012–2015). Aggregate performance measures (i.e., annual average clearance times) were used to show a decreasing trend in clearance times because disaggregate measures (i.e., clearance times by crash severity or number of vehicles involved in a crash) could not detect clear trends associated with the TIM training. The evaluation team noted concerns about missing data, erroneous data, and an inability to link TIM-trained responders to specific incidents. Big Data Approach/Opportunity A Big Data approach to assess the effectiveness of the National TIM Responder Training Program would collect and analyze data from the entire country. Data would be analyzed at the incident level (as opposed to the aggregate level) to identify: • Quantitative trends between training and shifts in incident clearance duration, secondary crash frequency, and responder struck-by events; • Areas in which significant improvements were absent; • Successes; and • What training or external factors contributed to the successes. Additionally, the analysis of historic and current data could explore whether characteristics of the training and/or the percentage of responders trained affected responder on-scene behaviors that improved or reduced incident clearance and scene safety. Data of interest for the analysis would include crash data, CAD data, weather conditions, TIM programs and policies in place, and responders training data—responders trained and not trained, and associated information such as discipline, jurisdiction, age, years on the job, and so forth—training dates, training locations, training types, and trainers and associated information. Time would be an important aspect of this analysis. Daily, responders receive training and new incidents occur across the United States. It takes time for the knowledge gained through training to translate to measurable benefits in the field (e.g., reductions in clearance times). Big Data moves from analyses that are based on single snapshots in time to the ability to provide a continuous feedback loop as changes occur and new data is generated.

42 Leveraging Big Data to Improve Traffic Incident Management Therefore, rather than conduct a single geographically and temporally bound study that represents a single snapshot in time, the Big Data approach would be to perform the analysis regularly (e.g., weekly or monthly) to account for the arrival of new data and to identify trends and highlight outliers for further inspection. This is an important distinction between the data-weak traditional approach and the data-hungry Big Data approach. Rather than making decisions based on a few performance measures that are calculated once a year, analyses can be conducted continuously to monitor responder practices and adjust accordingly (e.g., through funding for training, training content, and training locations). Should one or more of the train- ers, one or more of the learning objectives, or one or more of the training types be determined to be ineffective—either completely or under certain circumstances—the traditional approach will likely be too high level to detect these shortcomings, or it may take years to uncover the issues. For example, the U.S. DOT evaluation identified big differences in the numbers of responders trained across disciplines in Tennessee and Arizona (e.g., more fire and towing in Tennessee than in Arizona) and noted that average class size and mix of disciplines also may impact training effectiveness, but these differences could not be evaluated with the data at hand. With Big Data analytics, these differences, as well as negative trends and outliers, could be quickly detected and analyzed to identify and remedy training weaknesses or to expand on training strengths. A retail analogy is provided by Walmart. A grocery team could not understand why sales had suddenly declined in a particular product category. Walmart’s data scientists drilled into the data and quickly determined that pricing miscalculations had been made, leading to the products being listed at a higher price than they should have been in some regions. Big Data enables much faster and more accurate pinpointing and verification of problems caused by human error or miscalculation at the planning or execution stage of a particular business activity. If an organiza- tion cannot get insights until it has analyzed data for a month, a quarter, or even a year, it has lost sales, productivity, and efficiency within that time (Marr 2017). Through a low-level of granularity of data and low-cost, frequent analyses, Big Data can expose a more detailed and evolving picture of the incident response and training reality, helping to change policy and procedures as well as the mindset of “set and forget.” Continuous feedback, with data feeding the decision-making process, is necessary to remain efficient and effective. 4.2 Improve Resource Utilization and Management Historical analysis of incident response could support huge advances in the deployment of TIM resources, including both personnel and equipment. Big Data analytics could help to optimize SSP routes by identifying the best days and hours of service based on weather, seasonality, and other factors to ensure that the appropriate number and type of resources are scheduled for a geographic expanse. Desired response time, circulation time, vehicle cost per mile, weather, special events, and estimated obligated/unobligated patrol time are factored in to determine staffing needs. Big Data analytics could assist in staging human and equipment resources for quicker and less costly responses. Example: City of Berkeley, California, Police Patrol Beat Evaluation Study The city of Berkeley, California, conducted a study to assess the existing beat struc- ture and allocation of patrol staffing and to evaluate opportunities to improve the deploy- ment of resources. Depicted in Figure 4-2, the city’s existing system of 18 beats was based on

Big Data and TIM 43 20-year old crime trends, calls for service, and staffing levels, and needed to be updated to reflect existing conditions (Matrix 2014). Question What beat structure and allocation of patrol staffing will best improve the deployment of resources? Traditional Approach The approach to updating and improving the patrol beat structure was based on a qualita- tive and subjective assessment of data, as well as a quantitative assessment of four possible beat structures (16, 14, 11, and 4 beats) on calls for service, major crime, workload, geographical accountability, neighborhood integrity, and efficient travel. The methodology included inter- views with the police chief and patrol staff; the collection of data to document workloads, costs, service levels, and operating practices; and town-hall style meetings. Analyses included a statistical analysis of call-for-service workloads and major crime throughout the city; a GIS assessment of the equity of various beat boundaries on call workload and crime; and the analysis of interview, survey, and town hall data (Matrix 2014). Source: City of Berkeley, California (Matrix 2014) Figure 4-2. Existing beat structure and total calls for service.

44 Leveraging Big Data to Improve Traffic Incident Management Big Data Approach/Opportunity The traditional approach applied to update the patrol beat structure was resource intensive (town hall meetings, interviews, surveys) and resulted in a limited (manageable) number of distinct options that where then assessed on seven criteria. The quantitative analysis and opti- mization resulted in one recommended patrol beat structure. The Big Data approach to this problem would offer advancements in multiple ways. The Big Data approach would leverage more advanced optimization methods, such as genetic and evolutionary algorithms, would add a wider set of data sources (e.g., weather, census, social media) to those used in the traditional approach, and would be applied at a more granular level (e.g., time of day, day of week, week of year, local events), going beyond the criteria that were capable of being optimized with the computing limitations of the traditional approach. Big Data analytics could also offer an approach that automatically optimizes the number of beats and boundaries across the criteria. Furthermore, capitalizing on the computing power cost efficiencies available through Big Data analytics could allow for the simultaneous analysis of thousands of patrol beat structures as opposed to only four. Finally, Big Data could address issues of flexibility to better address future changes, and questions like the following: • What comes after this study? • How long will this patrol beat structure remain the “most” efficient? • When will the resources be available to repeat the traditional study to represent the changing times? Traditionally, working with little data at a high resolution, changes were hard to detect and the need to re-optimize was difficult to justify. The increases in data volume and data resolution have brought more light to the constantly changing and evolving world, and in many industries have exposed inefficiencies and gaps that can be corrected. The efficiencies of Big Data allow the analysis to be set up once and repeated over and over as new data becomes available. Big Data allows system changes to be identified quickly, enabling adjustments to be made far more frequently to maintain efficient beat structuring. In fact, Big Data analytics has the potential to take the patrol beat deployment decisions to the next level, moving from static beats to real-time dynamic beats that concentrate patrols in the areas with the greatest likelihood for need each day by factoring in variables such as weather, special events, and the mood of the population as expressed on social media. 4.3 Improve Safety Big Data has the potential to dramatically advance safety through a better understanding of the characteristics of traffic incidents, and through improved responder situational aware- ness. When the traits that are most dangerous for responders and passing motorists are known, adjustments can be made to equipment and on-scene behavior to mitigate those dangers. Early warning systems (e.g., via an audible alert or a color-coded message on a mobile device or responder vehicle computer) might be developed to make on-scene personnel aware of varying degrees of danger associated with different combinations of incident conditions. Such analytics also could guide traveler information systems and safety messages provided via 511 or VMS systems, or other means to offer driver-customized, in-vehicle alert warnings of responder activity. Understanding emergency vehicle lighting and conspicuity through analytics has the potential to improve safety, particularly when better understanding is gained of how approach- ing motorists behave given those stimuli. Big Data also can be applied to identify the most effective frequencies, geographic areas, and content for responder training.

Big Data and TIM 45 Example: Florida DOT “Move Over” Study To mitigate the risk to responders at incident scenes, every state has implemented a law that requires drivers to move over or slow down when approaching a patrol vehicle that has stopped at the roadside. The Florida DOT conducted a study to determine the effectiveness of the “Move Over” law in Florida. Question How effective has the Move Over law been in mitigating risk to incident responders in Florida? Traditional Approach To determine the effectiveness of Florida’s Move Over law, the Florida DOT and the Florida Highway Patrol (FHP) supported a field study that involved the observation of right-lane vehicles passing staged police stops on three Florida freeways in differing parts of the state. Each staged stop involved the use of a civilian research vehicle, a marked police vehicle, video recording of passing traffic, and measurement of passing vehicle speeds using a laser speed measurement device (see Figure 4-3). Differing patrol vehicle emergency lighting configurations—blue and red versus amber only—were tested (Carrick and Washburn 2012). This traditional field study approach provided results that were helpful in understanding how a convenience sample of 9,000 drivers reacted to a limited combination of emergency lighting configurations at a limited number of locations across the state of Florida. Notable concerns with this study are the secondary crash risk to researchers and law enforcement from remaining adjacent to high-speed traffic and the potential throughput loss along roadway facilities. Big Data Approach/Opportunity A Big Data approach to assess the effectiveness of the Move Over laws nationwide might involve a wide variety of naturally and constantly occurring data sources and the application of Big Data analytics to extract driver behaviors. Data of most interest to this study would include Source: Grady Carrick (Carrick and Washburn 2012) Figure 4-3. One of three staged stop sites.

46 Leveraging Big Data to Improve Traffic Incident Management police enforcement activities; vehicle telematics data (from passenger vehicles, commercial vehicles, and police fleet vehicles) including time of day, location, speed, lateral position, specification of response vehicles, active emergency lighting configurations at stops, and so forth; roadway inventory data like roadway classification, number of lanes, and horizontal and vertical curvature; and weather data. Using the data, speeds and compliance rates could be assessed for thousands of naturally occurring combinations of emergency lighting con- figurations, roadway types, vehicle types, locations, times of day, weather, and other factors (e.g., recent media campaigns) that influence compliance. Big Data analytics might include: • A non-zero variance analysis to determine what factors impact behavior and compliance; • A clustering analysis to group co-occurring factors into multiple scenarios/groups (e.g., compliance rates are high during non-peak periods on limited access highways with more than two lanes in each direction); and • A classification of the uncovered groups/clusters. The results would not only provide a detailed understanding of when, where, and under what conditions drivers comply or do not comply with the Move Over law, they could also help inform outreach, public education, and policy to further improve compliance. Instead of design- ing an experiment meant to represent reality based on a sample of data collected from a few locations and then extrapolating the results to other locations, the Big Data approach looks at actual behaviors occurring naturally across a wide area by leveraging the large volume of highly varied data available in the real world. Further, the Big Data approach also could be rerun as new data becomes available, making it easier to identify adjustments or corrections to policies, vehicle markings, emergency lighting systems, and other factors as needed. 4.4 Enable Predictive TIM Big Data could be used to predict when, where, and under what conditions traffic incidents are most likely to occur so that the appropriate response can be pre-staged and/or more quickly deployed if necessary. Identifying the nature and causes of traffic crashes is fundamental to traffic safety analysis, and it precedes the implementation of countermeasures embodied in the “3Es”: engineering, education, and enforcement. For example, every TMC operator knows that when it rains there will be a spike in crash activity. Improved data integration and analytics has the potential to move the TMC observation beyond intuition and into the realm of predicting when and where problems are most likely to occur under specific and detailed conditions such as planned special events, periods of holiday travel, or even the daily rush hour. Using various types of data, and in particular detailed weather data, to uncover correlations and predict when and where to put resources is foundational to improving TIM planning and operations. Example: Tennessee Highway Safety Office Predictive Analytics Agencies face a continual challenge in allocating resources in the most cost-efficient and effective way possible. Tennessee’s Crash Reduction Analyzing Statistical History (C.R.A.S.H.) program uses software and data to perform analyses that inform the agency’s decisions. Question How can the state more efficiently allocate limited resources, deploying troopers to locations and at times with the greatest likelihood of crashes? Big Data moves from limited data samples meant to represent reality to leveraging the wide variety of data occurring naturally in the real world.

Big Data and TIM 47 Traditional Approach The C.R.A.S.H. program developed by the Tennessee Highway Patrol (THP) leverages data from every crash report filed in the state, from traffic citations, and includes data about weather and special events to analyze and predict when and where serious or fatal traffic crashes are most likely to occur. C.R.A.S.H. breaks Tennessee into 5-mile-by-6-mile sections and predicts traffic risks for each section in 4-hour increments every day. THP uses these analytics to more efficiently allocate limited resources by deploying troopers to locations and at times with the greatest likelihood of crashes. The models also help field supervisors design shift assignments, develop enforcement plans, and determine when and where to conduct grant-funded activities (Tennessee Department of Safety and Homeland Security 2017). The model results, an example of which is shown in Figure 4-4, have proven to be accurate about 70 percent of the time (Martinelli 2017). Big Data Approach/Opportunity The Big Data approach would be to move from predictive modeling of crashes using histori- cal data to predicting crashes in real time for the purposes of reacting immediately to changes in the factors that are likely to lead to a crash. Instead of running the prediction models every day, the models might be run in parallel and continuously, using Big Data analytics in the cloud. Big Data predictive models would rely not only on historical data but also on real-time streaming data (such as speeds, volumes, occupancies, weather data, vehicle data, road weather condi- tions, data from social media, and events), which would be fed to the models in real time as it is generated and received to predict when and where there is a high probability for crashes. The outputs of such models could potentially feed real-time decision-support systems for active traffic management and dynamic resource allocation. One specific approach to this analysis would be the use of deep learning (a machine learning method), which allows complex relation- ships in large datasets to be captured efficiently. The more granular the data, the faster it changes, and a consequence of these fast changes is that the accuracy of deep learning models will start to drop as the existing relationships between the data begin to shift. The Big Data approach remedies this drawback by treating prediction models as short-lived and disposable. Big Data prediction approaches typically monitor the accuracy and performance of their current models in real time and develop new models as new data is added. Should a model stray from the existing level of perfor- mance and become less accurate, it can be discarded immediately and Source: Tennessee Department of Safety and Homeland Security (Freeze 2017) Figure 4-4. THP C.R.A.S.H. software program—model results. The concepts of disposability and replaceability are inherent to Big Data infrastructure from hardware to software and models.

48 Leveraging Big Data to Improve Traffic Incident Management replaced by a newly developed model. The C.R.A.S.H. example illustrates the concepts of disposability and replaceability, which are inherent to Big Data infrastructure from hardware to software and models. 4.5 Support Performance Measurement and Management Performance measurement and management are the ongoing processes undertaken in support of accomplishing the strategic objectives of a program. Performance measurement involves selecting quantitative performance measures to be tracked, setting performance targets, collecting and analyzing data in support of the performance measures, and ongoing monitoring and reporting of program accomplishments and areas that need improvement. Performance management goes further in that it involves active and continuous follow-up by program staff and managers to identify and implement specific strategies and tactics to improve efficiency and then to measure and report the outcomes of these strategies and tactics (i.e., did the strategy help to meet the performance targets?). Performance measurement and management are data-driven processes. Without access to the appropriate data and analytics tools, performance measurement and management can be challenging, laborious, or downright impossible. The application of Big Data could support TIM agencies with their current performance measurement and management processes, and it could also expand the thinking and overall approach to the processes (e.g., through identification of additional, critical performance measures; identification of performance gaps or pitfalls; and identification of the actions necessary to improve performance). Example: Oregon DOT Performance Management In 2014, Oregon DOT management inquired why the mutual Oregon DOT/Oregon State Police (OSP) RCT goal of 90 minutes was exceeded in 1,088 incidents. The experience of the Oregon DOT elucidates the need for more data and more advanced analytics for TIM performance management. Question What factors contributed to 1,088 incidents exceeding the mutual Oregon DOT/OSP RCT goal of 90 minutes? Traditional Approach Using the data available at hand, the Oregon DOT determined an answer to this question by examining the problematic incidents “one at a time.” The approach to the analysis was to engage response partners to anecdotally create a list of factors known to generally contribute to longer clearance times, review each of the 1,088 incident reports, and categorize the incidents in relation to the list of factors. Following the analysis, specific actions were developed and imple- mented to address the most common causes of extended clearance times (Oregon DOT 2018). The Oregon DOT has since developed a process to communicate the causal factors for long clearance times directly from the field to the dispatch centers so that the data can be immediately entered into their system to drive an ongoing report (Figure 4-5). Although this approach provides an excellent example of active performance management by the Oregon DOT, the analysis is based on a list of reasons for extended delays that was created based on subjective assessment (the anecdotal factors initially suggested by the response partners) rather than on tangible data.

Source: Oregon DOT (2018); used by permission Figure 4-5. Incident clearance times exceeding 90 minutes.

50 Leveraging Big Data to Improve Traffic Incident Management Big Data Approach/Opportunity A Big Data approach to this question would be to leverage a variety of data sources to auto- matically identify the factors (and combinations of factors) that lead to extended clearance times. At a minimum, a statewide analysis would be done; however, more insights could be drawn from multi-state or national data. Many of the conditions that lead to extended clearance times in Oregon are the same conditions that lead to extended clearance times in other states (as is suggested by the anecdotal factors developed by the Oregon DOT and partners shown in Figure 4-5). Relevant data for the analysis would include crash data, CAD data (timestamps of every notification, arrival, and departure from the incident scene), injury surveillance data, roadway data, weather data, and social media data. The analysis would be conducted at the incident level, which means that the clearance time and details of every incident would be compared against all others. A graph analysis could be conducted, yielding results like those represented in Figure 4-6. To conduct such an analysis, data relevant to each incident and its response would be plotted to create a represen- tative graph. The structure of the graphs would be based on a semantic graph ontology (a commonly shared vision of a domain). Each graph Big Data helps to reduce or eliminate the subjectivity, judgment, and bias often found in manual, qualitative, and human-driven analysis processes. Figure 4-6. Representation of graph analytics for TIM performance management.

Big Data and TIM 51 would provide a complete description of the incident/response (e.g., type, location, vehicles, injuries, responders, actions taken, timestamps). The incident graphs would be loaded into a graph database for analysis. Then, using graph analytics (e.g., graph similarity and subgraph matching algorithms), a portion of the incident graphs that are common to the incidents with long clearance times would be identified and extracted. The extracted subgraphs (consisting of branches, nodes, and values) would then be reviewed and classified to identify the various patterns (e.g., late tow truck arrival), triggers (e.g., heavy congestion), and thresholds (e.g., less than two responders on the scene) that are common to incident responses with extended clear- ance times. This approach would likely offer additional, even unexpected, insights into the causes for extended clearance times. An initial incident response and clearance ontology (IRCO) was developed as part of NCHRP Project 17-75 and is presented in Appendix B to this report. The IRCO could be used to structure the graphs with the data available for this type of analysis. 4.6 Support TIM Justification and Funding Response agencies, particularly public agencies, often are mission driven and task oriented at the expense of meticulous documentation of activities that serve to justify continued funding. Another opportunity for the application of Big Data for TIM is to build a collection of informa- tion that documents activities, program costs, and program outcomes to make an accurate and compelling business case for TIM. One of the most fundamental ways to improve the effective- ness of TIM is to ensure a dedicated and right-size funding stream. Historically, agencies have relied on TIM conventions, or “rules of thumb” (e.g., that 20 percent of incidents are secondary in nature, that each minute of blockage requires 4 minutes to recover) because this was the best or the only data available. Having data that helps make the business case for TIM increases the potential for securing TIM funding. Being able to demonstrate quantitatively the impacts of incidents on safety (e.g., secondary crashes), mobility (e.g., number of people stuck in incident- related congestion), the environment (e.g., air quality and fuel waste), and the economy (e.g., freight movement) will help to promote continued or increased funding for TIM programs. Example: FHWA TIM Benefit-Cost (TIM-BC) Tool The FHWA has developed a web-based TIM Benefit-Cost (TIM-BC) tool that assists TIM programs in determining the benefit-cost ratio for certain TIM activities (FHWA 2017b). The tool evaluates SSP, TIM laws, towing arrangements, training, dispatch colocation, and the establishment of TIM taskforces to quantify their benefits. The TIM-BC tool relies significantly on user inputs and/or default values for factors like average incident duration, average incident delay savings, and compliance rates, and is based on regression analyses from samples of data that are used to estimate the benefits of the TIM strategies and extrapolate them to other areas (Figure 4-7). The use of a Big Data approach and Big Data infrastructure could enhance the TIM-BC tool. Question How could a Big Data approach enhance the TIM-BC tool? Traditional Approach Following a traditional approach, development of the TIM-BC tool used simulations to generate data, which was subsequently used to develop and calibrate regression models. The simulations were needed because the necessary data was not available to develop the models

Source: FHWA (Ma and Lochrane 2015) Figure 4-7. TIM-BC tool SSP program inputs.

Big Data and TIM 53 directly. Moreover, as was stated in the January 2016 report, all possible incident simulation combinations (of number of lanes, grade, free-flow speed, traffic volume and composition, number of lanes blocked, and ranges of incident duration) were not replicated (Ma et al. 2016). With the professional workstation used for the analysis, it would have required 16 years to conduct the 740,880 possible runs (three runs of each of 246,960 simulation combinations), not including the time to process the output. Consequently, only about 1,319 “representative” simulation combinations were replicated and used to develop and calibrate the regression models (Ma et al. 2016). Big Data Approach/Opportunity In only a bit more time than it would take to run a single simulation on a professional work- station, cloud/Big Data infrastructure would allow each of the 740,880 runs to be run in parallel, and the resulting models could be collated into a single Big Data database. Furthermore, in less than 1 second, Big Data querying/matching engines could be leveraged to efficiently match one of the hundreds of thousands of models in that database to user inputs from web interface tools. In other words, the use of Big Data infrastructure would require fewer assumptions and would result in a much more complete tool. Given the computational time needed to run the models, the appli- cation of a Big Data computing platform could offer efficiencies as compared to the traditional simulations and modeling approach, even if the data for the development of the regression models had to rely on simulations. However, a true Big Data approach would leverage actual data to develop the regression models, rather than running hundreds of thousands of simulations to generate the necessary data. Multi-state or nationwide crash, CAD, roadway, and traffic data, as well as informa- tion on SSP programs, laws, levels of TIM training, towing arrangements, dispatch colocation, and TIM taskforces could be leveraged to determine if, where, and when these TIM strategies are effective; what factors impact success (e.g., geographic factors, implementation methods, socio- demographic factors); and what strategic, tactical, and support activities might be employed to improve the probability of success. With this Big Data approach, analysts could reduce reliance on models that typically include expert assumptions and theoretical relationships and, instead, shift to empirical evidence and analytics derived from the entire population of incidents. 4.7 Summary This chapter has presented a range of example Big Data opportunities for TIM. The examples presented are by no means exhaustive; rather, they provide a glimpse into potential opportuni- ties to improve TIM using Big Data approaches. Although the example Big Data opportunities are presented in contrast to the more traditional approaches to data collection and analysis, it is important to note that there is nothing inherently wrong with the traditional approaches. Rather, the examples illustrate that the Big Data approach is not simply an improvement on current practices, but instead a radical change from traditional approaches. Big Data represents a paradigm shift that goes beyond data collection and analysis practices to include different data storage, management, and security approaches; different approaches to financing and procuring IT services; and different approaches to development of skills among employees. The shift to Big Data will directly affect the fashion, speed, and frequency with which all businesses, including TIM, are conducted. When experiments are designed, conscious or unconscious biases can be introduced. When a model is built, assumptions and simplifications typically are made, and when data samples Big Data computing power applies brute force analysis that allows for hundreds of thousands of parallel analyses in seconds.

54 Leveraging Big Data to Improve Traffic Incident Management are collected and analyzed, results can be extrapolated to areas where they might not apply. Experiments, models, and data collection and analysis methods often are driven by budget limitations or by the limitations of the data, software, or computing capabilities at hand. Many limitations can be overcome through the application of Big Data approaches. Assuming the necessary volume of data is available, Big Data computing power and techniques can allow the data to be leveraged without overwhelming the data analyst. Big Data also allows for a level of granularity (e.g., in data, time, combinations of factors, number of simulations) that traditional approaches cannot come close to meeting. As evidenced by the examples, potential Big Data applications for TIM range far and wide, particularly compared to what can be done using traditional analytics. Yet, in most circumstances, the significant volume, variety, velocity, and veracity of data is needed to support Big Data analytics, and much of this data is not currently readily available. Moreover, given that incidents are infrequent events (and desirably so), TIM is at a disadvantage from a volume perspective. In counterpoint to limited volume, however, the multi-disciplinary aspect of TIM leads to a variety of data associated with incidents that could benefit from the application of Big Data analytics. The next chapter presents a comprehensive assessment of selected datasets relevant to TIM. This assessment will help to better understand the maturity and readiness of these datasets to support Big Data analytics to improve TIM.

Next: Chapter 5 - Assessment of Data Sources for TIM »
Leveraging Big Data to Improve Traffic Incident Management Get This Book
×
 Leveraging Big Data to Improve Traffic Incident Management
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

"Big data" is not new, but applications in the field of transportation are more recent, having occurred within the past few years, and include applications in the areas of planning, parking, trucking, public transportation, operations, ITS, and other more niche areas. A significant gap exists between the current state of the practice in big data analytics (such as image recognition and graph analytics) and the state of DOT applications of data for traffic incident management (TIM) (such as the manual use of Waze data for incident detection).

The term big data represents a fundamental change in what data is collected and how it is collected, analyzed, and used to uncover trends and relationships. The ability to merge multiple, diverse, and comprehensive datasets and then mine the data to uncover or derive useful information on heretofore unknown or unanticipated trends and relationships could provide significant opportunities to advance the state of the practice in TIM policies, strategies, practices, and resource management.

NCHRP (National Cooperative Highway Research Program) Report 904: Leveraging Big Data to Improve Traffic Incident Management illuminates big data concepts, applications, and analyses; describes current and emerging sources of data that could improve TIM; describes potential opportunities for TIM agencies to leverage big data; identifies potential challenges associated with the use of big data; and develops guidelines to help advance the state of the practice for TIM agencies.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!