Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
38 TIM Analysis Step 3â Perform the TIM Analysis With leadership buyÂin for the analysis plan, the next step is to execute per the plan. When conducting the TIM analysis, be sure to keep step with the schedule, and communicate when and if the need arises to deviate from the plan or refine an aspect of the plan. Along the way, be sure to consider the following: ⢠Keep management engaged and informed. ⢠Exercise and document engineering judgment. ⢠Actively manage deviations in cost, schedule, and analysis. ⢠Conduct âsanity checksâ at each step of the analysis. ⢠Leverage opportunities for data processing automation. ⢠Review both aggregate and disaggregate results. ⢠Capture lessons learned. ⢠Create a comprehensive archive. Following these practices while conducting the TIM analysis will provide confidence in the results, will strengthen stakeholder buyÂin, and will ensure a straightforward path to developing products that effectively communicate results. In the section below, each of these is discussed in detail with context. Keep Management Engaged and Informed Routinely engage management and stakeholders during this step by sharing challenges and progress in analyses and by including their inputs in key decisions such as applying assump tions based on documented peer or national findings. Be sure to understand the right level and frequency of information exchange to avoid information overload or undue surprise. Keeping management informed reduces the magnitude and occurrence of course corrections and promotes ownership of the process and results within the TIM program and across stakeholders. Exercise Judgment and Document Assumptions In the conduct of this study, to overcome the missing lane closure information, a duration based rule was derived from historical data. Specifically, incidents with longer time durations are assumed to have a greater number of lanes closed. Though this strategy allowed a more inclusive analysis of the reported incidents, it had limitations. For instance, longÂduration incidents were assumed to be the result of multiple lane closures. In doing so, the assumed number of closed lanes was at times beyond the range of lane closures that is considered in several of the TIM benefit methods. Consequently, the value of TIM may be understated for these types of incidents. The studies also applied a rule of thumb of 25% reduction in incident duration based on empirical comparison of data from the Maryland CHART program. Often in the conduct C H A P T E R 5
TIM Analysis Step 3âPerform the TIM Analysis 39  of the TIM analysis, judgment must be exercised and to the extent possible justified. Be sure to document the basis of these decisions, whether they are based on a conservative or optimistic outlook for a parameter, experiences from a peer organization, or the collective decision of stakeholders. Actively Manage Deviations in Cost, Schedule, and Analysis In the conduct of this task, deviations may present in the form of significant schedule slips due to delays in the procurement of data, the cleaning and analysis of data, unforeseen com plexities in the calibration of models, and the effective preparation of findings to target the intended audience. During instances of deviations from the plan, be sure to document and communicate the basis for deviation, more broadly if the deviation affects the realization of the goals for the analysis. When a deviation from the schedule occurs, also make an effort to negotiate whether the deviation is acceptable or whether the timeline, scope, or resource will need to be adjusted. This will help inform internal leadership of the basis for deviations and provide confidence in final outcomes. Conduct âSanity Checksâ at Each Step of Analysis Key in data processing is confirming metadata, data observations, and intermediate processed data are correct and reasonable. In the conduct of this study, discrepancies in the naming and location of loop detectors initially resulted in the incorrect assignment of demand data with incident data. This was revealed when spotÂchecking demand prior to, during, and after incidents. Exploration revealed the issue to be a gap in record keeping when detectors were physically moved during paving operations. Other sanity checks include the conduct of sensitivity analyses that apply variations in input parameters to confirm that the change in outputs is commensurate with expectation. Leverage Opportunities for Data Processing Automation More likely than not, the evaluation processes can be reapplied or repeated in future years. In the conduct of the analysis, the team may identify areas where the processes applied could be streamlined, enhanced, and automated. Moving toward this direction will not only benefit the existing analysis but will benefit subsequent analyses, potentially annual in nature. Another benefit of automation is the reduction in human error in processing data. Often in the process of cleaning and combining data, programming code or Excel macros may be developed. Be sure these data cleaning codes are commented and archived for potential future use. Review Both Aggregate and Disaggregate Results While the final number is what the analyst is working toward, be sure to review corridor or segmentÂspecific results to confirm the absence of anomalies that may arise from either the data or computational processes. Ask with a critical eye whether disaggregate findings make sense in the context of geographic and temporal proximity and in the context of TIM operations experts. Capture Lessons Learned A number of lessons were learned in the applications of TIM methods as a part of this study. First is that nearly 80% of the time and resources were allocated to collecting, quality checking, cleaning, and merging volume and incident data. The most prominent incident data fields that
40 Guidelines for Quantifying Benefits of Traffic Incident Management Strategies limited the TIM analysis were the number of lanes closed and injury type. As agencies conduct their benefits analyses, analysts may find potential efficiencies or pitfalls related to the analysis process as well as the broader TIM program practices. Be sure to channel these findings both through documentation and communication with appropriate TIM stakeholders. Create a Comprehensive Archive Be sure that data and processes have been documented to the extent that the analysis may be repeated based on a single archive of materials. This archive should include metadata directing the personnel, agencies, and databases that provided raw data. This archive should include literature and references that served as the basis for adopting specific parameters in the conduct of the analysis. This archive should also include any spreadsheets or code developed in the con duct of the analysis and postÂprocessing of results. Depending on cost and resources, the archive may also include the raw data applied in the analysis. This documentation will be instrumental in addressing questions that may arise from management, upper management, and external stakeholders. This documentation will also be instrumental to the next time a benefits assess ment is to be conducted. In addition, the archive should feed or inform the overall performance measure tracking activity that has been mandated by recent surface transportation legislation within the context of performanceÂbased management. Examples of the types of data that may be tracked over time include incident duration, frequency of incidents and secondary crashes, and delay savings due to TIM activities.