National Academies Press: OpenBook

Visualization of Highway Performance Measures (2022)

Chapter: Chapter 3 - Survey Results

« Previous: Chapter 2 - Literature Review
Page 21
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 21
Page 22
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 22
Page 23
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 23
Page 24
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 24
Page 25
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 25
Page 26
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 26
Page 27
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 27
Page 28
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 28
Page 29
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 29
Page 30
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 30
Page 31
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 31
Page 32
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 32
Page 33
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 33
Page 34
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 34
Page 35
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 35
Page 36
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 36
Page 37
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 37
Page 38
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 38
Page 39
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 39
Page 40
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 40
Page 41
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 41
Page 42
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 42
Page 43
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 43
Page 44
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 44
Page 45
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 45
Page 46
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 46
Page 47
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 47
Page 48
Suggested Citation:"Chapter 3 - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2022. Visualization of Highway Performance Measures. Washington, DC: The National Academies Press. doi: 10.17226/26651.
×
Page 48

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

21   The survey (Appendix A) was sent to 50 state DOTs and the District of Columbia. Forty-four DOTs responded to the survey—an 86% response rate. However, not every DOT answered every survey question, so statistics shown throughout this report reflect the percentage of the DOTs answering the question. For example, if 40 DOTs answered a particular question, then 75% of responses would signify 30 DOTs (30 out of 40). The survey included 15 questions. Five of the questions were asked using a matrix question like that shown in Figure 10. The first two questions asked about the DOTs’ use of visualization and the intended audience. Audience The significance and differences between internal and external audiences were identified by responding DOTs as important. During the interviews, the responding DOTs shared their per- spectives on the primary audience that they target to create the visualization, but the same data are also often used to tell the same performance story to broader audiences. The responding DOTs were asked about four types of potential audiences, including 1. Internal DOT analysts, 2. Internal leadership, 3. Elected public officials, and 4. External audiences (the public). The audience can fall anywhere on the spectrum of these four types. The survey responses shown below and in Appendix B reflect the perceived audience by either topic or visualiza- tion type. Results of the survey responding DOTs show that the internal DOT analysts are recognized by most DOTs as the audience that most uses the performance measure information. The DOTs were asked the following questions from three different perspectives: • What performance measures are visualized for your agency and who are the intended audiences? • What visualizations have you created for performance measures and who are the intended audiences? • Please provide a specific example of a performance measure visualization from your agency. Describe the purpose. C H A P T E R   3 Survey Results

22 Visualization of Highway Performance Measures PM1 (Safety Condition), PM2 (Infrastructure Condition), and PM3 (System Performance) measures were included in the survey because every state is required to report these three per- formance measures. Additional measures were asked about to see what other measures were visualized by DOTs. As shown in Table 2, more than 90% of the DOTs responding to Question 2 identied internal DOT analysts as the intended audience. Internal leadership followed closely, with the external audiences (the public) and elected public ocials reported as the intended audience by nearly 60% of responding DOTs. Additional highway performance measure audi- ences are shown to have very similar patterns. Internal leadership (decision makers) were identied as the primary audience by more DOTs for DOT highway performance measures. Environmental, resilience, and economic performance measures were not reported by as many DOTs, but approximately 27% of the DOTs are reporting these data internally. Performance Measures and Intended Audiences Question 2 of the survey is “What performance measures are visualized for your agency and who are the intended audiences?” e results of the number of DOT respondents are shown in Table 2. Highlighting two performance measures from Table 2 shows some similarities in responses. Figure 11, Specic examples from Question 2, shows the results of PM1 (Safety) and economic Figure 10. Survey Question 2.

Survey Results 23   Table 2. Results of Question 2. Figure 11. Specific examples from Question 2.

24 Visualization of Highway Performance Measures performance measures in a bar chart. Internal DOT analysts and internal leadership are the primary audiences. External audiences are somewhat less frequent, and elected public officials are reported as the audience approximately 65% as often as internal analysts. The economic measures are identified as applicable by about 33% of the DOTs in similar proportions. In addition to the measures asked in the survey, DOTs also reported on multimodal perfor- mance measures, including • Vehicle Miles Traveled (VMT), • Travel Time Reliability, • Vehicle Hours of Delay, • Average Travel Speed, • Combination Truck Hours of Delay, • Transit Passenger Trips, • Number of Riders on Park and Ride, • Percent of Airport Runways in Satisfactory or Better Condition, • Transit Ridership, • Rail Ridership, • Bike and Pedestrian/Bikeways, • Aviation Passenger Boardings, and • Rail Passengers. Additional safety measures include • Number of Alcohol-Related Traffic Fatalities, • Number of Non-Alcohol-Related Traffic Fatalities, • Number of Occupants Not Wearing Seatbelts in Motor Vehicle Fatalities, and • Number of Pedestrian Fatalities. Beyond performance measures related to mobility, additional performance measures include topics such as • Project Design and Construction, • Highway Operations, • Program Support, • Construction Projects on Time and on Budget, • Disadvantaged Business Enterprises Utilization, • DMV Wait Times, and • Customer Service. Hundreds of other internal performance measures drive internal day-to-day operations and inform decisions regarding resource allocations and process improvement initiatives and prio r- ities. The measures range from running security updates on IT equipment to delivering projects on time/on budget and within specified quality standards. DOTs are also assessing the response times to their customers as well as training classes delivered to staff. Observations about Intended Audiences from the Interviews As noted in the interviews, Florida DOT has a clear understanding of which audiences they are targeting with their visualization tools and how to prioritize them. Planners are their first priority, with significant consideration of the data needs of the larger transportation commu- nity, and then the general public. Partnerships with districts and other localities within the state also allow for the consideration of specific audiences on local levels, such as metropolitan plan- ning organizations (MPOs) and other transportation partners.

Survey Results 25   Utah DOT has clearly dened their audiences and how they are prioritized. e Governor’s Oce and the legislature are their designated primary audience, while they have dened the general public (as investors in the system) and internal users as ancillary. erefore, it was estab- lished early that all visualization tools must be able to present relevant data in a form that clearly and simply communicates the most signicant point, or performance story, in the most eec- tively direct manner possible. Arkansas DOT has identied that its intended audience includes two major types. e pri- mary audience is the decision makers, to inform them about the current performance of the department, and to communicate to them any gap between the current performance and targets. Secondly, there is the public, to improve the transparency of the DOT and be accountable for how they are spending taxpayers’ money. eir current internal tools contain data and visual- izations that require a certain level of familiarity with the subject matter to be understood, while their public tools reect an increased attention to clarity and simplicity. Observations about the Types of Visualizations Created Table 3 shows that the visualization types most reported by DOTs (90% or more) include simple charts, complex charts, dashboards, maps, and internal interactive visualizations. Dif- ferent visualizations are targeted to dierent audiences. For example, every DOT responding reported creating simple charts for internal leadership. Although fewer DOTs report using video, 91% of the DOTs who answered the question identified external audiences as the intended audience. Visualization Types and Intended Audiences Question 3 was “What visualizations have you created for performance measures and who are the intended audiences?” e results are represented in Table 3, which shows the number of DOTs responding and the percentage of total respondents answering that question. Table 3. Results of Question 3.

26 Visualization of Highway Performance Measures This question explored the types of visualizations that were created for the different audi- ences. The Washington State DOT case example shows many instances of simple line, bar, and pie charts and tables, and explains the thinking about creating them for their audience. Their emphasis on simple tables is consistent with the results of the survey responses. Simple charts account for over 80% of the graphics provided as examples. The simplicity of these visualizations is discussed by all the DOTs interviewed. Simple bar, line, and pie charts and tables are used extensively as communication tools for decision makers to quickly understand challenges and provide information to inform their decisions. Figure 12 shows that DOTs reported that simple charts are used for internal leadership slightly more than any other audience. The case examples demonstrate how providing context by using simple charts can help tell the story, and how the visualizations can become more complex the deeper the audience drills down to find more specific details. Complex charts are reportedly used more by analysts to understand large data sets and explore what factors may be impacting mobility, safety, maintenance, environmental, and eco- nomic issues. Very few complex charts were submitted as answers to this survey, except at the deepest levels of some dashboards. Figure 13 shows a few complex chart examples from those shared by the DOTs in this survey, as well as some examples from the Vizguide website referenced in the literature review. These types of charts are often used to provide detailed visualization of performance measures in the deepest level of dashboards. When working with other analysts with deep understanding of the data, there are many complex visualizations that are covered in the Vizguide. More than 80 videos shared on the TRBVIZ website explore this topic in depth; however, an in-depth discussion of this specific topic is beyond the scope of the current synthesis (National Transportation Research Board Committee on Visualization in Transportation 2020). Figure 12. Highlights of survey results from Question 3.

Survey Results 27   Figure 13. Examples of complex charts.

28 Visualization of Highway Performance Measures Performance Measure Visualization Examples and Purpose Questions 7–12 asked the DOTs to share examples about the purpose of performance mea- sure visualizations. e categories progressed from simply documenting the current status to showing a trend over time, to communicating progress toward a target, to helping to tell a perfor- mance story. It also asked whether the visualization was intended to inform day-to-day opera- tions, inform medium-term planning, inform long-term planning, or support a policy, as shown in Figure 14. e survey responses included specic images and the intended purpose of the image. All the visualizations can be seen in Appendix C (online), but representative examples are shown to demonstrate the varieties of and approaches to visualizing performance measures among DOTs. e icons depicted in Figure 15 are used to label the images. Most visualizations were identied with more than one purpose, so each of the icons is highlighted to show the selec- tions. If the purpose does not apply to a given image, it is greyed out. e following examples in Figures 16–23 show the variety of responses to the purpose of the visualization. ese purposes are not mutually exclusive; visualizations oen inform multiple categories. ey are oen used together to address a wide range of performance stories. Is a Dashboard Used by Analysts Only Available Inside Your Agency? Figure 24 shows the number of DOTs that responded that an internal dashboard was avail- able to DOT sta inside the agency. Of the 35 DOTs that responded to this question, 20 reported using a dashboard to show a trend over time. irteen DOTs reported they used an internal dashboard for medium-term programming. Figure 14. Question 8, “Please provide a specic example of a performance measure visualization from your agency.” Document current status Show trend over time Communicate progress toward a target Help tell a performance story Support a policy Inform long-term planning Inform medium-term planning Inform day-to-day operations Figure 15. Purposes of visualizations.

Survey Results 29   Figure 16. Document current status. Document Current Status I nf o g raphic P ho to g raph M ap, s to ry map M ap, chart, inf o g raphic

30 Visualization of Highway Performance Measures Figure 17. Show trend over time. Show trend over time Line chart, heat map Bar chart, pie chart Line chart Bar chart, line chart

Survey Results 31   Figure 18. Communicate progress toward a target. Communicate progress toward a target Line chart Pie chart, map, infographic Tables, bar charts Bar chart, line chart, pie chart, interactive

32 Visualization of Highway Performance Measures Figure 19. Help tell a performance story. Help tell a performance story Pie chart, bar chart, line chart Table, spark line, icons Map, story map Pie chart, map, interactive, layout,

Survey Results 33   Figure 20. Support a policy. Support a policy Bar chart, photo Bar chart, line chart Infographics Pie charts, photo

34 Visualization of Highway Performance Measures Figure 21. Inform long-term planning. Inform long-term planning Bar charts, line charts, photos Pie charts Pie chart, dials Map, infographic

Survey Results 35   Figure 22. Inform medium-term planning. Inform medium-term planning Pie chart, infographic Map, photos Tree map, map, table, interactive Line chart

36 Visualization of Highway Performance Measures Figure 24. Survey results for Question 12: Is a dashboard used by analysts only available inside your agency? Figure 23. Inform day-to-day operations. Inform day-to-day operations Bar charts Map, interactive Map, interactive, photograph, live video Map, pie charts

Survey Results 37   Software Tools Used So far, this chapter has shown examples from DOTs that range from the simplest charts to sophisticated dashboards. This section examines the choice of tools used by DOTs to create these visualizations. As shown by responses to the survey question, “What software tools does your agency use, and approximately how many users in your office create visualizations of performance measures that use that software?,” there are a wide range of visualizations designed for specific audiences that have different purposes. No DOT reported using just one tool. Every DOT creates a toolkit including many different software tools to meet their needs. Different tools are combined, depending on the DOT’s selected audience, what performance measure is being visualized, and whether the data are being visualized to simply “document current status” or if the informa- tion is presented as a performance story that shows “progress towards a target.” Many DOTs use both Tableau and Power BI, and the choice of the tool may depend on whether the message is delivered to external audiences or internal analysts. It may also depend on the skill set of the analyst tasked with creation. The broad categories of tools used are shown in Figure 25. This synthesis does not attempt to quantify or qualify the data sources for the data being visualized. Having a “central warehouse” of data is a foundational requirement for the creation of dashboards and finding the “data point of view.” How those data are gathered, stored, vetted, cleaned, and quality checked is beyond the scope of this synthesis. Most Used Tools Excel is the most universal tool reported as being used by 41 DOTs. Excel is designed to visu- alize simple charts. Maps created by the Environmental Systems Research Institute (ESRI) are the second most reported tool, used by 37 DOTs. Acrobat is a common distribution method for documents on the internet, reported by 30 DOTs. No matter what kind of software is used for creation, Acrobat provides the PDF format to view the information on almost all digital devices. Transportation Visualization Tools The Regional Integrated Transportation Information System (RITIS) is a tool actively used by 26 DOTs to view performance data collected by the FHWA for all DOTs. RITIS is a data- driven platform for transportation analysis, monitoring, and data visualization. It allows DOTs to visualize PM3 data among a set of other data. They can select specific topics and customize the information to be visualized. iPeMS is a tool created by Iteris used by five DOTs. It is part of a suite of software that includes platforms to monitor the performance of arterials, freeways, and intersections with visualization data that can predict mobility and in turn be used to improve mobility. Dashboards The three most-used dashboard tools as reported by survey recipients are Microsoft Power BI, Tableau, and ESRI StoryMaps. Many DOTs use all three tools. The ability of this software to read multiple sources of data from databases and Excel spreadsheets at the same time has trans- formed how data are being communicated as information. They provide a framework to group

38 Visualization of Highway Performance Measures information together in logical paths that can be used to guide the audience through a perfor- mance story or allow them to choose their own path to view the data that provides the answer to their question. Eighteen DOTs reported using Oracle. One DOT reported using Primavera P6 Site Manager (a project, program, and portfolio manage- ment tool that is used for planning, managing, and executing projects) as well as Agile Assets Maintenance and Pavement Management System [which has comprehensive geographic infor- mation system (GIS) functionality to display any asset, event location, work history, and current work orders on a map]. IBM® SPSS® Statistics software is used by at least three DOTs to deliver a set of statistical fea- tures to analyze data. Other dashboard tools reported by survey recipients include SAS, SAP, Workday, Qlik, Lookr, Dundas, and ThoughtSpot. Figure 25. Tools used by DOTs.

Survey Results 39   Graphics Tools Nineteen DOTs reported using Adobe Creative Suite. It provides a number of tools that create graphics that help tell performance stories. The suite of tools is designed to create bitmap images made of pixels (photos), vector-based images that create graphics by drawing lines, reports that can be shared as Adobe PDF files, and videos to tell stories combining sound and moving images. Programming Tools R is reportedly used by 12 responding DOTs to create visualizations with a lot of data. R is a language and environment for statistical computing and graphics. It is a free, opensource platform that is an integrated suite of software facilities for data manipulation, calculation, and graphical display. Nine responding DOTs report using Java, which is often used to build a small application module or applet (a simply designed, small application) for use as part of a web page. DOTs also reported using APL, .NET, and Python to program visualizations. Evaluating the Effectiveness of Performance Measure Visualizations To ascertain whether DOTs are evaluating the effectiveness of their visualizations, Question 4 of the survey asked, “Does your agency have processes for developing visualizations that ensure consistent and repeatable guidance for communication? (Either internal to your agency, or external outside your agency).” The results are shown in Figure 26. The first question asked about data management and the second question about internal review of the visualizations. The responses were very close between these two. Twelve DOTs reported no review process for data management. Twenty-seven DOTs reported having either a formal or informal process of review. Eight DOTs reported a formal written process, and 19 DOTs reported using an informal process of peer review for data management. Twelve DOTs reported no review process for visualizations that are created. Twenty-seven DOTs reported having either a formal or informal process of review (data management). Seven DOTs reported a formal written process of peer review; 17 DOTs reported using an informal process of peer review. Figure 26. Question 4, “Does your agency have processes that ensure consistent and repeatable guidance for communication?”

40 Visualization of Highway Performance Measures Question 6 of the survey asked agencies if, once a visualization has been created, “you have a method for evaluating the effectiveness of your performance measure visualizations?” As shown in Figure 27, seven DOTs responded that they had a method for evaluating the effectiveness, and 38 DOTs responded that they did not have a formal method for evaluat- ing the effectiveness of their performance measure visualizations. Two evaluation systems were reported typically employed: an internal review process and external feedback Washington State DOT has developed standards for consistent evaluation, following Tufte types of principles. Nebraska DOT formed an internal performance management group that works with various owners on effectiveness of dashboards. Georgia DOT has a team of internal reviewers who are responsible for maintaining and editing internal visuals. The process varies slightly depending on visual needs, but follows consistent branding and formatting for a smooth user experience. It is noted that the Communication Office in a few DOTs reviews the visualiza- tions before publication. DOTs reported that they monitor web analytics to see the number of views and how deep the audience “explores the story.” DOTs reported monitoring public response and reactions (media, press, public officials, number of web hits, and printed distributions). One DOT reported the DOT intends to survey internal customers to evaluate the effectiveness of their dashboard. The primary evaluation tool used by DOTs is seeking feedback from data consumers and incorporating that information to improve understanding. DOTs reported they gather feedback through internal and external stakeholders, as well as feedback from leadership. Responding DOTs expressed that the more involved the executive management is in identify- ing the audiences and defining the broad stories that are communicated using the visualizations, the more effective the performance management is at improving the transportation systems. Does Your Agency Have a Compelling Story Featuring Visualizations? As shown in the responses from Question 14, “Does your agency have a compelling story featuring visualizations?,” DOTs showcase examples of perceived successes and lessons learned from visualization of performance measures. The visualizations are shown, in Figures 28–34, with performance highlights that briefly describe the DOTs’ experiences. Figure 27. Question 6, “Do you have a method for evaluating the effectiveness of your performance measure visualizations?”

Survey Results 41   Figure 28. Performance highlight from Pennsylvania. Performance Highlight: Pennsylvania Performance Report Data Show Gap Between Goal and Operations In our first performance report, there was a graph showing the number of heavy congestion crashes by time of day in the coverage area for each of our traffic management centers. The graph demonstrated that one of the District TMC’s operating hours meant that it was standing down prior to the heaviest period for crashes in the PM peak. The TMCs operating hours were expanded to cover that time period. Pennsylvania Department of Transportation

42 Visualization of Highway Performance Measures Figure 29. Performance highlight from Washington, DC. Performance Highlight: PaveDC Helps Teams Complete Record-High Number of Repairs and Pleases Public Our PaveDC site allows residents to track the construction progress of the annual paving plan and it was well accepted by residents, community organizations, and utility companies. Internally it serves as a performance management dashboard and pushes teams to collaborate and complete a record-high number of miles of asset repairing. The PaveDC site has a very high visitor volume. District Department of Transportation

Survey Results 43   Figure 30. Performance highlight from Idaho. Performance Highlight: Idaho Learns to Manage Expectations from Well Received Corridor Plan One of our district offices created an ArcGIS Story Map to ‘tell a story’ about a corridor plan that was well received. It illustrated a multidisciplinary swath of data that showed stakeholders and the public the desirability of the proposed planned improvements. However, it created a false sense of hope that our agency would actually be in a position to carry through with the proposed improvements. So in the end, we learned how to better cast the message to manage expectations. Idaho Transportation Department

44 Visualization of Highway Performance Measures Figure 31. Performance highlight from Rhode Island. Performance Highlight: Rhode Island Visualizes Risk Data Using Maps and Tables to Impact Spending Priorities Sea level rise and other climate considerations have been included in transportation planning project selection criteria and incorporated into local Capital Improvement Programs in Rhode Island. By setting spending priorities to include awareness of sea level rise and storm surge issues, transportation decision makers took action to better plan for the future in the face of sea level rise and storm surge events. (Rhode Island Statewide Planning Program 2016) Rhode Island Department of Transportation

Survey Results 45   Figure 32. Performance highlight from Virginia. Performance Highlight: Dashboard Provides Transparency and Builds Confidence in Agency VDOT’s creditability was rebuilt around our ability to develop and deliver projects on-time and on-budget. In 2002 only 20% of projects were completed on-time. VDOT has consistently met its performance target of 77% for most of the last ten years. VDOT’s performance has been critical to instilling confidence in the agency and the success of several funding initiatives. Because of the transparency that the Dashboard provides, VDOT’s ability to deliver has not been questioned in almost 20 years. Virginia Department of Transportation

46 Visualization of Highway Performance Measures Figure 33. Performance highlight from Ohio. Performance Highlight: Ohio’s Successful Crash Visualization Improved Maintenance Project Traffic There was a crash a few years back that closed the Brent Spence Bridge. We were able to use this visualization to determine where vehicles re-routed during the crash closure. A medium to long-term maintenance project with expected lane restrictions and closures was planned for a few months later. We were able to use the trend map (a moving visual of traffic congestion in a map video format) to see how traffic was impacted and build these detour routes into our plans. We were also able to complete some needed widening and repair projects to those routes which were going to see an increase in traffic. Ohio Department of Transportation

Survey Results 47   Figure 34. Performance highlight from Iowa. Performance Highlight: Useful Information on Iowa “Track A Plow” Map Iowa’s “Track a plow” map was one of the first to show real-time snowplow locations. “Plow cam” imagery allowed people to see the road conditions that plow drivers were seeing. This hugely popular app is frequently used by local news stations during their weather forecasts. I would say it improved our reputation. Iowa Department of Transportation

48 Visualization of Highway Performance Measures Survey Summary The survey reported that visualization is used for the following purposes: To tell a DOT’s performance story by • Documenting current status, • Showing trends over time, • Showing progress toward a target, • Helping to tell a performance story, • Supporting a policy, • Informing long-term planning, • Informing medium-term planning, and • Informing day-to-day operations. To communicate the message of the story, using • Simple charts, • Infographics, • Maps, and • Dashboards. To ensure the story resonates with the intended audiences, including • Internal analysts, • Internal decision makers, • External stakeholders, and • External audiences. Figure 35 summarizes the results of the survey in a simple infographic. Figure 35. Telling the performance story with visualizations that resonate with the intended audiences.

Next: Chapter 4 - Case Examples »
Visualization of Highway Performance Measures Get This Book
×
 Visualization of Highway Performance Measures
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Visualizations are tools for analyzing, reporting, and communicating the complexities of a transportation system and for synthesizing these intricacies into presentations that can be easily understood.

The TRB National Cooperative Highway Research Program's NCHRP Synthesis 584: Visualization of Highway Performance Measures documents current practices and methods used by state departments of transportation (DOTs) for visualizing highway performance measures and their use of visualization techniques for communication and decision support.

Supplemental to the publication is a Presentation of Visualization Examples.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!