National Academies Press: OpenBook
« Previous: Chapter 3 - State of the Practice
Page 57
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 57
Page 58
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 58
Page 59
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 59
Page 60
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 60
Page 61
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 61
Page 62
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 62
Page 63
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 63
Page 64
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 64
Page 65
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 65
Page 66
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 66
Page 67
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 67
Page 68
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 68
Page 69
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 69
Page 70
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 70
Page 71
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 71
Page 72
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Automated Data Collection and Quality Management for Pavement Condition Reporting. Washington, DC: The National Academies Press. doi: 10.17226/26717.
×
Page 72

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

57   On the basis of the results of the agency questionnaire, agencies that indicated a willingness to participate in the case examples were given follow-up questions requesting more specific details related to challenges and changes with implementing APCS, meeting the PM2 reporting requirements, and reporting APCS results to other agency offices. Follow-up requests included the following: • Further describe required changes to meet PM2 requirements. • Expand on challenges with PM2 reporting requirements and how they were resolved (or are being addressed). • For each office that requests pavement condition survey data, indicate whether the data are used for information only or how the data are used. • Further describe required changes for implementing automated pavement surveys. • Expand on the challenge of transitioning to automated surveys and how it was resolved (or is being addressed). The remainder of this chapter summarizes the responses received for transitioning to APCS, PM2 reporting, agency reports on pavement condition, and agency use of the results of pave- ment condition surveys. Transitioning to Automated Pavement Condition Surveys This section summarizes agency responses regarding the changes and challenges in implement- ing an APCS. In total, 10 agencies provided additional details related to APCS implementation. Florida DOT The Florida DOT noted a number of changes related to the implementation of an APCS. The DOT modified definitions of pavement condition distress to concurrently calculate HPMS dis- tress and catalog both type and severity of wheel path and non-wheel-path cracking to address DOT business needs. Calculating the extent of cracking also required modification from ranges in percent to percentage of affected area. For asphalt pavements, the DOT currently assesses cracking, rutting, and ride deficiencies and is planning to add raveling (index and mean profile depth) as an independent deficiency. The DOT noted the need to develop correlations with manual surveys and new thresholds for cracking and raveling. The automated system seems to match the windshield survey fairly well for cracking; however, APCS struggles to match manual ratings of raveling. The DOT noted federal and state testing requirements can be tedious from a data collection standpoint. To meet the PM2 0.10-mile reporting requirement, the DOT required acquisition of a statewide vendor data collection contract; however, the DOT noted the APCS will be conducted with agency-purchased equipment and staff in 2022. C H A P T E R   4 Case Examples

58 Automated Data Collection and Quality Management for Pavement Condition Reporting The DOT has implemented a DQMP to address data quality; however, it is in the process of implementing best practices on how to handle APCS results and quality control requirements. The DOT has collected and analyzed 3 years of laser crack measurement system data on the Interstate highway system, and the results have agreed well from year to year. Further analysis will be required to validate the APCS results on the arterial network. The DOT noted monthly verification of the APCS (collection and processing) is more time-consuming than the current windshield survey method, including verification of images for quality control. The DOT has provided significant staff training on the APCS vendor software and noted that operation of the APCS data collection vehicle is more complicated than manual condition surveys. Illinois DOT The Illinois DOT had a difficult time receiving approval of the DQMP due to the type of equipment used by the data collection vendor. Notification of compliance was received October 2018; however, documentation of equipment certification was required for the 2019 collec- tion cycle. It was discovered that one of the DOT’s preservation treatments caused rutting and crack- ing data to be falsely reported (values too high). Specifically, the DOT applies microsurfacing (18–20 inches wide) to address distress at the longitudinal joints. Depending on the width of the road, the treatment can encroach into the wheel path, falsely raising the rutting and crack- ing percent. After finding several of these affected areas, the collection vendor worked with the DOT and the equipment manufacturer to identify when this occurred and adjust the automated distress classification. Training staff on data analysis continues to be a challenge, since part of the process to iden- tify distress is subjective. Annual rater training is required, and random sample review of rater results is performed throughout the pavement rating process. The DOT is working on software methodology to automate as much of the rating process as possible. The intent is to have the software identify certain distress and severity levels for human raters to verify while working toward a fully automated process. Another continuing challenge is the implementation of an asset management system. The DOT plans to have an online asset management tool for testing by fall 2021. The DOT continues to work with the provider on integrating pavement information into the pavement management software. Mississippi DOT The Mississippi DOT implemented APCS for asphalt pavements in 2010 and noted challenges with changes in results based on the adoption of new technology. However, the DOT also noted that the APCS is more consistent than manual pavement condition surveys, data quality valida- tion and verification is less subjective, and APCS is much easier to conduct with vendor con- tracts. The DOT has used historical empirical equations for index and rating calculations. It is in the process of developing more accurate indices based on more than 10 years of APCS results to replace the old empirical-based equations. Similarly, the DOT is working on simplifying the treatment decision trees. The DOT indicated integrating the APCS results into the pavement management system is more difficult due to the volume of data, the data storage requirements, and data management activities. New Jersey DOT The New Jersey DOT divides the collection of pavement condition data into HPMS and other, non-HPMS, state highways. The following discussion is based on the other, non-HPMS, state highway system.

Case Examples 59   The DOT is currently working on revising the pavement condition rating process to account for the difference in the extent and severity of previous semiautomated distress surveys and the current APCS. The DOT noted wide variations in summaries of network pavement condition from year to year and is working to understand and resolve the root of the differences. It modi- fied the distress identification file a number of times and had to reprocess years of data. The DOT is revisiting the weighting coefficients and equation structure for compiling distress information into a single condition index to match expected values more closely on the basis of distress data and ROW image review by pavement designers. The DOT is willing to “reset” network condition level on the basis of APCS technology, but it wants to be sure that the level is representative of actual conditions. The DOT has spent a considerable amount of time analyzing, verifying, and categorizing col- lected distresses to ensure reliability of collected data. Criteria and processes were developed to confirm the data collection vehicle was working properly and to verify the APCS results prior to staff training, data collection, and evaluation of results. On the basis of this effort, the DOT has implemented automated and manual data review practices to identify and remedy data quality issues. The DOT was required to train staff on criteria and procedures for conducting downward image review. It called on agency pavement design staff to help verify condition index rating while reviewing the ROW camera images. In addition, the DOT developed automated queries of processed data to identify locations requiring manual review. North Dakota DOT The North Dakota DOT indicated several challenges with APCS implementation. The DOT assigns an overall distress score based on different distress types and severities. To accommodate the APCS, the overall distress score was modified to account for distress over the entire 1-mile segment, rather than just the first 0.1 mile of every 1-mile segment. This adjustment also resulted in modification of distress triggers in the pavement treatment decision trees. Oregon DOT The Oregon DOT switched from manual condition surveys to an APCS in 2008. The DOT found that the APCS identified more cracking than was visible from a roadside manual (wind- shield) survey and that the technology has improved such that rating finer cracking is now possible. The DOT noted the ability to measure finer cracks is beneficial; however, it had to reduce the sensitivity of the cracking index calculation, in particular, for lower-severity cracks. Similarly, the DOT noted the APCS capabilities (e.g., 3D equipment) are overly sensitive to fine map cracking on concrete pavement surfaces. The fine cracks tend to be indicative of wet concrete floating and surfacing issues and are not indicative of structural defects. Therefore, the DOT modified the concrete cracking protocol to make sure map cracks are not counted as longitudinal cracks, which typically must be manually rated to ensure accurate assessment. For asphalt pavements, the DOT changed distress definitions to separate cracking in the wheel path from non-wheel-path cracking. However, the APCS actually helped make this pos- sible and allowed the DOT to make improvements in the distress survey protocol. One noted weakness of switching from manual surveys to an APCS was the identification of raveling. Rating raveling from images is difficult and highly subjective, which has led to inconsistent results. In addition, the DOT decided to remove block cracking as a distress type, and capture it as longitudinal and transverse cracking instead. When switching from a laser rut measurement system to a 3D system, the DOT found the rut measurements were different and had to change rutting thresholds for low-, moderate-, and high-severity to maintain consistency with previ- ous measures.

60 Automated Data Collection and Quality Management for Pavement Condition Reporting The DOT noted validation and verification of IRI data is relatively straightforward. However, validation and verification of transverse profiles for rut depth and 3D images for cracking is very intensive and time-consuming and requires a higher caliber of technician than was required for previous survey methods. The transition to APCS has resulted in higher expectations— “100% coverage means the data will be accurate 100% of the time.” Tennessee DOT The advanced automated distress survey methods utilize machine learning algorithms to identify different types of distresses; this process relies on standard distress images to train the classification models. The DOT is working on establishing a standard distress library to train and evaluate different classification models. When completed, this will significantly improve data quality. In transitioning from semiautomated to fully automated surveys, the DOT noted an increase in the extent of distresses due to the high-resolution APCS technology. To address this chal- lenge, the DOT conducted a pilot effort in APCS data collection in one of the four regions in Tennessee. After the collection was completed, the results were compared with results from previous years in the same region as well as with the current year’s collection from the other regions. The results were determined to be comparable in terms of overall indices, which are based on individual distresses collected by different collection systems (semiautomated and automated technologies). Although the APCS is used for network-level data collection, the DOT also realized there are some potential factors causing data variability. As a part of the DOT’s efforts to improve data quality, the current APCS contract requires network-level parallel testing, which aims to under- stand and quantify the data variability for network collection. The Materials and Tests Division purchased the same data collection van that is used by the current data collection vendor. The results and findings from parallel testing will be used for improving data collection procedures. Texas DOT The Texas DOT noted the APCS is much more efficient, particularly for more granular-level data collection. In addition, it took the DOT several years to transition to an APCS to ensure consistency of results through the development of a sound DQMP and to develop and provide rater training classes for agency staff, vendors, and the audit group to make sure all are on the same page. The data collection vendor is required to train its own staff on data collection. The DOT also has experienced staff who conduct the data verification process. In addition, the DOT noted the need to have validation and verification sites for conducting data quality checks; however, the agency noted that it is a time-consuming process, considering the size of the state highway network. Utah DOT The Utah DOT indicated having to adjust (recalibrate) model parameters upon implement- ing an APCS and to verify results based on historical information. The results of the APCS were not always more accurate than the previous manual pavement condition survey but were always more repeatable and, therefore, easier to use for forecasting. It became a matter of correctly cali- brating the pavement model. However, rather than correlate the APCS results to the results of previous manual surveys, the DOT decided to recalibrate everything to the APCS and only use the historical data for system-level charting. Data validation and verification required several

Case Examples 61   discussions with the data collection vendor about distress classification to ensure consistency, particularly with cracking data. The DOT coordinated APCS activities through trial and error, trying to align outputs and expectations. The DOT determined optimal results by conducting data collection during the fall and reporting results the following spring. Wyoming DOT In order to implement the APCS, the Wyoming DOT was required to modify the cracking index calculation (based on percent cracking), since the “home-grown” cracking index was no longer applicable; and the agency revised its treatment decision trees to incorporate the new cracking index. The DOT also reported having issues with data quality related to percent crack- ing of concrete pavement. The DOT conducted a comparison of the APCS results with the historical manual surveys. This evaluation determined 60% correlation between automated and manual pavement condition surveys. The DOT comparison also noted a 5% year-to-year variation in APCS results. PM2 Reporting Agencies were asked to expand on needed process changes and challenges with meeting PM2 reporting requirements. In total, 12 agencies provided additional information related to PM2 reporting. Connecticut DOT The Connecticut DOT, in general, is meeting the PM2 reporting requirements. The DOT is in the process of fully implementing a DQMP and working with other New England SHAs to establish regional control sites for validating data collection equipment and operators. The DOT has encountered challenges in finding good control sites that meet criteria for all of the metrics. For example, a site that might serve as a good control site for IRI may not be suitable for cracking and rutting. In addition, the logistics of establishing sites has been challenging. In relation to establishing (and adjusting) performance targets, the DOT relies heavily on pavement performance models to determine performance trends based on various budget sce- narios. The Pavement Management Group is working with the University of Connecticut to improve the existing pavement performance models. At this time, MPOs have adopted the DOT targets. Florida DOT To meet the PM2 reporting requirements, the Florida DOT implemented an APCS for data collection on the Interstate system. In addition, the DOT established the “good,” “fair,” and “poor” condition categories, summarized condition over the required 0.1-mile length, and cal- culated and reported percent cracking on the basis of the FHWA HPMS definition. Illinois DOT The Illinois DOT’s APCS vendor provided marked-up downward images of cracking to categorize the cracking width. The DOT worked with the vendor to modify the wheel path widths and section lengths to meet the PM2 criteria for reporting percent cracking. Around 2013, the DOT

62 Automated Data Collection and Quality Management for Pavement Condition Reporting conducted a statewide review of the non-Interstate NHS and worked with the local agencies and FHWA on needed modifications to meet the PM2 requirements. At this time, the DOT also ensured all local agency-owned non-Interstate NHS routes would be included in the DOT’s APCS. In past condition surveys, the DOT did not capture or report percent cracking according to the PM2 definitions. Therefore, the DOT worked with the data collection vendor to identify percent cracking during data analysis and report the results at 0.10-mile intervals. Ohio DOT The Ohio DOT noted the need to establish and update the components of the DQMP. In addition, the PM2 cracking definitions are different from those used previously by the DOT. To accommodate the changes in the cracking reporting requirements, the DOT spent significant time reprocessing distress data to conform to the PM2 definitions (e.g., identifying cracking within the wheel path, defining the width of the wheel path). Through this evaluation, the DOT also determined fatigue cracking in the asphalt layer for composite pavements is minimal due to the support of the underlying concrete layer. Therefore, the DOT identified a need for a cracking index specific to composite pavements. The DOT’s processing, editing, and scrubbing of rutting, cracking, and faulting data has evolved over time. The DOT tried to use data checks and edits similar to those conducted for the IRI data; however, this proved to be challenging. While the DOT has automated the process as much as possible, manual review of the ROW and downward imagery are required to ensure high-quality rutting, cracking, and faulting data. The DOT also reported problems segment- ing the IRI data to match the segmentation of the rutting, cracking, and faulting data exactly. Syncing the data sets required significant changes to the process. Oregon DOT The Oregon DOT was required to change the distress definition for fatigue cracking for asphalt pavements. Previously, fatigue cracking was included regardless of where it occurred (inside or outside of wheel paths). Now, fatigue cracking is only considered as alligator and longitudinal cracking within the wheel paths. For JPCP and CRCP, prior to PM2, the DOT included longitudinal cracking, regardless of where it occurred across the lane. Now, longitu- dinal cracking is counted separately for inside and outside the wheel path. To meet PM2 requirements, the DOT also needed to adjust the lane of data collection for highways with six or more lanes. Previously, the DOT designated the center lane for data collec- tion, but this was revised to the rightmost through lane. Post-processing of IRI and rutting measurements was also modified to meet PM2 require- ments. The DOT now post-processes the IRI and rutting measurement data in two ways: • For the Interstate highway system and non-Interstate NHS, the DOT follows the HPMS protocol. • For all other state routes, the data are processed to filter out bridges from the IRI and rut measurements, and the rutting depth measurement from whichever wheel path is greater is used instead of the average of both wheel paths. The PM2 reporting requirement at the 0.1-mile level has caused the DOT significant valida- tion errors, especially at bridge transitions or larger bridges. The DOT is also required to validate errors related to short changes in pavement type (e.g., concrete weigh-in-motion slabs, inter- section approach slabs on asphalt pavements). The DOT captures these changes at the 0.1-mile level; however, the pavement type codes in HPMS are not at the 0.1-mile level. When aggregat- ing to larger segments, the DOT noted these issues were less significant. However, resolving

Case Examples 63   these issues takes time and the DOT questions whether or not this level of detail is necessary for reporting at the national level. In addition, prior to PM2 legislation, the DOT conducted the APCS on the Interstate system every 2 years. To meet the PM2 requirement for annual data collection, the DOT has assumed an additional financial burden of $150,000 per biennium. The additional data collection is only conducted to satisfy PM2 requirements and has little value to the DOT. The DOT is able to track, manage, and make decisions using a 2-year data collection cycle. Another impact was noted as a result of the accelerated HPMS submittal date for the Inter- state system from June to April of each year. This required the DOT to make changes to the internal data processing procedures and timelines. The multiple submittal process requires the DOT to “freeze” part of the data and network to any changes until the data for the rest of the system are collected. It also forces the DOT to process the data, conduct error checks, and extract and submit the work twice, resulting in higher cost due to additional employee time and effort. New Jersey DOT To meet the PM2 requirements, the New Jersey DOT revised the percent cracking metric to meet the HPMS definition of wheel path size and types of cracking to include in the calculation. In addition, the DOT needed to replicate all of the PM2 metric calculations for the percent pave- ments in good and poor condition and differentiate between the Interstate highway system and non-Interstate HPMS. The DOT retrieved HPMS submission data from another division and developed a crosswalk by modeling the DOT network performance to the NHS network perfor- mance. It should be noted that the DOT only maintains 62% of the network—the remainder is shared by 82 other owners. When the baseline targets were due, the most recent HPMS report card was missing data required to calculate network performance by using the new metrics on 85% of the network. The DOT provided supplemental data as an approximation of the network condition but did not have a high level of confidence in the reliability of the baseline targets. Additionally, the DOT had no experience or historical trend information to verify the reliability of the baseline. The DOT does not have investment, treatment, cost, trigger, reset, or deterioration information from the other 82 owners. Therefore, the DOT modeled and made broad assumptions about the performance of the remainder of the network on the basis of responses to surveys conducted with some of the larger owners. In relation to meeting targets, the DOT “missed the memo” regarding the use of IRI-only information on the non-Interstate NHS targets for the first performance period. By the time this was pointed out in mid-September, prior to the October 1 submission deadline, it was too late to redo the modeling and coordinate with the other 82 NHS owners. Additionally, the DOT had no confidence in the approximated baseline, no experience with the new metrics, and no historic trend information. Therefore, the DOT significantly missed the non-Interstate NHS “poor” target strictly because of the unreliability of the baseline estimate and unfamiliarity with pavement performance based on the NHS’s use of the new PM2 metrics. Tennessee DOT The Tennessee DOT started collecting cracking and faulting data per PM2 reporting require- ments in 2014. However, the DOT did not have sufficient historical data to establish prediction models for setting targets for the first 4-year performance period. To address this, the DOT tried to build relationships between PM2 performance measures and DOT measures. In relation to establishing SHA performance targets, the DOT noted the PM2 performance measure definition (good, fair, and poor) makes it a challenge to establish performance models.

64 Automated Data Collection and Quality Management for Pavement Condition Reporting To address this, the DOT built probabilistic relationships between the PM2 performance mea- sures and the DOT measures. The results of this analysis indicated that IRI generally increased as the standard deviation of the rut depth increased. A single performance index did not correlate well with PM2 poor condition; however, the pavement quality index correlated well with PM2 good condition (FHWA 2018b). The developed probabilistic curves can be used in determining the performance targets. The DOT conducts performance prediction and funding analysis on project segments. Therefore, reporting performance measures on the basis of 0.1-mile segments is a challenge. Segment aggregation influences the predicted percent good and poor; however, the DOT does not currently have a good way to address this challenge. The DOT may consider using proba- bilistic or machine learning methods to predict the percent good and poor for the 0.1-mile segments. Finally, per PM2 requirements, DOTs can adjust performance targets (4-year targets) on the basis of midterm numbers (2-year targets). However, since the DOT does not have reliable models for performance measure prediction, it will be difficult to adjust the performance targets. Texas DOT The Texas DOT has distress definitions that differ from the PM2 distresses. To meet both needs, the DOT’s current data collection method includes the distresses defined by PM2 (reported at each 0.1-mile) and DOT-specific distresses. For example, for asphalt pavements, PM2 uses fatigue cracking, rutting, and IRI to determine good, fair, and poor condition, while the DOT uses a condition score (ranging from 100 to 1) to determine five condition categories (very good, good, fair, poor, and very poor). The DOT noted challenges in using the analytical tools to establish performance targets based on the PM2 metrics, and had insufficient data for developing deterioration models for some of the PM2 metrics. Due to the uncertainty in the performance prediction models and funding levels, there will be a need to adjust the targets in the future. Utah DOT To meet PM2 reporting requirements, the Utah DOT added PM2 fields and distress defini- tions for PM2 and HPMS reporting (e.g., percent cracking). Noted challenges included estab- lishing agency performance targets, since the DOT did not have extensive data to support the HPMS indices and calculations. Therefore, the DOT used approximations to establish the agency performance targets. Similarly, the DOT had no performance data for MPOs and had to collect the data and assist them with setting the performance targets. Vermont Agency of Transportation The Vermont Agency of Transportation made several changes to accommodate the PM2 requirements, including changing the wheel path width from 2.46 feet to 3.28 feet and adding the PM2 percent cracking calculation, which is different from what is used in the agency’s pavement management system. The agency also reported challenges in establishing agency performance targets due to lack of historical cracking data for developing the deterioration models necessary for confident performance prediction. Also, due to lack of historical cracking data and no deterio- ration models, it was difficult to establish the MPO performance targets. In addition, the agency’s cracking index calculation uses all cracking types, severities, and extents. This differs from the PM2 cracking definition and required the DOT to create a correc- tion calculation based only on cracking within the wheel path.

Case Examples 65   Virginia DOT To meet PM2 requirements, the Virginia DOT added the summary data needed for dening good and poor pavement sections. Because the DOT had conducted an APCS for more than a decade, the required data were available and only required calculations in support of the metric. Wyoming DOT In order to calculate the PM2 indices, the Wyoming DOT redesigned the process for incor- porating new variables along with changing the treatment decision trees. e DOT was able to establish the PM2 baseline condition and performance targets using the historical composite index. e performance targets have not yet been regenerated with the redesigned process. Simi- larly, the MPO performance targets are also based on the historical composite index and have not yet been regenerated with the redesigned process. Agency Reports on Pavement Condition Agencies report pavement condition in relation to both HPMS requirements and agency- specic condition indicators. is section summarizes and illustrates pavement condition reports from four responding agencies. Arizona DOT e Arizona DOT Data Analytics Group conducts data collection, analysis, maintenance, and reporting for the HPMS, the linear referencing system, and the State Highway Log and provides data for the model inventory roadway elements (safety program) report. e processed HPMS data include a summary of miles in good and poor condition (Figure 36). Source: Figure provided courtesy of the Arizona DOT. Figure 36. 2019 Arizona DOT bridge and pavement condition.

66 Automated Data Collection and Quality Management for Pavement Condition Reporting Idaho Transportation Department The Idaho Transportation Department (ITD) provides access to assessments of pavement condition via a performance measure dashboard and in the ITD’s TAMP. The performance measure dashboard provides a “condition” summary number of activities (e.g., bridge and pave- ments, construction projects completed on time, days to process vehicle titles). An example of the ITD performance measure for pavement condition on all state highways is shown in Figure 37. The ITD TAMP provides details related to asset measures, targets, and performance (ITD 2019). Source: Figure provided courtesy of the Idaho Transportation Department. Figure 37. Idaho Transportation Department pavement performance measure dashboard (yellow highlighting added by ITD).

Case Examples 67   Maine DOT The Maine DOT provides public access to an asset management interactive map to showcase information related to the Interstate highway system, non-Interstate NHS, collectors, and local roads and streets. The interactive map allows the public to view information related to safety, bridge and pavement condition, and service (i.e., load restrictions, congestion). The pavement condition assessment includes IRI, rut depth, and PCR. An example of the Maine DOT’s inter- active map is shown in Figure 38. Condition categories are arranged by highway priority and unit of measure (i.e., inches for rutting, inches/mile for IRI). Selecting a roadway on the inter- active map shows information related to condition data (see inset on Figure 38). In this case, the highway priority is 2 (high-priority, non-NHS arterials), the pavement condition is A (3.3 > PSR < 4.0) and ride quality is A (<170 inches/mile). Utah DOT The Utah DOT condition data map provides information related to roadway location (e.g., state route, milepost), pavement surface type, and pavement condition. The condition data map (see insert) provides the overall condition index, ride (IRI), rutting, faulting, and cracking infor- mation for the selected roadway (Figure 39). Agency Use of Results of Pavement Condition Surveys Tables 28 through 36 provide a summary of how the various agency offices utilize the pave- ment condition survey results. Source: Figure courtesy of the Maine DOT. Figure 38. Example screenshot from the Maine DOT public map viewer.

68 Automated Data Collection and Quality Management for Pavement Condition Reporting Source: Figure courtesy of the Utah DOT. Figure 39. Example screenshot of Utah DOT statewide condition data map. Agency Use of Results of Pavement Condition Survey Connecticut For submission to FHWA annually. Illinois To quantify the state of acceptable conditions. North Dakota For general information. Ohio For asset management, TAMP, multiasset decisions, trade-offs, and multiasset optimization. Oregon As a primary factor for overall pavement program strategy and project selection. Tennessee To conduct analysis of funding needs on the basis of the pavement condition data. Asset management team uses results for TAMP reporting. Texas For TAMP reporting. Utah To develop performance charts, targets, and budgets. Also used for initial work plan. Virginia In analysis for overall decision and recommendation of maintenance of all the assets. Wyoming To manage pavement condition. Table 28. Use of results of pavement condition surveys: asset management.

Case Examples 69   Agency Use of Results of Pavement Condition Survey Illinois To allocate district budgets for the next programming cycle. New Jersey (non- HPMS) For information only. Ohio For work plan development along with planning, budget, forecasting. Oregon To inform the budgeting for pavement program funds, both in the State Transportation Improvement Program (STIP) and in the maintenance budget. Texas In budget allocation. Utah To set overall pavement and regional budgets. Virginia To determine maintenance and rehabilitation needs, on short- and long-term basis, that help in the determination of budget (based on agency condition index and detailed distresses). Wyoming To help produce projected condition. Table 29. Use of results of pavement condition surveys: budget. Agency Use of Results of Pavement Condition Survey Florida For general information. Ohio As a source for existing conditions prior to construction if there are problems or issues on particular construction projects. Texas To select construction projects. Wyoming To develop a “pick list” from which the districts can choose construction projects. Table 30. Use of results of pavement condition surveys: construction. Agency Use of Results of Pavement Condition Survey Connecticut To help identify roadways needing resurfacing (but also rely on roadway knowledge to determine where resurfacing work should be performed). Florida For district offices to review and validate APCS results. Illinois To make programming decisions (also used by local agencies). Mississippi In various reports on pavement condition and recommendations. North Dakota To help set priorities on project selection. Ohio For routine use (particularly ROW and surface imagery) and for spot location analysis using collected inertial profiles more and more. Oregon To plan and select paving and chip seal projects. Tennessee For pavement management annual report and sometimes for project selection. Texas For pavement management operations. Utah To set overall pavement and regional budgets. Virginia To make maintenance decisions and monitor the performance of the network. Wyoming To develop a “pick list” from which the districts can choose construction projects. Table 31. Use of results of pavement condition surveys: districts.

Agency Use of Results of Pavement Condition Survey Connecticut To generate preliminary pavement resurfacing lists, perform field reviews, and recommend projects for the Maintenance Resurfacing Program using maintenance history data. To compare recommendations from the pavement management system with the Maintenance Resurfacing Program. Florida To review and validate APCS results. Mississippi To cross-reference with the maintenance program that keeps track of all maintenance orders, assets, and other information. North Dakota For general information. Ohio As supportive information and to help determine rideability corrections in a few districts that use IRI data. Oregon As part of patching budget allocation. Tennessee To conduct maintenance and rehabilitation analysis based on historical pavement condition survey; to recommend the short-term (1–3 years) resurfacing list. Texas In selection of maintenance projects. Utah For informational use and to help with the work plan. Virginia As an initial filter for choosing sections considered for various types of maintenance and rehabilitation on the basis of the DOT’s condition index and detailed distresses rather than the PM2 “good,” “fair,” and “poor” condition categories. Wyoming By maintenance crews performing chip seals and patching. Table 32. Use of results of pavement condition surveys: maintenance. Agency Use of Results of Pavement Condition Survey Florida To conduct the APCS and publish the annual results. North Dakota For general information. Ohio To help solve some material-related issues, distress data, in particular, are helpful; to help solve suspected material-related issues, APCS downward imagery is very infrequently helpful. Oregon To validate material and construction-specification changes on the basis of condition data, (primarily cracking and rutting). Tennessee For forensic or postconstruction investigations. Texas For in-field checking of material performance. Wyoming To analyze specifics on projects for benefit. Table 33. Use of results of pavement condition surveys: materials. Agency Use of Results of Pavement Condition Survey Connecticut As part of iterative process of generating optimized construction programs; providing pavement management list of pavement treatments; in conjunction with condition data in performing field evaluations to determine whether the recommended treatments are viable. New Jersey (non- HPMS) To assist in selection of appropriate treatment on the basis of the extent and severity of distresses and to assess performance of previous treatments to help determine viability of future treatments. North Dakota In concrete pavement design for calibrating the AASHTOWare Pavement Mechanistic- Empirical Design software. Ohio In decision-tree process as a first cut for treatment selection. Oregon For project-level pavement design and analysis. Tennessee For calibrating the rutting and cracking models in design software. Texas In the pavement design process. Utah For starting concept work and initial design. Wyoming In project-level analysis. Table 34. Use of results of pavement condition surveys: pavement design.

Case Examples 71   Agency Use of Results of Pavement Condition Survey Connecticut To track how the DOT is doing relative to pavement performance and determine needed funding levels; to respond to questions from politicians, the press, and citizens about the condition of the pavement network. Florida For distribution of targeted lane miles to district management for review and of pavement selections for arterial replacement. Interstate pavements are reviewed by both district and central office management. Illinois For inclusion in the annual TAMP implementation documentation provided to FHWA. Mississippi To update the DOT website used by upper management and the public and to visualize pavement condition and changes through the years (https://path.mdot.ms.gov/pavement_condition). The same data are available for internal use and include more granular data used by the districts. Interstate ride rating is used to create a 3-year maintenance plan. The ride involves engineers from maintenance, research, districts, FHWA, and construction. The plan is submitted to upper management for revision and approval. Upon request, the data are used, for example, for perpetual pavement, research studies, and other comparisons. New Jersey (Non- HPMS) For information only. North Dakota To help show the need for additional funding and tell the DOT’s story about the overall condition of the pavement network. Ohio To develop several key performance indicators for internal use. (These are not directly comparable with national performance measures and do include IRI or manual PCR data). Oregon By rolling up results into performance measures that are provided to upper management for decision support. Tennessee To provide annual pavement management report. Texas To set statewide goals and funding allocation. Utah For general information. Virginia To inform strategic and long-term investment decisions. Wyoming For overall condition and legislative funding. Table 36. Use of results of pavement condition surveys: upper management. Agency Use of Results of Pavement Condition Survey Illinois For uploading to DOT’s website so local agencies can use the data in their programming efforts. Certain identified distresses trigger allowed treatments. North Dakota To help prioritize projects in the STIP. Ohio In decision-tree process as a first cut in recommending decision tree outcomes and as supportive information (data and imagery). Oregon As part of the transportation planning process. Texas In 4- and 10-year planning. Utah For general information and for help with TAMP, but mostly informational. Wyoming To develop targets and support funding. Table 35. Use of results of pavement condition surveys: transportation planning.

72 Automated Data Collection and Quality Management for Pavement Condition Reporting Summary of Chapter 4 In response to follow-up questions, the agencies noted a number of challenges, modifications, and benefits of transitioning to an APCS. Challenges included the need to revise distress defini- tions (13 agencies), the method for calculating percent cracking (6 agencies), and the calcula- tion of percent cracking for asphalt pavement so as to be specific to alligator and longitudinal cracking within the wheel path (6 agencies) and to develop correlations with manual surveys (4 agencies). While the consistency and repeatability of crack identification has improved with APCS, identifying raveling still remains a challenge (2 agencies). The time needed to conduct data quality management activities for the APCS is significantly longer as compared with manual surveys (6 agencies); however, APCS data are more consis- tent and data quality validation and verification is less subjective (6 agencies). Three agencies reported implementing or developing automated processes to assist with data quality manage- ment activities. In addition, staff training for both data quality activities and data collection and analysis was required by 5 of the responding agencies. With regard to the higher-resolution capabilities of the APCS, 5 agencies noted the need to adjust decision trees to account for the higher occurrence of low-severity cracking as compared with a manual survey. In relation to PM2 reporting, 5 agencies noted challenges in establishing performance prediction targets without having sufficient years of data based on the HPMS mea- sures. Finally, agencies reported how various offices utilize the APCS results, including upper management (12 agencies), district and maintenance offices (11 agencies), the asset manage- ment office (9 agencies), and the budget office (8 agencies).

Next: Chapter 5 - Summary of Findings »
Automated Data Collection and Quality Management for Pavement Condition Reporting Get This Book
×
 Automated Data Collection and Quality Management for Pavement Condition Reporting
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Automated collection of pavement data allows agencies to collect data on pavement health, including cracking, rutting, faulting, and roughness, at highway speeds. This provides important information for better pavement decision-making.

The TRB National Cooperative Highway Research Program's NCHRP Synthesis 589: Automated Data Collection and Quality Management for Pavement Condition Reporting documents the experiences, challenges, and state-of-the-practice solutions used by state departments of transportation that are in the midst of transition or that have transitioned to automated and semiautomated processes for collecting pavement data. It also summarizes the data for state and federal reporting requirements, such as Transportation Asset Management Plans and MAP-21.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!