National Academies Press: OpenBook

Bridge Element Data Collection and Use (2022)

Chapter: Chapter 4 - Case Examples

« Previous: Chapter 3 - State of the Practice
Page 29
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 29
Page 30
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 30
Page 31
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 31
Page 32
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 32
Page 33
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 33
Page 34
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 34
Page 35
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 35
Page 36
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 36
Page 37
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 37
Page 38
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 38
Page 39
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 39
Page 40
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 40
Page 41
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 41
Page 42
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 42
Page 43
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 43
Page 44
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 44
Page 45
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 45
Page 46
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 46
Page 47
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 47
Page 48
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 48
Page 49
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 49
Page 50
Suggested Citation:"Chapter 4 - Case Examples." National Academies of Sciences, Engineering, and Medicine. 2022. Bridge Element Data Collection and Use. Washington, DC: The National Academies Press. doi: 10.17226/26724.
×
Page 50

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

29   Responses from the questionnaire and the literature search were employed to identify state DOTs that reported success in collecting bridge element data, ensuring the quality of the data, and integrating the data into asset management decision-making. These state DOTs were con- tacted to further discuss their practices and the rationale for developing them. Considerations in selecting interview candidates included presence of the following: • Practices for measuring preservation benefits based on element data or models. • Existence and use of performance measures based on element data for decision-making. • In-place procedures for ensuring element data quality. • Experience in using element models or modeling frameworks for bridge- and network-level decision-making. • Success in using and communicating element data analysis in agency reports, internal and external communications, and TAMPs. Case examples from six state DOTs (Florida, Kentucky, Michigan, Minnesota, Rhode Island, and Wisconsin) are presented in this chapter. State DOTs that participated in the synthesis with a case example, a survey response, or both are depicted in Figure 4-1. Table 4-1 lists the agency practices that were highlighted in the examples that focus on a par- ticular use of bridge element data. It should be noted that the presented case examples illustrate simply a highlighted practice from the selected agency; their practices in the other areas are not presented here. Florida DOT: An Element-Based Decision-Making Framework Using a Custom BMS and Bridge Element Performance Measures The Florida DOT has built a custom BMS over the years and evidences a long-standing tradi- tion of implementing research into practice (Sobanjo and Thompson 2001; Thompson, Sobanjo, and Kerr 2003; Sobanjo and Thompson 2007; Sobanjo 2011; Sobanjo and Thompson 2013). The backbones of the project selection process are two tools developed in house: the Project Level Analysis Tool (PLAT) and the Network Analysis Tool (NAT). PLAT and NAT work together and are tied to the AASHTOWare BrM software as the main source of bridge data. PLAT is entirely element driven and is supported by continual research on deterioration, unit cost, and risk models for Florida bridge elements. The Florida element deterioration models are Markovian models that are modified from the initial Pontis models (Sobanjo and Thompson 2001). The state first developed its deterioration models based on AASHTO’s CoRe Structural Elements (AASHTO 1997) and then updated them in 2016 to address changes in the elements. C H A P T E R 4 Case Examples

30 Bridge Element Data Collection and Use Practice State DOT Florida Kentucky Michigan Minnesota Rhode Island Wisconsin Preservation benefits based on element data X X Performance measures based on element data X X Ensuring of element data quality X X X X Decision-making based on element data X X X Communication based on element data X X Table 4-1. Highlighted practices in the case examples. Figure 4-1. State DOT participation in the synthesis.

Case Examples 31   PLAT develops three alternatives for each bridge: no action; management, repair, rehabilita- tion, and improvement (MRR&I); and bridge replacement (Figure 4-2). PLAT also serves as a project scoping tool, both for the central office and the districts. Element and bridge health indexes, life-cycle cost, and incremental utility-cost ratio represent the main drivers of PLAT and NAT. For candidate bridge replacement projects, PLAT performs life-cycle cost analysis to confirm that rehabilitation is not more cost-effective. NAT compiles a bridge program (using PLAT outputs) and prioritizes for network-level performance outcomes (Figure  4-3 and Figure  4-4). NAT determines the priority order of PLAT-generated projects and the outcomes of funding levels and projects. The Florida DOT Figure 4-2. Florida DOT Project Level Analysis Tool (PLAT) screenshots.

32 Bridge Element Data Collection and Use Figure 4-4. Florida DOT NAT algorithm. Figure 4-3. Florida Network Analysis Tool (NAT) sample dashboards.

Case Examples 33   customized NBI Translator and applies it to map element condition data to NBI GCRs for state- owned structures. The Florida DOT is comfortable using the conversion tool for state-owned bridges, which are in good to excellent condition. The DOT does not use the conversion for locally owned bridges because some are in fair or poor condition, and the DOT does not find the conversion to be as accurate for such structures. Translated NBI GCRs are employed for predicting network-level NBI conditions in NAT outputs. In addition, NAT features built-in dashboards and creates reports and graphs for program outcomes for performance measures (e.g., percent good or excellent, health index, paint health index, life-cycle benefits). The Florida DOT operates a performance-based maintenance and preservation program sup- ported by bridge element performance measures for bridge decks, deck joints, steel protective coating systems, and concrete slope protective systems. Overall, the Florida DOT has committed to a high condition standard that may not apply to all state DOTs. The Florida DOT also notes that these measures and targets were developed by its Turnpike District, with bridges in better condition than those of other districts. Bridge preservation strategy in Florida aims to ensure the functionality of “protection elements” over the long term and preservation of “core and structural” elements throughout the entire life of the bridge. The following Florida bridge ele- ment performance measures are based on both element and defect condition and incorporate network targets for each measure: Bridge Deck • Reinforced Concrete. At least 95% of deck area is in CS1/CS2, with less than 1% of area includ- ing Exposed Rebar (1090) and Cracking (1130) defects. • Prestressed Concrete. At least 95% of deck area is in CS1/CS2, with less than 1% of area including Exposed Rebar (1090), Exposed Prestressing (1100), and Cracking (1110) defects. • Steel. At least 95% of deck area is in CS1/CS2, with less than 1% of area including Corrosion (1000) and Connection (1020) defects. Deck Joints • Deck Joints (Steel Bridges). At least 95% of deck joint length is in CS1/CS2, with less than 1% of length including conditions allowing leakage (multiple defect types). • Deck Joints (All Bridges). At least 90% of deck joint length is in CS1/CS2, with less than 5% of length including Metal Damage (2370) defects. Steel Protective Coating Systems • Steel Protective System. At least 95% of protective system area is in CS1/CS2, with less than 1% of area including Peeling/Bubbling/Cracking (3420) and significant Effectiveness (3440) defects. Concrete Slope Protective Systems • Concrete Slope Protection. At least 95% of slope protection area is in CS1/CS2, with less than 1% of area including Exposed Rebar (1090), Cracking (1130), and significant Seal (2330) and Settlement (4000) defects. These performance measures are reported in, and tracked by, the Florida DOT Bridge Per- formance Report, exemplified in Figure 4-5. The performance measures are also used for man- aging turnpike maintenance in five maintenance zones (Figure 4-6). Zone 1 maintenance is performed under work order–driven regional repair contracts while Zones 2 through 5 are managed through asset maintenance contracts. Bridge performance measures are applied to enhance reporting and coordination between the Florida DOT and the asset management con- tractors by publishing data monthly, tracking condition trends, and producing detailed reports to support repair programs.

34 Bridge Element Data Collection and Use Figure 4-5. Sample of the Florida DOT bridge performance report.

Case Examples 35   Kentucky Transportation Cabinet: Ensuring Element Data Quality The Kentucky Transportation Cabinet conducts a file review at its central office of 10% of all inspections each month to underpin QA (Kentucky Transportation Cabinet 2020). During these reviews, each element is reviewed for data quality, using inspection notes and photographs for each defect (Figure 4-7). The photographs are taken and presented to draw attention to defects, so the reviewer can easily check and confirm the accuracy of the element quantities. The Kentucky Transportation Cabinet documentation includes the policies and procedures followed during this review. If an inspection indicates an NBI GCR score of 6 or lower, inspectors provide addi- tional notes. Two engineers in the central bridge office perform the QA reviews. Central office reviews are preceded by district reviews for every inspection. In addition to the reviews addressing defects, element condition data are compared with NBI GCRs during central office reviews to verify that the element condition data are consistent with Figure 4-6. Samples of reporting and coordination using Florida bridge element performance measures.

36 Bridge Element Data Collection and Use the NBI GCRs, conforming to the comparison guidelines in Figure 4-8. The Kentucky Transpor- tation Cabinet also utilizes custom conversion profiles to ensure consistency between element condition data and NBI GCRs (Figure 4-9). For example, if more than 2% and less than 12% of deck elements are in CS2, the deck NBI GCR is mapped to a 7. These conversion profiles are also applied to develop a bridge program prioritization list. The Kentucky Transportation Cabinet developed these custom profiles for more comprehensive use for future resource allocation and programming in the AASHTOWare BrM software. As an additional QC step, the Kentucky Transportation Cabinet conducts a follow-up field inspection of one bridge for each inspector every year. During these reviews, the bridge is subjected to an independent inspection to confirm that all defects were measured and that element CSs were quantified accurately. Since streamlining the process, the Kentucky Trans- portation Cabinet has recorded fewer improvement plans and fewer plans of corrective action each year in the NBIS Metric Review, with 2020 marking the first year with no new plans of either type. Vertical cracking on abutment Figure 4-7. Example of Kentucky Transportation Cabinet defect photograph and inspection notes.

Case Examples 37   Michigan DOT: Automated and Scheduled Data Queries for Quality Assurance The Michigan DOT has developed a series of queries that check data quality and consis- tency for various NBI items and bridge element condition data. Approximately 60 queries are run once a month to identify potential data quality issues. The first set of queries verifies that the elements are correct based on the structure type and structure material (e.g., the existence of slab elements in a non-slab bridge or the lack of a timber deck element for a bridge coded with a timber deck in NBI). The Michigan DOT also cross-references the deck material, deck protection, and deck wearing surface type to make sure that the element data are tracking CS1 → Good NBI → 9, 8, 7 CS2 → Fair NBI → 6, 5 CS3 → Poor NBI → 4, 3 CS4 → Severe NBI → 2, 1, 0 1 2 3 4 GOOD FAIR POOR SEVERE Condition State Poor • Major rehabilitation or replacement Serious or Critical • Emergency repair or high priority major rehabilitation or replacement • Unless closely monitored it may be necessary to close until corrective action can be taken. Imminent Failure or Failed • Major rehabilitation or replacement • Bridge is closed. minor rehabilitation 4 Poor Condition 2-3 0-1 NBI Condition Ratings 7-9 Good Condition • Routine maintenance Fair Condition • preventive maintenance or5-6 Figure 4-8. Map of Kentucky element conditions to NBI GCRs.

38 Bridge Element Data Collection and Use Figure 4-9. Samples of Kentucky Transportation Cabinet custom conversion profiles for mapping element condition data to NBI GCRs.

Case Examples 39   with the associated NBI data items. For example, the queries verify that no black bar elements (BME for deck protection system, coated reinforcing steel) are utilized when a bridge was constructed after a certain year—or when asphalt overlay with membrane (Element 817) does not exist for a bridge with an NBI type of wearing surface (108A) that is not coded as bitumi- nous. Several queries check whether correct elements were inspected for specific material and design types, such as steel superstructure elements for a structure that is coded as such for NBI items (43 and 44). Examples of queries in this category are as follows: • Not a slab bridge but slab element present. • NBI coded timber deck, no timber deck NBE. • Item 107 not coded as concrete deck but concrete deck elements present. • Item 108C coded as 1, 2, or 3 or built before 1979 but black bar deck elements (BME for deck protection system, coated reinforcing steel) present. • Item 108C coded as 0 but coated, stainless, nonmetal bar deck elements present. • Item 107 coded as 1 or 2 but nonconcrete deck elements present. • PS [prestressed] box beams with no deck or top flange NBE. • Item 108A showing overlay, Element 810 concrete deck surface present. • Item 108A incompatible with Element 815. • Item 108A incompatible with Element 816. • Item 108A coded wrong, bituminous wearing surface element present. • Item 108B coded as 0, no membrane but Element 817 present. • Item 43/44 not coded truss but truss elements present. • Item 43/44 not coded arch but arch elements present. • Item 43/44 not coded steel but steel elements present. • Item 43/44 not coded reinforced concrete but reinforced concrete elements present. • Steel stringers present but no steel girders, trusses, arches. • Reinforced concrete stringers present but no reinforced concrete girders, arches. • Prestressed concrete stringers present but no prestressed concrete girders, arches. • Design main compared against existence of proper elements (e.g., piers, abutments). The Michigan DOT also runs queries that check not only NBI deck area against deck element area but also roadway area against wearing surface area to verify that the sum of the deck ele- ments falls within the tolerance of the inventoried deck area. The final set of queries compares element condition data to NBI GCRs. Deck, superstructure, and substructure GCRs are compared with CS thresholds to identify inspections that are not compatible. The Michigan DOT also maintains a wearing surface rating that is compared against wear- ing surface element conditions. These high-level thresholds are checked for each new inspec- tion, and inspectors are alerted if the difference is at an out-of-tolerance level. Examples of queries that check deck area quantities and compare NBI GCRs with element condition data are as follows: • Deck area >10% more than sum of deck element area. • Roadway area >10% more than sum of WS [wearing surface] element area. • Deck rating = 3 and CS3<20% and CS4<10%. • Deck rating = 3 and CS3<10% and CS4<1%. • Deck rating = 4 and CS2<20% and CS3<1% and CS4<1%. • Deck rating = 5 and CS2<1% and CS3<1% and CS4<1%. • Deck rating = 6 and CS4>10%. • Deck rating = 7 and CS3>10% or CS4>5%. • Deck rating = 8 and CS3>5% or CS4>1%. • Deck rating <8 and 100% in state 1.

40 Bridge Element Data Collection and Use • Deck rating = 9 and CS2>5% and CS3>1% and CS4>1%. • WS rating = 3 and CS3<20% and CS4<10%. • WS rating = 4 and CS2<20% and CS3<1% and CS4<1%. • WS rating = 5 and CS2<1% and CS3<1% and CS4<1%. • WS rating = 6 and CS4>10%. • WS rating = 7 and CS3>10% or CS4>5%. • WS rating = 8 and CS3>5% or CS4>1%. • WS rating = 9 and CS2>5% or CS3>1% or CS4>1%. • WS rating <8 and 100% in state 1. • Superstructure rating = 3 and CS3<20% and CS4<10%. • Superstructure rating = 4 and CS2<20% and CS3<1% and CS4<1%. • Superstructure rating = 5 and CS2<1% and CS3<1% and CS4<1%. • Superstructure rating = 7 and CS3>10% or CS4>5%. • Superstructure rating = 8 and CS3>5% or CS4>1%. • Superstructure rating = 9 and CS2>5% or CS3>1% or CS4>1%. • Superstructure rating <8 and 100% in state 1. • Substructure rating = 3 and CS3<20% and CS4<10%. • Substructure rating = 3 and CS3<10% and CS4<1%. • Substructure rating = 4 and CS2<20% and CS3<1% and CS4<1%. • Substructure rating = 5 and CS2<1% and CS3<1% and CS4<1%. • Substructure rating = 6 and CS4>10%. • Substructure rating = 7 and CS3>10% or CS4>5%. • Substructure rating = 8 and CS3>5% or CS4>1%. • Substructure rating <8 and 100% in CS1. Every month, inspectors are contacted about the query results if any of their inspections are filtered by these queries. Although some queries trigger data changes, others simply alert the Michigan DOT of a unique scenario that needs attention. The Michigan DOT has been run- ning these data quality checks monthly since May 2017. It is seeing improvement and increased consistency in the element data. The Michigan DOT went from hundreds to tens of errors in the annual NBI file. It attributes the improvement in part to these checks but also to the validations in MiBridge, its BMS. MiBridge confirms data entries against other data points on the screen or requires additional information based on other items on the screen during data entry. The Michigan DOT is constantly evaluating new techniques for checking its data to improve both the NBI data and the agency’s confidence in all facets of the bridge data. The Michigan DOT is also working on a more formalized feedback loop with its field staff to provide comments on the checks and to document why a change cannot be implemented or documented when a change is completed. Minnesota DOT: Efforts to Increase Element Data Quality The Minnesota DOT has been working on multiple facets of its QC and QA processes to increase both the quality of bridge element condition data and the state’s overall bridge inspec- tion program. Proficiency Exam. In addition to the federal minimum requirements for certification as a team leader, all applicants must also pass a Minnesota-specific standardized test. That test requires the applicant to identify element (and NBI) ratings for a test bridge. The results for the applicant are scored against the standard and dictate whether the applicant’s knowledge is suf- ficient to conduct inspections on Minnesota’s inventory of bridges.

Case Examples 41   QC. Each inspection agency in Minnesota must designate a registered professional engineer to provide oversight to its inspection program. This engineer, or program administrator, is then responsible for successfully executing accurate, timely, and compliant inspections for that inspec- tion agency. The program administrator must review and approve all inspection results (including element ratings). QA. The Minnesota DOT has a dedicated staff member who annually conducts compliance reviews on about 20% of the local agencies statewide. This effort amounts to about 43 reviews per year, with the goal of each agency undergoing an in-depth review over the course of every 5-year period. As part of the in-depth compliance review, the Minnesota DOT selects a minimum of four bridges from a local agency’s inventory, and an independent inspection (including element ratings) of those bridges is performed. Any differences between the auditor’s results and the conditions reported by the team leader are discussed. Grossly inaccurate ratings are subject to plans of corrective action, revocation of program administrator status, or withholding of state aid funds for bridge construction projects. Refresher Training. Minnesota conducts annual training seminars at seven different loca- tions statewide. Team leaders and program administrators are required to attend two of these training sessions every 4 years, but most attend every year. Part of this training includes inter- active calibration exercises. In these exercises, team leaders and program administrators assess provided photos and assign condition ratings. After the participants individually rate a bridge that was inspected, the correct rating is revealed and discussed. These training sessions also often include presentations on any changes to element definitions and common miscoding errors found during compliance reviews. Central Management. The Minnesota DOT central office staff initially processes all invento- ried bridges. The list of elements for each bridge is initially populated from construction plans by two to three dedicated staff members, ensuring consistency across the state. The central office also implements statewide changes if new elements are required or if a definition is modified (requiring data migration). Inspection Field Manual. The Minnesota DOT developed and authored its Inspection Field Manual (Minnesota DOT 2020) to detail its specific adoption of NBEs. The manual goes into greater detail than the AASHTO manual, helping guide inspectors to produce better consistency in element ratings. The manual also provides case photos to assist inspectors in visualization and comparison against conditions encountered in the field. Rhode Island DOT: Automated and Scheduled Reports for Agency-Defined Steel Beam End Element The Rhode Island DOT has a custom beam end, Element 8107 (Figure 4-10). The agency uses this element to track the condition of steel beam ends, which are more susceptible to deterioration and may dictate the bridge load capacity as deterioration progresses. This element is defined as the last 5 feet of steel beam ends, which see the most corrosion because of joint deterioration (Figure 4-11). The quantity in linear feet of Element 8107 is added to that of Element 107 (steel open girder/beam NBE) for the federal data submissions. This breakout allows the Rhode Island DOT to track changes in that section of the beam, which may affect the bridge’s load rating. One of the applications of this element is creating a report to advise the load rating team to look at the deterioration in CS3 and CS4 to determine whether it warrants a reload rating of the bridge (Figure 4-12). The report does not incorporate pedestrian or bike path bridges and includes only those bridges with more than 10% of Element 8107 in CS3 or any quantity of that same element in CS4.

42 Bridge Element Data Collection and Use Figure 4-10. Description of Rhode Island steel beam end element.

Case Examples 43   Figure 4-11. Examples of Rhode Island inspection photos for steel beam end element.

Figure 4-12. Example of the Rhode Island DOT steel beam end element report.

Case Examples 45   The report for the load rating team gives the percent and quantity of the steel beam ends in CS3 and CS4 and also notes both the timing of the last load rating on the bridge and the latest bridge inspection. The report also informs the load rater of the relevant RhodeWorks group for the bridge. (RhodeWorks is the Rhode Island 10-year transportation improvement plan.) This report is a monthly scheduled task in the AASHTOWare BrM software. When the quantities in CS3 and CS4 reach a certain limit, this finding is reported to the load rating department, which then determines whether the finding is severe enough that plating is needed at the beam ends. If the decision is to add plating, a new load rating would be conducted because of the advanced deterioration in CS3 and CS4, and a load rating also would need to be performed on the new plated beam end. The posting status may also be changed by the condition finding. If the bridge is on a critical transit route, then steel repairs may be made until work can be performed within the 10-year plan. Wisconsin DOT: Use of Element Deterioration Models and Preservation Performance Measures The Wisconsin DOT uses element deterioration models for decks and wearing surfaces to forecast future conditions and to trigger recommended work actions in its BMS. In 2018, the Wisconsin DOT conducted a preliminary investigation of element deterioration based on inspection reports (2014–2018) and developed the following Markov-Weibull deterioration curves: • Bare wearing surface (Element 8000). • Asphaltic concrete overlay (Element 8511). • Asphaltic concrete overlay with membrane (Element 8512). • Thin polymer overlay (Element 8513). • Concrete overlay (Element 8514). • Delamination, spalls, patched area, pothole (Defect 3210). • Cracking (Defect 3220). • Reinforced concrete deck (Element 12). • Delamination, spalls, patch areas, exposed rebar (Defect 1080). This state investigation was based on limited data. When available, other nondestructive test- ing (NDT) results were compared to the age-based data. The Wisconsin DOT considers these initial models as an age-based snapshot of the inventory data rather than models based on tran- sition probabilities. The agency followed a simple approach and averaged the deterioration for each age group for each wearing surface. It did not allow a starting condition to influence the future condition but took each data point as is. The Wisconsin DOT then manipulated the shape of the Markov-Weibull curves (using median years for CSs and Weibull shaping parameter) to match the age-based curves as closely as possible, focusing mostly on the condition thresholds that would trigger the BMS to recommend treatments. Considering the purpose and effect of the Weibull shaping factor on the CS1-to-CS2 transition rate to better match age-based results, the agency had no qualms with using a similar approach for all other transition rates. In the future, the Wisconsin DOT intends to refine the data and develop more accurate do- nothing deterioration curves given that current models assume no bridge improvement work was done on the data set. For the data set utilized to develop the models, no major work was per- formed on the overlay elements, and even the analysis of Element 12 for the deck only employed inspections in which no work was performed on the deck. Crack and concrete sealing may have occurred on concrete wearing surfaces; however, the Wisconsin DOT is actively enlarg- ing the sealing program and thus increasing the service life of its wearing surfaces. The agency

46 Bridge Element Data Collection and Use anticipates that most structures will outperform the current deterioration curves and that the BMS thus will produce conservative recommendations. The Wisconsin DOT also contends that data refinement to further develop the models would be better served by updated and expanded inspection data (up to 2021). The agency may also create new curves, using only the IRT and chain drag data and not the defect data in the inspection. The Wisconsin DOT also recently implemented a deck scanning policy that drastically increased the NDT performed on bridge decks statewide. These deck scans are being entered in the Wisconsin Highway Structures Information System and will influence inspection condition quantities and thus the deteriora tion curves. The Wisconsin DOT internal investigation of bridge element deterioration models served as the impetus for the ongoing Transportation Pooled Fund Program Research Project on Bridge Element Deterioration for Midwest States, led by the Wisconsin DOT. The wearing surface deterioration models are currently applied in the Wisconsin Structures Asset Management System (WiSAMS) (Figure 4-13). WiSAMS deteriorates deck overlays and defects during a specified analysis period to identify eligible work actions. WiSAMS also incorporates triggers or eligibility rules that are based on element condition data and other variables. For example, one of the condition criteria rules addresses recommend- ing a reapplication of a thin polymer overlay (TPO), which references the deteriorated wearing surface condition, expressed as NUMOVERLAY = 0 AND NUMTHINPOLYMEROVERLAYS <4 AND NDEC ≥5 AND ((Q2OF1080 + Q3OF1080 + Q4OF1080)/QTOF1080PARENT <0.01)) AND ((Q3OF8513 + Q4OF8513)/QTOF8513 >0.15)) Figure 4-13. Wisconsin DOT’s WiSAMS GUI.

Case Examples 47   where • NUMOVERLAY is an inventory reference to the number of thick overlays. For this rule, the intent is to exclude TPOs applied on top of concrete overlays. • NUMTHINPOLYMEROVERLAYS is an inventory reference to the number of TPOs applied. Currently, WiSAMS assumes a maximum of four reapplications based on milling practices. • NDEC is the deck NBI component rating. Because of the variability inherent in the NBI rating system, this condition is flexible. However, this rule does not necessarily support placing a TPO on a deck with an NBI of 5. • Q2OF1080 is the quantity of Defect 1080 (delaminations, spalls, patch areas, and exposed rebar in CS2). This defect is associated with the underside of the deck or slab element. Because of the very slow development of this defect, it is not deteriorated in WiSAMS unless the last inspection recorded more than 5% of deck or slab area in CS2 or worse. • QTOF1080PARENT is the total quantity of the parent element (deck or slab) in Defect 1080. • Q3OF8513 is the quantity of Element 8513 (i.e., TPO in CS3). • QTOF8513 is the total quantity of Element 8513. Examples of deterioration curves are presented in a number of figures. The first sample curve, for Defect 3210 (delamination, spalls, patched areas, potholes), is an age-based curve (Figure 4-14). While plotting this curve, thermographic inspections were also plotted for com- parison (Figure 4-15). The Wisconsin DOT expects to rely increasingly on NDE in the future and plans to utilize the data from NDE inspections to improve deterioration curves. The second sample curve addresses TPOs (Element 8513). The agency plotted two curves for this element: a Markov-Weibull deterioration curve (Figure 4-16) and an age-based deterioration curve (Fig- ure 4-17). The two curves do differ; the overall Markov-Weibull deterioration curve shows a faster deterioration. The Wisconsin DOT plans to refine these curves by adding more data points from future inspections. The Wisconsin DOT also maintains preservation work eligibility rules that are based on element condition data as well as NBI data. The eligibility matrixes, such as the concrete deck and other y = 5E-05x2 + 0.0003x R² = 0.6772 y = 4E-05x2 - 0.0009x R² = 0.8945 y = 8E-05x - 0.0008 R² = 0.1898 0% 5% 10% 15% 20% 25% 0 10 20 30 40 50 60 Pe rc en t o f T ot al Wearing Surface Age Delamination/Spalls/Patched Areas (Defect 3210) in Bare Wearing Surface (Element 8000) AveCS2 AveCS3 AveCS4 Poly. (AveCS2) Poly. (AveCS3) Linear (AveCS4) Figure 4-14. Wisconsin DOT deterioration curve for delamination/spalls/patched areas in bare wearing surfaces.

48 Bridge Element Data Collection and Use 0 2 4 6 8 10 12 14 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 46 48 Pe rc en tD el am in ati on Wearing Surface Age Thermography for SE Region State-Owned Bridges with Bare Wearing Surface Max Min Average % delam Figure 4-15. Wisconsin DOT percent delamination by wearing surface age based on thermographic inspections. 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 0 5 10 15 20 Pe rc en t T ot al Q ua nti ty Years ti S1 S2 S3 S4 Figure 4-16. Wisconsin DOT Markov-Weibull deterioration curve for thin polymer overlay.

Case Examples 49   bridge element preservation eligibility matrixes, are available in the Wisconsin DOT Bridge Manual chapter on bridge preservation (Wisconsin DOT 2020). These rules are designed to guide the decision-makers in selecting a preservation activity and to demonstrate the potential enhancement to NBI values—and the anticipated service life increase—that would result from that activity. The Wisconsin DOT also utilizes bridge element condition data for internal communication. Each year, the agency generates an internal report (the annual bridge report), which includes element-based preservation performance measures (Table 4-2) and condition targets for strip seal joints, coated steel surfaces, and bearings. Element-specific condition targets are tracked and also are listed in the Wisconsin DOT Bridge Manual chapter on bridge preservation (Wisconsin DOT 2020). Agency targets include maintaining at least 90% of coated steel surfaces in CS2 or better, 95% of bearings in CS2 or better, and 85% of bridges with strip seal joints that are effective in stopping leakage. y = 0.0037x R² = 0.4428 y = 0.0088e0.3063x R² = 0.7798 y = -0.0006x2 - 0.013x + 1.0006 R² = 0.8305 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 0 2 4 6 8 10 12 Pe rc en t T ot al Q ua nti ty Wearing Surface Age Thin Polymer Overlay (Element 8513) Avg CS3 Avg CS2 Avg CS1 Linear (Avg CS3) Expon. (Avg CS2) Poly. (Avg CS1) Figure 4-17. Wisconsin DOT age-based deterioration curve for thin polymer overlay. Objective Target/Goal Performance Measure Maintain bridges in good or fair condition 95% of bridges Percentage of bridges in good or fair condition (NBI rating 5 or higher) Maintain bridge decks in good or fair condition 95% of bridge decks Percentage of bridge decks in good or fair condition (NBI rating 5 or higher) Maintain effective expansion joints that do not leak 85% of bridges with strip seal joints that are effective in stopping leakage Percentage of bridges with 90% of their strip seal expansion joints in CS2 or better (effective joint) Maintain coated steel surfaces in CS2 or better 90% of coated steel surfaces Percentage of coated steel surfaces in CS2 or better (effective) Maintain bearings in CS2 or better 95% of bearings in CS2 or better Percentage of bearings in CS2 or better Seal eligible concrete decks (NBI rating 6 or higher) with sealant every 3–5 years 20% of eligible concrete decks sealed Number of decks sealed (square feet of deck area) each year during a 5-year period Table 4-2. Wisconsin DOT element-based bridge preservation performance measures.

50 Bridge Element Data Collection and Use Using element deterioration models enabled the Wisconsin DOT to shift away from rely- ing solely on NBI GCRs. The NBI GCRs are still part of the WiSAMS rules, but where the NBI GCRs have element-based triggers, they are more like safety nets than the actual triggers. As an example, assume that a TPO is eligible for an NBI GCR that is greater than or equal to 5. The ele- ment condition should trigger the TPO at a rating much higher than 5, but the NBI GCR value provides a boundary in case an inspector incorrectly coded (or neglected to update) element data. Another benefit of element deterioration models is their capability to set different thresh- olds for different defects and CSs. For example, such models can treat delamination and spalls differently than abrasions and can establish the work action triggers accordingly to capture the risk associated with each defect.

Next: Chapter 5 - Summary of Findings »
Bridge Element Data Collection and Use Get This Book
×
 Bridge Element Data Collection and Use
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Initial efforts to define and collect bridge element data in the United States started in the late 1990s with the development and implementation of bridge management systems (BMSs). Over the years, the bridge management community provided feedback and made suggestions to improve the bridge element inspection methodology.

The TRB National Cooperative Highway Research Program's NCHRP Synthesis 585: Bridge Element Data Collection and Use documents current state departments of transportation (DOTs) practices and experiences regarding collecting element-level data and ensuring data accuracy. The synthesis also examines how state DOTs are using the data from inspection reports.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!