National Academies Press: OpenBook
« Previous: Front Matter
Page 1
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 1
Page 2
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 2
Page 3
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 3
Page 4
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 4
Page 5
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 5
Page 6
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 6
Page 7
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 7
Page 8
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 8
Page 9
Suggested Citation:"Report Contents." National Academies of Sciences, Engineering, and Medicine. 2015. Maintenance Quality Assurance Field Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/22201.
×
Page 9

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

CONTENTS 1 SUMMARY 5 CHAPTER ONE INTRODUCTION Background, 5 Synthesis Objectives, 5 Synthesis Scope and Approach, 5 Terminology, 6 Report Organization, 7 8 CHAPTER TWO LITERATURE REVIEW Overview, 8 Maintenance Quality Assurance Condition Assessment Approaches, 9 Data Collection Activities to Support Maintenance Quality Assurance Programs, 10 Maintenance Levels of Services, 12 Using Maintenance Quality Assurance Results, 12 15 CHAPTER THREE STATE OF THE PRACTICE Overview, 15 Survey Content, 15 Maintenance Quality Assurance Program Status, 15 Data Collection and Quality Assurance Procedures, 17 Use of Maintenance Quality Assurance Data, 27 Innovations, Improvements, and Enhancements, 30 31 CHAPTER FOUR CASE EXAMPLES Approach, 31 Rationale and Motivation for Getting Started, 31 Methods to Ensure Quality and Consistency, 33 Impact of Selected Approach on Capabilities, 35 38 CHAPTER FIVE CONCLUSIONS Overall Findings, 38 Further Research, 40 42 REFERENCES 43 ABBREVIATIONS, ACRONYMS, INITIALISMS, AND SYMBOLS 44 APPENDIX A: SURVEY QUESTIONNAIRE (WEB VERSION ONLY) 102 APPENDIX B: SURVEY RESPONSES APPENDICES A and B beginning on page 44 are web-only and can be found at www.trb.org, search on NCHRP SYNTHESIS 470. Note: Many of the photographs, figures, and tables in this report have been converted from color to grayscale for printing. The electronic version of the report (posted on the web at www.trb.org) retains the color versions.

SUMMARY MAINTENANCE QUALITY ASSURANCE FIELD INSPECTION PRACTICES In the mid-1990s, Maintenance Quality Assurance (MQA) programs emerged as a method to estimate maintenance funding needed to achieve a given level of service. These programs required agencies to adopt a method of documenting work accomplishments and productivity, reliable cost data, and an inventory of highway maintenance features. Since that time, the capabilities of MQA programs have evolved as data collection and analysis technology has improved and transportation agencies have become more customer focused. Today, agencies are using the results of MQA programs to estimate the cost of providing different maintenance service levels to the traveling public, essentially enabling maintenance personnel to defend budget requests and to establish reasonable performance targets under constrained conditions. A number of initiatives have been undertaken to share MQA practices among practitioners, and they are explored in more detail below.. For instance, major national peer exchanges were conducted in 2004 and 2008, and a recent U.S. Domestic Scan (Best Practices in Performance Measurement for Highway Maintenance and Preservation) focused on the degree to which state MQA programs are linked to agency business and strategic plans. In addition, the University of Wisconsin at Madison established an MQA website where state transportation agencies can post MQA resources. In 2012, TRB published an NCHRP synthesis titled Performance-Based Highway Maintenance and Operations Management that summarized the role of MQA programs within the broader context of performance- based management, which is supported under the recent highway legislation commonly known as Moving Ahead for Progress in the 21st Century (or MAP-21). An objective of these initiatives has been to expand the use of MQA programs among state highway practitioners and to share experiences so that the practice continues to advance. This synthesis, which documents the current field inspection procedures being used to support a state MQA program, builds on previous efforts to document MQA practices and brings the documentation up to date. The information contained in this synthesis was obtained using three different sources. First, a literature review was conducted to provide background information about the state of the practice and recent developments that have taken place in the implementation and use of MQA programs. Second, a survey was distributed to voting members of AASHTO Subcommittee on Maintenance for each of the 50 states and the District of Columbia, asking for information on their MQA field inspection practices. A total of 40 agencies responded to the survey. Finally, follow-up interviews with representatives from eight departments of transportation (DOTs) were conducted to expand on the following three aspects of their program: • The rationale and motivation for initiating their MQA program. • The procedures used to ensure the quality and consistency of the MQA data and results.

2 • The impact their methodology has had on how the MQA results can be used to support agency decisions. The study found that most state DOTs have an MQA program in place or intend to implement a program within the next 5 years. Among the states that have an MQA program in place, most have had their program for more than 10 years and have made substantial changes to the program within the past 5 years. A number of factors have driven the interest and activity in the MQA area, with most state DOTs indicating that their program was initiated so agencies could • Improve accountability, • Estimate maintenance needs, • Develop performance-based budgets, • Monitor asset performance, • Make good use of available funding, and • Track and report maintenance activities. The survey of state practice investigated data collection practices in six asset categories: drainage, roadside assets, pavements, bridges, traffic, and special facilities. Of these asset categories, the most complete inventories were established for pavements and bridges. Several assets within the traffic and special facilities categories also had complete inventories established in more than half of the state DOTs with MQA programs. The survey of state practice focused only on data being collected to establish inventories and to assess the condition of the various assets. It did not include questions about operational maintenance activities such as snow removal or mowing. Most state DOTs report conducting manual surveys to collect the condition information, with annual surveys being most common. Bridges are the lone exception, as they are typically inspected every other year. Automated equipment is most commonly used for paved roadways, but the equipment is also used to some degree for other assets found along the road edge. The survey found that condition surveys for pavements and bridges are conducted outside of the MQA program in some agencies because information is already being collected as part of a pavement management condition survey or an annual bridge inspection. Half of the state DOTs use the survey results in a hybrid model that uses features of both the pass/fail and graded condition assessment methods. Surveys are typically conducted by district or regional personnel, and central office personnel are responsible for conducting random checks of data quality. Manual survey methods are most commonly used and nearly half of the state DOTs report using handheld computers to record information. Pencils and paper are still very common tools used during the MQA surveys. Most state DOTs with MQA programs survey representative samples of the network to estimate statewide conditions. The samples are typically 0.10 mi long, and between 10% and 20% of the total samples are inspected. For example, state DOTs may use statistical methods to estimate the number of samples to inspect, or they could just set a number based on experience, but most agencies report that they strive for a 95% confidence level in the data. The resources required to conduct the surveys vary among the states, with most reporting that they spend more than 6 person-months conducting surveys. The level of resources required is dependent on the sampling rate and the size of the network. In a typical MQA program, these efforts are spread out among several raters. With one exception, the state DOTs with MQA programs are actively taking steps to manage data quality, making use of rating manuals, training programs, independent

3 assurance checks, and data reasonableness checks to support their efforts. To help reduce bias, most states use a team of two raters to conduct surveys. A number of states certify their raters and at least one state has posted the qualifications for raters on their website. Several states have initiated studies to statistically evaluate the number of samples that need to be inspected to provide a reasonable level of confidence in the data. MQA data are used in a variety of ways to support agency decisions. Most states use the survey results to establish a level of service (LOS), with letter grades (A to F) being most commonly used. The results have been used to establish performance targets and some states have established (or are establishing) links between their performance targets and resource requirements. Most of the state DOTs with MQA programs have a computerized maintenance management system (MMS) in place, yet less than half of the states use the MMS to estimate budget needs or to schedule work activities. In addition, few state DOTs report that their MMS is integrated with their pavement or bridge management systems. The survey results are typically reported to maintenance and field personnel in virtually all of the state DOTs. Some agencies provide the information to other agency personnel, but few provide the information to elected officials or the public. Reports are the most common method of presenting information, but agencies also use websites and dashboards to communicate with stakeholders. Most state DOTs report that their MQA program has helped their agency achieve more consistent conditions on a statewide basis and that the information has helped them establish maintenance priorities. The following factors have most contributed to the success of the program: • Support of upper management, • Training, • Simplicity of the program, • Ease of use, • Confidence in the data, and • Buy-in from field personnel. Planned enhancements will occur in the following areas: • Implementation of new software, • Development of handheld computer applications for recording field data, • Adding GPS characteristics to the data, and • Exploring the use of automated surveys. Further research is needed for the following areas: • Establishing more consistency in performance measures, to help state DOTs better communicate on an equal basis and to facilitate national reporting of maintenance needs. • Monitoring progress made over the next several years by repeating in 3 to 5 years the survey conducted under this project. • Improving the efficiency of data collection activities, by taking advantage of statistical analyses to determine the sample sizes needed to achieve a reasonable level of confidence in the data and by exploring the use of ongoing automated data collection activities being used for pavement management to support MQA efforts. • Increasing the use of MQA results for planning and budgeting activities through the development of implementation guidance, peer exchanges, domestic scans, and workshops.

4 • Establishing LOS-cost relationships that allow states to better communicate maintenance funding needed to achieve various levels of service with stakeholders. • Improving the integration of capital and maintenance expenditures for whole-life costing, so the future impacts on maintenance from capital expansion projects can be better understood. • Demonstrating the benefits of maintenance investments to improve communication with stakeholders and to help agencies justify expenditures on MQA programs.

5 state MQA program, builds on previous efforts to document MQA practices and brings the documentation up to date. SYNTHESIS OBJECTIVES The objective of this synthesis is to document current MQA field inspection practices administered within state DOT maintenance offices. The information contained in this document is intended to benefit transportation agencies that are building asset inventories and acquiring performance- based data on highway assets, such as roadside and drainage features, as part of an asset management program. SYNTHESIS SCOPE AND APPROACH The synthesis addresses all aspects of MQA field inspection practices used to manage physical assets, including the type of data collected, the methodology used to assess condition, and the processes in place to ensure the quality of the data. Performance metrics for operational factors (e.g., snow and ice removal, mowing, and accident response), which are often a large part of a maintenance budget, are not included in the scope of this synthesis. In addition to summarizing the types of data collected and the methodologies used, the synthesis presents information on how the field inspection data are used to report highway conditions, to estimate budget needs, and to establish targeted levels of service. The rationale and motivation behind the adoption of an MQA program are also explored. Overall, the information documented in this synthesis presents current practices in the following areas: • The scope of the agency’s MQA program, including program objectives, assets assessed, and assessment criteria used. • The inspection processes used, including information on the frequency of inspection, the methodology used, the methods used to train inspectors, and the reliance on in-house versus contract personnel to conduct the surveys. • The methods used to ensure the quality and consistency of the data collection processes and outcomes. CHAPTER ONE INTRODUCTION BACKGROUND In the mid-1990s, Maintenance Quality Assurance (MQA) programs emerged as a method to estimate maintenance funding needed to achieve a given level of service. These programs required agencies to adopt a method of documenting work accomplishments and productivity, reliable cost data, and an inventory of highway maintenance features. Since that time, the capabilities of MQA programs have evolved as data collection and analysis technology has improved and transportation agencies have become more customer focused. Today, agencies are using the results of MQA programs, often in conjunction with computerized maintenance management systems (MMS), to estimate the cost of providing different maintenance service levels to the traveling public, essentially enabling maintenance personnel to defend budget requests and to establish reasonable performance targets under constrained conditions. A number of initiatives have been undertaken to share MQA practices among practitioners since NCHRP published its Maintenance QA Program Implementation Manual in 1999 (Stivers et al. 1999). For instance, major national peer exchanges were conducted in 2004 and 2008, and a recent U.S. Domestic Scan (Best Practices in Performance Measurement for Highway Maintenance and Preservation) focused on the degree to which state MQA programs are linked to agency business and strategic plans (Markow 2012). In addition, the University of Wisconsin at Madison established an MQA website where state transportation agencies can post MQA resources and where materials from the national peer exchanges can be found (http:// www.wistrans.org/mrutc/training-libraries/mqa/). In 2012, NCHRP published a synthesis titled Performance-Based Highway Maintenance and Operations Management that summarized the role of MQA programs within the broader context of performance-based management, which is supported under the recent highway legislation commonly known as Moving Ahead for Progress in the 21st Century (or MAP-21). One objective of these initiatives has been to expand the use of MQA programs among state highway practitioners and to share experiences so that the state of the practice continues to advance. This synthesis, which documents the current field inspection procedures being used to support a

6 • Asset—A physical item of roadway infrastructure that has value. Assets are sometimes referred to as roadway “furniture” or “features.” An asset may be a single item, such as a sign, or a linear item, such as a road or guardrail section. An asset may also be a spatial item, such as a rest area or mowable acreage. • Asset inventory—A physical count of assets. The count may be by coordinates, milepoints, road section, geographical area, road network, maintenance section, or other convenient method of sorting and reporting the amount of assets in the road system. • Category—Logical groups of maintained assets that are combined because of their common function or location on the highway, such as pavements and drainage structures. • Characteristic—Specific performance measures that are rated for each feature. • Condition assessment—A physical inspection and rating of roadway assets to determine the condition of individual assets, roadway sections, or overall road networks. • Feature—Assets that are contained in a category. For instance, the traffic category might include guardrails, impact attenuators, and barriers. • Independent assurance (IA)—An assessment of the reliability of test results that is performed by a third party not directly responsible for process control or acceptance testing. The survey found that other terms may be used by some agencies for this activity. • Level of service (LOS)—A measure of the condition of individual assets as well as the overall condition of the roadway. LOS measures are generally specified in customer service terms related to safety, preservation, convenience, aesthetics, comfort, and mobility. Some agencies also measure LOS in terms of environmental impacts or legislative mandates. • Maintenance management system (MMS)—A modern MMS at a high level of maturity integrates organization structure, business processes, and technology to provide a systematic approach for planning and executing an efficient customer-oriented and performance-based maintenance program. At the most basic level, an MMS tracks maintenance activities, costs, and resources. • Maintenance Quality Assurance (MQA)—A process of physically inspecting and rating the condition of the roadway assets and maintenance services. The quality assessment employs the same measures used to set performance targets. The data from the maintenance quality assessment are used to assess outcomes, actual performance, and maintenance LOS. • Performance measure—A quantifiable measure of performance to determine progress toward specific, defined organization objectives based on statistical evidence. Sample measures include height of grass, number of potholes per lane mile, and percent of signs below standard. • The use of MQA data to support agency business processes and outreach activities with both internal and external stakeholders. • New initiatives and technologies that are being considered to improve existing MQA programs. The information contained in this synthesis was obtained using three sources. First, a literature review was conducted to provide background information about the state of the practice and recent developments that have taken place in the implementation and use of MQA programs. Second, a survey was distributed to voting members of AASHTO Subcommittee on Maintenance (SCOM) for each of the 50 states and the District of Columbia, asking for information on their MQA field inspection practices. A total of 40 states (80% of the 50 states) responded to the survey. Finally, phone or face-to-face interviews with representatives from eight DOTs were conducted to expand on the following three aspects of their program: • The rationale and motivation for initiating their MQA program. • The procedures used to ensure the quality and consistency of the MQA data and results. • The impact their methodology has had on how the MQA results can be used to support agency decisions. The eight state transportation agencies selected to participate in the interviews were chosen based on several factors, including their expressed willingness to provide additional information. To ensure that a range of approaches were represented in the case examples presented in the document, selection factors also included the age of the program, the use of automated or manual approaches to collect data, the degree of detail in the survey approach, and the use of in-house versus contract personnel to collect the data. The information obtained from these three sources was used to develop the findings presented in this synthesis. TERMINOLOGY Several terms used throughout the synthesis were defined in the survey of state practice for use by the practitioners in preparing their responses. These terms, and the definitions that were provided, are listed here. These same definitions were used in presenting the survey results in this document. • Agency district/region—Different geographic areas of responsibility within a given agency. • Agency division/section—Various areas within a given agency; includes such divisions/sections as materials, construction, roadway design, planning, maintenance, and so on.

7 • Performance target—A targeted level of an activity or performance expressed as a tangible, measurable goal against which achievement can be compared. A performance target is usually a numerical rating, such as “pavement drop-off less than x inches,” but it could also be an overall rating, such as a targeted LOS equal to “A” in an A to F rating scale. • Quality assurance (QA)—All planned and systematic actions necessary to provide confidence that a product or facility will perform satisfactorily in service. The survey found that other terms may be used by some agencies for these activities. • Quality control (QC)—Actions and considerations necessary to adjust a process to ensure the process produces reliable results. The survey found that other terms may be used by some agencies for these activities. • Sampling—A small group of sections selected from the entire population (usually statistically) that is used to represent the condition of the entire population. Other terminology in this synthesis and in the literature review should be interpreted in context. The meanings will generally be clear from the definitions provided, the discussions presented, or the examples provided by the source. REPORT ORGANIZATION This synthesis of practice is organized into the five chapters described here. • Chapter one—Introduction. This chapter introduces the synthesis, providing background information and summarizing the scope and organization of the document. • Chapter two—Literature Review. The findings from the literature are summarized and presented in this chapter. Relevant topics covered in the literature review include the MQA approaches that are generally used, the methodologies used to collect inventory and condition data, and the use of MQA results. • Chapter three—State of the Practice. The results of the survey of state practice are presented in this chapter by topic area. These include the following: – Survey content; – MQA program status; – Data collection and quality assurance procedures; – Use of MQA data; and – Innovations, improvements, and enhancements. • Chapter four—Case Examples. This chapter summarizes the information provided by the eight agencies that were interviewed, in terms of the three topic areas that were explored in more detail: the rationale and motivation for their program, the procedures used to ensure quality, and the impact the methodology has had on the use of their MQA data. • Chapter five—Conclusions. The synthesis concludes with a summary of key observations from the findings and suggestions for further research and outreach in the MQA area. • Appendices—Two appendices are included with the synthesis. The first appendix (Appendix A) provides a copy of the questionnaire that was distributed electronically to the state participants. The second appendix (Appendix B) presents the responses by state for each of the questions posed to the survey participants. Both appendices are available in the online version of the report.

Next: CHAPTER TWO Literature Review »
Maintenance Quality Assurance Field Inspection Practices Get This Book
×
 Maintenance Quality Assurance Field Inspection Practices
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Synthesis 470: Maintenance Quality Assurance Field Inspection Practices summarizes practices used by state transportation agencies to support maintenance investments. Appendices A and B are only available through the online PDF document.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!