National Academies Press: OpenBook

Data to Support Transportation Agency Business Needs: A Self-Assessment Guide (2015)

Chapter: Chapter 4 - Phase 2: Conducting the Assessment

« Previous: Chapter 3 - Phase 1: Preparing for the Assessment
Page 22
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 22
Page 23
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 23
Page 24
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 24
Page 25
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 25
Page 26
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 26
Page 27
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 27
Page 28
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 28
Page 29
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 29
Page 30
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 30
Page 31
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 31
Page 32
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 32
Page 33
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 33
Page 34
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 34
Page 35
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 35
Page 36
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 36
Page 37
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 37
Page 38
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 38
Page 39
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 39
Page 40
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 40
Page 41
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 41
Page 42
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 42
Page 43
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 43
Page 44
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 44
Page 45
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 45
Page 46
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 46
Page 47
Suggested Citation:"Chapter 4 - Phase 2: Conducting the Assessment." National Academies of Sciences, Engineering, and Medicine. 2015. Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Washington, DC: The National Academies Press. doi: 10.17226/23463.
×
Page 47

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

22 This chapter provides specific guidance for the champion and the facilitators on conducting data value and data management assessments. It assumes that agreement has been reached on a set of assessments to conduct and participants have been identified for each assessment team based on the guidance in Chapter 3. Following a summary of each assessment tool, step-by- step guidance is provided for conducting assessments and summarizing the results that will be required for Phase 3. Data Value Assessment Tool Overview The Data Value Assessment Tool takes the assessment team through a process of rating the availability, quality, and usability of data required to meet a defined set of business functions. As described in Chapter 2: • Data Availability addresses whether the agency has the right kinds of data in place, at the right level of detail, with sufficient coverage to meet its information needs. Example: if a project manager needs to understand how much of their budget has been expended, but there are no tracking systems in place for this, one would say that expenditure data is not available. • Data Quality addresses whether the data available are good enough to meet the agency’s infor- mation needs. The data value assessment looks at three aspects of data quality of particular concern to data users—currency, accuracy, and completeness. Example: if a project manager gets budget status reports, but they are 1-month old they may not be sufficiently current or timely. If reports only include internal staff charges but not contractor costs, one might say that the expenditure data is not sufficiently complete. If there are known errors or inconsistencies between reports, the data might not be sufficiently accurate or reliable to meet agency needs. Additional aspects of data quality are considered under the data management assessment. • Data Usability addresses whether the agency’s data is being provided in a convenient form for analysis and interpretation. Data usability includes consideration of how easily data can be accessed and how well it is integrated, analyzed, and presented in a convenient form for users and customers. Example: if a project manager gets two sets of monthly reports—one for internal charges and one for contractor charges, and the manager must combine them manually to get the full picture, one would say that the expenditure data has poor usability. Availability is assessed with respect to specific business activities. Quality and Usability are assessed for each of the major data sources used for performing the selected business activities. The assessment categories ratings are Excellent, Good, Fair, and Poor. To calculate weighted rat- ings across various data sources, the assessment also asks team members to rate the importance of each data source to each business activity. Figure 6 summarizes the different elements and ratings of the data value assessment. C H A P T E R 4 Phase 2: Conducting the Assessment

Phase 2: Conducting the Assessment 23 Once an agency has applied the Data Value Assessment Tool for all of the key major functional areas, a high-level view, such as that shown in Figure 7, can be developed to highlight variations in how well data is working to meet agency needs. Each data value assessment provides one row of this chart. Data Management Assessment Tool Overview Overview of the Tools Two data management tools are available—one for an agency-wide assessment and one for an assessment of a specific data program or data category. These tools are similar in structure, but vary in the assessment elements and criteria. The data management tools take the agency-wide or data program management assessment team through a process of rating current data manage- ment processes. As described in Chapter 2, the following elements are considered: • Data Strategy and Governance is concerned with how the agency and individual business units decide what data to collect and how best to manage and deliver it. This area of concern Business Area Availability Quality Usability Maintenance Management Good Fair Fair Pavement Management Excellent Good Good Safety Planning Excellent Good Fair Performance Management Fair Fair Good Project Scoping Good Fair Good Construcon Management Good Good Fair Corridor Planning Good Good Poor Figure 7. Sample data value assessment summary. Element Rang Data availability—is data available at the right level of detail, with sufficient coverage? Poor. Lile or no data available to support this acvity Fair. Limited data available—large gaps remain Good. Basic data is available—some gaps remain Excellent. Sufficient data is available to meet needs Data quality—are data sufficiently accurate, credible, complete, and current to support decision making? Poor. Quality not sufficient—data not useful Fair. Lack of currency, accuracy, or completeness limits value Good. Acceptable but needs improvement Excellent. Sufficient to meet needs Usability—can data be easily accessed, integrated, analyzed, and presented as needed to support decision making? Poor. Requires substanal effort to get data into usable form Fair. Requires moderate effort to get data into usable form Good. In usable form but reporng improvements helpful Excellent. In usable form and no improvement is needed Figure 6. Data value assessment elements and ratings.

24 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide includes establishing, enforcing, and sustaining data management strategies, roles, account- ability, policies, and processes. • Data Architecture and Integration is concerned with practices to standardize and integrate data and includes standardizing spatial referencing and other key linkages across data sets and minimizing data duplication and inconsistencies. • Life Cycle Data Management is concerned with the operational aspects of managing data to ensure that it is adequately maintained, preserved, protected, documented, and delivered to users. • Data Collaboration is concerned with achieving efficiencies through processes to coordinate data collection and management within the agency and to partner with external organizations to share data. • Data Quality Management is concerned with practices to define required levels of quality, measure and report data quality, ensure quality as new data is acquired, and improve the quality of existing data. Each of these areas is broken into a set of assessment sub-elements. For each sub-element, different maturity levels are defined that characterize a progression from an ad hoc approach to data management to an approach that is well-defined, documented, and institutionalized within the agency or data program. The complete set of assessment elements and sub-elements is provided in Appendix C, along with the criteria for each maturity level, a mapping of each sub-element to the AASHTO data principles, a discussion of the benefits of moving up the maturity scale, and a listing of relevant improvement actions that can be considered. Table 2 summarizes the different data management assessment elements, sub-elements, and maturity levels. Two sub-elements are only applicable for the agency-wide assessment; one is only applicable for data-program-specific assessments. Once an agency has completed a data management assessment for agency-wide functions and/or specific data programs or categories, a high-level view such as that shown in Figure 8 can be developed to highlight variations in how well data is working to meet agency needs. Each data management assessment provides one row of this chart. Element/Sub-element Descripon Maturity Levels 1. Data Strategy and Governance Leadership and management pracces to manage data as a strategic agency asset 1: Inial: Ad hoc and event driven, success due to heroic efforts of individuals 2: Developing: Recognized need for improvement, pilot iniaves under way 3: Defined: Defined and documented processes not yet stabilized or widely socialized 4: Funconing: Implemented processes—operang and adding value 5: Sustained: Evaluated and improved processes, sustained over me 1.1 Strategy and Direcon Leadership commitment and strategic planning to maximize value of data to meet agency goals 1.2 Roles and Accountability Clear roles, accountability, and decision-making authority for data quality, value, and appropriate use 1.3 Policies and Processes (Agency-wide Only) Adopon of principles, policies, and business processes for managing data as a strategic agency asset 1.4 Data Asset Inventory and Value Tracking of agency data assets and their value added 1.5 Relaonships with Data Customers (Program Specific Only) Connecons between data producers and users Table 2. Data management assessment elements and maturity levels.

Phase 2: Conducting the Assessment 25 Element/Sub-element Descripon Maturity Levels 3: Defined: Defined & documented processes not yet stabilized or widely socialized 4: Funconing: Implemented processes—operang and adding value 5: Sustained: Evaluated & improved processes, sustained over me 4. Data Collaboraon Internal and external collaboraon to maximize data sharing and avoid duplicaon of effort 4.1 Internal Agency Collaboraon Collaboraon across agency business units to leverage opportunies for efficiencies in data collecon and management 4.2 External Agency Collaboraon Partnerships with external enes to share data and avoid duplicaon 5. Data Quality Standards and pracces to ensure that data is of sufficient quality to meet user needs 5.1 Data Quality Measurement and Reporng Metrics and reporng to ensure user understanding of current data quality 5.2 Data Quality Assurance and Improvement Pracces for improving the quality of exisng data and ensuring the quality of newly acquired data 1: Inial: Ad Hoc and event driven, success due to heroic efforts of individuals 1.6 Data Management Sustainability Connuity of data management knowledge and experse through staff transions 2. Data Life Cycle Management Pracces for managing data throughout its life cycle from collecon to archiving or deleon 2.1 Data Updang Well-defined and coordinated data update cycles 2.2 Data Access Control Well-defined policies and guidelines for managing access to data sets 2.3 Data Findability and Documentaon Availability of data catalogs and diconaries that enable discovery and understanding of available agency data assets 2.4 Data Backups and Archiving Guidelines and procedures for protecon and long-term preservaon of data assets 2.5 Data Change Management Processes to minimize unancipated downstream impacts of data changes 2.6 Data Delivery Delivery of data to users in various convenient, useful, and usable forms 3. Data Architecture and Integraon Technical standards, processes, tools, and coordinaon mechanisms to maximize data integraon and minimize duplicaon 3.1 Locaon Referencing Common locaon referencing methods across agency data sets 3.2 Geospaal Data Management (Agency- wide Assessment Only) Standardized approach to managing geospaal data 3.3 Data Consistency and Integraon Standards and pracces to ensure use of consistent coding and common linkages so that different data sets can be combined to meet business informaon needs 3.4 Temporal Data Management Standardizaon of date-me data elements to enable trend analysis and integraon across data sets collected and updated on varying cycles 2: Developing: Recognized need for improvement, pilot iniaves underway Table 2. (Continued).

26 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide Assessment Phase Guidance Purpose of the Assessment The purpose of the assessment is to identify opportunities to improve decision making and make more effective use of existing data. Agencies do not need to strive for data perfection across the board. Such data perfection would neither be feasible, given resource constraints, or necessarily desirable from a benefit-cost perspective. The assessment tools provide a framework within which agencies can identify the current state of data and the current state of data manage- ment practices. This provides a baseline for discussion about potential improvements. Although application of the assessment tools will suggest potential improvements, the agency assessment teams need to evaluate whether or not each type of improvement makes business sense. For example, higher levels of data management maturity are typically characterized by formal docu- mented processes and procedures. These can require considerable investments in staff time to create, maintain, and operationalize within the agency. In some cases, these investments are worth it (e.g., where an undocumented, chaotic process creates unacceptable risks of providing inaccurate performance data to the state legislature). In other cases, formalizing processes may not be appropriate (e.g., where experimentation is being encouraged for a new type of data). In deciding whether to maintain the status quo or take steps to improve, agencies can weigh the risks of doing nothing and the likely returns from moving forward. In developing strategies for improvement, the concept of diminishing returns is useful. Agencies can strive to invest in data improvements until the marginal cost of making (and sustaining) the improvement is equal to the marginal benefits gained. Overview of Activities Both the data value and the data management assessment feature two workshops during the Assessment Phase. Assessment Phase activities are summarized in Worksheet 8. The worksheet identifies participants and inputs and outputs for each activity. Activities 1 through 4 are con- ducted separately for each assessment. Activity 5 is conducted at the end of Phase 2 to summarize the results from all of the individual assessments. Data Program Strategy & Governance Life Cycle Mgt. Arch. & Integraon Collab- oraon Quality Overall Level Agency-wide 2-Developing 3-Defined 2-Developing 2-Developing Not Assessed 2-Developing Traffic Monitoring 3-Defined 4-Funconing 3-Defined 5-Sustained 4-Funconing 4-Funconing Crash Data 5-Sustained 4-Funconing 3-Defined 5-Sustained 4-Funconing 4-Funconing Pavement Inspecon 1-Inial 4-Funconing 3-Defined 1-Inial 5-Sustained 3-Defined STIP/Capital Projects 3-Defined 5-Sustained 1-Inial 2-Developing 2-Developing 3-Defined Financial 5-Sustained 5-Sustained 4-Funconing Not Assessed 4-Funconing 4-Functioning Figure 8. Sample data management assessment summary.

Phase 2: Conducting the Assessment 27 Step 1: Conduct Assessment Preparation Meeting For Data Value Teams Tool Configuration The Data Value Assessment Tool is generic because it is intended to be applicable for any DOT business function that depends on data. Therefore, the tool must be configured for each business area in which it is to be applied. Configuration consists of three steps—each of which can be accomplished at the Assessment Workshop Planning Meeting: 1. Specify the business area to be assessed by recording the selected area in the space provided on the tool’s Configuration tab 2. Break your selected business area into specific activities 3. Identify types of data needed to perform these activities Selecting Business Activities. Breaking business areas into activities enables the assessment team to focus on specific ways that data is used (or could be used). A comprehensive breakdown of all activities for the business area is not necessary. Criteria for identifying (and describing) activities are as follows: • Each activity should be important to the success of the overall business area. • Each activity should be clearly and consistently understood by different members of the assessment team. Acvity Parcipants and Inputs Outputs (Results) q 1. Conduct Assessment Prepara on Mee ng (one per assessment team) Champion, Facilitator, Staff, Managers of selected business units Assessment Rosters Tool Configura on Workshop Agenda Workshop Invita ons—from execu ve sponsor, champion, and/or group managers q 2. Conduct Assessment Workshop (one per assessment team) Facilitator, Champion (op onal), Assessment Team—bring laptops with assessment tool + projector for group assessment exercise Consensus ra ngs and par cipant comments on key gaps q 3. Prepara on Mee ng for Gaps and Candidate Ac ons Workshop (one per assessment team) Champion, Facilitator, Staff, Managers of selected business units Summary assessment results Handouts for Gaps Workshop q 4. Gaps and Candidate Ac ons Workshop (one per assessment team) Facilitator, Champion (op onal), Assessment Team– Projector for group gap valida on and ac on iden fica on exercise Validated gaps with business impacts List of candidate ac ons to close the gaps q 5. Assessment Results Analysis and Summary (a single mee ng for all assessments combined) Champion, Facilitator, Staff Summary workshop results for presenta on and synthesis with other assessment areas Worksheet 8. Assessment Phase activity checklist.

28 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide • Activities should not be fundamentally about data collection or processing—they should be activities where data is or could be used to make better decisions or respond to information needs. • Activities can include areas that could benefit from greater data availability, not just areas where data needs are met. • Activities can be those that the agency performs or those that the agency would like to perform in the future. The following generic set of activities that require data can be used as a starting point for any DOT business area—these can be tailored as needed to represent activities specific to the agency: • Monitor results or performance against established objectives • Track expenditures, resources used, and accomplishments • Assess future needs for budgeting or lining up new/different resources • Diagnose root causes for limited performance or inefficiencies • Plan, prioritize, or schedule actions to be taken In the Data Value Assessment Tool, sample activities for different business areas are included on the Example Lists tab. Figure 9 shows examples of sample business areas and associated activities included in the Example Lists tab of the tool. Sample Business Areas and Associated Ac vi es Asset Management Maintenance Management Project Scoping Traffic Opera ons Management Safety Planning Corridor Planning Pavement Needs and Risk Assessment Maintenance Budge ng Current Condi ons Assessment Incident Management Network Screening Current Condi ons Assessment Pavement Resource Alloca on and Treatment Selec on Maintenance Acvity Tracking Scope Development Traveler Informaon Counter- measure Analysis Future Demand Analysis Bridge Needs and Risk Assessment Equipment Management Schedule and Budget Development Signal Timing and Coordinaon Counter- measure Design Alternaves Evaluaon Bridge Resource Allocaon and Treatment Selecon Materials Management Acve Traffic Management Counter- measure Evaluaon Strategy Priorizaon Other Asset Needs Assessment and Budgeng Winter Maintenance— Snow Route Planning, Snow Plow Tracking Project Priorizaon Cross-Asset Tradeoffs Figure 9. Data value tool—Example Lists tab sample business areas and associated activities.

Phase 2: Conducting the Assessment 29 Figure 10 shows the Configuration tab. To use one of the standard business areas and asso- ciated activities from the Example Lists tab, copy them from the Example Lists and paste them into the Configuration tab. Otherwise, enter the business area and activities in the spaces provided. Identifying Data Types. The third configuration step is to identify the major types of data applicable to the collection of activities that you have defined. The idea here is not to identify all data types used for the activities comprehensively, but rather the ones that either add the most value to the results or that could (with improvement) add value to the results. Figure 10. Data value tool—configuring business areas, activities, and data sources.

30 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide A sample list of data types is included on the Example Lists tab. You can copy and paste from this list or enter your own. Criteria for identifying data types to include are as follows: • Each type of data should be important to at least one of the defined business activities, but not necessarily to all of them. • Each type of data should be sufficiently well-defined that members of the assessment team have a common understanding of what type is included. (For example, using the term “traffic data” could lead to confusion about whether this includes both real-time data and count data.) • Each type of data should be specific enough so that assessment team members can rate its quality and usability. Although it is OK to include multiple data sets under a single “data type” (e.g., “environmental data” encompasses data from multiple sources), this may make it more challenging to assign meaningful and consistent ratings if there is wide variation in data avail- ability, quality, or usability across the data sets included in the single data type. • At least some members of the assessment team should have sufficient familiarity with each type of data to be able to rate its quality and usability. • Up to 20 data types can be selected, but the amount of time needed to conduct the assessment process will be directly affected by the number of data types, because each data type is assessed individually. In general, selecting between 5 and 10 data types is recommended. Once the business areas and data types have been configured, click the Apply button at the top of the Configuration tab. This updates the other tabs of the worksheet. See Table 3 for examples of Data Value Assessment Tool configurations. Agenda Development A full day should be allocated for the data value assessment workshop, although it may take less time when the assessment team is smaller and when the tool is configured to include rela- tively few business activities and data types. A sample agenda is shown in Figure 11. For Data Management Teams The data management tools allow for configuration of parameters affecting how maturity levels for each element are calculated. Figure 12 shows the Configuration tab for the Agency- wide tool—the Configuration tab for the Program-specific tool is similar, but shows a slightly different set of sub-elements. Agencies can tailor the data management elements to meet their own needs, priorities, or areas of focus. If a particular data management element is not applicable or relevant to the assess- ment discussion, its weight can be set to 0%. Then, any elements weighted at 0% will not affect overall assessment results. There are two types of configuration for this tool: (1) weights on elements and sub-elements and (2) selection of a threshold value. Adjusting weights. Sub-element weights are used in the tool to calculate maturity levels for elements based on sub-element maturity levels. Element weights are used to calculate overall data management maturity levels based on element maturity levels. By default, all sub-elements in an element are weighted equally and each of the five elements has an equal weight in calculat- ing the overall maturity level. Weights can be adjusted as desired, but all of the element weights need to sum to 100%, and the sub-element weights for each element also need to sum to 100%. One reason to adjust weights is if a particular element or sub-element is not applicable for your agency or the particular data program you are assessing. In that case, set the element or sub-element weight to 0 and adjust the other weights so that they sum to 100.

Phase 2: Conducting the Assessment 31 A second reason to adjust weights is to give priority to certain elements. For example, if you think that under data collaboration that internal collaboration is more important than external collaboration, you might assign internal agency collaboration a weight of 80% and external col- laboration a weight of 20%. Adjusting the threshold value. The threshold value affects how maturity levels are assigned to each sub-element. To assess the maturity level for each sub-element, the data management tools present five descriptions of current practice—corresponding to the five maturity levels. Because there may be situations where it is difficult to place agency practice in a single maturity level, members of the assessment team are asked to indicate the extent to which they think that each description reflects the current state of agency practice. Options are as follows: • 1-Totally Disagree • 2-Somewhat Disagree Business Area Business Acvies Data Types Maintenance Management Track maintenance level of service Track maintenance expenditures, resources used, and accomplishments (outputs) Develop future-year maintenance budget requests Idenfy opportunies for improved efficiency Plan, priorize, and schedule work Road Inventory Maintenance Feature Inventory Maintenance Feature Condion/Performance Maintenance Work Orders Budget Allocaons Project Scoping and Design Project Management/Project Control Prepare Design Plans Environmental Review Review Exisng Condions/Idenfy Needs Create Concept Reports Road Geometry Traffic Counts, AADT, Classificaon Asset Inventory Environmental Data Crash and Fatality Data Construcon Project Status Data Right-of-Way Data Facilies Management Track facility inventory and condion (Includes both buildings and system/components) Track facility capital and maintenance expenditures and work accomplishment Idenfy candidate projects for rehabilitaon, replacement, and expansion/addion Diagnose causes of high maintenance costs or inefficiencies Priorize candidate projects and develop resource-constrained improvement program Facility Inventory Facility Inspecon/Condion Maintenance Records Facility Improvement Program Budgets and Expenditures Congeson/Mobility Improvement Transportaon system performance monitoring Scoping and design of candidate projects Corridor and long-range planning, mul-modal planning Real-me traffic and incident management Improvement program development/priorizaon Road Inventory Traffic Counts, AADT, Classificaon Bike Routes and Paths, Non-Motorized Travel Counts Real-Time Traffic Volume/Occupancy, Travel Time Construcon Project Scope and Status Table 3. Example Data Value Assessment Tool configurations.

32 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide • 3-Somewhat Agree • 4-Totally Agree The tool assigns a maturity level to the practice description representing the highest maturity level that receives a rating equal to or higher than a configurable threshold value of either 3-Somewhat Agree or 4-Totally Agree. The default value of this threshold value is 3-Somewhat Agree. However, this can be changed to 4-Totally Agree if you only want to assign a maturity level when the assessment team indicates this strongest level of concurrence with the description of a particular level. Figure 13 shows an example of a sub-element rating. In this example, by default, a maturity level of 4-Functioning would be assigned for the data updating sub-element because this is the highest maturity level description that received Data Value Assessment Workshop Agenda 9:00 AM Background • Introducons • Assessment Purpose • Assessment Steps and Schedule 9:45 AM Assessment Content • Business Acvies and Data Types • Assessment Results and Definions 10:30 AM Importance Rangs • Instrucons • Individual Rangs • Consensus Rangs 11:30 AM Availability Rangs • Instrucons • Individual Rangs • Consensus Rangs 12:30 PM Lunch 1:30 PM Quality Rangs • Instrucons • Individual Rangs • Consensus Rangs 2:30 PM Usability Rangs • Instrucons • Individual Rangs • Consensus Rangs 3:30 PM Results Review • Results and their Derivaons • Discussion and Adjustment 4:00 PM Wrap-Up • Feedback • Next Steps Figure 11. Data value assessment workshop sample agenda.

Phase 2: Conducting the Assessment 33 Figure 12. Data management tool configuration tab. 2.1. Data Updang: well-defined and coordinated data update cycles. 2.1.1 Data updang cycles and business rules for data updates have not been defined 1-Totally Disagree 2.1.2 Updang cycles have been established but have not been documented 2-Somewhat Disagree 2.1.3 Updating cycles have been documented Business rules have been defined for how key data enes are added, updated, and deleted 4-Totally Agree 2.1.4 Business rules for data updang are embedded in and enforced by applicaons 3-Somewhat Agree 2.1.5 Data updang methods are periodically reviewed to idenfy opportunies for improved efficiencies 2-Somewhat Disagree Figure 13. Example of data management tool rating method.

34 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide a rating of at least 3-Somewhat Agree. Responses in the example indicate that the agency has defined updating cycles and business rules for data updates and has documented these updating cycles. They have to some extent embedded business rules for data updating within applications. If the tool was configured so as to change the threshold from 3-Somewhat Agree to 4-Totally Agree, a maturity level of 3-Defined would be assigned—corresponding to the practice descrip- tion in 2.1.3. Assessment Workshop Planning Meeting: Agenda Development A full day should be allocated for the data management assessment workshop, although it may take less time when the assessment team is smaller and when the tool is configured to exclude multiple sub-elements. A sample agenda is shown in Figure 14. Step 2: Assessment Workshop Separate guidance is provided for data value and data management assessment teams. How- ever, the facilitator should be prepared for both types of teams to address the following sets of assessment challenges: • Understanding the context for the assessment. Ensure that team participants understand that the results will not be used to judge individuals or business units, but to identify improve- ments to benefit the agency. • Understanding terminology. Before asking participants to select ratings for any element, be sure that they are clear on both the assessment elements and the definitions of the ratings. • Varying ratings within an assessment element. That there may be variation within a given ele- ment can make it difficult to assign a single rating. For example, for a data value assessment that identifies “travel data” as a source, participants may state that data for vehicular travel has excellent availability but data for pedestrian travel has poor availability. In this situation, an overall rating should be assigned that reflects the relative importance of these two varieties of travel data for the business activities included. However, the comments area should be used to identify areas of specific weakness so that data gaps can be identified and addressed. For the data management assessment, there will likely be situations—particularly with the agency-wide assessment—where practices for some types of data are more mature than for others. Again, ratings should be assigned reflecting the predominant situation, but particular areas of weakness should be noted as gaps. For Data Value Teams At this workshop, the facilitator will lead members of the assessment team in completing the Data Value Assessment Tool. The recommended approach at this workshop follows. Background: Why Are We Here? • Describe why the agency is conducting the data self-assessment and how it plans to use the results. Identify the executive sponsor. • Describe why this business area was selected. • Summarize the schedule of meetings—for both the Assessment and (if available) Implemen- tation and Monitoring phases. • Provide an opportunity for questions.

Phase 2: Conducting the Assessment 35 Data Management Assessment Workshop Agenda 9:00 AM Background • Introducons • Assessment Purpose • Assessment Steps and Schedule 9:45 AM Assessment Content • Element Overview • Maturity Level Descripons • Rang Procedure 10:30 AM Strategy & Governance Rangs • Sub-Element Descripons • Individual Rangs • Consensus Rangs 11:15 AM Life Cycle Management Rangs • Instrucons • Individual Rangs • Consensus Rangs 12:00 PM Lunch 1:00 PM Architecture and Integraon Rangs • Instrucons • Individual Rangs • Consensus Rangs 1:30 PM Collaboraon Rangs • Instrucons • Individual Rangs • Consensus Rangs 2:00 PM Data Quality Rangs • Instrucons • Individual Rangs • Consensus Rangs 2:30 PM Results Review • Results and their Derivaons • Discussion and Adjustment 3:15 PM Wrap-Up • Feedback • Next Steps Figure 14. Data management assessment workshop sample agenda.

36 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide Activities and Data Types • Describe how the activities for the business area were selected. • Ask assessment team members to describe each activity; provide clarification if there are differences in how the different activities are understood. • Describe how the data types were selected and clarify what they include. • Ask for a show of hands from the assessment team members about who has used each type of data and feels qualified to rate its quality and usability. Assessment Results and Definitions • Talk about the result of the data value assessment—show the sample in Figure 7. Explain that this process will allow the agency to take a broad look at data needs across different business areas—integrating the perspectives of people who work with data on a daily basis as well as people making decisions based on data. • Present the definitions of data availability, quality, and usability. • Ask members of the assessment team to provide their own examples for each—to make sure that the group understands these concepts. Importance Tab • Describe the first activity: to identify the importance of each data type for each of the specific activities. • Explain that this information will be used in calculating final ratings. Provide an example: let’s say that both traffic data and pedestrian data are used for project scoping, and the quality of traffic data is very good but the quality of pedestrian data is low. If traffic data is rated as having “High Importance” and pedestrian data is rated as having “Low Importance,” the overall data quality score for project scoping will be higher than if both of these data types were rated as having “High Importance.” • Discuss the different importance ratings: – High Importance—Essential, can’t perform this activity without it – Medium Importance—Valuable, could do without it but it would affect value or credibility of results – Low Importance—Helpful, but could do without it – NA—Not helpful or relevant for this activity • Discuss how there may be some data types that aren’t being used for a given activity but still may be important for that activity. For example, it may be that information on mainte- nance activity costs is not available or not reliable, and therefore it is not used for budgeting. However, if high-quality cost information were available, it would be important for the bud- geting activity. Therefore, this type of data should be rated as being of Medium Importance— because budgeting is happening without it, but the credibility of budgeting results is suffering from the lack of good cost data. • Ask each team member to individually complete ratings for the first activity. When they are finished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties. • Ask each team member to complete the remaining ratings on the Importance Tab. Ask them to provide a brief 1- to 2-sentence comment on why they selected the ratings they did. • When everyone has completed the exercise, open a copy of the Data Value Assessment Tool, and project it on a screen. • Select a member of the assessment team and ask them to state how they rated each of the data types for the first activity. Ask the group if anyone selected anything different. Discuss reasons for variation in each rating and enter the consensus rating for the group. If it is difficult to

Phase 2: Conducting the Assessment 37 achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. • Make sure to stay within the allotted time. The objective is not perfection, but to obtain the general sense of the group on the relative importance of each data source for the different activities. Availability Tab • Describe the second activity: to identify the availability of data for each of the activities. The purpose of this activity is to provide a general assessment of whether or not the agency has sufficient data to support the business area (i.e., answer the question: “Do we have the right data?”). • Explain that this activity is being done only for the activities (not for the data types) to highlight cases where there is an unmet need for a type of data that the agency doesn’t collect—and therefore might not be included on the selected list of data types. • Present the different availability ratings: – Poor—Little or no data to support this activity – Fair—Limited data and large gaps remain – Good—Basic data available, but some gaps remain – Excellent—Sufficient data is available to meet needs and there are no gaps • Provide examples for each rating (tailored to the agency if possible): – Poor Availability: Developing a bike/pedestrian plan—with little or no information on pedestrian/bike facilities or current travel patterns. – Fair Availability: Conducting network screening to identify candidate locations for countermeasures—using crash data but very limited road inventory data. – Good Availability: Projecting future pavement condition—based on 5 years of trend infor- mation on pavement deterioration—but some gaps in understanding of how deterioration rates vary by pavement type and traffic level. – Excellent Availability: Developing a maintenance budget—based on regularly updated unit costs for labor, equipment, and materials. • Ask each team member to individually complete the Availability Ratings. When they are fin- ished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties. • Select a member of the assessment team and ask them to state how they rated the first activ- ity. Ask the group if anyone selected anything different. Discuss reasons for variation in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For ratings other than “Excel- lent Availability,” record a comment on the master assessment tool that documents the gaps in data availability. Quality Tab • Describe the third activity: to rate the quality of each data type. The purpose of this activity is to provide a general assessment of whether or not the agency’s data is of sufficient quality to support the business area (i.e., answer the question: “Is our data good enough?”). • Define the three different dimensions of data quality: – Currency—the extent to which the data represents current conditions. – Accuracy—the degree to which the data represents actual conditions as they existed at the time of measurement.

38 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide – Completeness—the degree to which the data provides sufficient coverage and includes values for all required data elements. For example, a data set may be considered incomplete because it is missing coverage of some portion of the road network or some time periods or some classes of travelers. • Stress that data quality should be rated relative to what the needs are. For example, data for planning purposes can be less accurate than data for design purposes. Data for traveler information needs to be more current (i.e., real time) than data for monthly or annual per- formance reporting. • Explain that quality ratings are to be assigned for each data type—considering the most demanding needs across the identified activities. • Present the different quality ratings: – Poor: Data is not current, accurate, or complete enough to be useful – Fair: Data is useful but lack of currency, accuracy, or completeness limits value – Good: Data quality is acceptable, but should be improved – Excellent: Data quality is sufficient for this activity—no improvements are needed – NA: Don’t know—not enough information • Ask each team member to individually complete the three sets of Quality Ratings for each data type, and provide a comment about why they assigned the ratings they did (i.e., what quality issues exist with each data source). When they are finished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties. • Select a member of the assessment team and ask them to state how they rated the first data type. Ask the group if anyone selected anything different. Discuss reasons for variation in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For ratings other than “Excel- lent,” record a comment on the master assessment tool that documents the gaps in data quality. Usability Tab • Describe the final activity: to rate the usability of each data type. The purpose of this activity is to provide a general assessment of whether or not the agency’s data is provided in a convenient form to support the business area. • Provide examples of what makes data more (and less) usable: – Effort required to find the appropriate data (e.g., a web page or catalog) – Effort required to access the data (e.g., self-service versus special request) – Available tools and effort required to report and visualize data – Effort required to combine different data tables or data sets as needed (e.g., integrating separate data tables for rigid and flexible pavements or integrating data tables that use dif- ferent spatial referencing methods) – Effort required to understand what the data means and how it was collected (e.g., avail- ability of a data dictionary and metadata about the data set itself) • Stress that data usability should be rated in the context of the identified business activities that involve use of data. • Explain that ratings are to be assigned for each data type—considering the most demanding needs across the identified activities. • Present the different usability ratings: – Poor: Data is available, but requires substantial effort to translate into usable form – Fair: Data is available, but requires moderate effort to translate into usable form – Good: Data is available in a usable form, but improvements to reporting capabilities would be helpful

Phase 2: Conducting the Assessment 39 – Excellent: Data is available in a usable form—no improvements are needed – NA: Don’t know—not enough information • Ask each team member to individually complete the usability ratings, and provide a comment about why they assigned the ratings they did (i.e., what usability issues exist with each data source). When they are finished, ask if they had difficulty assigning ratings. Provide clarifica- tion as needed to address their difficulties. • Select a member of the assessment team and ask them to state how they rated the first data type. Ask the group if anyone selected anything different. Discuss reasons for variation in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For ratings other than “Excel- lent,” record a comment on the master assessment tool that documents the gaps in data usability. Results Tab • Discuss the results and describe how they are derived from the assessment exercise: – Results by Data Source—Importance: This is based on the highest importance rating assigned for the data source across all of the activities on the Importance tab. – Results by Data Source—Quality and Usability: These are taken directly from the responses on the Quality and Usability tabs. – Results by Business Activity—Availability: These are taken directly from the responses on the Availability tab. – Results by Business Activity—Quality: First, the overall quality rating is considered to be the limiting (lowest) quality rating among currency, accuracy, and completeness. Then, the overall quality ratings of each data source are weighted by activity-level importance ratings to determine activity-level quality ratings. This weighting is accomplished by first converting the quality and importance ratings from qualitative to quantitative values. Importance rating values are 0-NA, 1-Low, 2-Medium, 3-High. Quality ratings are 0-NA, 1-Poor, 2-Good, 3-Fair, 4-Excellent. Then, the quantitative quality ratings are multiplied by a weighting factor (the ratio of the data source importance score to the sum of the impor- tance scores for all data sources). The resulting weighted numeric score is then converted back to a quality rating. – Results by Business Activity—Usability: the usability ratings from the Usability tab are weighted by importance using a method analogous to the one described for Quality. – Overall Business Area Results: The business activity results are aggregated to the overall business area level using two alternative methods. The “conservative” (or limiting activity) method takes the lowest rating across the different business activities. The “optimistic” method takes the average rating across the different business activities. Agencies can select one of these for inclusion in their summary presentation about the assessment results. • Ask team members if the results appear to be an accurate reflection of data availability, quality, and usability. Provide an opportunity to go back and adjust the consensus ratings, making sure that any changes are adequately supported with reasons and documented in the com- ments blocks. • Save the assessment results. Wrap-up • Thank the participants for their time and effort. • Ask for feedback to be used for improving future assessment workshops. • Remind participants of the timing and scope for the second workshop.

40 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide For Data Management Teams The recommended approach for the data management assessment workshop is shown below. Background: Why Are We Here? • Describe why the agency is conducting the data self-assessment and how it plans to use the results. Identify the executive sponsor. • Describe how this team was selected to participate. • Provide an overview of the schedule of meetings—for both the Assessment and (if available) Implementation and Monitoring phases. • Provide an opportunity for questions. Assessment Content • Describe each of the assessment elements—using the definitions in Table 2. • Describe the maturity levels—using the definitions in Table 2. • Talk about the result of the data management assessment—show the sample results in Figure 8. Indicate that this process will allow the agency to look at how different types of data are managed, and that this will lead to identifying what should be done to get more value from data investments. • Describe the process that will be followed to complete the rating tool. Data Strategy and Governance Tab • Present an example of how to make a selection for each practice description for the first sub-element: – If the description does not reflect current practice in the agency, select 1-Totally Disagree – If the description partially reflects current practice in the agency, but is not the predomi- nant way things are done, select 2-Somewhat Disagree – If the description is the predominant way things are done, but elements of it are not fully in place, select 3-Somewhat Agree – If the description accurately describes current agency practice, select 4-Totally Agree – In a case where there are substantially different practices in place within the agency or data program, the group can choose to qualify their rating in the comments section. For example “We have a rigorous data quality assurance program for data set X, but not for data sets Y and Z—our answers represent practice for data sets Y and Z only.” • Ask each member of the assessment team to complete the entries for each of the elements on the tab, and provide a comment about why they assigned the ratings they did. When they are finished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties. • Select a member of the assessment team and ask them to state how they rated the first sub- element type. Ask the group if anyone selected anything different. Discuss reasons for varia- tion in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For maturity levels lower than “Functioning,” record a comment on the master assessment tool that documents comments from the group about why the level was selected. Data Life Cycle Management Tab • Ask each member of the assessment team to complete the entries for each of the elements on the tab, and provide a comment about why they assigned the ratings they did. When they are finished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties.

Phase 2: Conducting the Assessment 41 • Select a member of the assessment team and ask them to state how they rated the first sub- element type. Ask the group if anyone selected anything different. Discuss reasons for varia- tion in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For maturity levels lower than “Functioning,” record a comment on the master assessment tool that documents comments from the group about why the level was selected. Data Architecture and Integration Tab • Ask each member of the assessment team to complete the entries for each of the elements on the tab, and provide a comment about why they assigned the ratings they did. When they are finished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties. • Select a member of the assessment team and ask them to state how they rated the first sub- element type. Ask the group if anyone selected anything different. Discuss reasons for varia- tion in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For maturity levels lower than “Functioning,” record a comment on the master assessment tool that documents comments from the group about why the level was selected. Data Collaboration Tab • Ask each member of the assessment team to complete the entries for each of the elements on the tab, and provide a comment about why they assigned the ratings they did. When they are finished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties. • Select a member of the assessment team and ask them to state how they rated the first sub- element type. Ask the group if anyone selected anything different. Discuss reasons for varia- tion in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For maturity levels lower than “Functioning,” record a comment on the master assessment tool that documents comments from the group about why the level was selected. Data Quality Tab • Ask each member of the assessment team to complete the entries for each of the elements on the tab, and provide a comment about why they assigned the ratings they did. When they are finished, ask if they had difficulty assigning ratings. Provide clarification as needed to address their difficulties. • Select a member of the assessment team and ask them to state how they rated the first sub- element type. Ask the group if anyone selected anything different. Discuss reasons for varia- tion in each rating and enter the consensus rating for the group. If it is difficult to achieve consensus in a reasonable amount of time, use either a “majority rules” approach or appoint one member of the team to have the final say. • Continue through the other activities using the same process. For maturity levels lower than “Functioning,” record a comment on the master assessment tool that documents comments from the group about why the level was selected.

42 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide Results Tab • Select the Results Tab and review the maturity levels assigned for each element and sub-element. • Ask team members if the results appear to be an accurate reflection of agency practice. Provide an opportunity to go back and adjust the consensus ratings, making sure that any changes are adequately supported with reasons and documented in the comments blocks. • Save the assessment results. Wrap-up • Thank the participants for their time and effort. • Ask for feedback to be used for improving future assessment workshops. • Remind participants of the timing and scope for the second workshop. Step 3: Gaps and Candidate Actions Workshop Preparation Meeting For Data Value Teams In the Data Value Gaps and Candidate Actions Workshop Preparation Meeting, participants review the results of the assessment and prepare a list of gaps to be used as a starting point for the workshop. This ensures that there is continuity between the two workshops and enables the second workshop to proceed efficiently. Worksheet 9 provides a format for listing the gaps. At the Preparation Meeting, use the com- ments from the Data Value Assessment Tool to partially complete this worksheet (leave the Business Impacts column blank). A sample is included to provide examples of the types of gaps that should be recorded. ID Type of Data Gap Descripon Business Impacts Sample: Type of Data Gaps Business Impacts Maintenance Costs Availability: We need historical cost informaon for maintenance budgeng, but we only have aggregate expenditures, not costs by acvity. Beer data would improve our ability to link budget esmates with expected outputs. This would reduce the need for budget adjustments. Freight Availability: We would like to priorize projects based on economic benefits, but lack good data on freight flows. We are unable to understand and communicate the economic impacts of potenal investments. Right-of-Way Availability: We need data on right-of-way limits, but this isn’t systemacally maintained. Beer data would reduce the me needed to research right-of-way limits where quesons or problems arise. Worksheet 9. Data value gaps.

Worksheet 9. (Continued). Road Inventory Availability: We need inventory data for local, non- state-maintained roads. Beer data is needed to meet federal reporng requirements and perform consistent safety analysis for both on- and off-system roads. Underground Infrastructure Availability: We only have underground infrastructure informaon where recent projects have been done. Beer data would reduce the need for special discovery efforts for project scoping. Pavement Condion Availability: We need trend data to esmate a pavement deterioraon model, but only have 1 year of data. Beer data would allow us to produce more accurate and credible esmates of future pavement needs and set realisc performance targets. Traffic Availability: Traffic data for weekends and special events is very sparse. There is a risk of over- or under-designing facilies based on faulty traffic assumpons. Sign Inventory Quality: Sign inventory is 3 years old and doesn’t reflect recent work. Districts won’t use the inventory because they don’t trust it is correct—they instead spend me re-collecng informaon in the field. Roadside Assets Quality: Roadside asset data has horizontal accuracy to the nearest 3 meters but sub-meter accuracy is needed. Data can be used for planning but addional field collecon will be required for project scoping and design. Crashes Quality: Reported crash locaons don’t match where crashes actually occurred. We lack crical informaon needed to idenfy and correct safety issues. Considerable staff me is required to review each crash record and assign proper locaons manually. Traffic Quality: We don’t trust summarized traffic data because it doesn’t adequately account for detector failures. Our mobility performance measures lack credibility. Pavement Condion Quality: We don’t trust that the pavement roughness data were measured accurately because equipment weren’t calibrated properly. We are not able to reliably track trends in pavement condion and understand how investments in pavement are affecng condion and performance. Type of Data Gaps Business Impacts Bridge Inspecons Quality: We think that that bridge condion data is biased because there wasn’t sufficient independent verificaon of inspecon results. Priories and needs for certain bridge projects may be overstated, resulng in subopmal investment decisions. As-Built Plans Quality: We can’t be sure that as-built plans are complete because there wasn’t sufficient quality assurance. We can’t rely on as-builts as a data source for updang asset inventories and providing informaon on underground assets for project scoping. This means that we must pay to gather this informaon. Incidents Quality: Incident locaon informaon was entered in “free form text” so we can’t use it for mapping. Manual coding of locaons is needed, which takes valuable staff me away from more producve acvies. Project Delivery Quality: Project compleon date wasn’t defined in a consistent way—somemes the physical compleon date was used; other mes the financial close-out date was used. The data can’t be used to compute stascs on project delivery performance and can’t be used to help interpret historical crash, incident, and traffic data. Traffic Usability: We must submit a request to IT to get the traffic data we need. Valuable IT resources are strained and business value of the data is diminished. Traffic Signal Inventory Usability: Our central traffic engineering unit maintains a signal inventory but districts weren’t aware of that—so some started collecng their own data. Effort and resources were wasted that could have been be’er used elsewhere. Facility Inventory Usability: We have a facility inspecon database but no standard reports to summarize overall results. Special staff effort is required to prepare one-off summaries of the data. The data is not used as much as they could be given the effort required for summarizaon and analysis.

44 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide For Data Management Teams In the Data Management Gaps and Candidate Actions Workshop Preparation Meeting, par- ticipants review the results of the assessment and prepare a list of gaps to be used as a starting point for the workshop. This ensures that there is continuity between the two workshops and enables the second workshop to proceed efficiently. Worksheet 10 provides a format for listing the gaps. At the Preparation Meeting, use the com- ments from the Data Management Assessment tool to partially fill out this worksheet (leave the Business Impacts column blank.) A sample is included to provide examples of the types of gaps that should be recorded. Step 4: Gaps and Candidate Actions Workshop For both the data management and the data value assessments, a Gaps and Candidate Work- shop is conducted—the second of two workshops held for each assessment team in Phase 2. Instructions for this workshop are the same for both assessment types. In this workshop, the ID Assessment Element Gap Descripon Business Impacts Sample: Assessment Element Gap Descripon Business Impacts Data Strategy and Governance Lack of formal (role-based) definion of accountability and responsibility for data quality for each type of data When exisng key staff leave the agency or are re-assigned, there are risks that data quality will decline due to lack of formal accountability for specific job posions. Data Life Cycle Management No formal change noficaon process when data coding changes occur in one system that may affect another Problem won’t be noced unl managers request reports that rely on integraon across systems. May result in delay in provision of informaon required for management and create need for emergency repairs to fix coding inconsistencies. Data Architecture and Integraon Lack of standardizaon in data definions— different districts maintain spreadsheets of informaon in various formats Duplicaon of effort and lack of ability to aggregate data to produce informaon needed for management decisions. Data Collaboraon Data is gathered from local jurisdicons ad hoc; formal data-sharing agreements do not exist Current methods strain agency staff resources and do not reliably result in complete or current data. Data Quality No formal quality assurance process is in place Lack of trust in the data; lack of ability for data managers to provide informaon on current level of quality—results in underused data and loss of potenal value. Worksheet 10. Data management gaps.

Phase 2: Conducting the Assessment 45 assessment team is brought back together and asked to validate the draft list of data gaps and produce a list of candidate actions to address these gaps. The workshop consists of two exercises. A sample agenda for this workshop is provided in Figure 15. Exercise 1: Gap Validation The goal of the first exercise is to validate and complete the draft gaps produced at the Gaps and Candidate Actions Workshop Preparation Meeting—using Worksheet 10. The facilitator can use the following steps to review each gap: • Read the draft gap aloud. • Ask participants to confirm that this issue merits consideration in the data improvement action plan. This should screen out any gaps that the group thinks are not that important, as well as gaps for which realistic solutions aren’t likely to be identified. • If participants don’t agree that the gap is significant, delete it from the list and move to the next gap. • Ask participants if the gap can be described more precisely; re-word as needed. • Ask participants to describe the business impacts of the gap: how is the gap creating risks, causing inefficiency, limiting value derived from available data, or affecting decision making? Why should agency management care about this gap? After all draft gaps have been completed, the facilitator should ask the group if any gaps are missing and provide an opportunity to add entries to the list. At the end of this first exercise, the group should have produced a complete set of gaps that the assessment team thinks are worth addressing in the data improvement action plan. Exercise 2: Candidate Improvements The second half of the workshop should be spent identifying candidate improvements that will be fed into the coordinated data improvement action planning process in Phase 3: Imple- ment and Monitor. Worksheet 11 provides a format for recording the results of this exercise. Worksheet 12 provides a format for recording more detailed information about each candidate Gaps and Candidate Acons Workshop Agenda 9:00 AM Background • Review of Workshop Purpose and Agenda 9:15 AM Exercise 1: Validaon of Data Gaps • Review Process used to Develop Dra Gaps • Gap Screening and Validaon • Business Impacts 10:15 AM Exercise 2: Candidate Acons • Idenfy Current Iniaves that will Address Gaps • Recommend New Candidate Acons • Complete Acon Evaluaon Forms 12:00 PM Wrap-Up • Feedback • Next Steps Figure 15. Gaps and Candidate Actions Workshop sample agenda.

46 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide ID Acon Descripon Lead Responsibility Current or New? Worksheet 11. Candidate actions to address gaps. Data Improvement Evaluaon Form Candidate Improvement: Recommended By: Date: Implementaon Proposed Lead for Development/Implementaon: _____________________________________________ Esmated Time for Development/Implementaon:  < 3 months  3-6 months  6-12 months  > 12 months Resource Requirements:  Can be covered under exisng project or iniave  Requires new effort by In-house staff only  Requires new effort involving external contractor support Ongoing Maintenance & Support Proposed Business Owner: _____________________________________________ Other Staff Support Needs:___________________________________________ Resource Requirements:  Can be maintained with exisng staff  Requires addional staff <= 1 FTE  Requires addional staff > 1 FTE Business Case (please describe for each applicable item) External or Internal Agency Mandates:_____________ _____________________________________ Staff Time Savings:__________________________________________________________________ Other Agency Cost Reducon:_________________________________________________________ Risk Migaon: ____________________________________________________________________ Improved Business Decisions:_________________________________________________________ Comments: Worksheet 12. Data improvement evaluation form.

Phase 2: Conducting the Assessment 47 improvement that can be used to facilitate prioritization in Phase 3. A recommended process for completing these worksheets is as follows: • Distribute copies of the data improvement idea checklist included in Appendix D. This is a master list of the types of data improvements that team members can consider to address the identified gaps. • Identify Current Initiatives. Ask the assessment team members to list current agency initiatives—planned, funded, or in progress that are expected to address one or more of the identified gaps. Record these in Worksheet 11—identifying the lead business unit and manager of the effort and noting it as a “current” action. • Ask each assessment team member to take 5 to 10 minutes and identify what they think would be the two most important actions to consider to close what they consider to be the remain- ing highest priority gaps. Participants should select actions that they think will provide the greatest business value, regardless of cost or level of effort. Ask each participant to describe their actions and identify the most likely lead business unit (and manager if possible). If a participant proposes an action similar to one already on the list, provide an opportunity to enhance the wording of the original action. • When all participant actions have been recorded, ask the group if they think anything is miss- ing from the list and add new entries as appropriate. • Complete Improvement Evaluation Forms (see Worksheet 12) for each entry on Work- sheet 11. Depending on the number of candidate improvements that the group has come up with, this can be done either as a group exercise, or through a “divide and conquer” strategy—with each team member completing forms for a portion of the candidate improvements. Wrap-up • Thank the participants for their time and effort. • Ask for feedback to improve future workshops. • Describe plans for the Implementation and Monitoring phase. Step 5: Assessment Results Analysis and Summary Following completion of all of the assessment workshops, the facilitators and staff supporting the assessment effort should get together and review Worksheets 9, 11, and 12 for data value teams, and Worksheets 10, 11, and 12 for data management teams. Staff should • Edit the worksheets as needed to ensure that they are complete and consistent. This may require consultation with assessment team members to provide clarification or to fill in miss- ing information. • Prepare a presentation or briefing for the planning team to launch Phase 3 that includes – A description of each assessment team—including their topic and a list of their members – A data value assessment summary table (modeled after Figure 7) that summarizes ratings for data availability, quality, and usability for each of the business areas performing the data value assessment – A data management assessment summary table (modeled after Figure 8) that summarizes maturity levels for each of the groups performing the data management assessment – The consolidated list of gaps – Comments on themes common to the different groups – The consolidated list of current initiatives and candidate actions

Next: Chapter 5 - Phase 3: Improve and Monitor »
Data to Support Transportation Agency Business Needs: A Self-Assessment Guide Get This Book
×
 Data to Support Transportation Agency Business Needs: A Self-Assessment Guide
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Report 814: Data to Support Transportation Agency Business Needs: A Self-Assessment Guide provides methods to evaluate and improve the value of their data for decision making and their data-management practices.

NCHRP Web-Only Document 214: Transportation Agency Self-Assessment of Data to Support Business Needs: Final Research Report describes the research process and methods used to develop NCHRP Report 814.

The following documents supplement the project and are available online:

This supplemental information is offered as is, without warranty or promise of support of any kind either expressed or implied. Under no circumstance will the National Academy of Sciences, Engineering, and Medicine or the Transportation Research Board (collectively "TRB") be liable for any loss or damage caused by the installation or operation of this product. TRB makes no representation or warranty of any kind, expressed or implied, in fact or in law, including without limitation, the warranty of merchantability or the warranty of fitness for a particular purpose, and shall not in any case be liable for any consequential or special damages.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!