CHAPTER 2 RESEARCH APPROACH
This chapter of the conduct of research report presents a task-by-task breakdown of the approach and results of the work performed throughout NCHRP Project 10-113. Inclusive of reporting and administrative tasks, there were a total of fourteen (14) tasks completed across four (4) distinct phases of work (Figure 1). Each task begins with a section heading and a brief description of the goals of each task as documented in the work plan followed by the documented work and results of the research team.
Phase I. Planning
Phase I includes six (6) tasks as outlined in Figure 2.
Task 1. Literature review
Task Overview
This review focuses on previously published literature and ongoing research on the state of practice of quality management processes and quality checks of 3D Models. However, there is little work conducted in this domain, particularly with regards to their use in highway construction. Thus, the review begins with an overview of the use of 3D Models in highway construction projects, the benefits and drivers of their use, the challenges and barriers to their implementation, and a review of the current quality management practices nationally and internationally.
Task Outcomes
3D Models Overview
According to the Federal Highway Administration (FHWA), three-dimensional (3D) modeling in the transportation construction sector is a mature technology that is the cornerstone of the modern-day digital jobsite. As the benefits of 3D modeling are becoming more widely recognized, the US highway industry is transitioning from the traditional 2D modeling design process to 3D modeling (FHWA, 2017). This transition is evident in the work of (Bradley et al. 2016) which noted that there is an increase in the adoption of 3D modeling and Building Information Modeling (BIM) use. The transition from 2D plans to 3D modeling is primarily driven by roadway contractors using Automated Machine Guidance (AMG), as the process of reengineering 2D plans is burdensome and time consuming (Catchings et al., 2020; Dadi et al., 2021).
In highway construction, 3D modeling has several versatile applications, including risk management; safety control; and integration with other technologies such as cloud computing and mobile services, sensing technologies and sensors, laser scanning and photogrammetry, augmented and virtual reality, unmanned aerial systems and robotics, virtual design and construction, Global Positioning Systems (GPS) and Geographic Information Systems (GIS), and life cycle analysis from the planning and design stages to the maintenance, structural monitoring, and renovation stages (Costin et al., 2018). Additionally, 3D models
can be used as contract documents throughout the bidding process, and as-built 3D models for maintenance and operation purposes in the post-construction phase of the project life cycle, the development of future projects, and to minimize exposure to tort litigations (Jung et al., 2014).
In NCHRP Synthesis 593: 3D Digital Models as Highway Construction Contract Documents, a survey of 41 state DOTs found that (37%, 15 DOTs) of the respondent DOTs, do not currently use 3D digital models in construction; those states include Arizona, Texas, New Mexico, Montana, Indiana, Kentucky, Pennsylvania, Virginia, Tennessee, Georgia, North Carolina, New Jersey, Rhode Island, Delaware, and Hawaii. However, (51%, 21 DOTs) of the respondent DOTs do use 3D models for information purposes only. These state DOTs include California, Oregon, Washington, Idaho, Wyoming, Colorado, North Dakota, South Dakota, Nebraska, Kansas, Wisconsin, Missouri, Arkansas, Michigan, Ohio, West Virginia, South Carolina, Vermont, New Hampshire, Massachusetts, Connecticut. Additionally, (12%, 5 DOTs) of the respondent DOTs do use 3D models for both information, and as contract documents. These state DOTs include Florida, Alabama, Oklahoma, Utah, and New York (Nassereddine et al., 2022).
3D Models Drivers and Benefits
State DOTs are more and more recognizing the importance of 3D models in all phases of a highway project, including planning, design, construction, maintenance, and operations (FHWA, 2018). According to FHWA, 3D modeling allows for faster, more efficient, and more accurate planning of construction. A 3D modeling software facilitates the integration of the design and construction teams for alternative project delivery methods including design-build, construction manager/general contractor, or integrated delivery method. It allows them to connect virtually, thus, enabling them to test, develop, and make changes to construction projects throughout its life cycle stages. Additionally, 3D modeling technology allows intricate and detailed design features to be viewed geospatially (3D view) from multiple perspectives, which can greatly help in the interpretation of design plans (FHWA, 2017). Torres et al. (2018) found that the use of 3D modeling contributed to cost and time savings, as well as an increase in productivity.
Eadie and Johnston (2020) identified the following major drivers for the adoption of BIM and 3D modeling: clash detection; visualization and improved communication to operatives and construction sequencing; competitive pressure and/or ambition; cost savings through reduced rework; whole life costing benefits for the project and optimization of life cycle costs; time savings and reduced delays; added value to client; reduced client pressure; health and safety improvements; construction quality enhancements; and innovation process and fabrication, including automation of schedules, prefabrication, and asset management benefits.
A Smart Market report by (McGraw-Hill Construction, 2012) identified both short- and long-term benefits experienced by companies through the adoption of BIM and 3D modeling. According to their analysis, short-term benefits include reduced document errors and omissions; reduced rework; reduced cycle time of specific workflows; marketing new business; offering new services; and staff recruitment and retention. Looking at longer time frames, benefits include increased profits; reduced construction costs; reduced project duration; maintaining repeated business; and reduced claims and litigation expenses.
In a study by (Guo et al., 2017), data collected from site visits to seven transportation agencies showed that the use of 3D modeling reduces cost and rework, improves productivity, enhances communication and visualizations, error detection, 3D design verification, and provides a greater clarity in the design intent. Moreover, the use of 3D models as part of the contract documents can greatly facilitate the bidding process. Furthermore, producing as-built 3D models in the post-construction phase has the potential of greatly improving maintenance and operations efficiency, as well as future retrofitting operations (Jung et al., 2014; Woo et al., 2010).
In NCHRP Synthesis 560: Practices for Construction-Ready Digital Terrain Models, the current state of practice of DTMs, 3D models of the bare ground surface with natural features such as ridges and break lines that are often used for highway construction purposes, was investigated. State DOTs were asked about their perception of the estimated benefits from the use of DTMs. Overall, among 37 responding state DOTs, the following benefits emerged: easier to calculate earthwork quantities; earlier identification of plan discrepancies and conflicts; reducing risk during bidding for contractors and/or DOTs; improved
communication on the project; fewer change orders and or construction revisions; and fewer project delays. Additionally, when asked to rate their agreement level with perceived long-term benefits, respondent state DOTs, on average, strongly agreed that long-term benefits include cost savings; improved accuracy of plans; improved documentation of measurements in database for future references; improved communication; and improved efficiency of project construction. Also, respondent state DOTs agreed, on average, fewer claims and litigation is a long-term benefit (Dadi et al., 2021).
3D Models Challenges and Barriers
Despite the numerous benefits and added-value 3D modeling offers, particularly in the highway construction industry, the use of 3D modeling faces major challenges. Such challenges include an increase in the cost and time to produce the 3D models; issues and concerns related to data management privacy and errors; the need for software training expertise; issues related to model validation; and a lack of standards and guidelines which can lead to incompatibility errors (Torres et al., 2018). Costin et al. (2018) identified technical, legal, and other challenges as summarized in Table 1. Moreover, as-built digital 3D models are not easy to create in the post-construction phase, and require sophisticated equipment and expertise (Jung et al., 2014; Woo et al., 2010).
Table 1: Challenges of 3D Modeling, adapted from (Costin et al., 2018)
| Category | Challenge |
|---|---|
| Technical Challenges | Lack of interoperability and information sharing across different software |
| Lack of knowledge | |
| The need for hardware that can handle the processing of large volumes of data | |
| Lack of definitions of data requirements as related to identifying to whom and when the data should be shared for the duration of the project | |
| Legal Challenges | Lack of agreement on legal clauses about the use of digital signatures, stamps, and deliverables |
| Integrity of data during transmission and information confidentiality | |
| Difficulties of updating insurance policies to cover the responsibility of stakeholders | |
| Lack of standards, methods, and contractual languages, particularly when used for contract documents in the bidding process | |
| Disputes on data ownership and liability | |
| Other Challenges | Institutional resistance to change and adopting new technologies |
| Lack of sufficient funding to make the large initial investment that includes updating software, hardware, IT systems, training workers and engineers, and changing the project delivery method | |
| The need for education and training about the technologies and methods | |
| Change of role and responsibility of infrastructure project stakeholders |
Moreover, software implementation and operation remain a major challenge for 3D modeling. 3D modeling software requires a high initial capital investment and a long time to learn how to use the software, not to mention the lack of support and resistance to change from senior managers (Migilinskas et al., 2013). Additionally, software compatibility issues create serious problems for the successful implementation of models (Hasan & Rasheed, 2019). For instance, contractors, sub-contractors, and specialty contractors often use different software tools for their designs, many of which cannot be integrated together, limiting the potential for design review and error/clash detection (Staub-French & Khanzode, 2006).
Contractors and sub-contractors face major challenges in the adoption of BIM and 3D modeling. The lack of skilled professionals with BIM expertise, cost/time constraints, including the time needed to hire people proficient with BIM skills and cost of training such individuals and software implementation, as well as the lack of official standards and processes for BIM use across the industry are some of the major
barriers contractors and sub-contractors face. Therefore, many contractors and sub-contractors have opted to outsource their BIM and 3D modeling responsibilities to third parties (Fountain & Langar, 2018).
Small and Medium Sized Enterprises in particular struggle with the implementation of BIM and 3D models. A major factor causing this includes the undue burden of training and costs of implementation and the lack of an return on investment (ROI) on such investments. Additionally, Small and Medium Sized Enterprises struggle with technical, organizational, and process related changes due to lack of relevant expertise. Finally, SMEs are often unfamiliar with the significant potential and benefits of successful implementation of 3D modeling (Kouch, 2018).
Eadie and Johnston (2020) identified the following barriers for the adoption of BIM and 3D modeling: cost of implementation including software, technology, and training; lack of senior management support and staff resistance to change; lack of skilled staff and technical expertise; legal uncertainties and concerns; lack of client demand; impact on sociotechnical culture and lack of flexibility; software interoperability issues; doubts over return on investment (ROI) and lack of vision for long-term benefits; lack of supply chain buy-in; long learning curves and implementation time; and the suitability for ongoing projects.
In NCHRP Synthesis 593: 3D Digital Models as Highway Construction Contract Documents, the 15 state DOTs that do not currently use 3D models in construction were asked to select the contributing reasons that they do not use 3D models yet. Among the 15 DOTs, (73%, 11 DOTs), indicated the need for more education or training for the office staff; (67%, 10 DOTs), reported the absence of processes to ensure that the detailing of 3D digital models is acceptable for use in construction; (67%, 10 DOTs) noted the need for education and training for the field survey and inspection staff; (53%, 8 DOTs) identified the need for education and training for equipment operators; (47%, 7 DOTs) indicated that the modeling software is currently problematic, and the hand off of model file to construction is yet to be established; and (47%, 7 DOTs) had concerns related to stamping and sealing 3D digital models. On the other hand, the least common reasons for not adopting the use of 3D digital models included financial reasons such as unproven ROI (7%, 1 DOT), and the unsuccessful implementation of 3D models despite being attempted. Yet, it is
noteworthy to mention that none of the state DOTs indicated that the high cost of development was a barrier to the use of 3D digital models (Nassereddine et al., 2022).
Quality Management Current Practices Overview
According to the FHWA, Quality Assurance (QA) refers to all the planned and systematic actions necessary to provide confidence that a product or facility will perform satisfactorily in service, while Quality Control (QC) is all those QA actions and considerations necessary to assess and adjust production and construction processes so as to control the level of quality being produced in the end product. QA/QC procedures on construction projects are conducted to assess the quality of the project is in compliance with specific standards (Tang et al., 2022). The “Highway Design” Chapter from the “Project Development and Design Guide” describes some data quality management procedures for highway construction. The chapter indicates that the quality checks should include both internally and externally produced work by following the QA/QC protocols, including verification of all the files and ensuring they are correctly organized, verification of all design calculations, ensuring all features are consistent with the project scope and that they don’t conflict with each other, and complying with all legal, regulatory, and contractual requirements (FHWA, 2012).
The “FHWA Guidance on QA/QC in Bridge Design” details the requirements of design quality management for bridges. Such practices include confirming plan sheets are properly named and organized, and design calculations are independently verified and double checked. However, this guidance does not discuss procedures for 3D models (FHWA, 2011).
The “Design Review: Principles and Practice” handbook highlights the importance of quality control and offers several principles of design review. To improve the outcomes of the quality management of the design, the design review process should be conducted by people who are independent of the design creator and decision makers to avoid conflicts of interest. It should be carried out by knowledgeable experts who can offer constructive critiques. It should be carried out in an objective methodical manner rather than depending on the individual reviewer’s tastes. The review process should be completed in a timely fashion.
The review should be multidisciplinary to combine perspectives of different engineering disciplines for a more rounded assessment. Finally, the findings of the review should be accessible, transparent, and clear to the design team. However, this handbook does not specifically offer any insight on quality management practices of 3D models (Design Council, 2019).
NCHRP Project 20-07/Task 172 describes some design and data quality management procedures and practices. Those include making a formal statement of all the design submission requirements; making a clear description of the review personnel and process; checking if the design conforms with the QA/QC plan and the project requirements; reviewing relevant documents including plans and drawings, and verifying design calculations, specifications, or any other pertinent documents. Still, this report does not address any such practices for BIM or 3D models (Molenaar et al., 2005).
The Florida Department of Transportation (FDOT) expects the Design-Build contractor to be responsible for ensuring the quality and accuracy of the design, plans, specifications, drawings, or any other documents. They request that the contractor describe the checking and review process for the specifically designed project. Then, the quality control plan should be submitted to FDOT and signed by the responsible professional engineer to confirm that the review has been conducted. The contractor is fully expected to correct all errors in the documents or design uncovered during the review process without any additional compensation (FDOT, 2000).
The Washington Department of Transportation (WSDOT) QA/QC procedure of design and construction documents is organized by engineering discipline. The QA/QC process includes independently checking and back-checking, by experienced architects and engineers, all plans, calculations, drawings, and other submittals to ensure that they are in accordance with generally accepted architectural and engineering practices. The checker and back-checker should be clearly identified on the face of all submittals. All plans and documents should be stamped, dated and signed by the responsible Washington registered architect or engineer (Washington Department of Transportation, 2000).
The South Carolina Department of Transportation (SCDOT) quality control plan for roadway design includes a comprehensive checklist for preliminary plans, right of way plans, and construction plans.
The inspector has to verify all elements in the checklists are in compliance with project requirements. The checklist includes general project/roadway information, roadway cross section details, plan sheets, section profiles, and other reference information (South Carolina Department of Transportation, 2022).
The Virginia Department of Transportation (VDOT) organizes its design QA/QC by engineering discipline. At VDOT, the design QA should be performed by one or more members of the lead designer team independent of the QC team. The design QC can be performed at the office where the work was conducted. The design QC has to include a review of math and engineering computations, technical accuracy, conformance to contract requirements, review of form, content, and spelling, and coordinate with other engineering disciplines (Virginia Department of Transportation, 2018).
A guide for the review of AASHTO Controlling Design Criteria offers a systematic approach to the review of existing roadways prior to implementing improvements to those roadways, however, the guide does not offer insight on quality management practices of data in general or 3D models specifically (Viparina, 2009). Additionally, while the AASHTO “Constructability Review Best Practices” describes elements that are part of successful constructability practices employed by state transportation agencies, it does not discuss quality control measures for design information including 3D models (AASHTO Subcommittee on Construction, 2000).
Quality Management of Contract Documents
The literature on quality management practices for contract documents is limited as only one resource on the matter was found. A report on the organizational analysis of the United States Army contracting command in Kuwait briefly discusses the review process of contract documents. The report states that for contracts with values exceeding $100,000, all relevant and applicable files must be verified and checked for accuracy by the contracting officer, the procurement analyst, the attorney-adviser, and chief of contract operations. For contracts exceeding $500,000, in addition to the aforementioned individuals, all relevant documents have to be further reviewed and checked by the Commander of the US Army Contracting Command - Kuwait (USACC-KU) and the Deputy Principal Assistant Responsible for Contracting (PARC).
This process was implemented in March 2008 via email by the commander of the USACC-KY and the Deputy PARC (Orr, 2008).
Quality Management Practices of 3D Models
QA/QC procedures on construction projects are conducted to confirm the quality of the project is in compliance with specific standards (Tang et al., 2022). Most state DOTs verify the accuracy of the work built by performing independent verification of survey points or running actual model checks with the design model and the design that was used by the contractor. Only a few state DOTs conduct both model checks and field verification of survey points (Dadi et al., 2021).
In NCHRP Synthesis 593: 3D Digital Models as Highway Construction Contract Documents, among the five state DOTs that use 3D models for information purposes and as contract documents, only two state DOTs (Florida and New York) indicated that they use QC checklists for 3D digital models. However, four state DOTs (Florida, New York, Oklahoma, and Utah) indicated that they perform internal checks of the quality of the model of record, and that they partner with consultants to preview the 3D digital model in advance of the letting (Nassereddine et al., 2022).
In NCHRP Synthesis 560: Practices for Construction-Ready Digital Terrain Models, 14% of responding state DOTs indicated that they do not verify the accuracy of their DTM models used in construction. Among the 32 state DOTs using DTMs who perform model verification, 78% indicated that they conduct field verification surveys to verify model accuracy, 31% indicated that they run model checks with the DTM used by the contractor to verify model accuracy, and 3% stated that they compared point cloud data (terrestrial, drone photogrammetry, Light Detection and Ranging (LiDAR)) to the DTM to verify model accuracy (Dadi et al., 2021).
Michigan Department of Transportation (MDOT) has used a formal process to review the digital data, including 3D models, that they provide to contractors as reference information since 2015. However, most of their consultant designers were not aware of the internal quality assurance review process of design data models (Mitchell et al., 2019). This review process uses a spreadsheet that is maintained by the design
manager throughout the project design development phase, which tracks all quality control checks and the quality assurance reviewer’s comments at each milestone. The review process includes a check on the presence of all relevant files with the correct file names, conventions and referencing; the visual review of the alignment and profile files in native and open formats; tie-in geometry for alignments, profiles, and side slopes; visual review of various discipline design files (e.g., signing, lighting); visual review of 3D model files (both line strings and surfaces); concurrence between digital files and contract PDF plans; visual review of surfaces and drainage files; geospatial metadata (i.e., units, coordinate zones, vertical datum); and additional criteria for construction layout data (MDOT, 2015).
FDOT modeling software has been set up to include QA/QC tools that check for Computer Aided Design and Drafting (CADD) compliance of the 2D and 3D design files. The model is reviewed by the consultant designers, engineers, project managers, FDOT District Project Reviewer, construction contract estimators, contractors, and other potential parties. The model quality control process includes visual checks for gaps, spikes, overlays, and transitions, as well as a formal checklist of items including pavement lanes, shoulders, sidewalks, slopes, superelevation, slope breaks, clearances, conflicts with utilities, drainage signs, and depths, among others. (FDOT & Sexton, 2018).
Table 2 is a sample checklist used by FDOT in their quality control process of their 3D models.
Table 2: Sample 3D Engineered Model QC Checklist, adapted from (FDOT & Sexton, 2018)
| Implementation Items | Originator | Reviewer | Comments |
|---|---|---|---|
| Initials | Initials | ||
| Geographical Coordinate System has been defined in the model(s)/design file | |||
| 3D Baseline/Centerline has been displayed in the model(s) | |||
| Referenced 3D model break lines match the 2D planimetric lines |
| Implementation Items | Originator | Reviewer | Comments |
|---|---|---|---|
Review of model(s) for completeness, visually:
|
|||
Component Depths match the Typical Section:
|
|||
Verify Station Offset Elevation at Critical Location:
|
|||
Verify Cross Slopes:
|
|||
| Vertical Clearance | |||
| Clash Detection – Interference Checking | |||
3D Deliverable Created
|
At the Iowa Department of Transportation (Iowa DOT), before 3D engineered models are sent out for the bidding process, the final design goes through an extensive quality assurance process. In their experience, a major benefit of using 3D models is the level of quality assurance and control that can be done throughout the design phase. While traditional 2D quality control processes are limited to the reviewer’s
visualization ability of 2D plans in a 3D workspace, 3D engineered models allow the designer to look at the project in isometric views from different angles to detect any irregularities that ought to be corrected. Iowa DOT uses a drainage analysis tool to assist the designer in determining the direction of flow on an existing surface by delineating drainage on the proposed 3D surfaces of the model. Such tools offer an additional level of quality control. Additionally, 3D models are augmented horizontally and vertically to exaggerate elements to better detect errors. Most errors that are detected throughout the QA/QC process are easily and quickly corrected in the 3D engineered model prior to supplying it to the client, owner, or contractor (Reeder & Nelson, 2015).
The purpose of the 3D Roadway Design Center (3D RDC) at the Oregon Department of Transportation (ODOT) is to create the process, standards, and practices to allow the delivery of consistent, quality roadway digital design documents using accurate and detailed ground surveys. The 3D RDC at ODOT has added additional specific guidance to its best practices to address roadway digital design data quality control. Included in their guidance, it is stated that it is the responsibility of the Region Roadway Manager to incorporate the review of roadway digital design data as part of the Region Roadway Quality Control process. Additionally, they are to perform a QC review of the roadway digital design data included in the eBIDS Handoff Package and Construction Survey Handoff package and ensure the uploaded eBIDS Handoff Package includes the data intended by the designer. Finally, they have to review and process exception requests to the eBIDS Handoff package requirement in accordance with the Local Agency Guidelines Manual (ODOT, 2015).
The Georgia Department of Transportation (GDOT) has issued guidance on its QA/QC process. Accordingly, the reviewing engineer on the design team is expected to conduct a thorough QA of the 3D model once in the preliminary design and once in the final design to verify all the details included in the design model. The reviewer on the design team should assess the 3D model is reasonable and has no major concerns, particularly in areas that are not normally shown in design plans. GDOT advises the reviewer to electronically view DTM surfaces in 3D and rotate the surfaces in different angles and views to visually inspect it for problematic areas. Some techniques the reviewer can perform in the QA process include the
following: ensuring all files exist with the correct naming and referencing; overlaying 3D models with 2D design to check for consistency; viewing the proposed contours of the 3D surface using a small interval and checking for continuity of surface; checking for good triangulation and elevations using triangulation and elevation tools; and using trickle tools to verify flow paths of any surface point (GDOT, 2018).
VDOT has developed a 3D Model Development Guidance Manual to address 3D modeling requirements, best practices, and frequently asked questions, with the expectation that it will provide a direct benefit during the project delivery phase. However, it is yet to publish details on QA/QC methods of 3D engineered models. VDOT has indicated that future versions of that manual will include additional information and details about QA processes and constructability review efforts (Virginia Department of Transportation, 2020).
The Unites States Army Corps of Engineers (USASCE) has incorporated a section on quality control into its BIM execution plan template. Here, the plan has to concisely describe the strategy for quality control for each of the model files in the project. The template also includes a table where the plan must identify the responsible party for several quality checks, and provides the reference documents necessary to execute those checks, as well as the frequency with which each the checks must occur (United States Army Corps of Engineers, 2020).
International Standards
Outside of the United States, the European community has united behind the ISO 19650 approach to construction information management (Maier, 2020). The ISO 19650 standard is an international standard from the International Organization for Standardization (ISO). It is used for managing information over the whole life cycle of a built asset using BIM. This standard, published in 2019, contains the same concepts, principles, and high-level requirements as the UK BIM Framework, and is closely aligned with the current UK 1192 standards (British Standards Institution, 2022). In the United Kingdom, ISO 19650 supersedes the British Standard BS 1192:2007+A2:2016, and the Publicly Available Specification (PAS) 1192-2:2013 (Cohesive BIM Wiki, 2021).
Currently, the ISO 19650 series of standards are the most comprehensive BIM/information management standards available in the world (Plannerly, 2022). The ISO 19650 has two parts related to information management which are (Mirniazmandan, 2021):
- ISO 19650-1: Organization and digitization of information about buildings and civil engineering works, including BIM – Information management using BIM: Concepts and principles.
- ISO 19650-2: Organization and digitization of information about buildings and civil engineering works, including BIM – Information management using BIM: Delivery phase of the assets.
In this ISO 19650 standard, part 1 outlines the concepts and principles, and provides recommendations on how to manage building information, while part 2 supplies information management requirements in the delivery phase of assets (Cohesive BIM Wiki, 2021). The intent of ISO 19650-2 is to offer a roadmap to facilitate the standardization of BIM processes in a uniformed fashion (Peters & Mathews, 2019). ISO-19650 applies to the whole life cycle of a built asset, including strategic planning, initial design, engineering, development, documentation and construction, day-to day operation, maintenance, refurbishment, repair, and end-of-life. Use of this standard can potentially help remove barriers to collaborative working across international borders (Cohesive BIM Wiki, 2021).
The ISO 19650 series has major benefits for large multinational organizations. Such organizations have struggled for years to accommodate the various requirements from their partners and stakeholders across international regions. ISO 19650 series helps these organizations to create a unified approach across each of their regions (Dadmehr & Coates, 2019).
The ISO 19650-1 establishes a framework for the business processes to support creating and managing BIM information that includes the following (International Standard for Organization, 2018a; Maier, 2020):
- Perspectives of project and asset information management.
- Definition of information requirements and the resulting asset and project information models.
- The information delivery cycle and its alignment to the asset life cycle.
- Project and asset information management functions.
- Delivery team capability and capacity.
- Information container-based collaboration.
- Information delivery planning, including timing, responsibility, and defining a federation strategy and breakdown structure for information containers.
- Managing collaborative information production, including level of information and information quality.
- The Common Data Environment solution and workflow.
ISO 19650-2 describes the quality management practices of data and information. Here, each task should undertake a quality assurance check of information in accordance with the project procedures and methods. If the check is successful, the record of the outcome of the check is reported. However, if the check is unsuccessful, the information should be rejected, and the author of the information should be notified and provided comments on the required corrective action. After the quality check process is complete, and before the information is shared with a common data environment for collaboration, the information has to be independently reviewed to ensure it includes all data requirements, the level of information need, and the necessary information for coordination with other task teams. ISO 19650 recommends the use of an “Information management matrix” to organize the quality checks and review for each tasks and confirm this process is successfully completed (International Standard for Organization, 2018b).
Aside from ISO-19650, there are plenty of international standards that discuss construction related guidelines and practices. However, they fall short of addressing quality management practices of 3D models and 3D model–based project development, which further highlights the need for this research.
ISO-6707 is an international standard that offers a full set of vocabulary, terms, and concepts that are commonly used in documentation governing construction work and construction products, in an attempt to improve communication in the design, execution, and maintenance of construction work within industries. This standard consists of four parts. Part 1 defines general terms related to buildings and civil
engineering and construction work (International Standard for Organization, 2020b). Part 2 defines terms related to buildings and civil engineering and construction work in the specific areas of contracts, communication systems, methods, and documentation (International Standard for Organization, 2017a). Part 3 establishes preferred terms and concepts related to sustainability for buildings and other types of construction works (International Standard for Organization, 2017b). Part 4 establishes preferred terms and concepts related to facility management for buildings and other types of construction works (International Standard for Organization, 2021). However, these standards do not address any type of data quality management terms, concepts, and vocabulary.
ISO/TS 12911 is an international standard that sets forth a framework providing specifications for the commissioning of BIM that is applicable to any range of modeling buildings and facilities. The framework allows the information manager, who utilizes the framework, to assist in structuring an international, national-project or facility-level BIM guidance document. The main objective of the framework is to create common guidance for the application of BIM, make BIM guidance more manageable, and enable the testing of BIM-related guidance. The “controls” section of the framework aims to provide guidance for the specification of managerial processes and quality assessment associated with the process of BIM to ensure data quality and business compliance (International Standard for Organization, 2012b). However, this document does not detail what the guidance is for the practices and processes of quality management of BIM or 3D models.
DIN EN 17412-1 is a German standard that discusses the concepts and principles for defining the level of information need for BIM. This document specifies the characteristics of different levels used for defining the detail and extent of information required to be exchanged and delivered throughout the life cycle of built assets. Moreover, it offers guidelines for principles required to specify information needs. The concepts and principles set forth in this standard can be applied for a general information exchange and generally agreed way of information exchange between parties in a collaborative work process, as well as for an appointment with specified information delivery (Deutsches Institut für Normung, 2021). However, this document does not discuss any quality management practices for BIM or 3D models.
ISO 22263 is an international standard that sets out a framework for the management and organization of project information about construction works. The purpose of this standard is to facilitate control, exchange, retrieval, and use of relevant information about the project and the construction entity. This standard is targeted toward all agents in the project organization in management of the construction process as a whole and in coordination of its sub-processes and activities. However, this standard does not discuss information organization of 3D models or BIM models, or detail specific guidelines for practices relating to the quality management of information (International Standard for Organization, 2008).
The ISO 29481 international standard is an information delivery manual for building information models. It consists of three parts. Part 1 specifies a methodology that links the business processes undertaken during the construction of built facilities with the specification of information that is required by these processes, and a way to map and describe the information processes across the life cycle of construction works. This is intended to facilitate interoperability between different systems and promote collaboration during all stages of a project life cycle (International Standard for Organization, 2016). Part 2 specifies a methodology and format for an interactive framework describing ‘coordination acts’ between actors in a building construction project during all life cycle stages (International Standard for Organization, 2012a). Part 3 defines the international standard data schema and classification for the efficient identification, management, and reuse of this manual specifications (International Standard for Organization, 2020a). Still, ISO 29481 does not describe or offer any practices for quality management procedures for information related to 3D modeling.
Task 2. State Policy and Statute Analysis
Task Overview
The purpose of Task 2 was to identify the required data to be collected in Phase II, to achieve the research objective. The task involved conducting a detailed examination of statutes, standards, policies, and practices (incl. checklists) for review (including QC and QA) and documentation procedures from model
development through delivering contract documents. The selected states were Pennsylvania, Montana, Florida, and Minnesota.
The task looked at how DOTs document current quality management procedures regardless of whether the project is delivered digitally or not. All disciplines were considered. The review explored any documentation, such as checklists, quality management procedures and manuals that currently describe the requirements for performing quality reviews for contract plans. The review also looked at documentation being developed for conducting reviews or updating current procedures to work in a digital environment.
The outcome of the task is guidelines for what data to collect to develop the research methodology and create the guide. This will develop a quality management process that is fully compatible with digital delivery. A secondary consideration is to identify what types of review tools software providers could develop to partially automate the design review process.
Task Outcomes
Statutes and rules regulate the practice of engineering. Typically, they establish a board that implements the regulations to licensed engineers, the qualifications and procedures for licensure, disciplinary procedures, and so on. Statutes and rules contain many definitions, including defining a professional engineer, responsible charge, and other terms regarding the responsibilities that engineer’s shoulder.
Typically, statutes and rules also define the requirements for applying a seal, including applying a signature alongside the seal, the medium of the seal (physical and digital), which types of documents require seals, and so on. Some states require that every plan sheet be signed and sealed while others only require the cover sheet to bear the signature.
For some states, this is all that is contained in the statutes and rules, but other states go further. For example, Pennsylvania has a section that includes the type of information found in the MUTCD to regulate the placement of signs in designated Highway Safety Corridors. Pennsylvania also has specific requirements that jurisdictions need to meet (including having a designated Municipal Traffic Engineer) to be certified to place traffic signals on the roads.
Florida’s regulations go into significantly more detail regarding an engineer’s responsibilities and has an expanded section on the responsibilities of structural engineers. The section defines terms like Structure, Structural Component, Structural System, Structural Engineering Documents, and Structural Submittals. Florida also has the most detailed requirements for signing and sealing digital files, including 3D models.
Policy and Practice Findings
Agencies use a patchwork of manuals, guidelines, bulletins/memoranda, and job aids to standardize and manage the delivery of their design programs. Identifying the full spectrum of design references involves navigating a complex web or having prior institutional knowledge of the agency’s policy framework. Minnesota groups information by functional area, providing a more seamless approach to locating the agency’s design policy and job aids. Other agencies group information by document type, e.g., Montana has a single page with links to all the manuals. This is convenient to locate manuals, but harder to be sure you have identified all the governing documents.
In general, agencies say very little about design reviews. The review milestones are easy to locate in the design manuals and/or process maps, but the procedures for doing the reviews are not well documented outside of isolated examples. Minnesota has a very detailed list of checks for bridge final plans and Montana has detailed instructions for manually identifying utility clashes. Pennsylvania has similar checklists for some milestone reviews. Florida has a brief section in the design manual that identifies clear responsibility for Category 1 and Category 2 structure design reviews but does not provide information on what the design reviews should cover. Generally, design reviews are performed using 2D plans, which in the absence of clear design review requirements can lead to an emphasis on the quality and consistency of the plan drafting, rather than the design itself.
Some agencies address quality directly, while others are silent on the topic as it relates to design. Florida’s Design manual has two quality-related chapters; the first describes a required QA/QC Management Plan and the second the department’s role in quality assurance. The CADD manual has a chapter
devoted to quality that references the Design Manual, describes associated job aids, and provides detailed procedures for Quality Control of Corridor 3D Models and Extracted Surfaces. Minnesota’s LRFD Bridge Design Manual has a section describing both process and product control procedures when using both COTS software and bespoke spreadsheets and other models. An internal committee evaluates software before it is placed into production and maintains an approved list of spreadsheets. Nevertheless, the manual points out, this does not absolve the designer of responsibility for the output of the programs used in bridge design. All calculations, regardless of manual or computer-generated, must be checked by a second engineer using a three-tiered approach depending on the complexity of the analysis required for the component type.
Florida requires a Quality Control Plan that identifies the staff performing the roles of Engineer of Record (EOR) (i.e., the professional that will sign and seal the document), Lead Technical Professional, QC Reviewer, QA Manager, and BIM Manager. The staffing plan must list the functional areas, milestones, lead technical professional, and quality control reviewer. The plan must also identify the software that will be used for conducting 3D model reviews. There are requirements for formal review documentation, but no guidance on how to conduct those reviews. There are instructions for a five-step process to execute and document paper-based reviews using stamps and colors for each responsible party. The instructions for electronic reviews simply say to ensure that the five-step process can be replicated. There are detailed procedures for BIM reviews that address development, design analysis, and interdisciplinary coordination. Developmental reviews address conformance (adherence to CADD standards), completeness, and internal consistency. Again, the manual identifies who is responsible, the guiding standard, and what type of check to conduct, but does not describe how to perform the review.
Both Florida and Montana have incorporated the concept of Level of Development (LOD). Florida’s CADD manual defines LOD for proposed model elements and Level of Accuracy (LOA) for existing conditions. The CADD Manual defines minimum modeling requirements by discipline and a Model Element Breakdown Workbook (MEB) to document the LOD/LOA of the model elements. Montana has a
LOD guide for designing bridges in 3D. It is essentially LOD foundational definitions and a MEB with minimum modeling requirements.
Guidelines
Providing guidance for how to execute and document model-based quality management involves first defining the boundaries of a review (who, when, why, what). For example, there would be a different responsible party, different design criteria, a different methodology and a different outcome (documentation and actions) for a roadway preliminary design review versus a bridge hydraulics final design review. Figure 3 shows how these boundaries lead to defining the review methodology and outcome. Our methodology begins with building taxonomies for these review boundaries.
The purpose of the guide developed under this study is to provide a guideline for executing design and 3D model checks and then documenting those checks. For example, if part of a roadway design check is to check the lane width, then the guideline should provide a clear methodology for how to measure a distance and check that lines are parallel. The guide will provide guidance on creating job aid checklists and sample checklists for various disciplines.
The data collection in Phase II of the study should focus on the following four areas:
- Information needed to build out the taxonomies.
- Job aids for what to check (e.g., existing checklists).
- Job aids for how to conduct design and model reviews.
- Definitions to support a set of properties for quality management metadata (referred to in this conduct of research report as a data dictionary) that can be used by vendors to build software tools to execute and/or document model-based reviews.
Information to Build Out the Taxonomies
The following types of information will support building out the taxonomies:
- Types of manuals and design guidelines.
- Process maps (to identify milestones and review types).
- Organization charts (to identify functional areas and responsible parties).
- Model element breakdowns.
Job Aids for Review Methodology Development
The following types of information will support developing review methodology guidance:
- Design manuals and CADD manuals.
- Job aids like review checklists.
- Software vendor materials that describe software-based reviews.
- Training materials and recordings.
The purpose of collecting design review checklists is to inventory the types of actions performed as part of those design checks. These may be measuring lengths and areas, counting numbers, checking for offsets, checking for coincidence (e.g., a dimension starting from a work point), locating content, isolating content, hiding, and revealing isolated content, creating reports, etc.
Definitions
The following types of information will support capturing definitions to create a data dictionary:
- Design manuals and CADD manuals.
- Statutes and administrative rules.
- Glossaries of terms.
Task 3. Task 1 & Task 2 Synthesis
Task Overview
The purpose of this task was to summarize the results of both Task 1 and Task 2. The main objective of Task 1 and 2 was to review published information regarding current quality management processes, design review protocols for 2D contract plans and 3D model–based delivery, State DOTs policies, statutes, and overall state of the practice that would help the research team finalize the research methodology and determine data collection needs to be collected during Phase II.
The research team also reviewed quality management practices and procedures for all disciplines (regardless of whether the project is delivered digitally or not). Specifically, the research team reviewed in detail the policies, design manuals and job aids for Pennsylvania, Montana, Florida, and Minnesota.
Task Outcomes
This task synthesized the results of Tasks 1 and 2. The synthesis combined the gaps and guidelines and provided a cohesive summary of the two areas of exploration.
Task 4. Methodology
Task Overview
The purpose of Task 4 was to propose a methodology to achieve the research objective to be fully developed in Phase II. At the minimum, the methodology needed to include the following:
- Define a list of elements to be considered in the review of 3D models and their respective standards. Identify which of these elements are contractual representations of the design intent or supplemental engineering information. Elements may include existing and proposed features (e.g., utilities. drainage, structures, etc.).
- Develop inter-discipline and multi-discipline review (including QC and QA) and documentation procedures.
- Develop a procedure for version control from model development through delivering contract documents.
- Develop resources (including workflow, data dictionary, etc.) to automate the review and documentation procedures. These resources will be available as a stand-alone document for software developers.
- Define a list of standard terminologies used in 3D modeling
- Identify which parts of the review and documentation procedures are essential versus general practice.
The task looked at the results of Tasks 1 and 2, which reviewed the literature and did a deep-dive into the policies, practices, job aids, and other resources provided by four State DOTs. The task further considered the needs of the guide, which is outlined in Task 5.
The outcome of the task was the framework for executing the research to develop the Guide. It will guide how the materials are collected, organized, analyzed, developed, and ultimately presented in the Guide.
Task Outcomes
The proposed methodology had four basic steps:
- Collect and organize information.
- Identify and close gaps in collected information.
- Document and develop processes.
- Create guidance.
The proposed methodology began with developing taxonomies for different categories of information to organize both the materials that are collected in Task 7 and the information extracted from those resources. By using taxonomies, we could ensure that the guidance provided is comprehensive. The taxonomies also helped to define the boundaries of a review. For example, there would be a different responsible party, different design criteria, a different methodology and a different outcome (documentation and
actions) for a roadway preliminary design review versus a bridge hydraulics final design review. Figure 4 shows how these boundaries lead to defining the review procedures and documenting the outcome.
We addressed the first methodology requirement by proposing the following taxonomies:
- Functional Areas: Major disciplines and sub-disciplines
- Model Elements: Major systems, sub-systems, major elements, and sub-elements
- Milestones: Points in time at which reviews occur
- Review Types: Categories of reviews, e.g., external, design, 3D model
- Review Criteria: Standards and other criteria that are the reference for the review
- Responsible Parties: Roles involved in the design and review processes
The methodology proposed collecting a suite of information for three purposes: to build out the taxonomies, develop review methodology guidance, and to capture definitions to create a property set. The types of information were:
- Manuals and design guidelines.
- Process maps (which would identify milestones and review types)
- Organization charts (which would identify functional areas and responsible parties)
- Model element breakdowns.
- Quality management plans and BIM execution plans (which would identify terminology, responsible parties, review guidelines and procedures, and model element breakdowns).
- Glossaries of terms.
- Job aids like review checklists.
- Software vendor materials that describe software-based reviews.
- Training materials and recordings.
- Statutes and administrative rules.
- Glossaries of terms.
- Metadata schemas.
The taxonomies would enable building out a library of reviews defined by functional area, milestone, model elements, review type, review criteria, and responsible party. Using the taxonomies in this way would identify gaps. We would assess whether we can close a gap by collecting additional information or if the information does not exist yet and needs to be developed. Such an example could be review criteria for 3D coordination, like a set of rules for the acceptable clearances between different element types for automated clash detection.
Next, we would define the methodologies and documentation requirements for each review. This step addresses the second, third, part of the fourth, and the sixth requirements for the methodology. Based on the review of statutes in Task 2, legal requirements generally are limited to which documents require a seal and how the seal must be affixed. The quality management protocols identified in Task 2 were general professional practices contained in policy documents (e.g., manuals), guidelines, training materials, and job aids. We will address the sixth methodology requirement by identifying which current practices are no longer essential based on new, digital practices.
Review Procedures
With the review library mapped, we would begin to address the second and the first part of the fourth methodology requirements by developing procedures for using a 3D model to conduct the library of reviews for final deliverables. We would use the collected job aids to document procedures and augment the collected job aids with the procedures developed based on the gap analysis. The result would be an initial set of review procedures for each type of final deliverable review contained in the library of reviews.
We would then conduct interviews with our technical advisers (discipline engineers), the project panel and up to four state DOTs, and then we will test the refined review procedures on projects – provided the timeline of the projects fit within the NCHRP Project 10-113 schedule.
Review Documentation
During Task 2, the research team identified a gap in procedures and documentation for reviews that are conducted digitally. To address the third methodology requirement, we would capture examples from Europe, in particular, the United Kingdom, where there has been substantial work toward developing a digital “golden thread” to trace accountability over the asset life cycle. We would develop a property set for the documentation that needs to accompany digital files to demonstrate the review status and responsible individuals. This property set could be applied as metadata, in an accompanying “Read Me” file, or as a digitally signed and sealed PDF document that accompanies the 3D models, depending on the implementing agency’s needs.
The methodology considered reviews involving mathematical relationships between vector geometry to be the best initial candidates for automated reviews. Clash detection algorithms and tools that check the geometric compliance with codes such as sight distances, curve lengths, structural equations, and so on are examples of reviews that currently can be automated. We will develop a data dictionary for fields that software vendors can add to their review documentation to meet the needs of agencies and provide guidance on the steps for the accountable individuals to validate the automated review outcomes.
Finally, the methodology involved documenting the practices captured and/or developed using the logical organization of the Guide developed in Task 5. We would address the fifth methodology requirement by developing a glossary of terms for the guide. We would address the fourth methodology requirement by creating a comprehensive guide that addresses process control procedures and a data dictionary. The review property set would be provided as part of the appendix to the guide. These products would be shared with State DOTs to assist them in requesting automation tools from their software developers.
The guidance would be documented with a focus on the easy implementation of the guide and incorporation into agency policy documents and digital systems that are organized in disparate ways. There would be a chapter devoted to assisting agencies with the implementation of the guide. We would work with the panel to select the first chapter to be developed in Phase II of the research. While developing the guide, we proposed to conduct interviews to capture feedback, refine procedures, and, if timing allows, test the procedures on actual design projects.
Task 5. Preliminary Guide Outline
Task Overview
The purpose of Task 5 was to develop an annotated outline for the guide that will be fully developed in Phase III, Task 11. Guidebook Development. The preliminary outline presented herein is based on the information collected during Task 1 and 2. The preliminary guide will be used to draft the proposed methodology, which will guide the data collection and execution methodology development in Phase II.
Task Outcome
The results of the task were an annotated outline providing a list of each chapter and bullet points outlining the content contained in the chapter.
Task 6. Prepare Interim Report 1
Task Overview
The work performed through Phase I was summarized in Interim Report 1 for completion of Task 6. The work was submitted to the NCHRP project panel on January 6, 2023. Revisions were required and presented to the NCHRP project panel again on February 16, 2023. The revised Interim Report 1 was submitted on March 3, 2023. Subsequently, the research team waited for approval of that Interim Report prior to advancing to Phase II of the research project. The team received approval to continue March 4, 2023.
Phase II. Data Collection and Methodology Development
Phase II includes four tasks as outlined in Figure 5.
Task 7. Data Collection and Analysis Synthesis
Task Overview
The objective of Task 7 was to collect and analyze resources to support the methodology to be developed in Task 8. The objective for Phase 2 is to develop a library of reviews for which procedures could be developed and documented in the guide. Another core objective for Phase 2 was to develop a property set of information needed to document the review process and to develop a data dictionary for quality management metadata so that software vendors could potentially implement that property set in report documentation for the review tools that they develop in their software.
- The research team focused on collecting data in four areas:
- Information needed to build out the taxonomies.
- Job aids of what to check (e.g., existing checklists)
- Job aids for how to conduct design and model reviews.
- Definitions to support a data dictionary for quality management metadata.
The research team collected examples of manuals, checklists, guidelines, process maps, glossaries of terms, statutes, and administrative rules (that document definitions), organization charts, model element tables, job aids (including checklists), training materials, and software product descriptions. Some of these materials related directly to how an agency implements the quality process, while others related directly to
how an agency communicates the 3D model requirements. The research team also met with 3D modeling experts and quality management experts to discuss practical approaches to executing the process and documenting the quality process digitally.
The core activities in Task 7 were:
- Collect information,
- Create taxonomies,
- Organize information,
- Create a review library,
- Collect a library of procedures,
- Collect a set of terms relating to 3D modeling and quality documentation,
- Identify and close gaps in collected information.
Outcomes
The outcomes of Task 7 included a vast catalog of collected materials, taxonomies, a library of reviews, a library of procedures that are performed in the execution of the reviews, and a collection of terminology.
Organized Collection of Materials
The materials were organized in a manner that would enable them to be quickly recalled when needed in Task 8. This involved tagging materials with metadata that could be used to locate the information. The metadata was derived from the taxonomies that were developed to create a review library. Taxonomies were developed for functional areas, model elements, review types, and responsible parties. The taxonomies are included in this conduct of research report as Appendices A-D.
Taxonomies
Taxonomies also make it easier for agencies to implement guidance whether they organize their design standards, policies, and guidelines by functional area (e.g., discipline) or not. The research team
determined that it was not necessary to create taxonomies for review milestones (as these are linear) or review criteria (as these followed the functional areas). Taxonomies were developed for Functional Areas, Model Elements, Review Types, and Responsible Parties. This section describes each of the taxonomies developed by the research team. Each taxonomy is included as appendices in Task 9 Sample Guidebook, delivered as a separate document.
Functional Area Taxonomy (Appendix A)
The top level of the functional areas taxonomy is the discipline. The following disciplines were included: Construction, Drainage, Environmental, Geotech, Roadway, Rail, Structures, Survey, Traffic, and Utilities. Each discipline was broken down to a second level (i.e., the sub-discipline). Two sub-disciplines were broken down into a third level. These were Traffic – ITS and Structures – Bridges.
Model Elements Taxonomy (Appendix B)
The model elements taxonomy closely follows the functional area taxonomy. This was a deliberate decision to align the model elements to the review criteria (which also follow the functional areas taxonomy). Table 3 shows the organization of the model element taxonomy on the left with an example from the Bridge Structures model element taxonomy on the right. Note this is the only taxonomy that was included in the Task 9 Sample Guidebook document.
The top level of the model elements taxonomy is the discipline. The following disciplines were included: Drainage, Geotech, Roadway, Rail, Structures, Survey, Traffic, and Utilities. The Construction and Environmental functional areas do not have corresponding model elements; this is because the indicated sub-disciplines (i.e., cost estimating, demolition, scheduling, value engineering, NEPA, and permitting) concern the project as a whole.
At the second level, the model elements taxonomy deviates from the functional areas taxonomy. In some cases, a sub-discipline has a corresponding group of model elements. In other cases (e.g., hydrology and hydraulics), model element groupings apply to more than one functional area. The primary consideration when creating model element groupings was to follow an asset class breakdown. The Structures model
element groupings followed the breakdown published in the Information Delivery Manual for the Design to Construction Data Exchange for Highway Bridges. For simplicity, the Structures discipline was split into two groups at the discipline level, i.e., Bridge Structures and Structural Components.
Table 3: The Format of the Model Element Taxonomy with an Example for Bridge Structures
| Discipline | Bridge Structures | ||||
| Model Element Grouping 1 | Approach Structure | ||||
| Model Element Type 1 | Approach Slabs | ||||
| Model Element Type 2 | Sleeper Slabs | ||||
| Model Element Grouping 2 | Bearings | ||||
| Model Element Type 1 | Curb and Gutter | ||||
| Model Element Type 2 | Deck | ||||
| Model Element Grouping 3 | Substructure | ||||
| Model Element Type 1 | Abutment/End Bent | ||||
| Model Element Type 2 | Architectural Feature | ||||
The model elements taxonomy was used to develop a Model Element Table (MET), which is a listing of model elements within each group and sub-group. The MET is a key tool to support BIM management during model development as well as a tool to document the quality process. “Existing” elements are not included as a model element grouping, even though many example METs do so. The researchers felt that existing elements follow the same asset-focused breakdown. In practice, existing elements are stored in a separate file (e.g., the “survey” file) and can be inventoried using their own tab in the MET file. The asset-focused breakdown in the MET is intended to better support identifying existing assets that are removed or rehabilitated through the construction process.
Review Types Taxonomy (Appendix C)
The review types taxonomy has five top-level categories: 3D model, Design, Quality Assurance, Survey, and External. Each of these categories breaks down to a second level. The 3D model category broke
down to a third level, though functionally on a project it may not be necessary to differentiate the third-level reviews. While each third-level review involves distinct procedures, they are often conducted by the same responsible party and using the whole of the project or discipline dataset.
Responsibilities Taxonomy (Appendix D)
While many DOTs self-perform design services, the intention of differentiating between the “owner” and the “designer” was to more clearly differentiate between the project participants who perform Quality Assurance reviews (i.e., the “owner”) and those who perform Quality Control reviews (i.e., the “designer”). The responsible parties taxonomy has three top-level categories: Owner, Designer, and External Agency. Each of these categories breaks down to a second level.
The responsible parties taxonomy was developed to provide examples in the review library and to align to common roles on a project. The intention was to be able to align the core competencies needed to execute review procedures to the roles in the taxonomy. In practice, individuals’ names would be entered in the review documentation, however, the guide could help a project team to identify an individual with the necessary skillset.
For external reviews, the design project team has no control over the procedures that are used or the core competencies of the reviewers. Nevertheless, these reviews are of interest because the designer needs to provide digital deliverables that would meet the external reviewers’ needs.
Review Library
The research team developed a review library help identify the types of reviews to be considered for inclusion in the Task 9. Sample Guidebook. Providing guidance for how to execute and document model-based quality management involves first defining the parameters of a review. For example, there would be a different responsible party, different design criteria, a different methodology and a different outcome (documentation and actions) for a roadway preliminary design review versus a bridge hydraulics final design review. Figure 6 shows how these boundaries lead to defining the review procedures and documenting the outcome. Our methodology begins with building taxonomies for these review parameters.
The research team used the taxonomies to create a long-list of review types. It quickly became evident that certain types of reviews apply to groups of elements whereas others apply to the discipline model as a whole or even to the project dataset as a whole. Specifically, design reviews (particularly calculations and code reviews) are very disaggregated because different criteria are applied for different types of elements. However, 3D model reviews and some design reviews (e.g., cost control and value engineering) apply at the discipline level or to the project dataset as a whole. External reviews and quality assurance reviews apply to the overall project dataset.
We determined that there are too many combinations of reviews to document each one, so we will provide guidance on how to determine the comprehensive list of reviews needed on a project and create a library of reviews that are representative of each type of review. The omitted reviews would differ from the sample only in terms of the review criteria that are specific to the group of model elements.
Procedures Library
Creating a procedures library helped the research team isolate procedures that were applicable to each of the processes identified for performing quality management (i.e., initiate, check, back check, update and verify). These procedures will be further developed during Phase III.
In order to provide guidance on how to execute the reviews, the researchers need to reference a number of procedures—or actions that are taken within software to examine the 3D model content. However, these procedures are common to a wide range of review types. For example, the procedure “Select an alignment and view its properties” would be used both for a roadway design review and for a roadway model standards compliance review. The researchers therefore developed a library of procedures and will provide guidance
on how to use the procedures library to determine the core competencies needed to execute reviews. This will aid project teams in identifying individuals to execute reviews or to identify software proficiency training that reviewers would need to take in order to execute their reviews.
Collection of Terminology
The research team collected two different types of terminology: 3D model terms and quality process terms. The 3D model terms will be used to define a list of standard terminologies used in 3D modeling as a glossary in the guide. The quality process terms will be used to develop a data dictionary for quality management metadata that can be used by vendors to build software tools to automate reviews and/or provide review documentation. This glossary of terms is included as Appendix A in the Task 9 Sample Guidebook document.
Conclusion
The procedures are identified by considering one discipline and one review type at a time. In many cases, the procedures (or actions taken within a 3D model environment) apply broadly to all disciplines and review types. Isolate content, pan, zoom, view element properties, and so on. In other cases, procedures are specific – display the properties of the geospatial coordinate system, create a rule set for a clash detection algorithm. Developing a comprehensive library of procedures is part of the Task 8 Execute Methodology, however, developing a full set of procedures would be most difficult without a specific dataset and software to test the validity of the procedures. Nevertheless, the data collected in Task 7 was analyzed to inform the activities of Task 8 Execute Methodology, and content to be considered for Task 9 Sample Guidebook.
With regard to documenting the quality process, the researchers found a wide range of practices and approaches. The reviewers created a property set and data dictionary for documentation information that included both required and optional information. Required information included items like the name of the Originator and the origination date, and the name of the Reviewer and the review date. Optional information included things like the credentials of the reviewer, which some agencies require, and others do not. The review documentation needs to be auditable, which means that it needs to be stored in a way
that is accessible for an auditor who does not have access to 3D modeling software. There are many ways to do this. An auditor will check that each element of the design and model has been checked, so the documentation needs to incorporate a way to demonstrate these checks have occurred. In the past, this was facilitated by highlighting each section of the plans as the check occurred. A MET is a tool to facilitate that documentation, but there are still a multitude of ways to do that. One spreadsheet file per model with tabs for each review at each milestone? Or one spreadsheet file per milestone with tabs for each discipline, each model, and each review type? Thus, the researchers have provided building blocks for an agency to create review documentation that fits their needs rather than tools that are directly applicable.
Task 8. Execute Methodology Synthesis
Task Overview
The objective of Task 8 was to execute the proposed methodology according to the approved Interim Report No. 1. The core activities were:
- Create an execution plan to guide the team with key activities, timelines, and participants.
- Setup interviews with technical advisers or discipline engineers to obtain feedback from practitioners on the approved proposed methodology.
- Solicit participation from four State DOTs to help the research team execute the methodology.
- Host a web conference to provide instructions, roles & responsibilities, desired outcomes, and time requirements.
Outcomes
The main outcome of Task 8 was a packet of instructions for executing model-based design reviews and 3D model reviews using the proposed methodology. More detail is provided regarding the outcomes of each of the four steps.
Execution Plan
To guide the task, we divided the task into six activities. Table 4 includes the activities, participants, and the timeline during which the activities occurred.
| Activity | Participants | Timeline |
|---|---|---|
| Interview Subject Matter Experts | Alexa Mitchell, Jennifer Steen, Francesca Maier, John Reese, Kevin Martin, Colby Christiansen, Grant Schmitz, Dan Prokop, Daniel Domalik | March – May 2023 |
| Develop review libraries | Francesca Maier, Jennifer Steen, John Reese | March – May 2023 |
| Solicit partnerships from DOTs | Alexa Mitchell, Jennifer Steen, Francesca Maier | March – May 2023 |
| Host a web conference | Alexa Mitchell, Jennifer Steen, Francesca Maier, John Reese | Early June 2023 |
| Develop a packet of instructions | Jennifer Steen, John Reese, Kevin Martin, Marcia Yockey, Grant Schmitz, Julie Rivera, Francesca Maier | May – July 2023 |
| Collect feedback | Francesca Maier, Jennifer Steen | Early September 2023 |
| Create a synthesis report | Francesca Maier, Jennifer Steen, Alexa Mitchell | September 2023 |
Interviews
We interviewed HDR Design SMEs and HDR Quality SME.
We setup interviews with technical advisers, discipline engineers, and a quality manager with extensive experience in project delivery. Typically, these engineers oversee quality management procedures for projects at HDR, and many of them have significant experience with model-based design as well. The quality manager oversees HDR’s quality procedures for all of HDR’s transportation design projects.
The interviews served two purposes. First, they confirmed the priorities that underpinned the methodology and second, they provided feedback on the approved proposed methodology. Some specific feedback included:
- An emphasis on quality management documentation that is an auditable record of all checks that occurred. For example, the paper-based process involves highlighting plan content as it is
- checked if there are no markups. We discussed different approaches to documenting model-based checks that do not result in a mark-up or comment.
- A suggestion to combine review types into a smaller number of reviews organized by the typical responsible party for executing the review. Specifically, combining sub-categories of 3D Model Standards and 3D Model Integrity review types.
- The expansion of the resulting five review categories to six with the inclusion of an Estimated Quantities review category. Pay Item breakdowns generally do not correlate well with model element breakdowns, which limits the ability to extract quantities directly from the model. Therefore, quantity estimating is a complex task with limited repeatability (from project-to-project) and reproducibility (from person-to-person).
State DOT Participants
We solicited participation from eight State DOTs to help the research team execute the methodology. The solicitation included at least two states that have already started modifying their quality process to include digital delivery and at least two states that have not. The invited States were: Utah, Montana, Minnesota, California, Pennsylvania, Delaware, Tennessee, and Louisiana.
We developed a packet (Appendix E) and provided an introductory webinar to guide the participating DOTs with key pre-requisites to meet the objectives of the project. Our aim was to get a broad cross section of project types, but we relied upon the states to do the testing and did not have control over which projects they used. We developed a survey to receive feedback from the testing States. The survey was available for one month and we received five responses from two States (Utah and California).
Web Conference and Test Instructions
We hosted a web conference to provide instructions, roles & responsibilities, desired outcomes, and time requirements for testing the packet of review instructions. The packet had the following information:
- Background explaining the research objectives and guiding principles,
- Request for participation and guidance on the type of feedback we seek,
- Assumptions and contact information,
- An index of five review types (i.e., Survey, 3D Model Integrity, 3D Model Standards, Clash Detection and Spatial Coordination, and Design Discipline) with a detailed description of the review,
- A library of procedures to implement a seven-step process for each review type. The seven steps were: Initiate and Prepare, Conduct, Document, Resolution, Revisions, Verification, and Audit. The reviews were defined with the following information:
- Scope of the review
- Review information (including documents to review and reference documents)
- Review process (i.e., procedures to conduct each of the seven steps and core competencies for each procedure)
- Job aids to support review documentation, including check lists and a MET, and
- A glossary of 3D modeling and quality management terms.
Conclusion
The feedback received from participating states was overall positive and confirmed the direction, relevance, and usefulness of the methodology. Feedback packet is provided in Appendix F. However, we only received feedback from five individuals representing two States. The research team concluded that feedback from a larger audience was needed to inform the content to be prepared for Task 9 Sample Guidebook. To obtain additional feedback, the research team planned and facilitated an in-person workshop to collect input from a wider audience of intended users of the guide. The team prepared the materials for a conference workshop held in early October as part of the 2023 IHEEP Annual Conference in Salt Lake City, UT. This 90-minute workshop was completed on October 4. The materials used for this workshop were previously shared with the NCHRP project panel as part of the Task 8 Synthesis delivered in the last quarterly report (October 9, 2023), but also included in this conduct of research report as Appendix G. The IHEEP workshop was well attended with nearly 50 active participants. The workshop was facilitated by
Francesca Maier, Jennifer Steen, and Alexa Mitchell. Rachel Catchings was also in attendance to assist with note taking. As part of the IHEEP workshop preparation, the research team identified that the five review types previously proposed do not cover the important step of checking the quantities taken off from the design to use in cost estimating. We therefore added this to the Discipline Design Review procedures used in the IHEEP workshop.
Task 9. Sample Guidebook
Task Overview
The purpose of this task was to complete a sample guidebook that would be publication ready to include a fully developed chapter as well as detailed descriptions of other proposed chapters in the form of an annotated outline. The research team chose to develop the content for the proposed Chapter 2 of the guide on Quality Management Concepts. This chapter serves as an example of what the other chapters of the guide will look like. This section of the Interim Report 2 provides the details on how the sample guidebook was developed.
Outcomes
The single outcome of this task is a sample guidebook, which is being delivered as a separate document (along with this interim report).
The only requirement for Task 9 was to fully develop one chapter. While the development of taxonomies was part of the work needed to develop the Task 8 Methodology, they were not all included in the Task 9. Sample Guidebook. More details about what was included in the Task 9 Sample Guidebook is covered in the other chapters. For example, during the development of Chapter 2, the research team identified that briefly introducing the purpose of quality management was necessary to provide context for the detailed discussion in Chapter 2, so it was included as a section in the Introduction chapter of the guide. Another example of information that was developed as part of Tasks 7 and 8 that were not included in the final sample guidebook were the taxonomy tables (Appendices A–D in this document). The research team
concluded that the only taxonomy that was needed as a reference in the guide was the Model Elements Taxonomy. This Report includes all the taxonomies developed during Task 8, but only the Model Elements Taxonomy is in the Task 9 Sample Guidebook document.
The Task 9 Sample Guidebook is a good starting point for the work to be conducted in Phase III. A thorough review from the perspective of the intended audience will provide the research team with the opportunity to capture input for developing the full content of the guide that will be valuable to all State DOTs. The research team encourages the project panel to work with their design staff to review the content being proposed and participate in future Phase III reviews and the workshops being proposed for Phase IV. Involvement from the intended audience for this guide is a critical success factor for final development and implementation of the final products for this study.
Task 10. Prepare Interim Report 2
Task Overview
The work performed through Phase II was summarized in Interim Report 2 for completion of Task 10. The work was submitted to the NCHRP project panel on November 22, 2023. The research team awaited for approval of that Interim Report prior to advancing to Phase III of the research project. The team received approval to continue on January 19, 2024.
Phase III. Guide Development
Phase III includes two tasks as outlined in Figure 7.
Task 11. Guide Development
Task Overview
Below is a summary outline of the contents of the guide. The complete guide was passed out to the panelists before the in-person panel meeting and feedback was solicited. The content of the draft guide represents a culmination of knowledge gained from this research project.
Title: Quality Management Guidelines for 3D Model–Based Project Development and Delivery Summary
Discusses the transition to 3D model–based project development and delivery, outlining the benefits and challenges of implementing a paperless quality management system for construction and design projects.
Chapter 1 - Introduction
Provides an overview of the guide’s purpose, scope, and the significance of adopting 3D modeling in design and construction. It emphasizes quality management objectives like risk management, accountability, and aligning with information management standards (e.g., ISO 19650).
Chapter 2 - Quality Management
Covers the principles of quality management systems, including international standards (ISO 9000 series) and practices for managing the quality of software products and digital records. It also highlights project execution with frameworks for defining quality for 3D models.
Chapter 3 - Record Management
Focuses on managing digital records, including standards like ISO 15489 for records management and BIM records management. It addresses challenges such as digital preservation, versioning, accessibility, and cyber security in a 3D model environment.
Chapter 4 - Model Reviews
Details the types of model reviews, the processes involved, and the responsibilities of reviewers. It distinguishes between design intent reviews (e.g., modeling standards and model integrity) and design compliance reviews (e.g., survey review and clash detection).
Chapter 5 - Components of Review
Explores the technical aspects of model reviews, focusing on standards for model development, tools for review (e.g., checklists, automated tools), and job aids. This chapter helps ensure consistent and standardized 3D model reviews.
Chapter 6 – Implementing This Guide
Provides guidelines for integrating the quality management process into an organization, including change management, workforce development, process updates, and leveraging technology. It also suggests strategies for ongoing improvements and tool developments.
Appendices
The report appendices offer supplementary resources, including a glossary of terms, model elements taxonomy, review documentation property set, competencies required, detailed review procedures, and sample quality artifacts.
Task 12. Interim Report #3
Task Overview
The work performed through Phase III was summarized in Interim Report 3 for completion of Task 12.
Phase IV. Final Products
Phase IV includes two tasks as outlined in Figure 8.
Task 13. Present Draft Deliverables (Workshops)
Task Overview
The research team used the approved draft Guide, submitted, and approved during Phase III, as the basis for the workshop materials. The team planned and facilitated three (3) virtual 2-hour workshops. The content for each of the workshops built upon the previous one, focusing on specific areas of the guide. The research team met with the same group of people for all three virtual workshops to maintain continuity and in-depth discussion. The purpose of these workshops was to gather input on the technical content and reach consensus on the presentation and content of the final guide.
The research team invited participants from the IHEEP 2023 workshop and extended invitations to the project panel (or their designates), the Joint Technical Committee on Electronic Engineering Standards (JTCEES), and the Committee on Bridges and Structures (COBS) Technology Subcommittee. These three groups brought a diversity of perspectives from various disciplines in both the public and private sectors, as well as from the software vendor community.
The research team recorded the virtual workshops to make them available to those who were unable to attend. The dates for the workshops had been scheduled based on contractual requirements for delivering the final products on time and the availability of the research team. The workshops were held on the following dates:
| Scheduled Event | Event Date | Event Time |
|---|---|---|
| Virtual workshop #1 | 9/5/2024 | 2-4 p.m. EDT |
| Virtual workshop #2 | 9/10/2024 | 12-2 p.m. EDT |
| Virtual workshop #3 | 9/12/2024 | 12-2 p.m. EDT |
The workshop included presenting the guidance to the workshop participants and working through a number of coordinated activities. There was a benchmarking activity during the first workshop followed by a risk identification activity. In the second workshop, participants rated the identified risks. In the third workshop, participants re-ranked the highest priority risks and identified mitigation measures. In the third workshop, participants considered the future software landscape and provided input into preferred features.
Finally, participants optionally completed a feedback survey. Feedback was generally favorable, with the top three areas of need being review procedures, sample job aids, and a review documentation property set. The workshop output is presented in Appendix H.
Task 14. Final Products
Task Overview
The work under this task focused on the development of the final products, including:
Guide: This standalone document providing suggestions and best practices for performing quality management of 3D model–based project delivery was approved at the end of Phase III. Research team does not expect any technical modifications to this document.
Conduct of Research Report: This document will include an overview of the research project including a synthesis for each task similar to what has been provided in the interim report. The report will also include a chapter on conclusions and recommendations for future research.
Outreach Materials: Will include the PowerPoint presentation and workshop facilitation plan delivered in Task 13. During Task 14, we will create and deliver a new presentation to be used for the 90-min webinar conducted at the end of the project.
Implementation of Research Findings and Products: A technical memorandum that will include an executive summary of the project and resulting products, as well as recommendations for activities and resources for implementing the research final products. The research team discusses opportunities to use technology transfer programs in place sponsored by AASHTO and FHWA and administered by TRB to deploy implementation of this guide.