This chapter provides suggestions for implementing this guide and addressing gaps in the design quality–management process created by the introduction of 3D model–based design. These guidelines are organized using the people-process-technology framework.
This framework balances competing needs and gaps against how guidelines fit within an element, summarized as follows:
Transitioning from a 2D plan–based to a 3D model–based review environment highlights the ideas about quality management mechanisms for performing reviews long held by agencies.
While the roles and responsibilities of design team members are not changing dramatically, the required competencies and methods used to perform and document reviews in a model-based environment are in flux. Thus, it is important for agencies to review current job descriptions to identify gaps in required and preferred qualifications. Based on this analysis, agencies can develop and deploy an education and training plan to upskill their workforce using short- and long-term activities suggested in this section.
This section provides suggestions on the core components required to establish appropriate resources, identifies key positions and job responsibilities affected by digital design, and proposes a change management strategy to methodically and efficiently reskill agency staff and their stakeholders.
Multiple agencies have found it extremely valuable to establish an organizational structure or technical working groups guided by a steering committee when transitioning to digital project development and delivery. Dedicated staff with specialized expertise were tasked with creating and deploying action plans for establishing or updating standards and procedures, evaluating and setting up new technology, defining new roles and responsibilities or updating them (as described in Chapter 4), identifying skill set gaps, and creating education and training programs to fill those gaps. Deploying model-based quality management is only one of many aspects of digital project development and delivery that need to be coordinated and overseen by someone with the appropriate expertise and availability.
The research team suggests the following:
Stakeholders typically involved in recommended activities include
Digital project development and delivery are becoming more widespread, so agencies and industry (e.g., consultants and contractors) need to collaborate with each other and educational
institutions. Agencies and industry can work with trade schools, colleges, and universities to integrate digital delivery technologies and methods into their curricula as part of a long-term vision and partnership to prepare the future workforce. However, this will not fulfill immediate training needs for new quality management processes and workflows among the existing workforce. This section provides suggested short- and long-term strategies for sharing knowledge, creating trainings to fill competency gaps, and advancing the functionality of technology.
The research team suggests the following:
A program that combines short how-to videos and hands-on workshops has become the preferred style for technology training. This is due to the rapid advancement of technology, causing traditional training materials to quickly become outdated. Creating a training program for agency staff and stakeholders on model review processes and development standards is highly encouraged. Deploying training in small groups provides people with a support network to return to when troubleshooting issues. Just-in-time and on-demand trainings provide answers about specific topics when a reviewer most needs them.
Suggestions for creating training content are as follows:
Suggestions for delivering training:
This guide’s suggestions for updating quality management processes are based on the five-step process, introduced in Chapter 2. The research team has identified three key areas where agencies may need to evaluate and make updates: QA documentation requirements, QC standards and procedures, and tools and job aids.
Agencies need to review their current forms to identify whether they need to be updated and, if so, how to update them. If an agency does not have any QA documentation requirements, establishing those requirements is a necessary first step. However, even if an agency has well-defined QA documentation requirements, the forms may still need to be updated. Appendix C offers a review documentation property set that can be used during this evaluation. Agencies can also consider the guidance for management of digital records provided in Chapter 3.
Agencies need to evaluate their current standards and published procedures to identify what needs to be updated. ISO 19650 provides recommendations for managing digital project files when using model-based design that should be consulted when establishing or revising standards and procedures related to the quality management of 3D models.
Items to consider during the evaluation include
Agencies need to evaluate their technology stack to determine whether they should acquire new software. Job aids, such as checklists, should be updated to incorporate model-based design concepts.
Items to consider during this evaluation include
Some agencies have been using CADD standards for many years. When transitioning to a model-based design environment, it is important to reevaluate these standards since they were produced to control the appearance of points, lines, and text on a drawing sheet. While CADD standards are still a component of model-based design, they do not provide guidance on model creation. This is an important distinction because reviewing 3D model quality depends on the criteria set for how it should be created. Guidance for establishing modeling standards is summarized as follows:
For detailed guidance, refer to Chapter 5.
As introduced in Chapter 2, quality management programs can use a mix of process and product controls to achieve quality objectives. A well-defined process control framework is likely to result in fewer product controls. This section provides guidance for leveraging modeling software configurations to control the production process, as well as onboarding trainings on the agency’s quality management procedures to control product outcomes.
Modeling software configurations can be a very powerful tool for creating a consistent and repeatable model-based design process. Since the introduction of CADD software, many agencies have standardized the look and feel of contract plan sets. Configuring plan production
tools within software makes it easier for users to apply established standards rather than trying to figure it out on their own. An agency can work with its software vendors and CADD support team to set up the configuration and libraries to follow the defined modeling standards for each discipline. This is typically accomplished by setting up libraries of traditional CADD standards, such as layering, color, line styles, and standard 2D and 3D cells or blocks, and adding a series of assemblies and subassemblies of model-based objects.
Creating 3D object libraries that comply with agency design standards and standard drawings will likely provide at least 80 percent of all model elements necessary to assemble pavement and shoulder structures, bridges, and drainage systems. Modern modeling software has the functionality to create parametric objects, following specific standards, that users can select, drag, and drop when assembling discipline-specific models. The ability to use parametric objects, elements, or cells allows designers to quickly update or automate changes rather than recreating an entire object from scratch. When utilizing parametric objects within a defined library or configuration, agencies can provide workflows for using and updating the objects. Configuring the system to set up modeling tools and software design calculations following agency or national standards will make it easier for users to apply model development standards in a consistent and repeatable manner.
Agencies may want to consider setting up a methodology for reviewing and deploying new versions of software. The CADD support team needs to work closely with IT when deploying new software. Agencies can coordinate with IT to check for agency requirements related to operating systems, cloud-based usage policies, hardware and infrastructure, and cybersecurity.
Once a software package has been deployed, the agency may also consider establishing standard checking criteria for critical functions within the software (i.e., functions related to calculations as the basis for the design, such as superelevation and calculations). Parametric behavior of templates may be another example of a functionality in the software to spot check using standard validation criteria. Lastly, there needs to be a process for documenting the review procedure that was established for deploying these new software versions.
Standardized onboarding training procedures should be employed as much as possible. Providing standardized guidance on the most effective workflows for using specific software enables users to confidently reproduce the steps that result in quality model–based deliverables. Trainings need to include repeatable and reproducible steps for performing design reviews, as well as checklists and standardized forms for documenting reviews. With the proper documentation in place, the process will enable any qualified person to track the record of decisions for all reviews that are performed.
This section summarizes current software functionality and gaps, and it offers guidance for working with software vendors and the industry to improve the functionality and automation of tools for reviewing model integrity and verifying model-based design and information requirements.
The state of the practice for modeling software is quite advanced, including software in the infrastructure domain. Many software vendors provide a variety of packages for model-based
corridor design, bridge modeling, and hydraulic and drainage modeling. While no software is perfect and the functionality of specific packages could be improved, current modeling tools provide the features that agencies need to deploy model-based project development and delivery methods. Many agencies still need to build or expand their software configurations to implement process control and standard approaches to model development, which would improve the quality, checkability, and usability of the resulting digital data.
Numerous vendors offer technical solutions for managed environments that are compatible with ISO 19650 standards for CDEs. The ISO 19650 series defines a CDE as both the technical solution for sharing data, which vendors provide, and the rules and workflows for collaboration, which an agency needs to establish. Many commercially available tools are compatible with formalized rules and workflows, offering managed access to folders and notifications that trigger when files change status.
Some aspects of model review or design verification software are currently less advanced in other fields. Market-driven innovation in automation tools may eventually make it easier for agencies and design teams to perform effective and efficient model quality reviews. But one significant barrier to bringing these tools to market is the diversity of data models that need to be supported for each proprietary format model. Once IFC 4.3 is adopted for infrastructure domains, enabled by its publication as an ISO standard in 2024, new model and design-review tools will likely come to market since vendors will only need to support one schema. Nevertheless, software vendors need to know what types of checking features to implement. Desired features fall into the following categories:
Existing automation tools can be categorized into three types of checks: CADD standards compliance, design code compliance, and 3D design review and clash detection.
CADD Standards Compliance Checks. Most modeling software already offers CADD standards checkers, though they are underutilized. Software should be configured to run a routine that checks specific drawings against a master standards template file or library. This type of checker inspects whether proper layers, line styles, and colors have been applied to specific geometry in drawings. Third-party add-on software also provides the same functionality. These applications may offer additional checking functionality or customization beyond what native software offers. One example of CADD standards compliance software is CADconform! by Altiva Software, and Pencil9’s Harmony is software that helps automate standards and configurations for CADD applications.
Design Code Compliance Checks. The Nemetschek Company’s Solibri is software that provides model checking and clash detection capabilities. Solibri is used for commercial BIM architectural design. Software should be configured appropriately to compare BIM files to specific codes.
Current corridor modeling software—such as Bentley Systems’ OpenRoads Designer and Autodesk Company’s Civil 3D—provide the functionality to set up geometric design standards that are used for calculating geometric features of roadways, such as horizontal and vertical alignments and superelevation. These software packages can be configured to provide warnings when a design exceeds geometric requirements (e.g., curve radius, stopping sight distance, and K values) specified in the AASHTO Green Book or other agency standards. Current modeling software can also be used to set up and produce different report types that can summarize design parameters, including alignment information, hydraulic calculations, and tables. While not a completely automated tool, these reports provide documentation that reviewers can check against a model; however, some knowledge of the software is required to open dialog boxes and view the parameters of various design elements.
3D Design Review and Clash Detection Checks. Software like Autodesk NavisWorks and Bentley Infrastructure Cloud can be used to combine 3D models, navigate them in real time, take measurements, and review files using tools that allow users to post redline drawings, post comments, and assign comments to specific users. These applications also can be used to perform clash detection algorithms (e.g., hard and soft clashes) and produce reports of identified interferences.
There has been much development in the deployment and adoption of open data standards and services provided by buildingSMART International. These services and standards are being explored by many European countries as well as AASHTO’s BIM for Bridges Pooled Fund Studies TPF-5(372) and TPF-5(523). Software vendors are working to incorporate these new standards. Agencies may want to stay informed about the development of these standards since this is a potential solution for keeping proprietary file types accessible into the future.
Open data standards rely on the IFC schema, which is used to reference model elements within a design to the IFC data structure. How this is executed depends on the proprietary software being used. But the result should be the same—the ability of modeling software to export one or multiple IFC files. Once IFC files have been exported, a specific sequence of steps must be completed to verify their quality. First, IFC files need to be run against the buildingSMART IFC File Validation Service. This initial check verifies that a file complies with the normative rules of the IFC schema, meaning that if the file has a bridge, this object should be referenced as IfcBridge, and that all bridge subassemblies are also referenced to the proper IFC rules. The second check is to run an IDS that identifies all the alphanumeric information that is required to be part of the IFC file.
The TPF-5(372) BIM for Bridges Pooled Fund Study produced an IDS that is intended to check an IFC file deliverable issued for bidding and construction. The IDS can only check for alphanumeric information requirements, so reviewers should verify that the geometry complies with agency-established modeling standards. The IDS standard can be used to create an IDS file to check discipline-specific alphanumeric information, including metadata related to the review process. This file could be used to archive review documentation. The IDS file can be opened using IDS viewers or an HTML viewer, if the reviewer can read HTML. There is also IDS editing software that can be used to create individualized, project-specific specifications to check against.
BCF by buildingSMART is an open source development with great potential to advance the ability of agencies to conduct and document model-based reviews. BCF is an open data technology that allows different modeling software to communicate model-based issues with each other, provided that design teams are producing IFC models. The IFC models can be shared and opened using modeling review software that is compliant with BCF technology. BCF technology allows the user to create and send comments via a BCF file to others who can view them in their own model review software. The number of commercial software currently supporting BCF is limited, but a list is available on buildingSMART International’s website.
Human reviewers are essential for inspecting the nuances of design, but software can potentially provide automated checks of design standards and changes made between milestone reviews. Since not all agencies will use the same CADD or design review software, agencies need to define their standards and functional requirements based on performance outcomes rather than prescriptive methods. Current business practices for developing project deliverables for construction may not be adequate in a model-based design environment. This distinction is important because users often pass on opportunities to improve the process simply because a new methodology does not align with their preferences. Agencies must collaborate with software developers and the industry to improve the process and products used currently for model-based development and delivery.