A successful review process requires a well-organized model development environment and workflow. This chapter elaborates on model environment components that agencies can establish to implement a robust quality management process. The quality management process can be completed within design authoring software or model review applications. Implementing recommendations from this chapter will streamline the review process. In addition to model development aspects and specific model review tools, agencies can look at the following standards for guidance:
NBIMS-US provides standards, guidelines, templates, and other resources for implementing a stable production environment. These standards are continually supported and expanded. Agencies may need to review and adopt future releases—scheduled to be published every three years—as appropriate. ISO 19650, Parts 1 and 2 are international standards that support the consistent management of information on a BIM-enabled project. Agencies can use these standards to assist in the development of a collaborative, model-based production environment.
A standards-based approach to model development provides the design team and reviewers with a structured framework for planning, creating, and verifying model-based deliverables. These standards are the basis for the modeling standards and model integrity reviews. Clearly defined naming conventions, model content, and element information yield finished products that are consistent and predictable. Suggestions in this section provide guidance for the development of foundational components to advance a stable review product.
Information modeling standards specify the LOD and information needed for a specific purpose (i.e., LOIN). (See Section 6.3.2 for suggestions on implementing information standards across an agency.) Information modeling standards are composed of the following key elements, which can be used to establish frameworks for specific types of reviews:
Specific guidance for survey reviews includes the following:
Specific guidance for modeling standards includes the following:
Chapter 3 outlines how the CDE supports processes for approval and records management. The CDE provides a space for collaborative production of federated models that bring together information containers from multiple sources and parties. A CDE workflow protects the security and quality of information throughout production, review, and delivery. Agencies may want to investigate CDE developments or expansions that enhance the use of protocols for 3D model and model-based reviews. Considerations for a well-configured CDE that fully supports review protocols include
Source: PennDOT, with permission.
Table 8. LOD/LOI specification by milestones.
| Model Element | 30% Model | 60% Model | 90% Model | Final Model | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Min. LOD | 2D/3D | Min. LOI | Min. LOD | 2D/3D | Min. LOI | Min. LOD | 2D/3D | Min. LOI | Min. LOD | 2D/3D | Min. LOI | |
| Outlet structure (drainage) | 100 | 2D | Use 2D shapes to determine location and measure area. | 200 | 3D | Provide 3D solid to represent structure and location. | 300 | 3D | Refine 3D solid to demonstrate details and interaction with pipes: Attach attributes. | 300 | 3D | Refine 3D solid to demonstrate details and interaction with pipes: Attach attributes. |
| Inlet protection (drainage) | N/A | N/A | N/A | 100 | 2D | Use 2D shapes to determine location and measure area. | 300 | 3D | Model in 3D to output volume quantity. | 300 | 3D | Model in 3D to output volume quantity: Attach attributes. |
| Pipe (utilities) | 100 | 2D | Use 2D lines to determine location and measure length. | 200 | 3D | Provide 3D solid to represent pipe. | 300 | 3D | Provide 3D solid to represent pipe: Attach attributes. | 300 | 3D | Provide 3D solid to represent pipe: Attach attributes. |
| Conduit (utilities) | N/A | N/A | N/A | 100 | 2D | Use 2D lines to determine location and measure length. | 300 | 3D | Provide 3D solid to represent conduit: Attach attributes. | 300 | 3D | Provide 3D solid to represent conduit: Attach attributes. |
Another aspect of CDEs that should be incorporated into agency processes is the management of information containers (e.g., folders and files) and associated metadata. Within a CDE, information containers can have revision states tagged to identify the current state as a container moves through the review workflow. Suggested states that are permission-controlled include work in progress, shared, published, and archived. A folder’s revision state is modified after the review processes and can be documented through transmittals. Metadata should also be associated with individual files throughout the review process, as discussed in Section 3.2.2.
Documentation of agency-defined CDE protocols should be described in quality management documents. Project-specific protocols should be outlined by the project team in the BEP and project quality plan.
Consistent naming conventions for folders and files should be adopted in digital models. Logical names help users effectively and efficiently identify and retrieve data when reviewing models. Most agencies have standardized file and folder naming conventions for traditional files and deliverables. Agencies should review their naming conventions, paying special attention to 3D model files and delivered models. Adapting them based on ISO 19650 standards and best practices for consistency will prevent confusion and duplication.
The National Annex to ISO 19650-2 advances naming conventions for information containers that can be adapted based on project scale and complexity. The standard defines a series of field codes separated by delimiters to create a unique string. These naming conventions follow a specific format that provides consistency and scalability for different types of projects. Figure 10 depicts field code types and descriptions that can be expanded through revision metadata and tracked in the CDE.
While an agency need not conform to specific ISO naming conventions, it can reorganize the field codes to accommodate its current definition and provide a standard, yet customizable, naming practice. Agencies can further customize naming conventions to make it easier to track a file’s milestone and status in the quality management workflow.
There are naming restrictions that agencies need to consider, such as string length and the use of specific characters. Certain CDEs limit the number of characters in a file path. Some platforms or operating systems specify characters and words that are invalid or not allowed in file and folder names. For example, names and words that can be misconstrued by programming languages are not allowed for information containers. Agencies should review these limitations prior to finalizing a standard naming convention.
Design authoring software enables agencies to create a configuration that standardizes how product resources are used to design and draw objects to align with specific modeling and design standards. Defining an agency software configuration gives users the resources to repeat the same process, providing consistency across files and projects. Utilizing a well-defined software configuration allows manual or automated review processes to check against the defined standards.
Software configurations are installed along with a software application to coordinate symbology standards (e.g., line styles, labeling), the design standards file (e.g., superelevation calculations), and resources (e.g., drawing or sheet layouts). Terms used to describe software configurations vary across software platforms and include “workspaces,” “support configurations,” “state kits,” and “country kits.”
Implementing configurations for agency software is challenging because software, platforms, and operating systems undergo continual updates. Agencies need to be aware of these updates and plan accordingly, particularly when updates require new versions of software configurations. Detailed processes to implement and manage these versions can support file compatibility, reduce the risk of file corruption or data loss, and provide consistency between projects and collaborations.
There are also numerous advantages of implementing an agency software configuration, such as
Most agencies have defined software configurations for their design authoring software and platforms. Additional information and configuration management may be required to facilitate the quality management of 3D models created in these platforms. Defining the following aspects of a software configuration will advance modeling standards reviews for 3D models:
Adoption will take several years, but the transportation industry is shifting toward wider usage of open data standards, which can be used to collaborate outside of proprietary software and sidestep versioning issues. While IFC, associated data dictionaries, and information delivery manuals are being developed for transportation projects, agencies have an opportunity to set up their software configurations for future implementation. Building out the proper software configuration and documentation is important for the quality management process, and it allows for future automation of design standards.
Model management establishes parameters for documenting how designs are modeled. When designers include documentation of model contents along with their models, reviewers can quickly understand what is included (or omitted) and what needs to be inspected. Agency-defined modeling standards give designers requirements to follow and can also be validated against in review. Organizing and documenting a model’s contents improves transparency and accessibility for information reviewers as they complete the different review types discussed in Chapter 4. Two model management tools that can help reviewers understand the content within a model include the BEP and MET.
As explained in Section 2.2.3, the BEP is provided to reviewers for their reference to help them understand the project needs and implementation process for model generation and management. Reviewers oversee the model management procedures outlined in the BEP and confirm that they are implemented correctly. Agencies can refer to several BEP guidance documents within the architecture, engineering, and construction industry when developing or expanding their own. The NBIMS-US BEP guide is one example agencies can use.
As defined in Section 3.3.1, the MET helps manage how a model develops and communicates expectations for interdisciplinary coordination at each milestone. To incorporate a MET into the formal review process, agencies can
Using a 3D model (as opposed to paper or PDF) changes the medium for sharing information, but the information requirements for assets being constructed remain the same as in a traditional plan set. Reviewers need a sound process for verifying that a model contains the design information necessary for the contractor and meets design criteria and calculations (e.g., clearance requirements, sight distances, element thicknesses and dimensions, transition rates, and material types). This section describes review tools and job aids, such as checklists, reports, automated tools that can be used as quality artifacts, and review software, that can streamline 3D model reviews.
Checklists are a foundational quality management tool that provides reviewers with clear criteria—in a specified order—and becomes a quality artifact once the review process is complete. Checklists are just one type of job aid that can be used as part of a comprehensive review process with other review artifacts. Agency checklists standardize centralized review processes, mitigating risks that have been identified based on the current business context. Agencies must review their checklists periodically as part of the PDCA cycle.
Most agencies have checklists for design reviews that reflect the practices and standards defined in design manuals. A challenge that many agencies are currently confronting is how to develop checklists for 3D modeling standards and model integrity reviews. These standards are based on the CADD/BIM manual, model development standards, LOIN standards, and model integrity checks that can be automated, like clash detection for overlapping elements.
Creating a single quality management checklist that encompasses all disciplines and model review types would be difficult. Defining checklists for specific review types used at different points in the project review timeline provides standardization and flexibility. Agencies may already have discipline or milestone review checklists that are used for traditional quality management processes. Developing multiple checklists allows agencies to supplement or revise what already exists.
Reviewers will continue to inspect the same design content, but information is organized differently within a 3D model than it is on paper. For example, verifying station and offset information on a plan sheet requires checking annotations at points required by the checklist. To examine those same points in a 3D model, software tools are used to reveal location data of selected points in the model environment. Thus, the process of finding information differs even though the checklist content does not change. Agencies can expand their current discipline-specific checklists to include aspects unique to 3D models (e.g., element attributes) or incorporate review tools available in the 3D environment (e.g., specifying clash detection rules and routines to assess constructability and verify compliance with design standards).
Checklists are static documents, separate from model files. Users have to go outside a model to look at reviewer comments and see how they were addressed. When used as the quality management record, reviewers and CDE managers need to act with care to verify that checklists and models are stored appropriately in relation to each other. If a checklist contains a link or file path to a model, it needs to be updated when files are moved.
Some types of 3D model–review checklists that agencies may not have developed yet include the following:
Appendix F contains sample quality artifacts and checklists.
Traditional plan sets contain sheets with tabular data and information, such as alignment data, bridge rebar tables, bearing seat elevations, and drainage tables for pipes. In a model-based environment, this information is developed within a 3D model and can be extracted through generated reports for review and verification. These outputs can be reviewed as standalone documents or imported into analytical design tools to verify that a model matches the design.
Reports can be generated through design authoring or design review software. These software packages typically allow for limited report customization.
Additional types of reports can be generated to identify nonconforming elements based on design or modeling standards. These reports can be used to track whether changes have been addressed using attribution within property sets. Utilizing computer-generated reports is much faster than conducting a lengthy, manual, element-by-element check.
Agencies can define the review process to require that designers provide generated reports along with 3D model deliverables or to establish workflows for reviewers to generate reports themselves. If reviewers need to generate reports using specific software, additional core competencies and training should be provided.
Like generated reports, automated software tools provide a faster way of conducting reviews than manual methods. Several tools are included in design authoring, design review, and construction software that check infrastructure models against design and CADD standards. These automated tools enhance the efficiency of design teams and reviewers. Reviewers can use software outputs to document whether a model complies with standards or use a checklist while running these automated software tools to document the model’s compliance (or lack thereof). Examples of these tools include
At the time of writing, a number of software applications have been developed for 3D model reviews. Software for reviewing 3D models of transportation projects is still in its infancy. This makes it difficult to train reviewers or generate buy-in for and acceptance of 3D models. While there are some limitations with current technology, the development of 3D model–review tools is rapidly advancing. It is important for agencies to identify a review application that can be integrated within the CDE and technology stack, is compatible with different file types, and delivers a high level of performance with datasets of varying sizes.
When selecting a 3D model–review application to conduct the review types outlined in Chapter 4, agencies should consider the following functional requirements:
3D models of transportation projects need a reference framework for geospatial coordination within the review software. Similar to requirements for a 2D plan set review, reviewers need the ability to check station and offset information tied to an alignment or another linear referencing system. Review software applications must also let users inspect project datums, project coordinates, and projections.
Review software should be capable of providing model information in 2D and 3D views. 3D models can be rotated to capture different viewing angles, but reviewers—especially those new to 3D model reviews—still depend on 2D perspectives to examine details within a 3D model. 2D views should be generated from a 3D model and may include roadway or drainage profiles, cross sections, and planar sections from any angle. Agencies can define standard 2D views that need to be provided by the designer within review software, or they may document procedures for reviewers to interrogate the 3D models to create the correct 2D views.
Using software to visualize data is a powerful tool for reviewing 3D models. Outside of BIM/CADD, data visualization implies using colorful graphs and charts to clarify patterns within datasets. 3D information models developed for transportation projects are built using complex datasets, and applying data visualizations clarifies where various types of data exist within a model. Element information like classifications, materials, pay items, build order, associated specifications, review date, or review status can be tied to display styles, indicating through color or transparency any object with the queried properties. This allows reviewers to quickly identify the specific attributes they need to check, minimizing time-consuming tasks.
Another necessary functionality within review software is comment resolution. Reviewers can place or assign comments and issues on any component, view, section, or dimension in either 2D or 3D. Every response to the initial comment, including how it will be resolved and the status of that resolution (e.g., in progress, rejected, or completed), is documented by the review software. Comment resolution functionality supports the PDCA method. It provides greater efficiency and a superior audit trail compared to traditional paper methods because the software can maintain these comments within a model through updates and changes. Current software still does not fully support commenting on different portions of a model; thus, agencies should monitor how software evolves to address this gap.