Quality Management for Digital Model–Based Project Development and Delivery (2025)

Chapter: APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET

Previous Chapter: APPENDIX D: RESPONSIBILITIES TAXONOMY
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

An arrow points rightward from Phase 1 through Phases 2 and 3 to Phase 4. Phase 1 is Planning. Phase 2 is Data collection and methodology development. Phase 3 is Guidebook development. Phase 4 is Final products. The text reads: N C H R P 10 dash 113: Quality Management for 3 D Model-Based Project Development and Delivery. Page 1. Background: The objective of this research is to develop a guidebook for reviewing 3 D model-based deliverables including data validation and documentation procedures. The desired outcome from this study is for this guidebook to serve as a national industry reference for clarifying the standard of care for 3 D model-based project delivery and provide a consistent, repeatable, reproducible, and traceable quality management process that is equal to or better than existing paper-based processes. Consistency: The level of performance in following procedures does not vary greatly over time. Repeatability: The methodology has been tested by the same person using the procedures many times to verify the review process is correct so that it can be standardized. Once the methodology has been established, the users can work towards creating reproducibility. Reproducibility: The procedures capture the steps for performing each review well enough they can be reproduced by any person to achieve the same results. Traceability: The process enables any person to track the record of decisions for all reviews performed. The work for this study is being conducted in four different phases as illustrated in Figure 1. Phase 1 was completed in January 2023. Figure 1: N C H R P 10 dash 113 Project Scope of Work is a flowchart of four phases in a project's scope of work. An arrow points rightward from Phase 1 through Phases 2 and 3 to Phase 4. Phase 1 is Planning. Phase 2 is Data collection and methodology development. Phase 3 is Guidebook development. Phase 4 is Final products. (end of figure) Phase 1 included a review of current and previous research publications and reports, as well as a cursory review of state policies and statutes to determine which data should be collected in Phase 2 Then the research team proposed a methodology and an outline for the final Guidebook, which has been accepted by the project panel members.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 2. As part of Phase 2, the research team has: 1, Collected a variety of documents related to current quality management processes employed by State Departments of Transportation (D O T’s). 2, Organized information of the content that was collected. 3, Created a library of review types and needed procedures. 4, Collected a set of terms related to 3 D modeling and quality documentation to be included as a glossary in the final guidebook. In Phase 3, the N C H R P 10 dash 113 project panel will review the updated methodology and results of the Phase 2 testing to approve the development of the guidebook. Before beginning the development of the entire guidebook, the N C H R P 10 dash 113 project panel will work with the research team to develop a sample chapter of the guidebook to get a feel for what the final publication may look like. Finally, Phase 4 of the project will focus on the production of final deliverables, which will include: 1, Final guidebook that will serve as a national reference for State D O T’s for quality management of 3 D model-based project development and delivery. 2, Final research report that will include an overview of the research projects, literature review synthesis, methodology used to create the guidebook, lessons learned during the methodology testing, as well as conclusions and recommendations for future research. 3, Data dictionary that can be used by software providers when developing, outreach materials (e.g., webinars and presentation slides). Request for Participation: As indicated during the kickoff web conference on June 12, 2023, the research team is requesting assistance from State D O T’s to test the proposed methodology. What is the ask? We are asking each participant to select a 3 D model data set of an existing or completed project of their choice that can be used to perform one or more model-based reviews. It is not a requirement for the reviewer to be a project team member, but we ask that you select someone who meets the core competencies needed for the type of review. The research team has defined five types of reviews, including survey, 3 D model integrity, modeling standards, clash detection and spatial coordination, and discipline design reviews. Each of these reviews is defined in Appendix A. Review Types Definitions. Each of the reviews listed in Appendix A has a list of procedures in Appendix B - Procedures Library. Appendix C - Job Aids provides a glossary of terms and a model element table (M E T) that can be used as a checklist. What will you need to test the methodology? 1, A project data set that includes roadway, bridge or structure, utilities, and drainage 3 D models is preferred, but at a minimum, a roadway 3 D model is needed to test the methodology. Please note that a clash detection review model can only be performed if at least two disciplines are modeled.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 3. 2, Modeling software (for experienced modelers) or model review software for non-modelers. A. If a design review software is not available, a design review may still be performed using an over-the-shoulder review in which a modeling software expert can walk the non-modeler reviewer through the software to follow the procedures. 3, Design standards used for each of the discipline designs being reviewed. Specific standards needed for each type of review are outlined in the individual procedures. 4, Agency or project required C A D D and modeling standards for the model integrity and modeling standard review. 5, Appendices provided as part of this document. 6, Agency or project quality documentation forms or examples are provided in Appendix C. 7, Reviewers with the available in the next three weeks to perform the reviews and meet the core competencies for the specific selected review. What feedback are we asking for? The research team has prepared general information and high-level procedures (Appendices A, B, and C) for each type of review, as well as job aids, such as checklists, glossaries, and documentation forms. We kindly request you finish testing the methodology no later than August 31, 2023. The research team would like for you to provide feedback on the clarity and usefulness of the information packet documents by answering a post-review online survey, which will be sent out prior to August 31 to collect your feedback. Below are examples of the information we may include in the survey. 1, Review types feedback. A. What software did you use to perform the review? B. What disciplines were modeled in your data set? C. What standards did you use during the reviews? Did you have all the required standards to check against? Which standards did you not have? 2, Procedures feedback. A. Can you please rate the usefulness of the procedures? B. What would you change in the procedures? C. What would have been useful to have in the procedures? 3, Reviewer experience feedback. A. Did you have the appropriate people to perform the reviews? Did they meet the appropriate competencies? If not, which competencies should we highlight as needed to be developed? B. What documents in the information packet did you find most helpful? 4, Documentation Feedback. A. What did you use to document the check that was performed? B. What type of documentation would you need in the future?

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 4. Assumptions: 1, Each project being reviewed has at least one discipline model being reviewed. 2, State DOT or design consultant will perform the review on a volunteer basis (the N C H R P 10 dash 113 budget does not cover the time for participation). 3, Each state lead will assign project team members with the core competencies to perform the reviews with the instructions provided. 4, Each project team member or members will perform the specific type of review selected using their choice of software. The research team will not provide modeling or reviewing software. 5, The research team will be available to answer questions related to the instructions provided for testing the methodology. 6, The research team will not provide technical support related to the software functions. All instructions focus on performance-type specifications and thus are software agnostic. N C H R P 10 dash 113 Research Team Contact. As you start testing the methodology, please feel free to contact our team with any questions or comments. Alexa Mitchell, Project Manager and co-principal Investigator. Email: alexa dot mitchell at hdrinc dot com. Phone: 602.245.7056

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 5. Appendix A. Review Types Definitions. This appendix provides the definitions for the six types of reviews identified by the research team to be included in the final guidebook to be delivered for this study. Survey Review: This type of review checks for compliance with agency geomatics or survey specifications for developing and delivering existing conditions models, including geodetic control, topographic base models, 3 D surfaces, land or mineral boundary and right-of-way (R O W) information, and metadata. 1, Geodetic Control - process for verifying the location and accuracy of the primary and secondary control monuments to support construction, as well as compliance with required accuracy and point density for the project. This review also verifies metadata about the project, such as coordinate systems, datum, geoids, projection factors, etc. 2, Topographic Features and 3 D Surfaces - process for verifying all topographical geometry has been mapped correctly and has generated an accurate 3 D surface. Specifically, check that each feature has been assigned the appropriate feature code and is represented correctly in the base map drawing. This process also checks for the proper inclusion of points, break lines, voids, and other survey features in the 3 D surface. 3, Land Boundaries and Right-of-Way (R O W) - process for checking 2 D land and boundary features and geometry are accurately represented and meet the survey standards, 2 D land boundary features and geometry may include land corners, property lines, existing and proposed right-of-way lines, and easement boundaries. Some states also require a Mineral Boundary survey, which is similar to land boundary surveys and captures boundaries, including features and information related to mineral claims of natural resources. 3 D Model Integrity Review: This type of review checks for the overall integrity of each model element within each individual discipline model as well as the federated model (or combined model of all the disciplines). In specific, a model integrity review checks for: 1, Geometric Accuracy - A reviewer should check that the 2 D and 3 D linework is contiguous, lines do not cross, and geometry represents appropriate location, elevations, and slopes. 2, Surface Accuracy - A reviewer should check for appropriate triangulation and smoothness of the design model. Surface triangulation boundaries and break lines do not cross. Also, a surface should not have spikes sudden slope changes, or holes. This review should also check for appropriate point density and triangle sizes. Also, the surface should be checked for ponding. 3, Completeness of Content - A reviewer should check that the model is both complete and clean. The reviewer should verify all elements required by the model element breakdown structure (M E B S) for the project are included in the model, there are no (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 6. There are no duplication of 2D or 3 D geometry and the overall geometry are free of errors and omissions. 3 D Modeling Standards Review: This type of review checks compliance with model development standards that communicate the expectations for creating model deliverables. Modeling standards typically include computer-aided drafting and design (C A D D) and modeling development standards, such as level of development (L O D), level of information need (L O I N), and metadata associated with the data set or sets. 1, L O D Specifications - provide the minimum requirements for each model element within a discipline and or project model(s). L O D is defined by how closely a virtual element represents or resembles its real-world counterpart, including geometric dimensional accuracies and engineering confidence levels, including physical dimensions and graphical characteristics such as shape, size, and location. 2, L O I N Specifications - include L O D specifications as well as the requirements for information and documentation of all milestone deliverables for a specific project. 3, C A D D Standards - provide guidelines for consistency in the development of technical drawings, including file structure, data management, drawing templates, file naming conventions, and the use of libraries that establish layering, styles, and symbology conventions, as well as standard 2 D and 3 D objects. 4, Metadata - such as geospatial information is applied correctly to each drawing file, project identifiers (project number and names) to enable searching, and documentation of quality process compliance. Clash Detection and Spatial Coordination: This type of review verifies that two elements do not occupy the same space at the same time check for clearances required by the project or design intent and support construction and maintenance activities. Specifically, this review checks for: 1, Hard Clashes - verifies that two elements do not occupy the same space at the same time or physically collide or clash with one another. 2, Soft Clashes - verifies appropriate buffers around elements to make room for uncertainty (e.g., existing utility locations), or the elements will not interfere with construction equipment or future maintenance activities. 3, Sight Distance - verifies the geometric design and 3 D objects provide the required horizontal or vertical sight distance, stopping sight distance, and lateral or line of sight at an intersection. Discipline Design Review: This type of review checks for the overall functionality of the design and compliance with project requirements for design standards, design intent, and project milestone deliverables, quantities, and cost estimates. Design reviews specifically check for: (continued next page).

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 7. 1, Design Codes or Standards Criteria - Checks for proper use of discipline-specific design codes and or standards criteria as required by the agency design manual or manuals and conformance with project milestone requirements. 2, Design Calculations - Engineering calculations are correct based on the design code and or standards criteria for the design discipline. 3, Constructability- Overall functionality of the design and its constructability. A constructability review can also check for design stages. 4, Quantities - Checks for design quantities entered in the engineering cost estimate match the model quantities, and the quantity items are assigned to the correct pay item number or cost code. This process may be automated if the software provides that functionality. 5, Cost Containment - Also known as value engineering review, checks for design value based on various design alternatives for controlling overall cost without compromising the design intent and safety.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 8. Appendix B. Procedures Library The Procedures Library is a series of documents that describe high-level performance-based instructions that can be applied to each of the types of reviews. The procedures were created using a performance or task-based approach to provide flexibility to each D O T to add their unique methods and requirements, tools, and software. Each of the procedures will have a similar process consisting of the following seven key steps: 1. Initiate and Prepare a Review. 2. Conduct Review. 3. Document Review. 4. Resolve Review Comments. 5. Make Revisions (Address Comments). 6. Verify Revisions (verify comments were addressed). 7. Audit. If these procedures need to be more explicit, we would love your feedback at the end of your review on: 1, Procedures that were most useful. 2, Procedures that you think need more detailed procedures. 3, Procedures that need to be combined. 4, Procedures for a type of review not included. Table B-1. List of Documents in Appendix B. 1. N C H R P 10 dash 113 - 3 D Modeling Integrity Review. 2. N C H R P 10 dash 113 - 3 D Modeling Standards Review. 3. N C H R P 10 dash 113 - Clash Detection and Spatial Coordination Review. 4. N C H R P 10 dash 113 Discipline Design Review. 5. N C H R P 10 dash 113 Survey Review. Source of documents: N C H R P 10 dash 113 research team.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 9. Survey Review: Scope. This document provides guidance for reviewing compliance with the survey model against agency geomatics or survey specifications for developing and delivering existing condition models. Existing condition models have three major components to check: 1, Geodetic Control. 2, Topographic Features and 3 D surfaces. 3, Land Boundaries and Right of Way (R O W). For each major component review, a metadata process will be done to verify that the appropriate geospatial information and attribution are established within the model files. Review Information. 1, Table B-2 provides the reference documents needed to review geodetic control data. 2, Table B-3 the review process specific for geodetic control data checks. 3, Table B-4 provides the reference documents needed to review topographic features and 3 D surfaces. 4, Table B-5 the review process specific for topographic features and 3 D surface checks. 5, Table B-6 provides the reference documents needed to review land boundary and right-of-way data. 6, Table B-7 the review process specific for land boundary and right-of-way checks. The core competencies listed in tables B-3, B-5, and B-7 will help to identify the responsible individual or individuals intended to carry out the associated steps of the procedure. If the responsible individual does not meet the required competencies to review information directly in the modeling or model review software, then a two-person team would be needed. One person should have expert knowledge of the modeling software to assist the reviewer with over-the-shoulder review. The second person is the subject matter expert performing the review. The preparation of the documents to be reviewed can be executed by any qualified person except the individuals who will be conducting the review or auditing the review process. Table 2 outlines the review process to be followed for Geodetic Control review. The core competencies will help to identify the responsible individual(s) intended to carry out the associated steps of the procedure. If the responsible individual does not have competency with modeling or model review software, then a two-person team would be needed. One person should have expert knowledge of the modeling software to assist the reviewer with over-the-shoulder review. The second person is the subject matter expert performing the review. The preparation of the documents to be reviewed can be executed by any qualified person except the individuals who will be conducting the review or auditing the review process.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 10. Deliverable: Geodetic Control. Scope of Review. This review provides a process with procedures for verifying the location and accuracy of the primary and secondary control monuments that support construction. Table B-2. Review and Reference Documents. Document: Geomatics or Survey Manual. Purpose: Provides requirements for survey control requirements, survey datums and coordinate systems, field codes, and deliverables. The manual also should include a classification of accuracy standards. Document: Survey Report. Purpose: Provides information about how the control was established and the metadata that is associated with the data sets being reviewed. Document: Survey Model or Field Book. Purpose: Provides location and elevation information for the control. Document: Review Checklist. Purpose: An agency-developed list to provide the reviewer with a way to maintain consistency during the review. Document: Review Documentation. Purpose: Provides a means of documenting the review process. Review Process: Table B-3 outlines the review process to be followed for a Geodetic component for this portion of the review. The core competencies will help you to identify the responsible individual or individuals intended to carry out the associated steps of the procedure. Table B-3, Process and Procedures. Column headers: Process, Procedures, and Core Competencies. Process: Initiate. Procedures: 1. Identify the individuals who will be acting as reviewers and verify they have sufficient access privileges to execute and document the review. Core competencies: General knowledge of review procedures. Procedures: 2. Establish a review folder; Create or use an existing review folder. The folder will hold review documentation. Copy files to review into this folder or provide an index of files with their locations in this folder. Core competencies: General knowledge of common data environment. Procedures: 3. Place survey files into the review folder and certify it is ready for review. Core competencies: Expert knowledge of model authoring software. Process: Conduct. Procedures: 4. Locate individual survey files to check and open them to review in the appropriate software. Utilize the review checklist to maintain consistency throughout the review. Core competencies: Expert knowledge of model authoring software or design review software. Procedures: 5. Display the control monuments and site boundary to check the locations of the primary and secondary control are distributed around the site as described in the Survey Manual; and 6. Select each control point to verify that the horizontal and vertical control accuracy is within the tolerance established by the Survey Manual. Core competencies: Expert knowledge of survey and geodetic control and model authoring software. (continued on next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 11. Table column headers: Process, Procedures, and Core Competencies. Process: Conduct; Procedures: 7. Verify the elevations and coordinates on the primary and secondary control match the surveyor’s report by isolating and selecting elements. Review property values to validate the control values; Expert knowledge of survey and geodetic control and model authoring software. 8. Verify file metadata matches the survey report and survey manual. Display the file geocoordinate system and working units and check that they are correct; and 9. Assemble issues and anomalies in the review documentation. Repeat Steps 4 through 8 for each survey file. Core competencies: Expert knowledge of survey and geodetic control and model authoring software. Process: Document; Procedures: 10. Ensure that all review documentation and finalized review checklists are in the review folder. Reviewers should notify the survey team that the review is complete; Core competencies: General knowledge of review procedures. Process: Resolution; Procedures: 11. The survey team views the review documentation and assigns the responsibility of the revision to the appropriate party; 12. Each responsible party addresses the assigned comment in the review documentation in preparation for a comment resolution meeting; and 13. Schedule and mediate a comment resolution meeting where clear direction for model revisions will be developed to provide guidance to model authors. Core competencies: General knowledge of review procedures. Process: Revision; Procedures: 14. Make specific survey model revisions per the review documentation comments; Core competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 15. Verify model revisions have been made in accordance with the agreed-upon resolutions; Core competencies: Expert knowledge of model authoring software. 16. Finalize review documentation by obtaining signoffs from each reviewed; Core competencies: General knowledge of review procedures. Process: Audit; Procedures: 17. Transmit review documentation and file location to Q A Manager for a compliance audit; and 18. Review documentation audited for compliance with review standards. Core competencies: General knowledge of audit process. (end of table) Deliverable: Topographic Features and 3 D Surfaces: Scope of Review. This review provides a process with procedures for verifying the location and accuracy of the existing topographical features and the triangulated surface model. Table B-4 outlines the necessary review and documents required to complete this review.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 12. Table B-4. Review and Reference Documents. Document: Geomatics or Survey Manual; Purpose: Provides requirements for survey control requirements, survey datums and coordinate systems, field codes, and deliverables. The manual also should include a classification of accuracy standards. Document: Survey Report; Purpose: Provides information about how the control was established and the metadata that is associated with the data sets being reviewed. Document: Survey Model or Field Book; Purpose: Provides location and elevation information for all elements. Document: Agency C A D D Standards; Purpose: Check compliance with Agency C A D D Standards. C A D D standards provide guidelines for the constituency in the development of technical drawings, including file structure, data management, drawing templates, file naming conventions, and the use of libraries that establish layering, styles, and symbology conventions as well as 2 D and 3 D standards. Document: Review Checklist; Purpose: An agency-developed list to provide the reviewer with a way to maintain consistency during the review. Document: Review Documentation; Purpose: Provides a means of documenting the review process. (end of table) Review Process: Table B-5 outlines the review process to be followed for the Topographic Features and 3 D Surfaces review. Table B-5. Process and Procedures. Column headers: Process, Procedures, and Core Competencies. Process: Initiate; Procedures: 1. identify the individuals who will be acting as reviewers and verify they have sufficient access privileges to execute and document the review; Core Competencies: General knowledge of review procedures. Procedures: 2. Establish a review folder: Create or use an existing review folder. The folder will hold review documentation. Copy files to review into this folder or provide an index of files with their locations in this folder; Core Competencies: General knowledge of common data environment. Procedures: 3. Place survey files into the review folder and certify it is ready for review; Core Competencies: Expert knowledge of model authoring software. Process: Conduct; Procedures: 4, Locate individual survey files to check and open them to review in appropriate software. Utilize the review checklist to maintain consistency throughout the review; and 5. Verify file metadata matches the survey report and survey manual. Display the file geocoordinate system and working units and check that they are correct. Core Competencies: Expert knowledge of model authoring software or design review software. Procedures: 6. Verify the existing topographical features appear to follow the appropriate feature codes listed in the Survey Manual and meet agency C A D D standards. Isolate and select topographical features to view property information and validate information; Core Competencies: Expert knowledge of survey standards, C A D D Standards, and model authoring software. Procedures: 7. Verify the triangulated surface for accuracy. Display surface with triangles and contours. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 13. Column headers: Process, Procedures, and Core Competencies. Process: Conduct; Procedures: 7, continued. Review visually for spikes, abnormalities, or obscured areas by navigating and rotating the surface; 8. Compare known observations of field conditions from the Field Book to topographical features. Isolate individual elements; 9. Verify that topographic features are set to triangulate or not triangulate appropriately according to the Survey Manual. Select elements to review property information; and 10. Assemble issues and anomalies in the review documentation. Repeat Steps 4 through 9 for each survey file. Core competencies: Expert knowledge of model authoring software and survey data processing procedures. Process: Document; Procedures: 11. Ensure that all review documentation and finalized review checklists are in the review folder. Reviewers should notify the survey team that the review is complete. Core competencies: General knowledge of review procedures. Process: Resolution; Procedures: 12. The survey team views the review documentation and assigns the responsibility of the revision to the appropriate party; 13. Each responsible party addresses the assigned comment in the review documentation in preparation for a comment resolution meeting; and 14. Schedule and mediate a comment resolution meeting where clear direction for model revisions will be developed to provide guidance to model authors. Core competencies: General knowledge of review procedures. Process: Revision; Procedures: 15. Make specific survey model revisions per the review documentation comments; Core competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 16. Verify model revisions have been made in accordance with the agreed-upon resolutions; Core competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 17. Finalize review documentation by obtaining signoffs from each reviewer; Core competencies: General knowledge of review procedures. Process: Audit; 18. Transmit review documentation and file location to the Q A Manager for a compliance audit; and 19. Review documentation audited for compliance with review standards. Core competencies: General knowledge of audit process. (end of table) Deliverable: Land Boundaries and Right-of-Way. Scope of Review: This review provides a process with procedures to verify the existing and proposed project boundaries and existing right-of-way have been established according to the State and local agency standards and fulfill the requirements of the project. Table B-6 outlines the necessary review and reference documents needed to complete this review.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 14. Table B-6. Review and Reference Documents. Document: Existing property and ownership data; Purpose: Provides official parcel data used to create the property lines. Existing data may include map computations and property deeds. Document: Existing right-of-way data; Purpose: Provides existing parcel and boundary information for publicly owned rights-of-way. Reference data material may be deeds, survey maps, tax maps, or survey notes. Document: Project design layout; Purpose: Provides information for the spatial relationship of design features. Document: Review Checklist; Purpose: An agency-developed list to provide the reviewer with a way to maintain consistency during the review. Document: Review Documentation; Purpose: Provides a means of documenting the review process. (end of table) Review Process: Table B-7 outlines the review process to be followed for a Land Boundaries and Right of Way review. The core competencies will help to identify the responsible individual or individuals intended to carry out the associated steps of the procedure. Table B-7. Process and Procedures. Column headers: Process, Procedures, and Core Competencies. Process: Initiate; Procedures: 1. Identify the individuals who will be acting as reviewers and verify they have sufficient access privileges to execute and document the review; Core Competencies: General knowledge of review procedures. Procedures: 2. Establish a review folder: Create or use an existing review folder. The folder will hold review documentation. Copy files to review into this folder or provide an index of files with their locations in this folder; Core Competencies: General knowledge of common data environment. Procedures: 3. Place individual boundaries (i.e.. existing R O W, property lines, and land corners) and proposed R O W files into the review folder and certify it is ready for review; Core Competencies: Expert knowledge of model authoring software. Process: Conduct; Procedures: 4, Locate individual boundary files to check and open them to review in appropriate software. Utilize the review checklist to maintain consistency throughout the review; and 5. Verify file metadata matches the survey report and survey manual. Display the file geocoordinate system and working units and check that they are correct. Core Competencies: Expert knowledge of model authoring software or design review software. Procedures: 6. Verify individual boundary or right-of-way features within the file. Select the feature and view property information to validate the coordinates of the line work. Confirm that the geometry in the model matches with the deeds; Core Competencies: Expert knowledge of R O W standards, county deed requirements, and model authoring or review software. Procedures: 7. Assemble issues and anomalies in the review documentation. Repeat Steps 4 through 6 for each individual boundary file; Core Competencies: General knowledge of review procedures. Procedures: 8. Locate proposed R O W file and open it to review in appropriate software; Core Competencies: Expert knowledge of model authoring software or design review software. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 15. Column headers: Process, Procedures, and Core Competencies. Process: Conduct; Procedures: 9, Verify file metadata matches the survey report and survey manual. Display the file geocoordinate system and working units and check that they are correct; Core Competencies: Expert knowledge of model authoring software or design review software. Procedures: 10. Confirm proposed R O W or easements are displayed properly and encompass the necessary areas to construct and maintain the project; and 11. Confirm that the proposed geometry in the model is consistent with the proposed plans or legal descriptions for property acquisition. Core Competencies: Expert knowledge of requirements for county deed recording procedures. Process: Document; Procedures: 12. Ensure that all review documentation and finalized review checklists are in the review folder. Reviewers should notify the survey team that the review is complete; Core Competencies: General knowledge of review procedures. Process: Resolution; Procedures: 13. The survey team views the review documentation and assigns the responsibility of the revision to the appropriate party; 14. Each responsible party addresses the assigned comment in the review documentation in preparation for a comment resolution meeting; and 15. Schedule and mediate a comment resolution meeting where clear direction for model revisions will be developed to provide guidance to model authors. Core Competencies: General knowledge of review procedures. Process: Revision; Procedures: 16. Make specific survey model revisions per the review documentation comments; Core Competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 17, Verify model revisions have been made in accordance with the agreed upon resolutions; Core Competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 18. Finalize review documentation by obtaining signoffs from each reviewer; Core Competencies: General knowledge of review procedures. Process: Audit; 19. Procedures: Transmit review documentation and file location to Q A Manager for a compliance audit; and 20. Review documentation audited for compliance with review standards. Core Competencies: General knowledge of audit process. (end of table) 3 D Model Integrity Review: Scope of Review. The Model Integrity Review encompasses a methodical review of each discipline model individually as well as a review of the federation of all those models into one complete project representation. This review verifies the fidelity of the model contents for each discipline or federated project model for project compliance with geometric and surface accuracy and completeness of the content. Review Information: Table B-8 outlines the necessary review and reference documents required to complete this review.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 16. Table B-8. Review and Reference Documents. Document: BIM or Model Execution Plan; Purpose: Check compliance with the project modeling requirements. This document should contain a list of required model elements to be developed within the project. This may be documented within a model element breakdown structure (M E B S) or table. Document: Review Checklist; Purpose: An agency-developed list to provide the reviewer with a way to maintain consistency during the review. Document: Review Documentation; Purpose: Provides a means of documenting the review process. (end of table) Review Process: Table B-9 outlines the review process to be followed for the 3 D Model Integrity review. The core competencies will help to identify the responsible individuals} intended to carry out the associated step of the procedure. If the responsible individual does not have competency with modeling or model review software, then a two-person team would be needed. One person should have expert knowledge of the modeling software to assist the reviewer with over-the-shoulder review. The second person is the subject matter expert performing the review. The preparation of the documents to be reviewed can be executed by any qualified person except the individuals who will be conducting the review or auditing the review process. Table B-9. Process and Procedures. Column headers: Process, Procedures, and Core Competencies. Process: Initiate; Procedures: 1. Identify the individuals who will be acting as reviewers and verify they have sufficient access privileges to execute and document the review; Core Competencies: General knowledge of review procedures. Process: Initiate; Procedures: 2. Establish a review folder: Create or use an existing review folder. The folder will hold review documentation. Copy files to review into this folder or provide an index of files with their locations in this folder; Core Competencies: General knowledge of common data environment. Process: Initiate; Procedures: 3. Create federated discipline models in the review folder and certify it is ready for review. Create a federated model file by combining all discipline models and also create a proposed surface. Certify that the file is ready for review; Core Competencies: Expert knowledge of model authoring software. Process: Conduct; Procedures: 4. Locate the federated discipline model and open it to review In the appropriate software. Utilize the review checklist to maintain consistency throughout the review; 5. Check the geometric accuracy of each element or line. Isolate model elements to verify line work is contiguous and the geometry represents appropriate location, elevation, and slope. Record any anomalies in the review documentation; and 6. Check the completeness of the content of each element. Core Competencies: Expert knowledge of model authoring software or model review software. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 17. Column headers: Process, Procedures, and Core Competencies. Process: Conduct. Procedures: 6 (continued). Utilize the BIM Execution Plan and review checklist to verify all elements required for the project are included in the model. Isolate model elements to verify there are no duplications of 2 D or 3 D geometry or elements. Record any anomalies in the review documentation; 7. Locate the federated model file and proposed surface file and open it to review in the appropriate software; 8. In the federated model file, verify the spatial relationship of neighboring elements such as, how model elements fit together, or gaps between elements, interrogate the model, and use measurement tools to identify and record any anomalies; and 9. In the federated proposed surface file, check the proposed surface accuracy. View surface contours, brake lines, and triangulation of the proposed federated model. In addition, verify that surface details do not lie beyond the ROW or easement lines, and verify the surface does not have spites or sudden slop changes or holes. In addition, verify that surface details do not lie beyond the ROW or easement lines, and verify the surface does not have spites or sudden slope changes or holes. Record any anomalies in the review documentation. Core competencies for Procedures 6 through 9: Expert knowledge of model authoring software or model review software. Process: Document; 10. Ensure that all review documentation and finalized review checklists are in the review folder. Reviewers should notify the design team that the review is complete. Core competencies: General knowledge of review procedures. Process: Resolution; Procedures: 11. Design team views the review documentation and assigns the responsibility of the revision to the appropriate party; 12. Each responsible party addresses the assigned comment in the review documentation in preparation for a comment resolution meeting; and 13. Schedule and mediate a comment resolution meeting where clear direction for model revisions will be developed to provide guidance to model authors. Core competencies: General knowledge of review procedures. Process: Revision; Procedures: 14. Make specific model revisions per the review documentation comments; Core competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 15. Verify model revisions have been made in accordance with the agreed-upon resolutions; Core competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 16. Finalize review documentation by obtaining signoffs from each reviewer; Core competencies: General knowledge of review procedures. Process: Audit; Procedures: 17. Transmit review documentation and file location to Q A Manager for a compliance audit; and 18. Review documentation audited for compliance with review standards. Core competencies: General knowledge of audit process.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 18. 3 D Modeling Standards Review: Scope of Review. This review checks compliance with model development standards and C A D D standards that communicate the expectations for creating model deliverables. The modeling standards review is performed on each individual discipline model. Specifically, this review checks for verification of the Level of Development (L O D), Level of Information need (L O I N), C A D D standards, and metadata. Review Information: Table B-10 outlines the necessary review and reference documents required to complete this review. Table B-10. Review and Reference Documents. Document: Agency Model Development Standards; Purpose: Check compliance with Agency modeling standards for L O D and L O I N Specifications. Document: Agency C A D D Standards; Purpose: Check compliance with Agency C A D D Standards. C A D D standards provide guidelines for the constituency in the development of technical drawings, including file structure, data management, drawing templates, file naming conventions, and the use of libraries that establish layering, styles, and symbology conventions as well as 2 D and 3 D standards. Document: BlM or Model Execution Plan; Purpose: Check compliance with the project modeling requirements. This document should contain a list of required model elements to be developed along with each element’s level of detail and information for the project. This may be documented within a model element breakdown structure (M E B S) or table. Document: Review Checklist; Purpose: An agency-developed list to provide the reviewer with a way to maintain consistency during the review. Document: Review Documentation; Purpose: Provides a means of documenting the review process. (end of table) Review Procedure: Table B-11 outlines the review process to be followed for the 3 D Modeling Standards review. The core competencies will help to identify the responsible individual or individuals intended to carry out the associated steps of the procedure. If the responsible individual does not have competency with modeling or model review software, then a two-person team would be needed. One person should have expert knowledge of the modeling software to assist the reviewer with over-the-shoulder review. The second person is the subject matter expert performing the review. The preparation of the documents to be reviewed can be executed by any qualified person except the individuals who will be conducting the review or auditing the review process.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 19. Table B-11. Process and Procedures. Column headers: Process, Procedures, and Core Competencies. Process: Initiate; Procedures: 1. Identify the individuals who will be acting as reviewers and verify they have sufficient access privileges to execute and document the review; Core Competencies: General knowledge of review procedures. Procedures: 2. Establish a review folder: Create or use an existing review folder. The folder will hold review documentation. Copy files to review into this folder or provide an index of files with their locations in this folder. Core Competencies: General knowledge of common data environment. Procedures: 3. Copy discipline model files into the review folder and certify it is ready for review; Core Competencies: Expert knowledge of model authoring software. Process: Conduct; 4. Locate individual discipline model files to check and open them to review in the appropriate software. Utilize the review checklist to maintain consistency throughout the review. Core Competencies: Expert knowledge of model authoring software or design review software. Procedures: 5. Conduct C A D D standards check on elements within each discipline model utilizing the agency C A D D standards review checklist. Select individual model elements to view the properties and verify if the element meets the C A D D standards. Core Competencies: Expert knowledge of C A D D standards; Expert knowledge of model authoring software. Procedures: 6. Assemble issues and or comments in the review documentation; Core Competencies: General knowledge of review procedures. Procedures: 7. Conduct model development standards check on elements within each discipline model. Select individual model elements to view the properties and verify if the element meets the associated L O D and or L O I N outlined in the BIM or Model Execution Plan. Core Competencies: Expert knowledge of model development standards; Expert knowledge of model authoring software or design review software. Procedures: 8. Assemble issues and or comments in the review documentation; Core Competencies: General knowledge of review procedures. Procedures: 9. Conduct a geospatial metadata check on each file. Within each model file, check that the correct geospatial data is applied based on the BIM or Model Execution Plan; and 10. Conduct a project metadata check on each file. Within each model file, check that the correct project identifiers (project number and names) are applied based on the BIM or Model Execution Plan. Core Competencies: Expert knowledge of model authoring software. Process: Document; Procedures: 11. Ensure that all review documentation and finalized review checklists are in the review folder. Reviewers should notify the design team that the review is complete; Core Competencies: General knowledge of review procedures. Process: Resolution; Procedures: 12. Design team views the review documentation and assigns the responsibility of the revision to the appropriate party; 13. Each responsible party addresses the assigned comment in the review documentation in preparation for a comment resolution meeting; and 14. Schedule and mediate a comment resolution meeting where clear direction for model revisions will be developed to provide guidance to model authors. Core Competencies: General knowledge of review procedures. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 20. Column headers: Process, Procedures, and Core Competencies. Process: Revision; Procedures: 15. Make specific model revisions per the review documentation comments; Core Competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 16. Verify model revisions have been made in accordance with the agreed-upon resolutions; Core Competencies: Expert knowledge of model authoring software. Procedures: 17. Finalize review documentation by obtaining sign-offs from each reviewer; Core Competencies: General knowledge of review procedures. Process: Audit; Procedures: 18. Transmit review documentation and file location to Q A Manager for a compliance audit; and 19. Review documentation audited for compliance with review standards. Core Competencies: General knowledge of audit process. (end of table) Clash Detection and Spatial Coordination: Scope of Review. This review involves the analysis of each discipline model to evaluate the position of discipline-specific model elements in relationship to each other as well as in relation to model elements from other disciplines. Specifically, this review checks for hard clashes, soft clashes, and sight distance. A 3 D element interference report (clash detection report) will be created during the process of this review. Review Information: Table B-12 outlines the necessary review and reference documents required to complete this review. Table B-12: Review and Reference Documents. Document: National Design Standards; Purpose: Provides an understanding of required National standards (if applicable). Document: Agency Design Standards; Purpose: Provides understanding of required Agency standards. Document: Project Design Manual; Purpose: Provides understanding of project intent and special design requirements. Document: Clash Detection Reports; Purpose: Provides a listing of model elements that interfere with the location of an adjacent element or do not achieve a specified clearance from said adjacent element. Document: Review Checklist; Purpose: An agency-developed list to provide the reviewer with a way to maintain consistency during the review. Document: Review Documentation; Purpose: Provides a means of documenting the review process. (end of table) Review Process: Table B-13 outlines the review process to be followed for a Clash Detection and Spatial Coordination review. The core competencies will help to identify the responsible individual or individuals intended to carry out the associated steps of the procedure.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 21. If the responsible individual does not have competency with modeling or model review software, then a two-person team would be needed. One person should have expert knowledge of the modeling software to assist the reviewer with over-the-shoulder review. The second person is the subject matter expert performing the review. The preparation of the documents to be reviewed can be executed by any qualified person except the individuals who will be conducting the review or auditing the review process. Table B-13. Process and Procedures. Column headers: Process, Procedures, and Core Competencies. Process: Initiate; Procedures: 1. Identify the individuals who will be acting as reviewers and verify they have sufficient access privileges to execute and document the review; Core Competencies: General knowledge of review procedures. Procedures: 2. Establish a review folder: Create or use an existing review folder. Folder will hold review documentation Copy files to review into this folder or provide an index of files with their locations in this folder; Core Competencies: General knowledge of common data environment. Procedures: 3. Create federated discipline models in the review folder and certify it is ready for review. Create a federated model file by combining all discipline models and certify it is ready for review; Core Competencies: Expert knowledge of model authoring software. Process: Conduct; Procedures: 4. Locate the federated discipline model and open it to review in the appropriate software; Core Competencies: Expert knowledge of model authoring software. Procedures: 5. Prepare clash rules to be run on the review model in the specified clash detection software based on the review checklist; and 6. Run the clash detection routine on the review model and prepare a clash detection report to be included with the review documentation. Core Competencies: Expert knowledge of clash detection procedures and software. Procedures: 7, Locate the federated model file and open it to review in the appropriate software; Core Competencies: Expert knowledge of model authoring. Procedures: 8. Repeat Steps 5 and 6 for the federated model file, which contains all disciplines; Core Competencies: Expert knowledge of clash detection procedures and software. Procedures: 9. Conduct sight distance check on the federated model. If using automated tools, save the report to the review folder; and 10. Run any additional spatial coordination checks required by the agency standards. Core Competencies: Expert knowledge of spatial coordination checks, and model authoring software. Process: Document; Procedures: 11. Confirm all review documentation and finalized review checklists are in the review folder. Reviewers should notify the design team that the review is complete; Core Competencies: General knowledge of review procedures. Process: Resolution; Procedures: 12. Design team views the review documentation and assigns the responsibility of the revision to the appropriate party; and 13. Each responsible party addresses the assigned comment in the review documentation in preparation for a comment resolution meeting; Core Competencies: General knowledge of review procedures. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 22. Column headers: Process, Procedures, and Core Competencies. Process: Resolution; Procedures: 14. Schedule and mediate a comment resolution meeting where clear direction for model revisions will be developed to provide guidance to model authors; Core competencies: General knowledge of review procedures. Process: Revision; Procedures: 15. Make specific model revisions per the review documentation comments; Core competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 16. Verify model revisions have been made in accordance with the agreed-upon resolutions; Core competencies: Expert knowledge of model authoring software. Procedures: 17. Finalize review documentation by obtaining signoffs from each reviewer; Core competencies: General knowledge of review procedures. Process: Audit; Procedures: 18. Transmit review documentation and file location to Q A Manager for a compliance audit; and 19. Review documentation audited for compliance with review standards. Core competencies: General knowledge of audit process. (end of table) Discipline Design Review: Scope of Review. The Discipline Design Review is a comprehensive review of all elements pertaining to engineering standards and design. This review checks the model for conformance to design codes and calculations, project requirements, and design intent. This review type will follow a series of procedures that confirm: 1, Model objects were developed using the correct design code or criteria (as required by design standards and or manuals). 2, Design calculations inside and outside of the modeling software are correct (any automated calculations done inside the software can be exported as reports to be verified using the preferred method of the reviewer). 3, Modeled elements are consistent with design calculations. 4, Design satisfies project scope and requirements; and 5, Proper documentation through all stages of review. Review Information: Table B-14 outlines the necessary review and reference documents required to complete this review.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 23. Table B-14. Review and Reference Documents. Column headers: Document and Purpose. Document: National Design Standards; Purpose: Provides an understanding of required National standards. Document: Agency Design Standards; Purpose: Provides an understanding of required Agency standards. Document: Project Scope or Information Requirements; Purpose: Verifies the design satisfies the needs of the project requirements in terms of scope, functionality, constructability, and design intent. Document: Design Calculations; Purpose: Provides the design reference data to be confirmed in the model or models. Document: B I M or Model Execution Plan; Purpose: Provides information about the project’s common data environment (C D E l file folder structure, and location of specific files. In the event a BIM execution plan is not available, a document describing the file management structure and location of files to be reviewed should be provided to the reviewer. Document: Review Checklist; Purpose: An agency or designer developed a list to provide the reviewer with a method to maintain consistency during the review. Document: Review Documentation; Purpose: Provides a means of documenting the review process. Review Process: Table B-15 outlines the review process to be followed for a Discipline Design review. The core competencies will help to identify the responsible individual or individuals intended to carry out the associated steps of the procedure. If the responsible individual does not have competency with modeling or model review software, then a two-person team would be needed. One person should have expert knowledge of the modeling software to assist the reviewer with over-the-shoulder review. The second person is the subject matter expert performing the review. The preparation of the documents to be reviewed can be executed by any qualified person except the individuals who will be conducting the review or auditing the review process. Table B-15. Process and Procedures. Column headers: Process, Procedures, and Core Competencies. Process: Initiate; Procedures: 1. Identify the individuals who will be acting as reviewers and verify they have sufficient access privileges to execute and document the review; Core Competencies: General knowledge of review procedures. Procedures: 2. Establish a review folder by creating or using an existing review folder. The folder will hold review documentation. Copy files to review into this folder and or provide an index of files with their locations in this folder; Core Competencies: General knowledge of common data environment. Procedures: 3. Create a federated model for reviewers. Ensure all model files relevant to the discipline review are referenced into a single container file. Store this container file in the review folder (listed above) or provide its location in the file index. Verify links to all C A D D files and supplemental documents are working. Core Competencies: Expert knowledge of model authoring software. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 24. Column headers: Process, Procedures, and Core Competencies. Process: Initiate; Procedures: 4. Locate all documents and models required for discipline review including container files, supplemental design documents not linked to container files, calculations, reports, and or analysis model files. Certify they are ready for review. Core Competencies: Expert knowledge of project requirements and design information. Procedures: 5. Initiate review documentation. Select appropriate review form or forms and checklists. Identify reviewers. Provide the name and location of review files and supporting documents. Notify reviewers to proceed. Core Competencies: General knowledge of review procedures. Process: Conduct; Procedures: 6. Verify the overall design satisfies the project requirements in terms of scope, functionality, and design intent; Core Competencies: General knowledge of project requirements and scope. Procedures: 7. Check geometry matches project and design requirements; 8. Verify overall design follows the required design standards; 9. Review discipline-specific reports such as horizontal geometry, profile, design inputs and results, quantities, etc. Reviewers should run any reports not provided by the design team; 10. Follow agency Q A or Q C procedures for checking calculations including any software automated calculations provided as reports; 11. Review limits of discipline model against limits of other disciplines to ensure models are meeting up correctly; and 12. Isolate individual design elements to review: 1. Property data (e.g. pay items, materials). 2. Conformance to agency design standards. 3. Specification compliance. 4. Properties and values consistent with design calculations and analysis; Core Competencies: Expert knowledge of discipline design requirements. Procedures: 13. Evaluate the interface between individual design elements and surrounding design elements for proper assembly. Verify individual elements do not overlap or have inappropriate gaps with connecting elements; and 14. Check for conflicts between elements from the same and separate design disciplines (e.g. between road and utilities); Core Competencies: General knowledge of discipline design requirements. Procedures: 15. Review constructability of all elements including potential construction issues with surrounding elements; Core Competencies: Expert knowledge of discipline design requirements. Procedures: 16. Use property information and quantity reports to find unexpected values and duplicates. Identify potential duplicate elements or elements that are being calculated incorrectly; Core Competencies: General knowledge of model authoring software. Procedures: 17. Confirm supplemental documents fully satisfy any design information not provided in the model. Verify design information provided in model and supplemental documents are not in conflict; Core Competencies: Expert knowledge of design package. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 25. Column headers: Process, Procedures, and Core Competencies. Process: Conduct (continued): Procedures: 18. Identify any missing design information; Core Competencies: Expert knowledge of the design package. Process: Document; Procedures: 19. Ensure that all review documentation and finalized review checklists are in the review folder. Reviewers should notify the design team that the review is complete; Core Competencies: General knowledge of review procedures. Process: Resolution; Procedures: 20. Design team views the review documentation and assigns the responsibility of the revision to the appropriate party; 21. Each responsible party addresses the assigned comment in the review documentation in preparation for a comment resolution meeting; and 22. Schedule and mediate a comment resolution meeting where clear direction for model revisions will be developed to provide guidance to model authors. Core Competencies: General knowledge of review procedures. Process: Revision; Procedures: 23. Make specific model revisions per the review documentation comments; Core Competencies: Expert knowledge of model authoring software. Process: Verification; Procedures: 24. Verify model revisions have been made in accordance with the agreed-upon resolutions; Core Competencies: Expert knowledge of model authoring software. Procedures: 25. Finalize review documentation by obtaining signoffs from each reviewer; Core Competencies: General knowledge of review procedures. Process: Audit; Procedures: 26. Transmit review documentation and file location to Q A Manager for a compliance audit; and 27. Review documentation audited for compliance with review standards. Core Competencies: General knowledge of audit process.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 26. Appendix C. Job Aids. The research team created and collected several documents that we are classifying as “job aids”. 1. A glossary of 3 D modeling and quality management terms to assist with definitions of items referenced in the procedures. For example, roles for different people involved in the review process have been defined. 2. A model element table (M E T) to assist with organizing various model elements based on their physical counterparts. This M E T is provided as a potential document you could use for a checklist during the various reviews, but it is not required for you to use. 3. Quality control checklists and quality assurance forms from a few State D O T’s. We are providing these as examples of checklists that you may want to use. If your agency has similar checklists, we would appreciate it if you could provide links to their location so we can include them in the final guidebook as an example. Table C-1. Process and Procedures. Document Under Appendix C: 1. Del DOT 3 D Model Integrity Review Checklist; Source of Document: Delaware DOT. Document Under Appendix C: 2. N C DOT Survey Checklist (not for digital delivery, but may be modified to use as a checklist for digital deliverables); Source: North Carolina D O T. Document Under Appendix C: 3. N C H R P 10 dash 113 - 3 D Model Integrity Review Checklist (Excel Attachment); 4. N C H R P 10 113 - M E T (Excel Attachment); and 5. N C H R P 10 dash 113 - 3 D Modeling and Quality Management Terms Glossary; Source: N C H R P 10 dash 113 Research Team. Table C-2, Process and Procedures. Column headers: Document Name and URL to Source Document (Not in Appendix C) and Source of Document. Document Name and URL: M o DOT Survey Report (Form) 237.14.2.2 Survey Report.pdf (modot.org); Missouri D O T. Document Name and URL: U DOT Modeling Development Standards: Standards U DOT Digital Delivery (utah.gov); U DOT M E B S Spreadsheet: U DOT L O D Spreadsheet Master File.XLSX - Google Sheets; U DOT Digital Delivery Q C Checklist: U DOT Digital Delivery Q C Checklist.pdf - Google Drive; and U DOT Structures 3 D Catalog Final U DOT Structures 3 D catalog Final.pdf - Google Drive. Source: Utah D O T. Document Name and URL: Penn DOT Digital Delivery Resources: Digital Delivery Resources (pa.gov): Digital Delivery Interim Guidelines, Digital Delivery Execution Plan Template, Quality Management Review Checklists, Penn DOT M E B S, and Penn DOT Project Digital File Index. Source: Penn DOT.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 27. Del DOT 3 D Model Integrity Review Checklist: Directions for Completing the Checklist. 1, This checklist indicates elements that shall be checked by both the designer and the 3 D project file reviewer on a project to verify consistency between the plans and the 3 D project files. 2, This form shall be completed by the designer and submitted with the 3 D project files to the designated 3 D project file reviewer for the project. 3, The form submitted by the designer shall then be used by the designated 3 D project file reviewer to review the elements in the checklist for consistency between the plans and the 3 D project files. 4, This completed checklist will be provided back to the designer along with comments generated during the 3 D project file review. 5, Additional verification and review items may be requested for some projects. In these cases, the Del D O T Project Manager should communicate what additional items are to be reviewed. Similarly, some projects may not require certain elements to be reviewed by the 3 D project file reviewer. Table header: Project Information. Row headers: Contract Number, Contract Name, Maintenance Road Number or numbers, Designer or Engineer of Record, Project Manager, Location of Files. Table cells are blank. Table note: All files necessary for the 3 D Model review task shall be placed under the current project directory on the Y-Drive, in a folder labeled “Model Review Files”. Table header: Review Information. Column headers: Submission, Initial Review Submission, and Final Review Submission. Row headers, under Submission column: Submission Date, Review Completed Date, and Reviewer. Table cells are blank. Table note: “Initial Review Submission” shall be made when all Preliminary Construction Plan review comments have been addressed. This typically occurs between the Department-wide Preliminary Plan submission and Semi-Final Plan submission phases and must include the appropriate grades and geometries information that is required for the Semi-Final Construction Plan submission. “Final Review Submission” shall be concurrent with the Pre P S and E Construction Plan review submission.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 28. Table header: Items to be Submitted for Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have sub-columns with the following headers: Designer and Reviewer. Except for initial review for both Asskey export rows, all cells under these headers have empty check boxes. Row 1, 3 D_Review.dgn: This file is generated by the designer and will be the master 3 D review file with the following project files referenced in the file: 1, Original Ground Terrain (fs dot dgn). 2, Alignments (al dot dgn and or hv dot dgn). 3, Right-of-Way (rw dot dgn). 4, Construction Plans (cp dot dgn). 5, Proposed Construction (pc dot dgn). 6, Grades and Geometries (gg dot dgn). 7, Lighting Plans (li dot dgn). 8, Utilities or Relocations Plans (ut dot dgn). 9, Sign Structures (xx dot dgn). 10, 3 D Features of Proposed Top Surface or Surfaces (md dot dgn). Row 2, Original Ground DTM (fs dot dtm): InRoads S S 10 projects only. Row 3, Geometry File (.ALG) containing all the required C O G O points, alignments, and essential data: InRoads SS10 project only. Row 4, Asskey exports for all alignments, both horizontal and vertical, utilized along the project corridors. Row 5, Asskey exports for the proposed grades and geometries data contained within the Grades and Geometries sheets. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 29. Table header: General Model Review Items. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have sub-columns with the headers: Designer and Reviewer. All cells under these headers have empty check boxes. Row 1: Project design files utilize the correct Geographical Coordinate System (G C S). Row 2: Project Plans (P D F) files utilize the correct G C S. (Georeferenced P D F files). Row 3: All guidelines discussed in the “Development and Review of 3 D Engineered Models for Construction” document were followed. (Features, Intervals, etc.). Row 4: Project files utilize the latest C A D D Standards for both the Plans and 3 D Models. Row 5: Review of model for completeness (Visual Checks). 1, No significant gaps along the model. 2, Spikes or depressions along seam lines. 3, Overlapping modeling components. 4, 3 D model ties into the Original Grade surface. Row 6, Vertical clearance clash detection: Interference Checking. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 30. Table header: Typical Sections Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have sub-columns with the headers: Designer and Reviewer. All cells under these headers have empty check boxes. Row 1: 3 D Model feature widths match what is shown in the Typical Sections. {Lane Widths, Shoulder Widths, Ditch Sections, etc.). Row 2: 3 D Model pavement material depths match what is shown in the Typical Sections. Row 3: 3 D Model pavement cross slopes match what is shown on the Typical Sections, including superelevation sections. Row 4: 3 D Model side slope grading matches what is shown in the Typical Sections. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 31. Table header: Horizontal and Vertical Control Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have sub-columns with the headers: Designer and Reviewer. Row 1: Horizontal and Vertical Control Asskey data matches what is shown on the Horizontal and Vertical Control sheets. The two cells under Initial Review are empty. The two cells under Final Review have empty check boxes. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 32. Table header: Construction Plan Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have sub-columns with the headers: Designer and Reviewer. All cells under these headers have empty check boxes. Row 1: 3 D Model break lines match what is shown on the Construction Plans (Lane Widths, Shoulder Widths, etc.). Row 2: 3 D Model features match what is shown on the Construction Plans (Curb Lines, Guardrails, Islands, Slope Tie-Ins, etc.). Row 3: 3 D Model pavement tapers and transitions match what is shown on the Construction Plans. Row 4: 3 D Model roadside ditches, berms, etc. match what is shown on the Construction Plans. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 33. Table header: Profiles Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All cells under these headers have empty check boxes. Row 1: Verification of vertical alignment used in the creation of the 3 D Model matches what is shown in the Profile sheets. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 34. Table header: Grades and Geometrics Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All cells under these headers have empty check boxes, except for Row 1, Initial Review. Row 1: Grades and Geometrics Asskey data matches what is shown on the Grades and Geometrics sheets. Row 2: 3 D Model pavement break lines match what is shown on the Grades and Geometrics sheets. Row 3: 3 D Model pavement tapers and transitions match what is shown on the Grades and Geometrics sheets. Row 4: 3 D Model radii at intersections, entrances, and driveways match what is shown in Grades and Geometrics sheets. Row 5: 3 D Model cross slopes match what is shown on the Grades and Geometrics sheets. (Lane, Shoulder, Sidewalk, Side, Slopes, Median Crossovers, etc.). Row 6: 3 D Model superelevation cross slopes and transitions match what Is shown on the Grades and Geometrics sheets. Row 7: 3 D Model grades match the grades shown on the Grades and Geometrics sheets. Row 8: 3 D Model roadside Ditches, Berms, etc., match what is shown on the Grades and Geometrics sheets. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 35. Table header: Construction Details Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All cells under these headers have empty check boxes. Row 1: Verify any elements that are included in the Construction Details sheets which require the generation of 3 D data, are complete and consistent with the 3 D Model. These elements could include features in the 3 D Model or points to be provided in an Asskey file. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 36. Table header: Stormwater Management Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All four cells under these headers have empty check boxes. Row 1: Verify any elements that are included in the Stormwater Management Plan sheets that require the generation of 3 D data, are complete and consistent with the 3 D model. These elements could include features in the 3 D model or points to be provided in an Asskey file. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 37. Table header: Lighting Plan Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All four cells under these headers have empty check boxes. Row 1: Verify any elements that are included in the Lighting Plan sheets that require the generation of 3 D data, are complete and consistent with the 3 D model. These elements could include features in the 3 D model or points to be provided in an Asskey file. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 38. Table header: Utility Relocation Plans Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All four cells under these headers have empty check boxes. Row 1: If the 3 D Model includes utility relocation information, verify that the information matches what is shown on the Utility Relocation Plan sheets. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 39. Table header: Sign Structure Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All four cells under these headers have empty check boxes. Row 1: Verify any elements that are included in the Sign Structure sheets which require the generation of 3 D data, are complete and consistent with the 3 D model. These elements could include features in the 3 D model or points to be provided in an Asskey file. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 40. Table header: Cross Section Review. Column headers: Description of Item Being Reviewed, Initial Review, Final Review. Initial Review and Final Review each have two sub-columns with the headers: Designer and Reviewer. All 32 cells under these headers have an empty check box. Row 1: Verify the following items within the 3 D model match what is shown on the Cross Section sheets. Row 2: Pavement break line widths. Row 3: Pavement cross slopes. Row 4: Pavement material depths. Row 5: Side slope widths. Row 6: Side slope cross slopes. Row 7: Drainage feature widths, depths, and side slopes. Row 8: Drainage and utility infrastructure sizes and locations. The last two sections of the table are titled Initial Review Comments and Final Review Comments, and they are both blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 41. N C DOT Survey Checklist PD N Stage 1 - Location and Surveys Q C Checklist SPOT I D or Project T I P number: Click to edit. County: Click to edit. Table header: 1 L S 1 Provide Photogrammetric Control and Initiate Surveys. Column headers: Item number, Review Item, Yes, No, and N A. For each subcategory, the cells under Yes, No, and N A have blank checkboxes. 1, Complete Photogrammetric Control for Preliminary or Planning Mapping (N C Grid Datum). 1.1 Contacted Property Owners. 1.2 Performed panel control target surveys. 1.3 Processed and Developed Panel Control. 1.4 Compiled QC Documentation (Project Review Checklist – P R C]. 1.5 Provided panel control to the Photogrammetry Unit. 2 Complete S U E Level D. 2.1 Researched existing utility records. 2.2 Compiled QC Documentation (Project Review Checklist – P R C]. 2.3 Developed and Provided SUE Level D C A D D File (N C Grid Datum). 3 Perform an Independent Review of Mapping Limits Polygon. 3.1 Reviewed mapping limits polygon. 3.1.1 Received mapping limits polygon from the Project Team? 3.2 Reviewed and evaluated mapping limits for adequate design and analysis. 3.2.1 Coordinated with the Photogrammetry Unit and Project Team, if applicable? 3.3 Revised and provided mapping limits polygon. 3.3.1 Final mapping limits file adheres to the latest approved N C DOT MicroStation version? 3.3.2 Provided final mapping limits to the Photogrammetry Unit and Project Team, if applicable. 4 Complete Photogrammetric Control for Preliminary or Planning Mapping (Local Datum). 4.1 Developed a local project control network using the current Nation Spatial Reference System. (N S R S) projected onto the North Carolina State Plane Coordinate System. 4.1.1 Is the Project Datum Projection in accordance with N C DOT Location and Surveys Local Project Coordinate Systems? 4.1.2 Has the Project Datum Projection followed standard Geomatics Engineering procedures related to distance, direction, and elevation constraints? 4.1.3 Have contiguous and or neighboring project projections been reviewed and incorporated as part of the Project Datum Projection? 4.1.4 Have Coordinate equalities, if unavoidable placed in project areas that have little to no impact on the proposed design? 4.2 Contacted Property Owners. 4.3 Performed Panel Control Target Surveys. 4.4 Processed and Developed Panel Control. 4.S Compiled QC Documentation (Project Review Checklist – P R C). 4.6 Provided Panel Control to the Photogrammetry Unit.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 42. For items marked No that require further explanation, provide comments or action items in the table below. Column headers: Item number, Comments and Action Items. Both sections are blank and say click to edit. This checklist may not be comprehensive for every project. All items may not be applicable for smaller projects. It is the responsibility of the reviewer to ensure that an adequate review is performed. Q C Reviewer name: Click to edit. Date: Click to edit. Q C Reviewer (Signature): Blank.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 43. N C H R P 10 dash 113 - 3 D Model Integrity Review Checklist: PROJECT METADATA FIELDS. Project Identifyer, B I M Execution Plan Location, Reviewer, Review Documentation Location, Review Date, M E T Location, Originator, Q C Model Files Location, Transmittal Date, Auditor, Originator Response Date, Audit completed (Y or N), Verification Date, Audit Date, Certifier, Certification (Y or N), and Certification Date. Table with column headers: Implementation Items, Reviewer Comments, Originator’s Response, and Verifier’s Initials. Cells are blank in all columns except Implementation Items. Row 1: Geographical Coordinate System that has been defined in the model, models, or design file. Row 2: 3 D Baseline or Centerline has been displayed in the model or models. Row 3: Referenced 3 D model break lines match the 2 D planimetric lines. Row 4: Review of model or models for completeness, visually: Gaps along the model; Spikes or lips along seams; Overlapping components; Transitions between corridors and templates; Transitions between varying slope values; Slopes harmonization with existing surface; Median Crossovers; and Separator Islands. Row 5: Component Depths match the Typical Section: Pavement Layers; Driveway; Sidewalk; and Concrete. Row 6: Verify Station Offset Elevation at Critical Location: E O P at Drainage Nodes; Begin or End Taper Transitions; Begin or End Radius. Row 7: Verify Cross Slopes: Pavement Lanes; Shoulders; Sidewalk; Cross Over Medians; Slopes. Row 8: Vertical Clearance. Row 9: Clash Detection: Interference Checking. Row 10: 3 D Deliverable Created: X M L files for Corridor Alignments; X M L files for Existing and Proposed Surfaces (verified against 3 D design); D g n or D w g files for 2 D and 3 D lines; I c m file for OpenRoads Design Delivery. Row 11: Other.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

The text reads: Page 44. N C H R P 10 dash 113 - Model Element Table (M E T). First column header is Survey. Columns 2 through 9 have no headers. Row headers are 1 Cadastral, 1 A Boundaries, 1 B Control Monuments, 1 C Easements, 1 D Legal Description. 2 Construction, 2 Layout, 2 B Localizations, 2 C Machine Control. 3 Topographic, 3 A Geodetic Control, 3 B Existing Features, 3 C Terrain (or 3 D Surfaces). Roadway. 1 Active Transportation, 1 A Pedestrian Facilities, 1 B Bicycle Facilities. 2 Driveways, 2 A Driveways. 3 Edge Treatments: 3A Curb and Gutter, 3 B Safety Edge, 3 C Shoulders. 4 Geometry: 4 A Horizontal Alignments, 4 B Vertical Alignments 4 C Superelevation, 4 D Other Baselines. 5 Interchanges: 5 A Diverge Areas, 5 B Merge Areas, 5 C Ramps, 5 D Overpasses or Underpasses. 6 Pavement: 6A Base Course, 6 B Binder, 6 C Joints, 6 D Overbuild, 6 E Overlay, 6 F Structural Course, 6 G Wearing Surface. 7 Roadside Appurtenances: 5 A Cattle Grates, 5 B Fences. Drainage. 1 Culverts, 1 A Inlet Controls, 1 B Pipes, 1 C Outlet Controls. 2 Drains: 2A Concrete Pavement Subdrainage, 2B Edge Drain, 2C French Drain, 2 D Trench Drain, 2 E Underdrain. 3 Open Conveyances: 3A Ditches, 3 B Swales. 4 Retention and Detention Systems: 4A Infiltration Trenches, 4 B Injection Wells, 4 C Outlet Controls, 4 D Ponds or Drain Basins, 4 E Underground Detention Systems. 5 Storm Sewers. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 45. 5A Grates. 5B Pipes. 5C Structures. Erosion Prevention and Sediment Control. 1 Buffers. 1A Permanent Buffers. 1B Temporary Buffers. 2 Permanent Best Management Practices (BMPs). 2A Amended Soils. 2B Check Dams. 2C Rain Gardens. 2 D Riprap. 2E Plantings. 2F Tree Pits. 2G Waterbodies and Wetlands. 3 Temporary Best Management Practices (BMPs). 3A Berms and Diversions. 3B Erosion Prevention Devices. 3A Sediment Control Devices. 3 D Sediment Traps. Geotech. 1 Earth Retaining Systems. 1A Earth Retaining Systems. 2 Engineered Embankments. 2A Geofoam. 2B Geosynthetic Reinforced Soil–Integrated Bridge System. 2C Surcharge. 3 Geofabrics. 3A Geofabrics. 4 Geohazards. 4A Geohazards. 5 Subsurface Exploration. 5A Borings. 5B Geophysics. 5C Strata Surfaces. Traffic. 1 Barrier Systems. 1A Cable Barriers. 1B Concrete Barriers. 1C Crash Cushions. 1D Delineators. ?1E Guardrail. 1F Walls. 2 Intersections. 2A Bicycle Facilities. 2B Level Crossings. 2C Pedestrian Facilities. 2 D Roundabouts. 2E Signalized Intersections. 2F Stop-Controlled Intersections. 3 Intelligent Transportation Systems. 3A Connected Vehicle Systems. 3B Fleet Monitoring Systems. 3C Message Systems. 3 D Signal Systems. 3E Tolling Systems. 3F Weather Monitoring Systems. 4 Lighting. 4A Conduit. 4B Control Boxes. 4C High Mast Lighting. 4D Junction Boxes. 4E Luminaires. 5 Pavement Markings. 5A Colored Pavement. 5B Linear Striping. 5C Perpendicular Striping. 5D Raised Pavement Markers. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 46. 5E Stenciled Striping 6 Rumble Strips. 6A Centerline Rumble Strips. 6B Perpendicular Rumble Strips. 6C Shoulder Rumble Strips. 7 Signs. 7A Double Post Signs. 7B Overhead Signs. 7C Single Post Signs. 8 Temporary Traffic Control. 8A Temporary Traffic Control. Bridge Structures. 1 Approach Structure. 1A Approach Slabs. 1B Sleeper Slabs. 2 Superstructure. 2A Bearings. 2B Curb. 2C Deck. 2 D Deck Drain. 2E Deck Joists. 2F Haunch. 2G Girder. 2H Median. 2I Parapet (Barrier). 2J Railing. 2K Sidewalk. 2L Sound Wall (Barrier). 2M Transverse Member. 3 Substructure. 3A Abutment or End Bent. 3B Architectural Feature. 3C Drilled Shaft. 3 D Footings or Pile Cap or Drilled Shaft Cap. 3E Micro Pile. 3F Pedestal or Riser. 3G Pier or Bent. 3H Pier Cap. 3I Pier Column. 3J Pier Wall (Crash Wall). 3K Pile. 3L Seal Coat or Tremie Seal or Seal Slab. 3M Shear Key. 3N Wingwall or Stem Wall. 3O Slope Protection. Structural Components. 1 Element Details. 1A Bent Plate. 1B Bolt Assembly. 1C Conduit. 1D Drainage System. 1E Feature. 1F Field Splice. 1G Fill Plate. 1H Gusset. 1I Shear Connector (Stud). 1J Splice Plate. 1K Stiffener. 1L Weld. 2 Reinforcement. 2A Accessories. 2B Drill and Bond Chemical Adhesive Anchor. 2C Drill and Bond Dowel Bar. 2D Post-Tensioning. 2E Prestressing. 2F Structural Component Reinforcement. 2G Transverse Reinforcement. (continued next page)

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Continued from previous page, the text reads: Page 47. The text reads the following: 3 Coatings. 3A Coatings. 4 Common Buried Structures. 4A Common Buried Structures. 5 Walls. 5A Retaining Walls. 5B Noise Walls. 5C Perimeter Walls. 6 Other Structures. 6A Gantries. 6B Masts. 6C Sign Structures. Utilities. 1 Communications. 1A Cable. 1B Fiber Optic. 1C Telephone. 2 Electrical. 2A Overhead Electrical. 2B Underground Electrical. 3 Landscaping Sprinkler Systems. 3A Landscaping Sprinkler Systems. 4 Oil and Gas. 4A Gas Pipelines. 4B Oil Pipelines. 5 Sanitary Sewer. 5A Combined Storm and Sanitary Systems. 5B Gravity Systems. 5C Pressure Systems. 5D Septic Systems. 6 Water. 6A Water Connections. 6B Water Mains. Rail. 1 Geometry. 1A Cant. 1B Horizontal Alignments. 1C Vertical Alignments. 2 Platforms. 2A Platforms. 3 Power Systems. 1A Catenary System. 1B Third Rail. 4 Rail Signals. 4A Rail Signals. 5 Track. 5A Ballast. 5B Cross-Tie. 5C Railing. 5D Sub-Ballast. 5E Switches or Turn-outs or Cross-overs. Site Development. 1 Grading. 1A Embankment. 1B Excavation. 1C Stabilization. 2 Landscaping. 2A Irrigation Systems. 2B Plantings. 2C Walkways. 2 D Street Furniture.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Glossary of Terms

This glossary contains terms related to 3D modeling and quality management principles relevant to project development and delivery. Each term is presented in the following format:

Term (ACRONYM) Synonyms: synonyms. Related terms: related terms. Definition in a sentence. Optional example of usage.

3D Coordination Synonyms: Clash Detection. The process in which models are interrogated to identify spatial conflicts. 3D coordination can be partially automated using spatial clash detection algorithms. It can be applied within a discipline (e.g., to check for conflicts between rebar and post-tensioning) or across disciplines (e.g., to check for conflicts between foundations and subsurface utilities).

Asset Synonym: Facility. Fixed facilities within a transportation network managed and operated by public agencies.

Asset Category Primary or top-level method of cataloging transportation assets. Agency priority asset categories include bridges and structures, pavements, drainage networks, traffic safety, and intelligent transportation systems.

Asset Class The second level of cataloging assets within an asset category hierarchy.

Auditor The person who audits a project’s quality documentation.

Audit Date The date on which the Auditor audited the quality documentation.

Attribute Related terms: Non-graphical Data, Property. Non-graphical data that is part of a model element definition. Modern modeling software includes property fields that can be used to embed pay item numbers as attributes to elements in a 3D model.

Back Checker The person who reviews the Reviewer’s comments and markups and resolves issues or areas of non-concurrence. This may be the Originator. Some DOTs do not require back checks for design reviews.

Back Checker Date The date on which the Back Checker completed the back check.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

BIM Execution Plan (BEP) Synonyms: Digital Delivery Execution Plan. Related terms: BIM Manager, Building Information Modeling. A plan to manage the use of BIM, especially collaboration and information delivery, to accomplish project goals.

BIM Manager Related terms: BIM Execution Plan, Building Information Modeling, Model Author, Model Manager. The individual, normally identified in a BEP, who is responsible for overseeing BIM usage on a project.

Building Information Modeling (BIM) Related terms: BIM Execution Plan, BIM Manager. The use of a shared digital representation of a built asset to facilitate design, construction, and operation processes and to form a reliable basis for decisions. (ISO 19650-1:2018(E))

Computer Aided Design and Drafting (CADD) The process of creating computer models based on parameters

Calculation A mathematical process requiring a manual or automated way to extract numerical data via formulas, equations, and/or computer programs to achieve a numerical solution or interpretation of data.

Calculation Quality Control Form. A specific form to document and certify that the calculation review has been performed.

Check The act of inspecting or testing something to determine its accuracy or quality.

Certifier The Discipline Lead or Design Manager who certifies that the design was completed to the agency’s specifications and in alignment with a comprehensive quality management framework. Typically, only final deliverables are certified.

Certification Yes / No indicating whether the Certifier has certified the design.

Clash Detection Related terms: 3D Coordination. A technique used in BIM or digital delivery processes to identify conflicts or collisions between various model elements.

Common Data Environment (CDE) The agreed-upon source of information for a project or asset, which is used for collecting, managing, and disseminating each information container through a managed process. (ISO 19650-1:2018(E))

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Contract Documents A collection of clearly identifiable documents that describe the requirements and terms for a project. Contract documents typically include plans, specifications, and working drawings. The specification defines plans and working drawings, as well as how to coordinate contract documents in the case of a conflict. Models and/or CADD documents may be included in the definition of Plans and Working Drawings or defined as specific contractual entities in the Specifications or Special Provisions.

Corrector The person who made corrections in design documents. This may be the Originator.

Correction Date The date on which corrections were completed.

Data Exchange The process of taking data structured under a source schema to transform and restructure into a target schema, so the target data is an accurate representation of the source data within specified requirements and minimal loss of content.

Design Authoring The process in which 3D design or modeling software is used to develop 3D models based on specific roadway and structural criteria to convey design intent for construction. Core functions of design authoring include development and analysis of design elements, while the functions of a modeling software include the development of 3D objects. Depending on the discipline, Design Authoring is the same as the Modeling software.

Design Element Synonyms: Model Element. A component of a model that represents a physical object (e.g., a sign) or abstract concept (e.g., alignment, north arrow).

Design Review The process in which a 3D model is used to review and provide feedback related to multiple aspects of design, including evaluation of design alternatives and environmental constraints, review and validation of geometric design criteria, and completeness or quality of overall design.

Digital Record Related term: Quality Artifact. A file that contains data stored in a digital format.

Discipline Related term: Functional Area. A description of the discipline that content being reviewed falls under.

Discipline-Specific Model Related term: Federated Model. A model or linked models related to a single discipline. The superstructure model, substructure model, and detailing models are linked together into a federated Structural Discipline Model.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Document Type A description of the type of document being reviewed, which could come from a lookup list (e.g., calculations, design model).

Documentation Documents or records that serve as a record.

Facility Any physical asset within a transportation or roadway network (e.g., roads, bridges, tunnels, overpasses, rest areas).

Federated Model Related term: Discipline-Specific Model. A model compiled by referencing models together using a common spatial reference frame. A federated model most commonly is used to combine all discipline-specific models together to represent the project as a whole. However, when an individual discipline uses multiple models to develop a discipline-specific design, a federated model could be used to represent a single discipline.

Functional Area–Related term: Discipline. A description of the sub-discipline/functional area that content being reviewed falls under.

Graphical Data Related terms: Spatial Data, Non-graphical Data. Data conveyed using shape and arrangement and/or location in space. Graphical data includes spatial data and non-spatial data.

Industry Foundation Classes (IFC) Related term: Open Data. A non-proprietary data schema and format to describe, exchange, and share the physical and functional information for assets within a facility.

Layer Synonyms: Level. A means of segmenting data within a file. Model entities are assigned a layer, and layer properties can be used to control the visual style of elements as well as the editability and the visibility of elements on the entire layer.

Level of Information Need (LOIN) Specifications The minimum requirements for each model element within a discipline and/or project model(s). LOIN defines both the level of detail of the geometry and the level of information attached to model elements. A LOIN specification may define requirements for the final deliverable or may define a progressive specification with increasing detail and information at successive milestones.

Library A software resource file that provides configuration or utilities to aid in the use of the software. A library may contain object definitions, styles, scripts, property sets, configuration data and more.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Generally, libraries are developed to reflect the agency’s model development standards and packaged into the modeling software configuration.

Metadata Information that describes the characteristics of a dataset. Metadata may include structural metadata, which describes data structures (e.g., data format) and descriptive metadata, which describes data contents (e.g., roadway design). Metadata is used to describe and manage documents and other information containers.

Milestone A description of the stage in the project development life cycle during which the review occurred. Should be populated by a lookup list.

Model A representation of a system that allows for investigation of the system properties. (EN ISO 29481-1:2016).

Model Author Related terms: BIM Manager, Model Manager. The individual, normally identified in a BEP, responsible for creating a specific model element or group of model elements.

Model Element Table (MET) A classified list of model elements. A MET can be used to create a Model Progression Specification, which describes how elements of discipline-specific models increase in LOIN throughout the design process. The MET can also be used to document the quality management process.

Model Manager Related terms: BIM Manager, Model Author. The individual, normally identified in a BEP, responsible for a discipline-specific model. Model manager responsibilities are normally documented in the BEP. Typical responsibilities include managing design authoring within the discipline-specific model and implementing quality procedures for the discipline.

Non-graphical Data Related terms: Attribute, Graphical Data, Property. Data that describes attributes and properties of a model element that do not relate to its physical dimensions or location. A globally unique identifier is a common non-graphical attribute.

Open Data Format Related terms: Industry Foundation Classes, Proprietary Data Format. Data that is structured according to a schema has been published in a format that is free to use and redistribute. Open data formats are frequently supported by software products, enabling the exchange of data.

Originator The individual responsible for content being reviewed.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Project Identifier A project number or other identifier that can be used to connect review documentation to the project dataset.

Property Synonyms: Attribute. Non-graphical information that describes a model element. For instance, the Modulus of Elasticity is a property of a material (e.g., steel).

Proprietary Data Format Related term: Open Data Format. Data that is structured according to a proprietary schema. Data stored in a proprietary data format can usually only be read and written by one vendor’s software products.

Quality Artifact Related term: Digital Record. An auditable record of quality checks that have been performed.

Quality Assurance (QA) All the planned and systematic activities necessary to provide confidence that a product or facility will perform satisfactorily once in service; includes quality control, independent assurance, and acceptance as its three key components.

Quality Control (QC) Actions and considerations necessary to assess production and construction processes to control the level of quality being produced in the end product.

Reference Synonyms: Federated Model. An information container (e.g., a 3D model) stored in a separate file and federated with a 3D model as a read-only backdrop. While the visibility and visual styles of a reference can sometimes be adjusted, the data within the reference cannot be edited.

Review An assessment or examination of something. Review is used in context of examining design models.

Reviewer Synonym: Checker. The individual responsible for reviewing model content.

Reviewer Credentials The Reviewer’s credentials: some DOTs require specific reviewers to be a registered professional, such as Professional Engineer (PE), Structural Engineer (SE), or Land Surveyor (LS).

Review Criteria A description of the standard or code against the review is executed (design manual, standard code, checklist).

Review Date The date on which the Reviewer completed their review.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.

Review Title The title of the Reviewer: a way to demonstrate the reviewer’s expertise to justify their position as a reviewer. Some DOTs have seniority requirements for conducting some review types.

Review Type A description of the review’s specific purpose.

Rule Set A collection of criteria for implementing an algorithm. Rule sets are typically used with clash detection algorithms where they specify clearance envelopes around specific groups of model elements that would constitute a clash.

Saved View A predetermined set of saved attributes including viewpoints, scale, render style, orientation, and object and display settings saved for future retrieval.

Spatial Data Related terms: Graphical Data, Non-graphical Data, Model. Graphical data placed within a coordinate reference frame that is tied to a geographical coordinate system so that the information is tied to a physical location. Spatial data is often stored in 3D models and Geographical Information Systems (GIS).

Transmittal Date The date the Originator submitted materials for review.

Verifier The person who verified that corrections were adequately addressed. This may be the Reviewer.

Verifier Date The date corrections were verified.

Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 72
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 73
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 74
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 75
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 76
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 77
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 78
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 79
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 80
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 81
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 82
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 83
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 84
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 85
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 86
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 87
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 88
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 89
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 90
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 91
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 92
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 93
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 94
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 95
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 96
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 97
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 98
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 99
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 100
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 101
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 102
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 103
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 104
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 105
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 106
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 107
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 108
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 109
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 110
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 111
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 112
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 113
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 114
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 115
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 116
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 117
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 118
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 119
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 120
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 121
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 122
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 123
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 124
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 125
Suggested Citation: "APPENDIX E: TASK 8 METHODOLOGY REVIEW PACKET." National Academies of Sciences, Engineering, and Medicine. 2025. Quality Management for Digital Model–Based Project Development and Delivery. Washington, DC: The National Academies Press. doi: 10.17226/29172.
Page 126
Next Chapter: APPENDIX F: TASK 8 METHODOLOGY REVIEW FEEDBACK
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.