This appendix presents a data governance and management maturity model, initially created in 2015 as part of NCHRP Project 08-92 [2] and subsequently updated through a follow-on implementation project - NCHRP Project 20-44(12) [3]. NCHRP Project 23-23 created a web-based tool for conducting maturity assessments and made further updates to the model, primarily to include actions that agencies can consider to advance their maturity levels. The following tables present:
Links to relevant guidance sections are provided for selected actions that fall within the scope of NCHRP Project 23-23. Not all actions are covered within the guidance because its scope focuses on data governance implementation whereas the assessment covers both data governance and data management topics.
| 1.1 Strategy and Direction | |
|---|---|
|
Leadership commitment and strategic planning to maximize value of data to meet agency goals. This sub-element looks at the extent to which the agency leadership has demonstrated a commitment to managing data as a strategic asset through establishment of data governance structures, communications, and planning activities to ensure alignment between data investments and business needs. |
|
|
Level 1 |
Agency-Wide: Data collection and management is performed by individual business units with little or no agency-wide direction or coordination. Data improvements are not systematically or regularly identified - they are implemented on a reactive or opportunistic basis. Program-Specific: Data improvements are not systematically or regularly identified - they are implemented on a reactive or opportunistic basis. |
| 1.1 Strategy and Direction | |
|---|---|
|
Level 2 |
Agency-Wide: Efforts to implement agency-wide data governance or assess agency-wide data needs are being discussed or planned. Data improvement needs are identified and communicated to management in an informal manner. Program-Specific: Data improvement needs are identified and communicated to management in an informal manner. |
|
Level 3 |
Agency-Wide: Agency leadership has communicated their expectation that business units and information technology functions should collaborate on identifying and implementing data improvements that are of agency-wide benefit. Data business plans or equivalent planning tools have been prepared to identify short- and longer-term data collection and management strategies that align with business objectives Data improvement needs have been systematically reviewed, assessed and documented. Program-Specific: Data business plans or equivalent planning tools have been prepared to identify short- and longer-term data collection and management strategies that align with business objectives. Data improvement needs have been systematically reviewed, assessed and documented. |
|
Level 4 |
Agency-Wide: Agency leadership regularly communicates and demonstrates active support for data improvements that will lead to improved agency effectiveness and efficiency. Agency leadership actively works to facilitate collaboration across business units on data improvements and maintain strong partnerships between IT and business unit managers. Data business plans or equivalent planning tools are regularly updated. A regular process of data needs assessment is in place, and is used to drive budgeting decisions Program-Specific: Data business plans or equivalent planning tools are regularly updated. A regular process of data needs assessment is in place, and is used to drive budgeting decisions. |
|
Level 5 |
Agency-Wide and Program-Specific Data governance and planning activities are continually refined to focus on key risks and opportunities - and eliminate activities without demonstrated payoff. Data governance and planning activities would have a high probability of continuing through changes in executive leadership. |
This sub-element looks at the extent to which the agency leadership or data program manager has demonstrated a commitment to managing data as a strategic asset – through establishment of data governance structures, communications, and planning activities to ensure alignment between data investments and business needs.
The following boxes show how the data management assessment elements tie to the AASHTO Data Principles. The first element is broad based and therefore covers all the AASHTO Data Principles. Subsequent elements become more focused on specific aspects of data management and therefore are applicable to selected AASHTO Data Principles.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, decisions about what data to collect and how to manage it are made in a highly decentralized fashion. As agencies move up the maturity scale, investments in data are made in a more deliberate and coordinated fashion. Agencies and data program managers are better able to answer questions such as “are we collecting the right data?” and “are we managing our data effectively?” Agencies are better able to identify where relatively unproductive or lower-value data investments can be discontinued or diverted to higher value data investments.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Identify and Convene Data Governance Stakeholders |
Identify key stakeholders that would benefit from data governance and champions who would support it. Meet as a group to identify problem areas and discuss potential data governance strategies. |
|
|
1 |
Initiate Informal Tracking of Data Improvement Needs |
Set up an informal way to identify and track data improvement needs across multiple business units through surveys, focus groups, and shared lists. Look for opportunities to collaborate on meeting shared needs. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
2 |
Create a Data Strategy and Business Plan |
Scope and resource an effort to develop a data strategy and business plan that is aligned with strategic goals and business needs. Engage executive sponsors and ask them to communicate their support. |
|
|
2 |
Establish Process to Compile Data Improvement Needs |
Establish a formal process to systematically identify and categorize data improvement needs. Assign ownership of this process and set up management briefings to review the results. |
|
|
3 |
Create an Action Plan |
Establish funding and resources aligned with the investment needs and actions identified in the data business plan. Ensure business units are appropriately staffed and resources to support implementation. |
|
|
3 |
Implement and Monitor the Action Plan |
Set responsibilities and target dates for implementation actions, monitor execution, communicate achievements and hold people accountable for execution. |
|
|
4 |
Communicate the Data Strategy |
Conduct regular outreach to communicate the data strategy and related initiatives, seek feedback and look for opportunities to coordinate activities. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
4 |
Evaluate the Governance Approach and Outcomes |
Conduct an annual evaluation of data governance and planning activities to ensure alignment with current risks, opportunities, and organizational structure. |
| 1.2 Roles and Accountability | |
|---|---|
|
Clear roles, accountability and decision making authority for data quality, value and appropriate use. This sub-element assesses the extent to which roles and accountability for data stewardship have been agreed-upon, defined, documented and assigned to individuals. |
|
| 1.2 Roles and Accountability | |
|---|---|
|
Level 1 |
Agency-Wide and Program-Specific: The agency has not recognized the need to formalize accountability for the quality, value, and appropriate use of data. |
|
Level 2 |
Agency-Wide The agency recognizes the need to establish formal accountability for the quality, value, and appropriate use of data but accountability has not yet been formalized:
Program-Specific: A business lead or point person has been designated for each major data set or application but the responsibilities of their role haven’t been spelled out. |
|
Level 3 |
Agency-Wide: The agency has established clear decision authority and responsibilities for data governance and stewardship – as evidenced by one or more of the following:
Program-Specific: Role(s) have been designated to identify points of accountability for data quality, value and appropriate use - for priority data programs or data subject categories. Decision making authority has been defined for collection/acquisition of new data, discontinuation of current data collection, and significant changes to the content of existing data. Capabilities and skills for data management are included in staff position descriptions, agency recruiting and staff development efforts. |
| 1.2 Roles and Accountability | |
|---|---|
|
Level 4 |
Agency-Wide: Agency-wide data governance bodies and/or staff with data stewardship responsibilities are appropriately resourced and trained to carry out defined responsibilities and are active and achieving results recognized as valuable. The agency is also successfully identifying and resolving situations where individual business unit interests are in conflict with agency-wide interests related to data collection and management. Program-Specific: Staff with responsibility for data stewardship and management have sufficient time and training to carry out these responsibilities. Staff with responsibility for data stewardship and management play an active role in defining data improvements and periodically produce reports of progress to their managers. |
|
Level 5 |
Agency-Wide: The roles and responsibilities of agency data governance bodies and/or staff with data stewardship responsibilities are reviewed periodically and updated based on agency experience and to reflect new or changing data requirements and implementation of new data systems. Staff with responsibility for data stewardship and management coordinate with their peers in the agency and with external data partners to deliver best value for resources invested (for example, as incentivized by data management-related metrics included in employee performance reviews). Program-Specific: Stewardship roles are periodically reviewed and refined to reflect new or changing data requirements and implementation of new data systems. Staff with responsibility for data stewardship and management are coordinating with their peers in the agency and with external data partners to deliver best value for resources invested. Data management-related metrics are routinely considered in employee performance reviews. |
This sub-element assesses the extent to which roles and accountability for data stewardship have been agreed-upon, defined, documented and assigned to individuals.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, there is a lack of clarity about who “owns” data and who is accountable for making sure that data meets business needs. Responsibilities have not yet been defined for making sure that different business units coordinate on data collection and data management activities to maximize efficiencies. As agencies move up the maturity scale, roles and responsibilities are more formalized. Managers ensure that staff are assigned to data stewardship and management roles and that they are sufficiently trained and resourced. Formalizing and documenting roles and accountability for data creates a consistent and sustainable framework for proper data management. It reduces the agency’s dependence on “heroic efforts” to take care of what needs to be done. It helps to ensure that there are staff who are proactive in providing the right data, with the right quality and in the right form – in an efficient manner.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Designate a Data Governance Lead |
Identify and designate an individual to take the lead on data governance implementation. |
|
|
1 |
Compile Contact List for Data Programs and Systems |
Identify points of contact for each of the agency’s major data programs and data systems. |
|
|
1 |
Designate Leads (Program-Specific Action) |
Designate a business lead or point person for each major data set or application |
|
|
2 |
Charter Data Governance Group(s) |
Establish and charter one or more data governance bodies. Set up an initial process to ensure coordination of data collection/acquisition activities. Establish priority initiatives and performance measures. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
2 |
Formalize Data Stewardship Responsibilities |
Adopt a data stewardship model with documented roles and responsibilities for different types of stewards. Document knowledge, skills and abilities (KSAs) for data stewardship and make these available to HR and hiring managers to include within position descriptions, recruiting and staff training materials. |
|
|
3 |
Formalize Data Program Responsibilities |
Identify the decision making authority and processes for collection/acquisition of new data, discontinuation of current data collection, and significant changes to the content of existing data. |
Chapter 5-Data Collection/Acquisition Oversight |
|
3 |
Plan and Conduct Data Steward Training |
Create or compile training for data stewards. Deliver initial training to current stewards and establish a schedule for new steward onboarding and refresher training. |
|
|
4 |
Schedule Quarterly Review Meetings |
Schedule quarterly review meetings with data stewards and the data program manager to discuss data improvement needs and review the status of ongoing improvement initiatives. |
|
|
4 |
Routinely Evaluate the Data Roles and Responsibilities |
Regularly evaluate data governance and stewardship roles and responsibilities to ensure alignment with new or changing data requirements, systems, tools, and processes. |
| 1.3 Policies and Procedures | |
|---|---|
|
Adoption of principles, policies and business processes for managing data as a strategic agency asset. This sub-element looks at the extent to which the agency has established clear policies and procedures about how data is to be managed as a corporate asset. |
|
|
Level 1 |
Agency-Wide and Program-Specific: No formal data strategic goals, objectives, principles, policies or procedures have been established. |
| 1.3 Policies and Procedures | |
|---|---|
|
Level 2 |
Agency-Wide and Program-Specific: Agency leadership has endorsed basic strategic goals, objectives and/or principles for data management. |
|
Level 3 |
Agency-Wide and Program-Specific: Basic data policies and processes are in place, including one or more of the following:
|
|
Level 4 |
Agency-Wide and Program-Specific: A comprehensive set of data management policies have been successfully implemented, monitored, and supported. |
|
Level 5 |
Agency-Wide and Program-Specific: Policies are regularly evaluated and improved. Policy reviews and update decisions consider factors such as awareness/reach within the agency, effectiveness and cost burden. |
This sub-element looks at the extent to which the agency has established clear policies and procedures about how data is to be managed as a corporate asset.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, there are no written and adopted policies and procedures related to data governance and management. As agencies move up the maturity scale, policies and procedures are drafted, adopted and implemented throughout the agency. The policies and procedures provide an important mechanism for standardizing how an agency treats data. If implemented well, they should result in higher quality data, more effective use of data, and clear decision making processes around data.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Adopt Data Principles |
Work with executives and data program leadership to formalize the agency’s core data principles and values (e.g. to establish and treat Data as an Asset and/or to adopt the AASHTO Data Principles). |
|
|
1 |
Communicate Data Principles |
Create materials useful to communicate data principles and goals to stakeholders at all levels within the agency. Distribute and/or communicate materials as opportunities allow. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
2 |
Establish Foundational Data Policies |
Develop and adopt policies describing the scope of data governance, roles and responsibilities, and expectations for data documentation and data quality management. |
|
|
2 |
Establish Procedures and Guidance |
Develop specific procedures and guidance materials that support data governance principles and policies. |
|
|
3 |
Develop Training Materials |
Create and distribute change management and training materials useful to inform audiences at all levels of data related policies, processes, and tools. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
3 |
Integrate Data Governance Practices into Business Processes |
Review and update processes for approval of new data and technology investments and for data publication and sharing to ensure coordination and support other data governance principles and policies. Ensure documented policies and procedures are incorporated into routine business practice, providing necessary training, resources, and authority to ensure implementation. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly monitor policy implementation and effectiveness, with resources to close identified gaps. |
| 1.4 Data Asset Inventory and Value | |
|---|---|
|
Tracking agency data assets and value. This sub-element looks at the extent to which the agency or data program manager has documented their data, its uses and its value to the agency. |
|
|
Level 1 |
Agency-Wide: Data sets of agency-wide importance have not been identified or documented (though there may be some business-area specific or data set-level inventories). Program-Specific: There is limited awareness of how data sets are used and what value is being provided. |
|
Level 2 |
Agency-Wide: A basic inventory of data sets of agency-wide importance has been compiled with limited information such as business contact, technical contact, location, and description. Program-Specific: There is general awareness of how different data sets are used and what value is being provided, but no records are kept on this. |
|
Level 3 |
Agency-Wide: An inventory of data sets of agency-wide importance is available to agency staff. Information about the value of each data set of agency-wide importance is available, including one or more of the following:
Program-Specific: Primary users and uses of each data set have been identified and documented. Data collection or acquisition costs are tracked. |
| 1.4 Data Asset Inventory and Value | |
|---|---|
|
Level 4 |
Agency-Wide: Information about agency data sets is regularly updated as new data sets are added and old data sets are retired. The data inventory information is used to identify duplicative data sets that can be eliminated or consolidated. Managers use information about data costs to evaluate opportunities for improved efficiencies. Program-Specific: Managers use information about data costs to evaluate opportunities for improved efficiencies. |
|
Level 5 |
Agency-Wide and Program-Specific: There is a good understanding of the value provided by each data set with respect to agency efficiency and effectiveness. Data collection and management methods are regularly evaluated and improved. |
This sub-element looks at the extent to which the agency or data program manager has documented their data, its uses and its value to the agency.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, information about data and its uses resides in the heads of a few staff members – nothing is written down. As agencies move up the maturity scale, they consistently document their data sets and track how they are used. This provides the basis for articulating the value of different types of data to the agency, and weighing data collection and maintenance costs against value added. It also enables agencies to identify areas of duplication and opportunities for consolidation.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Compile a Data Asset List |
Identify and tap existing sources of information about agency data assets (including IT application inventories and available GIS data listings) to compile an initial data asset list. |
|
|
1 |
Develop Approach to Tracking Data Cost and Value |
Develop an approach for tracking data costs and value added. Cost tracking should consider initial and ongoing costs for data acquisition, collection and management. Value tracking should consider how the data are used, what decisions they feed, and what reporting requirements they help to meet. |
|
|
2 |
Implement Cost Tracking Processes (Program-Specific Action) |
Initiate tracking of data collection/acquisition costs |
|
|
2 |
Develop and Pilot Guidance for Cost Tracking Processes |
Develop guidance for a standard approach to tracking data collection/acquisition and management costs. Pilot the approach for one or more data assets, and then support application of the approach for priority data assets. |
|
|
2 |
Create a Data Inventory |
Create an inventory of priority data assets that includes contact information for the producers or providers of the data asset, a description of the data, and identification of primary users and uses of the data. |
|
|
2 |
Document Data Users and Uses (Program-Specific Action) |
Identify and document primary data users and uses for each dataset. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
3 |
Identify Opportunities to Improve Efficiencies |
Use the information on inventory, cost and data value to identify potentially duplicative datasets and evaluate opportunities to improve efficiencies/reduce costs of data collection and management. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly review and improve data collection and management methods. |
| 1.5 Relationships with Data Customers (Program-Level Sub-element) | |
|---|---|
|
Connections between data producers and users. |
|
|
Level 1 |
Program-Specific: There are no proactive outreach activities to understand data user needs. |
|
Level 2 |
Program-Specific: Informal, limited outreach to other business units has been conducted to identify how they might utilize available data sets. |
|
Level 3 |
Program-Specific: Meetings have been held with current or potential new users for our data to understand their needs. This information has been considered in developing plans for improvements. |
|
Level 4 |
Program-Specific: Input from data customers is routinely solicited, collected and considered through a variety of online and in person forums (e.g. Communities of Interest). |
|
Level 5 |
Program-Specific: There are formal, written agreements that document what data will be provided to customers, when and how. A process is in place to periodically re-negotiate these agreements. |
This sub-element looks at the extent to which data program managers have established channels of communication with data users.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, data program managers do not proactively communicate with data users to understand how they use data and obtain feedback on data quality. As agencies move up the maturity scale, data program managers make an effort to reach out to data users, and act on feedback received to make improvements. In some situations, service level agreements can be negotiated to formalize what data are provided, at what frequency and in what form. Strengthening relationships between data providers and data customers helps agencies to avoid situations in which data are being produced but not used as intended. A functioning feedback loop between data providers and customers helps data providers focus on data improvements that add value.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Identify Data Customers |
Create a list of data customers for the data program of interest, identifying contact information and current understanding of how they use the data. |
|
|
2 |
Conduct Customer Outreach |
Meet with data customers to learn more about how they use the data and hear their suggestions for improvements to data quality, availability and usability. |
|
|
2 |
Incorporate Customer Feedback into Data Strategy and/or Plans |
Incorporate customer input and stakeholder feedback into agency data strategy and/or improvement plans. |
|
|
3 |
Organize and Regularly Engage Stakeholder Communities |
Create regular opportunities for data customers and other key stakeholders to provide input into data program priorities. For example, organize meetings with broad “Communities of Interest” to gather information about how data collected or maintained in key business, data, or subject areas can be improved to better meet their needs. |
|
|
3 |
Incorporate Customer Feedback into Data Stewardship |
Incorporate customer input and stakeholder feedback into day-to-day data stewardship and lifecycle management planning, investments, and activities. |
|
|
4 |
Establish Data Agreements |
Establish formal data sharing or data use agreements with data customers and other key stakeholders establishing clear expectations and commitments for data products and services. |
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly monitor customer satisfaction and dedicate resources to improve customer relationships and deliver prioritized data products and services. |
| 1.6 Data Management Workforce Capabilities | |
|---|---|
|
Attracting, building and sustaining a workforce with the knowledge, skills and abilities to meet changing data management and analysis requirements. This sub-element assesses the extent to which the agency workforce has the right mix of skills for effective data management, and the extent to which the agency is able to sustain needed skill sets through staff transitions. |
|
|
Level 1 |
Agency-Wide and Program-Specific: There is limited awareness of the need to build workforce capabilities for data management and analysis. There is limited awareness of risks associated with loss of current data experts due to retirements or job changes. |
|
Level 2 |
Agency-Wide and Program-Specific: Workforce needs and risks associated with data management and analysis are generally recognized and understood but there is no coordinate strategy in place to address them. |
|
Level 3 |
Agency-Wide and Program-Specific: Strategies to address needs and mitigate risks have been identified and are being piloted through one or more of the following activities:
|
|
Level 4 |
Agency-Wide and Program-Specific: Strategies to address needs and mitigate risks are routinely and widely executed across the agency. Managers anticipate and plan for staffing requirements related to data management and analysis and ensure that work commitments are in line with available staff resources. |
|
Level 5 |
Agency-Wide and Program-Specific: |
|
Strategies to address needs and mitigate risks are routinely evaluated and improved based on prior experience and changing requirements. |
This sub-element assesses the extent to which the agency is able to sustain data management functions through staff transitions.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, the agency is not aware of risks associated with departures of staff with specialized knowledge and skills related to particular data sets or data management practices. As agencies move up the maturity scale, these risks are systematically identified and mitigation actions are put in place – including succession plans and mentoring strategies. A proactive approach to ensuring data management sustainability reduces risks of disruption to data access or reporting activities. It also provides for an orderly and efficient transition of responsibilities.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Perform Informal Data Workforce Needs Outreach |
Conduct informal outreach to identify data management/analysis related workforce needs and opportunities. |
|
|
1 |
Communicate High-Level Agency Data Workforce Needs |
Create materials communicating key data management/analysis related workforce needs and opportunities. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
2 |
Assess Knowledge Gaps |
Assess gaps in data management and analysis competencies in key data-related functions (e.g. data user, data collector, data analyst, data steward, front-line manager, senior manager). Evaluate results to recommend improvements to position descriptions, training, and/or recruitment. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
2 |
Plan for Data Knowledge Transfer |
Evaluate knowledge capture and transfer needs for critical data-related positions. Pilot succession planning and knowledge transfer tools to support future implementation. |
|
|
3 |
Develop a Data Literacy Curriculum |
Develop detailed data literacy curriculum and compile associated training materials for key data-related functions. |
|
|
3 |
Plan for Data Staffing Requirements |
Create tools for managers to evaluate and align data management and analysis resources of their programs, including typical work commitments for data program specific functions (e.g. Data Stewards, Data Stewardship team members or Data Governance committee members). |
|
|
4 |
Address Workforce Risks in Strategic Plans |
Elevate actions to address high priority workforce risks into broader agency and/or business unit strategies, priorities, and plans. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly monitor the workforce and capabilities against standards and best practices and dedicate resources to prioritize and improve identified gaps. |
| 2.1 Data Updating | |
|---|---|
|
Well-defined and coordinated data update cycles. This sub-element assesses the extent to which update methods and cycles have been defined and documented for key data sets. |
|
|
Level 1 |
Agency-Wide and Program-Specific: Data collection and updating cycles and business rules for enterprise data updates have not been established. |
|
Level 2 |
Agency-Wide and Program-Specific: Enterprise data collection and updating cycles have been established but have not been documented. |
|
Level 3 |
Agency-Wide and Program-Specific: Agency expectations for data updating processes have been established and documented, including one or more of the following:
|
|
Level 4 |
Agency-Wide and Program-Specific: Agency expectations for data updating processes are generally understood and consistently followed. Business rules for data updating are generally embedded in and enforced by applications (where applicable). |
|
Level 5 |
Agency-Wide and Program-Specific: Agency expectations for data updating are periodically reviewed, updated, and implemented. Enterprise data collection and data updating methods and cycles are also periodically reviewed to identify and implement opportunities for improved coordination and efficiencies. |
This sub-element assesses the extent to which update methods and cycles have been defined and documented for key data sets.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, data updates are made on an ad-hoc basis and users are not aware of data updating frequencies or methods. In addition, rules for adding and deleting key data entities (e.g. routes, projects, employees) have not been developed. As agencies move up the maturity scale, they create and maintain business rules for how each major data set is to be updated. Where applicable, business rules are embedded into applications. For example, an HR system may include a wizard for adding a new employee that makes sure that all required data elements are entered. Defining rules for data updates is a critical step that impacts the cost of data maintenance and also the level of quality that will be provided. Formalizing rules for data updates provides clarity for both data users and data managers.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Develop Data Flow Documentation Templates |
Create and share examples for documenting and diagramming data collection and update processes. |
|
|
1 |
Define Method and Standards for Documenting Data Update Cycles |
Establish a standard way to document data update cycles and integrate metadata elements for this within dataset metadata standards. |
|
|
2 |
Create Guidance on Data Collection and Updating |
Develop a guidance document that covers practices for documenting and periodically reviewing data collection and preparation processes, establishing appropriate data update cycles, and defining business rules for how key data entities are added, updated and deleted |
|
|
3 |
Conduct Training on Data Collection and Updating |
Provide training and support for application of the guidance on data collection and updating. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
3 |
Align Data Systems with Data Collection and Update Requirements |
Update key data collection and publication/access systems to reinforce established data collection and update requirements (e.g. required metadata fields, triggers/reminders to create certain documentation on publication or update, workflows ensuring necessary data collection or update as part of business processes, flags for data that has exceeded retention periods or useful or anticipated life etc.). |
|
|
4 |
Implement a Regular Monitoring Process |
Establish processes to regularly monitor data collection activities to ensure conformance with documented expectations. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate agency data collection and update methodologies and tools to identify and implement opportunities to improve outcomes. |
| 2.2 Data Access Control | |
|---|---|
|
Well-defined policies and guidelines for managing access to data sets. This sub-element assesses the extent to which the agency manages access to data sets in order to protect sensitive information and maintain data integrity. |
|
|
Level 1 |
Agency-Wide and Program-Specific: No agency policies or guidelines regarding data sharing or access management are in place beyond external agency requirements (e.g. state-level policy, legislative/regulatory requirements). |
|
Level 2 |
Agency-Wide and Program-Specific: The agency is beginning to address risks associated with unauthorized access to data and sharing of sensitive data through one or more of the following activities:
|
| 2.2 Data Access Control | |
|---|---|
|
Level 3 |
Agency-Wide and Program-Specific: Standard policy, practices, and guidance have been developed to identify and protect sensitive data, and define how data can be shared externally to the agency. However, these have not yet been fully implemented. |
|
Level 4 |
Agency-Wide and Program-Specific: Standard policy, practice, and guidance are implemented. Enterprise data sets have been classified for purposes of data protection and access and data owners/providers/stewards are complying with policies and guidelines for data protection and access. |
|
Level 5 |
Agency-Wide and Program-Specific: Data access and protection guidelines and procedures are well established, and are periodically reviewed and updated as part of standard practice. |
This sub-element assesses the extent to which the agency manages access to data sets in order to protect sensitive information and maintain data integrity.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, the agency’s approach to managing access to data is ad-hoc. As agencies move up the maturity scale, there is a standard method for classifying sensitive information and a formalized process for defining access privileges as new data sets are brought online. Standardizing and formalizing data access control supports compliance with applicable information security regulations and prevents data corruption due to unauthorized or unmanaged changes. It also provides a way for agencies to define and apply consistent criteria for what data are to be shared openly versus kept internal to the agency.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Evaluate Data Access and Sharing Needs |
Identify needs or opportunities to support access and sharing of data to support agency business, meet data sharing requirements, and fulfill external customer needs. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Assess Data Access and Sharing Risks |
Identify risks associated with unauthorized access or sharing of sensitive data and document potential mitigation strategies. |
Chapter 5-Classifying and Protecting Private and Sensitive Data |
|
2 |
Document Method for Identifying Private and Sensitive Data |
Building on established state-level information security policies, establish a methodology and process to identify and classify private sensitive data for purpose of data protection and sharing. |
Chapter 5-Classifying and Protecting Private and Sensitive Data |
|
2 |
Define Data Access and Sharing Policies |
Establish formal data access and sharing policies intended to protect sensitive data and define how data can be shared both internally and externally to the agency. |
Chapter 5-Classifying and Protecting Private and Sensitive Data |
|
3 |
Classify Enterprise Datasets |
Classify enterprise datasets based on sensitivity level and need for special handling. |
Chapter 5-Classifying and Protecting Private and Sensitive Data |
|
3 |
Provide Data Access and Sharing Training |
Develop and roll out training to ensure awareness and understanding of how to recognize and manage sensitive data. |
Chapter 5-Classifying and Protecting Private and Sensitive Data |
|
4 |
Implement a Regular Monitoring Process |
Establish monitoring processes to ensure that data is shared and accessed in conformance with documented policies and procedures. |
Chapter 5-Classifying and Protecting Private and Sensitive Data |
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate agency data sharing and access methodologies and tools to identify and implement opportunities to improve outcomes. |
| 2.3 Data Findability and Documentation | |
|---|---|
|
Availability of data catalogs and dictionaries that enable discovery and understanding of available agency data assets. This sub-element assesses the extent to which the agency ensures that potential data users can discover what data are available and understand potential applicability of a data set for a given business need. |
|
|
Level 1 |
Agency-Wide and Program-Specific: There is little or no recognition of the need or value of a data inventory, catalog or standardized metadata to support data findability. |
| 2.3 Data Findability and Documentation | |
|---|---|
|
Level 2 |
Agency-Wide and Program-Specific: Efforts are underway to improve the ability of agency staff to find available data through one or more of the following activities:
|
|
Level 3 |
Agency-Wide: Standards and templates for data documentation are defined and one or more of the following efforts is underway:
Program-Specific: Standards and policies have been defined to ensure that there is metadata and data dictionary information available for each data set. Templates for describing data collection, updating and reporting processes have been developed and are starting to be utilized. |
|
Level 4 |
Agency-Wide: Standards and templates for data documentation are actively used and this documentation is kept up to date. Quality assurance processes are in place to ensure this documentation is complete and useful. Processes are in place to update catalogs and documentation when data sets are added, modified or discontinued. Program-Specific: Metadata and data dictionary information is available and up to date. Quality assurance processes are in place to ensure that this information is complete and useful. Processes are in place to make updates to metadata and data dictionary information when changes are made to the data. |
| 2.3 Data Findability and Documentation | |
|---|---|
|
Level 5 |
Agency-Wide: The agency periodically evaluates opportunities to refine its approach to data documentation based on user needs, and feedback, new technologies and research into best practices. Program-Specific: Documentation of data sets is periodically improved based on feedback from users and research into best practices. |
This sub-element assesses the extent to which the agency ensures that potential data users can discover what data are available and understand potential applicability of a data set for a given business need.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, data sets are discovered primarily by word of mouth. As agencies move up the maturity scale, standard information is maintained and made available about what each data set contains – including the meaning of each data element. Providing an easily accessible catalog of data sets (or sources) adds value to existing data by promoting its re-use. It also minimizes the chances that duplicate data will be collected. Documenting the source and derivation of data elements also reduces risks associated with data misuse.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Identify Standard Metadata Elements |
Convene a work group to review and recommend standard metadata elements to be provided at the dataset and data dictionary level. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Scope a Data Catalog Effort |
Scope an effort to implement an agency data catalog that helps employees to identify, understand and access available datasets. Review available commercial offerings and agree on priority requirements. |
|
|
2 |
Create Base Metadata Templates |
Establish standards and supporting templates for documentation of data dictionaries, business rules, and supporting business glossaries. Establish data subject categories and other tags that can be used for categorization of agency data sets to help users identify and find data of interest. |
|
|
2 |
Compile Data Catalog Information |
Engage data stewards and subject matter experts in an effort to compile information about priority data assets. |
|
|
3 |
Share Data Catalog Information |
Provide access to the data catalog to enable employees and others as appropriate to browse and search for datasets of interest. |
|
|
3 |
Establish Data Catalog Update Processes |
Assign roles and establish business processes and training to review, maintain and update the data catalog information. |
|
|
4 |
Improve Data Documentation Practices based on Feedback |
Through interviews, focus groups or surveys, seek feedback from creators and users of existing data documentation products, and identify improvements in response to their feedback. |
|
|
4 |
Evaluate and Advance Metadata Repository Needs |
Evaluate need for advanced metadata repository tools to meet needs to store and access agency metadata documentation. Develop requirements documentation useful to procure a commercial tool or develop more robust metadata repository solution meeting the agency’s long-term needs. Fund system implementation. |
| 2.4 Data Backups and Archiving | |
|---|---|
|
Guidelines and procedures for protection and long term preservation of data assets. This sub-element assesses the extent to which active data sets are backed up, and inactive data sets are archived for future use as needed. |
|
|
Level 1 |
Agency-Wide Data protection policies and procedures have not been specified or consistently implemented, as evidenced by one or more of the following:
Program-Specific: Backups of data sets are made on an ad-hoc basis. |
|
Level 2 |
Agency-Wide Selected data protection policies and procedures are into place, as evidenced by one or more of the following:
Program-Specific: Backups of data sets are made regularly, but there are no written procedures on backup frequency or storage locations. Archive copies of data sets exist, but there are no written procedures on how to create these and how to retrieve them. |
| 2.4 Data Backups and Archiving | |
|---|---|
|
Level 3 |
Agency-Wide: Basic data protection policies and procedures are in place, as evidenced by the following:
Program-Specific: There are written procedures available on backup frequency and storage locations. There are written procedures available on data archiving and retrieval. |
|
Level 4 |
Agency-Wide: Mature data protection policies and procedures are in place, as evidenced by the following:
Program-Specific: Backup procedures are consistently followed. Archiving procedures are consistently followed. Backup procedures have been fully tested. Archiving procedures have been fully tested. |
|
Level 5 |
Agency-Wide and Program-Specific: Data managers and stewards periodically review existing data backup and archiving procedures and update them as appropriate to reflect user feedback or changing needs. |
This sub-element assesses the extent to which active data sets are backed up, and inactive data sets are archived for future use as needed.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, backups and archiving are performed on an ad-hoc basis. As agencies move up the maturity scale, the agency has developed and reliably follows guidance and procedures that specify what types of data will be centrally managed (e.g. stored in enterprise databases), how frequently backups will occur, where will backups be stored, and who is responsible for making and testing backups. In addition, there will be a well-defined process for identifying which inactive or historical data sets should be archived, and what type of business user access to the archived information should be provided to meet business needs. Formalizing backup processes and verifying that they are working reduces the risk of data loss due to hardware failures and other sources of data corruption. Formalizing archiving processes allows agencies to identify where data sets can be retired in order to reduce data maintenance costs. It also ensures that business user needs are taken into consideration in determining appropriate archive methods.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Complete an Internal Practice Review |
Review and document internal data protection and backup practices. Document potential agency best practices for broader implementation. |
|
|
1 |
Verify Current Backup and Archiving Procedures (Program-Specific Action) |
Review current data backup and archiving procedures to verify that program datasets are backed up and archived on a regular basis. |
|
|
1 |
Complete an External Practice Review |
Evaluate national best practices and/or survey peer agencies to identify external agency practices for potential implementation. |
|
|
2 |
Establish Back Up Policy and Practice |
Establish back up frequency, storage locations and data archival and retrieval, and back up testing practices for enterprise databases. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
2 |
Migrate Desktop Applications |
Create a list of desktop applications with data of agency-wide value, and scope projects to shift these to enterprise applications. Identify opportunities to meet these needs by piggybacking on existing systems or planned system development projects. |
|
|
2 |
Document Backup Procedures (Program-Specific Action) |
Produce and/or update documentation of backup and archiving procedures, including information on backup and archiving frequency, storage locations and data retrieval or restoration processes. |
|
|
3 |
Implement Data Protection Policy |
Establish specific roles and responsibilities to ensure back up policies and procedures are consistently implemented across the agency. Provide appropriate training and resources to both technical and business staff to support full implementation. |
|
|
3 |
Establish Regular Backup Testing Practice |
Establish practice to periodically test backups to ensure that data can be successfully recovered, including business verification that the data recovery was successful and meets their business needs. |
|
|
4 |
Implement a Regular Monitoring Process |
Establish processes to regularly and independently confirm that data protection practices are in conformance with documented policies and procedures. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate agency data protection methodologies and tools to identify and implement opportunities to improve outcomes. |
| 2.5 Data Change Management | |
|---|---|
|
Processes to minimize unanticipated downstream impacts of data changes. This sub-element assesses the extent to which procedures are in place to manage the process of making changes to data structures in one data set or system that may impact other systems or reports. |
|
|
Level 1 |
Agency-Wide and Program-Specific: There are no documented agency-wide best practices or guidelines for analyzing how changes to data structures, definitions or unique identifiers in one system may impact reports or other dependent systems. |
|
Level 2 |
Agency-Wide and Program-Specific: The agency recognizes the need for agency-wide data management policy and guidance, and these are under development. |
|
Level 3 |
Agency-Wide and Program-Specific: A standard change management process has been defined for changes to data elements that may impact multiple systems. This involves consultation and communication with affected data owners and users, and propagation of the changes across databases as needed. Change analysis and propagation processes are mostly manual. |
|
Level 4 |
Agency-Wide: Change management processes are in place and functioning as intended. These processes are supported by integrated and/or automated analysis and tools, as evidenced by one or more of the following:
Program-Specific: A change management process is in place and functioning as intended. |
|
Level 5 |
Agency-Wide and Program-Specific: A periodic review is conducted of the nature and extent of data changes to improve future data architecture and change management practices. |
This sub-element assesses the extent to which procedures are in place to manage the process of making changes to data structures in one data set or system that may impact other systems or reports.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, changes to data structures, definitions or unique identifiers are made as needed without awareness of potential unintended consequences. Impacts may be discovered only when downstream applications or reports stop working as a result of changes made. As agencies move up the maturity scale, a more proactive approach is in place for anticipating downstream impacts of changes, communicating with data stewards of these downstream systems, and implementing changes in a controlled, automated and coordinated manner. Such an approach would focus particular attention on management of “master data” that exist across multiple agency systems. Putting proactive and robust change management processes in place helps to avoid business disruptions due to broken reports or queries. It also helps to avoid the introduction of inconsistencies in data structures and definitions across systems that can be a barrier to creating an integrated view of data.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Identify Data Change Issues |
Convene a group of data managers and stewards to discuss situations where unmanaged data changes created downstream problems. Document the issues and their consequences and use the examples to raise awareness of the need for data change management. |
|
|
1 |
Review Existing Data/System Change Management Processes |
Identify existing agency or program practices for managing changes to data and systems such as change control boards and communication protocols. Determine whether existing processes might be adapted to cover data changes independent of system development or updates efforts. |
|
|
2 |
Design a Data Change Management Process |
Design and document a data change management process defining how data owners/trustees, users, and other stakeholders are expected to be informed and engaged prior to implementation of data changes which may impact multiple systems or business areas. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
2 |
Pilot a Data Change Management Process |
Work with one or more data programs to pilot application of a standard data change management approach. Capture lessons learned and make adjustments as appropriate. |
|
|
3 |
Implement a Data Change Management Process |
Establish specific roles and responsibilities to ensure that the data change management process is consistently implemented. Provide appropriate training and resources to both technical and business staff to support full implementation. |
|
|
3 |
Automate Data Change Management |
Work in targeted areas to automate data change analysis and propagation processes - for example to manage changes in reference or master data or apply available metadata to automatically flag systems and tables that contain particular data elements impacted by a proposed change. |
|
|
4 |
Implement a Regular Monitoring Process |
Establish processes to regularly confirm data change management practices are in conformance with documented policies and procedures. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate agency data change management methodologies and tools to identify and implement improvements. |
| 2.6 Data Delivery | |
|---|---|
|
Delivery of data to users in a variety of convenient, useful and usable forms. This sub-element assesses the extent to which data are delivered to end users in convenient forms that are suited to best meet business needs. |
|
|
Level 1 |
Agency-Wide: The agency has not pursued creating centralized data access, analysis, and reporting capabilities. Program-Specific: Data users can access a limited set of standard reports. Other reports are provided by request, but typically require specialist IT resources to implement. |
| 2.6 Data Delivery | |
|---|---|
|
Level 2 |
Agency-Wide The agency is exploring agency-wide needs and opportunities for improving access to integrated agency data in usable forms. Pilot initiatives may be underway; however data reporting is typically accomplished in a decentralized fashion. Ad-hoc query tools may be available but are not widely used. Program-Specific: Program data users can access a variety of standard reports. Data program staff have the tools they need to respond to special data requests without requiring use of IT resources. |
|
Level 3 |
Agency-Wide: The agency has implemented enterprise solutions for data access, reporting, visualization and analysis (e.g., data warehouse, data marts, dashboards) to meet common needs across business units. These may include one or more of the following:
Program-Specific: Data users have self-service access to both standard and ad-hoc reports. |
| 2.6 Data Delivery | |
|---|---|
|
Level 4 |
Agency-Wide: The agency has implemented enterprise data self-service solutions, to allow employees to access, analyze and report data to their meet specific needs. Example capabilities include that the agency has the expertise and tools to:
Program-Specific: Data are made available through a variety of formats and platforms (GIS portal, mobile devices, dashboards, etc.) to meet identified business requirements. Business needs for access to both “live” data and “frozen” or “snapshot” data have been addressed. |
|
Level 5 |
Agency-Wide: The agency has implemented a flexible architecture for reporting and mapping that enables them to easily add new data sources and enhance analysis capabilities in response to newly identified requirements. The agency also routinely improves data access and usability based on feedback from users and monitoring of the latest technological developments. Program-Specific: Data are made available outside of the agency via a statewide or national GIS portal or clearinghouse Access to data is provided through a service or API. |
This sub-element assesses the extent to which data are delivered to end users in convenient forms that are suited to best meet business needs.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, data are collected or acquired without careful consideration of the wide range of potential uses, and the types of delivery formats that would best serve these uses. As agencies move up the maturity scale, the agency implements tools and processes to ensure that data are delivered in usable forms. This may involve a variety of techniques including data integration and transformation (e.g. to combine traffic and pavement data, or to aggregate financial transactions into meaningful categories), development of exception reports, use of GIS portals and business intelligence platforms, creation of open data feeds, etc. An emphasis on data delivery squeezes more value out of data investments by promoting data use and re-use. It also contributes to agency efficiency by reducing the need for time consuming data manipulation and custom report development.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Assess Internal Data Delivery Practices and Tools |
Work with key stakeholders to evaluate agency data delivery practices and tools, identifying technology gaps and unmet business needs. |
|
|
1 |
Complete an External Practice Review |
Evaluate national best practices and/or survey peer agencies to identify external agency data access and delivery practices for potential implementation. |
|
|
1 |
Create Standard Reports and Queries |
Create standard reports and make them accessible to program data users. Build standard queries to enable quick response to special data requests without requiring use of IT resources. |
|
|
2 |
Implement or Improve Enterprise Data Solutions |
Scope an effort to implement or enhance available enterprise solutions for data access, reporting, visualization and analysis (e.g., data warehouse, data marts, dashboards) to meet common needs across business units. |
|
|
2 |
Identify and Prioritize Data for Inclusion in the Enterprise Data System |
Work with stakeholders to identify and prioritize data for inclusion in the enterprise data system. |
|
|
2 |
Implement Self-Service Access to Program Data |
Develop an ad-hoc query and reporting capability to provide users with access to frequently requested information. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
3 |
Implement Enterprise Data Self-Service Solutions |
Scope an effort to implement or enhance enterprise data self-service solutions, to allow employees to access, analyze and report data to their meet specific needs. |
|
|
3 |
Develop BI/Data Mart Solutions |
Provide dedicated process and resources to request development of data mart solutions that allow employees to easily perform ad-hoc queries and reports without requiring specialized data analysis capabilities or tools |
|
|
3 |
Develop a Mobile Data Access Solution |
Scope, implement and document standard approaches for accessing and updating agency data from mobile devices. |
|
|
4 |
Establish an External Data Publication Process |
Leverage available data repositories such as state or agency-level open data portals or GIS clearinghouses to make selected data available to the public. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate data delivery and access methodologies and tools to identify and implement opportunities to improve outcomes. |
| 3.1 Location Referencing | |
|---|---|
|
Common location referencing methods across agency data sets. This sub-element assesses the extent to which the agency has standardized methods for location referencing, including linear referencing for its road-related data sets. |
|
|
Level 1 |
Agency-Wide: The agency has not pursued creating a standard linear referencing system (LRS) for agency data. Program-Specific: Data sets including location elements cannot be spatially integrated with other agency data sets. |
|
Level 2 |
Agency-Wide The agency is working towards establishing a standard LRS. However most data sets including location elements cannot be easily and accurately integrated with other agency data sets. Program-Specific: Representation of location information is in the process of being standardized. |
|
Level 3 |
Agency-Wide: The agency has developed an agency standard LRS and is pursuing one or more of the following implementation steps:
Program-Specific: New data sets that include location elements are collected using the agency’s standard LRS. |
| 3.1 Location Referencing | |
|---|---|
|
Level 4 |
Agency-Wide: The agency has implemented a standard LRS, as evidenced by one or more of the following:
Program-Specific: Methods are in place and functioning to propagate changes in location referencing resulting from road network changes to business data sets. Methods are in place and functioning to translate between coordinate-based location referencing (e.g. latitude/longitude) and linear referencing (e.g. route-milepoint). |
|
Level 5 |
Agency-Wide: The agency has an advanced and adaptable location referencing approach, as evidenced by one or more of the following:
Program-Specific: Methods for propagating changes in location referencing resulting from road network changes are automated. Data owners/managers work closely with agency GIS staff and proactively work to improve their data sets’ consistency with agency-wide standards. |
This sub-element assesses the extent to which the agency has standardized methods for location referencing, including linear referencing for its road-related data sets.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, different data sets use different methods for location referencing; and standards for location referencing have not been established. This results in a lack of ability to reliably map information and to integrate different data that have a spatial component. As agencies move up the maturity scale, location referencing standards are developed and adopted; existing data sets are transformed as needed to use the standard referencing methods; and the standards are applied for new data sets are collected or acquired. In addition, a process is in place to propagate changes in linear referencing to various data sets as road changes occur or as errors are corrected. Standardization and management of location referencing is an essential practice that allows agencies to visualize and integrate their data in an efficient manner.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Identify Current Methods |
Identify current methods in use for locating assets and activities that occur along the agency’s linear networks (highways, rail lines, etc.) |
|
|
1 |
Convene a Work Group |
Convene a work group to recommend an approach to standardizing the linear referencing system (LRS) |
|
|
1 |
Identify Inconsistencies (Program-Specific Action) |
Review current location referencing methods and data against those for other agency data systems and identify inconsistencies. Assess approaches and level of effort to bring methods in conformance to those used in other systems. |
|
|
2 |
Create a Standard Agency LRS |
Procure services and software (as needed) to create and maintain a standard agency LRS. Migrate existing data to the new LRS. Establish and apply LRS data quality metrics. |
|
|
2 |
Expand Use of the LRS |
Establish requirements and review processes to ensure that new data collection efforts use linear referencing methods that are compatible with the agency’s LRS. Create plans for how changes in the LRS will be kept in sync with stand-alone systems that maintain their own versions of the network (e.g. pavement management, maintenance management). |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
2 |
Use Standard LRS (Program-Specific Action) |
Phase in use of the agency’s standard LRS for new datasets |
|
|
3 |
Implement LRS synchronization processes |
Develop, test and implement processes to propagate changes in location referencing resulting from road network changes to business data sets. |
|
|
3 |
Implement LRS translation Services |
Implement services that enable translation of coordinate-based locations to/from LRS-based locations |
|
|
4 |
Automate LRS synchronization processes |
Identify and pursue opportunities to further automate procedures to propagate LRS changes to disconnected systems. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate practices for managing location referencing and implement opportunities to improve outcomes. |
| 3.2 Geospatial Data Management (Agency-wide Sub-element) | |
|---|---|
|
Standardized approach to collection and management of geospatial data. This sub-element assesses the extent to which the agency has a standard approach to collection, management and integration of spatial data. |
|
|
Level 1 |
Agency-Wide: The agency does not provide enterprise-wide planning and support for management and integration of geospatial data. |
|
Level 2 |
Agency-Wide: The agency has designated responsibilities for enterprise-wide planning and support for managing geospatial data. The agency manages a collection of spatial data sets and makes them available for internal use. |
|
Level 3 |
Agency-Wide: The agency has written policies and standards that define how geospatial data are to be collected, stored, managed, shared and integrated with non-spatial data attributes. The agency includes consideration of spatial data in their information technology strategic plan (or equivalent) that identifies investment needs and priorities for hardware, software and data. |
|
Level 4 |
Agency-Wide: Agency geospatial data policies, standards, and guidance are well-understood and put into practice. For example, the agency has:
|
|
Level 5 |
Agency-Wide: Spatial data collection, management and visualization requirements are fully integrated within the agency’s information technology and data management planning and operational functions. The agency periodically reevaluates and updates its approach to geospatial data management to reflect changes in technology, data availability and cost, and user requirements. |
This sub-element assesses the extent to which the agency has a standard approach to collection, management and integration of spatial data.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, there may be a variety of methods used across the agency for collecting and managing spatial data. Hardware, software and services related to GIS are not standardized and are not well coordinated with “mainstream” agency functions for data management, reporting, integration or application development. As agencies move up the maturity scale, the agency views spatial data management and reporting/mapping as integral to its overall data management and delivery function. Standard methods, processes and tools are provided to ensure that GIS data isn’t viewed as its own silo but rather integrated with other agency business data. Training and support are made available to agency staff to ensure that they can make effective use of available data. Building a consistent agency approach for managing spatial data promotes efficiency in the use of hardware, software and staff expertise. It standardizes and streamlines data integration processes, reducing the need for time consuming, repetitive tasks. It also ensures that a variety of data are spatially enabled in order to provide business value.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Initiate Geospatial Data Management Planning |
Initiate enterprise-wide planning and support for management and integration of geospatial data; this includes outlining roles and accountabilities, as well as technical support for business units. |
|
|
1 |
Establish Staff Roles and Responsibilities |
Identify and document roles and responsibilities for geospatial data management. |
|
|
2 |
Create Geospatial Data Management Policies and Procedures |
Create written policies and procedures for collecting, storing, sharing and integrating spatial data. |
|
|
2 |
Integrate Geospatial Data Management Needs within IT Strategic Planning |
Integrate consideration of spatial data within IT strategic planning, including identification of needed investments in software, hardware, and data collection. |
|
|
3 |
Conduct Outreach |
Conduct outreach to ensure awareness and understanding of geospatial data policies and procedures |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
3 |
Train Data Stewards and Users |
Provide formal training to relevant staff on data standards and practices. |
|
|
4 |
Integrate Spatial Data Across all Relevant Applications |
Pursue efforts to fully integrate spatial data collection, management and visualization requirements within the agency’s information technology stack. |
|
|
4 |
Implement a Continuous Improvement Process |
Periodically reevaluate and update approaches to geospatial data management to reflect changes in technology, data availability and cost, and user requirements. |
| 3.3 Data Consistency and Integration | |
|---|---|
|
Standards and practices to ensure the use of consistent coding and common linkages so that different data sets can be combined to meet business information needs. This sub-element assesses the extent to the agency manages database creation and application development processes in order to minimize duplication and ensure integration. |
|
|
Level 1 |
Agency-Wide: The agency has not pursued planning for data integration/linkage across business applications outside of the context of individual application development projects. Common code lists for use across applications have not been developed. Program-Specific: Data sets have not been reviewed to determine consistency with applicable agency or industry standards. |
|
Level 2 |
Agency-Wide The agency has recognized the need to plan for data integration and linkage across business applications (outside of individual, project-level efforts). Enterprise planning efforts are underway, including one or more of the following activities:
Program-Specific: Cross-reference lists have been developed to allow for data to be used in conjunction with other data sets (e.g. state versus federal project ID). |
| 3.3 Data Consistency and Integration | |
|---|---|
|
Level 3 |
Agency-Wide: The agency has begun to implement practices to ensure linkages across data sets and use of consistent coding, as evidenced by one or more of the following:
Program-Specific: Data sets/applications adhere to agency standard link fields that have been established to facilitate cross-system integration. Standard code lists are used within data sets/applications if they are available (e.g. city/county codes, organizational unit codes). |
|
Level 4 |
Agency-Wide: The agency has fully implemented practices to ensure linkages across data sets and use of consistent coding, as evidenced by one or more of the following:
Program-Specific: Procedures are in place to ensure that externally procured data sets and applications adhere to established data standards. |
| 3.3 Data Consistency and Integration | |
|---|---|
|
Level 5 |
Agency-Wide: The agency has defined a “to be” data and system architecture to guide system addition, replacement, consolidations and updates. Processes and tools for master data management and data standardization are periodically reviewed and refined to reduce data duplication and inconsistencies and maximize data re-use. Program-Specific: Opportunities to improve data integration and consistency with other agency data sets are reviewed on an annual basis. |
This sub-element assesses the extent to which the agency manages database creation and application development processes in order to minimize duplication and ensure integration.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, each new database and application development effort is implemented in isolation. Any efforts to ensure linkage with existing data are the result of individual development team initiatives – as opposed to a standard agency process. As agencies move up the maturity scale, they proactively seek opportunities to ensure that different data sets can be linked. They manage the database and application development process to include an architectural review function that enforces standards and leverages common code lists and services. This approach minimizes data duplication, and facilitates data integration. It also increases efficiency of data maintenance requirements by consolidating code lists and other data tables.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Assess Data Integration Opportunities and Barriers |
Conduct an assessment of agency/program datasets to identify key integration points, identify data duplication and inconsistencies and identify opportunities for standardizing code lists across applications. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Create Cross-Reference Tables |
Create cross-reference tables that enable matching of records from one data source to another. |
|
|
2 |
Create a Master and Reference Data Plan |
Create a plan and roadmap that identifies master and reference data entities and the systems that contain data about them, with a prioritized list of initiatives for creating consistent, authoritative master and reference data. |
|
|
2 |
Dedicate Staff to Data Integration |
Assign one or more staff to plan and carry out data architecture and integration activities, including monitoring of current and newly introduced datasets for inconsistencies. |
|
|
2 |
Standardize link fields and code lists (Program-Specific Action) |
Develop and implement a plan to improve consistency of program datasets through standardizing link fields needed for cross-system integration and using standard agency code lists. |
|
|
3 |
Create Data Element Standards |
Create standards for data elements that can be used to link different data sets (e.g. project IDs, agency IDs, asset IDs) |
|
|
3 |
Establish a Data Review Process |
Establish a review process for new data and datasets (including data obtained through external sources) to ensure that they adhere to established standards. |
|
|
4 |
Establish or Expand Enterprise Architecture Practices |
Establish, improve and maintain an enterprise architecture incorporating data, system and technology layers. |
|
|
4 |
Implement a Continuous Improvement Process |
Routinely review data integration tools and processes to identify emerging needs or potential solutions that optimize existing data management workflows. |
| 3.4 Temporal Data Management | |
|---|---|
|
Standardization of date-time data elements to enable trend analysis and integration across data sets that are collected and updated on varying cycles. This sub-element assess the extent to which requirements for standardizing temporal data elements are considered to ensure that data representing different time periods can be combined as needed to support analysis. |
|
| 3.4 Temporal Data Management | |
|---|---|
|
Level 1 |
Agency-Wide and Program-Specific: There are no standards, guidelines, or strategies in place regarding date and time-related data elements. |
|
Level 2 |
Agency-Wide and Program-Specific: There is a recognition of the need for standards, guidelines, or strategies to define and standardize date and time-related data elements and there are some common approaches and conventions in place. These may include:
|
|
Level 3 |
Agency-Wide and Program-Specific: There are documented guidelines for ensuring consistency in use of date-and-time-related data elements across data sets and applications. There are established methods for designing data sets to support trend analysis. There are established methods for creating data snapshots representing a specific point in time and for integrating across data sets to create a snapshot-in-time view. |
|
Level 4 |
Agency-Wide: There is consistent use of date-and-time related data elements across the agency datasets and applications. New use cases or requirements for temporal analysis, snapshots, or integration can be met without major changes to data structures or substantial new development effort. Program-Specific: Data user requirements for trend analysis, snapshots and other uses of temporal information can be met without major changes to data structures or substantial new development effort. |
|
Level 5 |
Agency-Wide and Program Specific: Data user requirements for trend analysis, snapshots and other uses of temporal information can be met through largely automated processes. Agency temporal data requirements and related processes are periodically reassessed to identify opportunities for further improvement. |
This sub-element assesses the extent to which requirements for standardizing temporal data elements are considered to ensure that data representing different time periods can be combined as needed to support analysis.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, treatment of temporal data elements is not standardized – each new database and application development effort determines its own formats and requirements. As agencies move up the maturity scale, they consider business requirements for time-based queries and trend analysis. Based on these business requirements, they establish standards for temporal data elements (e.g. always use month and year in order to convert between calendar and fiscal year; always distinguish between data update date and the effective date of an observation.) In addition they establish processes to create snapshots of data sets to represent point-in-time conditions as needed for specific business purposes (e.g. safety analysis). Analogous to standardization of spatial referencing, a standard approach to temporal referencing is an important foundational element that ensures that different data sets can be integrated to provide business value. For example, both “when” and “where” are key questions for understanding cause and effect relationships among system performance, crashes and fatalities, asset condition, construction project completion, weather events, and land development.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Review and Assess Current Practices |
Review current naming conventions and practices for distinguishing different types of date-time elements (e.g. data collection dates, data loading or update dates, and data effective dates) |
|
|
1 |
Conduct Outreach |
Conduct outreach to share information about current practice and discuss both short and longer term ways to improve consistency in use of temporal data elements. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
2 |
Develop Guidance |
Work with business unit representatives to identify key use cases (trend analysis, snapshots for combining data from different sources). Produce a report with recommended best practices for designing date-time fields that support business needs. |
|
|
2 |
Develop Date-Time Data Standards |
Create data element standards for a limited set of commonly used date-time data elements – |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
specifying data element names, meanings, formats and usage notes. |
|||
|
2 |
Modify Data and Processes to Meet Analysis Requirements (Program-Specific Action) |
Implement changes to program datasets (to add or modify temporal data elements) or data management processes (to create snapshots) so that data analysis requirements can be met without major changes to data structures or substantial new development effort. |
|
|
3 |
Develop Training Materials |
Develop training materials to raise awareness of the meanings of different temporal data elements. Include multiple examples of how these data elements can be tapped to meet different analysis requirements. |
|
|
3 |
Conduct Training |
Conduct training for data managers and analysts on the meaning and use of temporal data elements for analysis. |
|
|
4 |
Implement a Continuous Improvement Process |
Periodically reassess temporal data requirements and related processes to identify opportunities for further improvement. |
| 4.1 Internal Agency Collaboration | |
|---|---|
|
Collaboration across agency business units to leverage opportunities for efficiency in data collection and management. This sub-element assesses the extent to which there is collaboration and coordination across different organizational units on data collection and management. |
|
|
Level 1 |
Agency-Wide: Most data collection efforts in the agency are independent - there have been little or no efforts to coordinate across business units. The agency does not have information about the extent to which data are duplicated. Program-Specific: There have not been any efforts to coordinate data collection or management activities with other business units. |
|
Level 2 |
Agency-Wide: The agency is starting to identify cases of data duplication and discuss opportunities for coordinating data collection and management across business units (e.g., safety and asset management). Program-Specific: Opportunities for coordinating data collection and/or management activities with other business units have been discussed, but no action has been taken on this yet. |
|
Level 3 |
Agency-Wide: The agency has taken steps to reduce data duplication and take advantage of opportunities for data collaboration, as evidenced by one or more of the following:
Program-Specific: A specific opportunity for coordinated data collection has been identified and is being pursued. |
| 4.1 Internal Agency Collaboration | |
|---|---|
|
Level 4 |
Agency-Wide: Applicable policy and guidelines are being followed across the agency, resulting in routine, effective collaboration across agency business units to generate efficiencies in data collection and management. For example:
Program-Specific: Data collection is routinely coordinated with one or more other business units. |
|
Level 5 |
Agency-Wide: The agency periodically reviews its data collection programs to identify opportunities to leverage new technologies and externally available data sources. The agency regularly seeks opportunities to minimize or reduce redundancy in data collection, storage and processing and the agency monitors progress of related efforts and initiatives. Program-Specific: New internal agency partnerships on data collection and management are actively sought in order to achieve economies of scale and make best use of limited staff and budget. |
This sub-element assesses the extent to which there is collaboration and coordination across different organizational units on data collection and management.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, data collection and acquisition efforts are planned and executed independently to meet the needs of different business units. Each business unit views the data they collect as “their own” and doesn’t consider the possible value of sharing the data with others in the agency. As agencies move up the maturity scale, data collection efforts are coordinated across business units and data are shared. Data partnerships are encouraged and incentivized. New data collection technologies are being pursued that can provide multiple types of data at once (e.g. video logs and LiDAR). In addition, business units work closely with
the agency’s information technology group to take advantage of enterprise reporting and other data access platforms. A collaborative approach to data collection and management reduces duplicative efforts and prevents proliferation of multiple overlapping and inconsistent data sets.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Identify High-Profile Data Duplication Issues |
Informally engage data and business leads to identify cases of data duplication and/or inefficiencies in existing data collection coordination. Discuss opportunities for improvement. |
|
|
1 |
Create an Agency Data Collection Plan |
Scope and carry out an effort to systematically identify opportunities for collaboration on data collection and acquisition activities and create an implementation plan for phasing in collaboration. |
|
|
2 |
Set Policy for Collaboration Prior to New Investments |
Set policies and processes for collaboration prior to new data collection investments. |
|
|
2 |
Conduct a Pilot Collaboration Project |
Identify an opportunity for collaboration across multiple business units on data collection within the agency. Scope and carry out a pilot effort. Document lessons learned for application to future collaboration efforts. |
|
|
3 |
Conduct Training and Outreach |
Provide appropriate training and dedicate time and resources to encourage business staff to collaborate on data collection. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
3 |
Establish Regular Coordination Check-ins |
Target priority areas for coordination and establish regular check-ins for cross functional coordination and collaboration. |
|
|
4 |
Implement a Regular Monitoring Process |
Establish monitoring processes to ensure that data collaboration practices are consistent with documented policies and procedures. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate agency data collaboration methodologies and tools to identify and implement opportunities to improve outcomes. |
|
|
4 |
Conduct Outreach (Program Specific Action) |
Regularly reach out to other business units to discuss potential collaboration opportunities |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
| 4.2 External Agency Collaboration | |
|---|---|
|
Partnerships with external entities to share data and avoid duplication. This sub-element assesses the extent to which the agency seeks out externally available data and develops data sharing arrangements and partnerships with external public and private sector entities. Such partnerships may involve the agency sharing its data with partners and/or obtaining data from partners. |
|
|
Level 1 |
Agency-Wide: The agency is not proactively partnering with external entities for data sharing, though individual business units may be providing data on request to partner agencies and requesting or obtaining data from them to meet specific needs. Program-Specific: Data sharing with external entities may be occurring to meet specific needs but an ongoing data sharing arrangement has not been established. |
|
Level 2 |
Agency-Wide and Program-Specific: Partnerships with other public and private sector organizations are being explored to share data on an ongoing basis. |
|
Level 3 |
Agency-Wide: The agency is proactively identifying opportunities for partnering on data collection, data sharing, and data management and has some partnering arrangements in place. Program-Specific: Data sharing arrangements are in place with external entities. |
| 4.2 External Agency Collaboration | |
|---|---|
|
Level 4 |
Agency-Wide: The agency has sustained partnerships with external entities involving regular update cycles. Program-Specific: Data sharing arrangements with external entities have been sustained over time (2+ years) and through multiple data update cycles. |
|
Level 5 |
Agency-Wide: The agency routinely reassesses its partnerships for continuous improvement. This may involve one or more of the following:
Program-Specific: New opportunities for data partnerships with external entities are actively sought. Staff liaison responsibilities for managing these external partnerships have been designated. |
This sub-element assesses the extent to which the agency seeks out externally available data and develops data sharing arrangements and partnerships with external public and private sector entities.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, staff in different business units may seek out and acquire data sets from external entities on a one-time basis as needs arise. External requests for agency data sets are considered on a case-by-case basis. As agencies move up the maturity scale, data sharing agreements are developed as appropriate to make best use of both internal and external data resources. Rather than making or fulfilling a series of one-time, special data requests, regular processes are set up to share data on an ongoing basis. The agency provides self-serve access to key data sets for which there are frequent requests. A proactive approach to external data collaboration saves the agency staff time to fulfill data requests and provides opportunities for the agency to gain access to a richer pool of data at a lower cost than would be required if it were to collect and manage the data in-house.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Organize External Outreach |
Engage potential partner agencies to explore data sharing and/or data collection collaboration opportunities. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
1 |
Complete an External Practice Review |
Evaluate national best practices and/or survey peer agencies to identify other agency external data collaboration practices for potential implementation. |
|
|
2 |
Organize Regular External Outreach |
Establish formal practices to engage potential partner agencies to explore data sharing and/or data collection collaboration opportunities. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
|
2 |
Provide Data Sharing Agreement Templates |
Evaluate likely data sharing opportunities and draft template or example agreements to facilitate future collaboration opportunities. |
|
|
2 |
Pursue a Data Sharing Opportunity (Program-Specific Action) |
Negotiate a data sharing arrangement with an external partner. |
|
|
3 |
Establish Standard Data Sharing Options |
Identify and implement tools and methods for sharing data with external agency partners. |
|
|
3 |
Renew and Expand Existing Data Sharing Arrangements (Program-Specific Action) |
Check in with external agency partners to discuss opportunities for improvement or expansion to the existing data sharing arrangement. |
|
|
3 |
Capture External Data Sharing Lessons Learned |
Support reviews of existing data sharing arrangements to identify lessons learned for application in ongoing or future collaborations. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
4 |
Implement a Regular Monitoring Process |
Establish processes to regularly and independently confirm that data collaboration practices are in conformance with documented policies and procedures. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate agency data sharing methodologies and tools to identify and implement opportunities to improve outcomes. |
| 4.3 Public Data Sharing Policy and Guidance (Agency-Wide Sub-element) | |
|---|---|
|
Policies and guidance for sharing agency data externally. |
|
|
Level 1 |
Agency-Wide The agency has not pursued establishing policies or guidelines about what data to make available to the public, or how to make it available to the public. |
|
Level 2 |
Agency-Wide The agency is in the process of establishing policies or guidelines about what data to make available to the public, or how to make it available to the public. |
|
Level 3 |
Agency-Wide External data sharing policies are developed and in place. Models and/or guidelines for external data sharing have been established. |
|
Level 4 |
Agency-Wide External data sharing policies are implemented, understood, and followed when sharing and/or publishing data for public use. |
|
Level 5 |
Agency-Wide The agency periodically seeks feedback from agency data producers, data managers, and external data users about its data sharing policies and practices and uses this feedback to guide improvements. |
This sub-element assesses the extent to which the agency has established policies or guidelines regarding sharing data with the public. Such policies or guidelines would encourage data sharing while ensuring that any data that are shared do not contain personal or sensitive data and meet readiness criteria based on documentation and data quality.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, the agency may have difficulty meeting state-level open data requirements. In some cases, staff may be reluctant to share data that could be of value to the public. In other cases, data may be made available that are not properly documented or have errors that make them difficult or misleading to use. As agencies move up the maturity scale, there is a clear understanding of what data to share and how to prepare data for sharing with external parties.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Review and Assess Current Practices |
Review and document current public data sharing practices. Identify potential improvements to existing practices. |
|
|
1 |
Complete an External Practice Review |
Evaluate national best practices and/or survey peer agencies to identify external agency practices for potential implementation. |
|
|
2 |
Create Public Data Sharing Policies or Guidance |
Develop a policy and/or guidance identifying what data can be shared with the public, and processes to be followed for public data sharing. |
|
|
2 |
Document Public Data Sharing Models |
Document models and share example of successful public data sharing practices and implementations, |
|
|
3 |
Establish a Standard Public Data Sharing Process |
Implement tools and processes to make data easily available outside the agency via an open data portal or clearinghouse. |
|
|
3 |
Conduct Outreach |
Build awareness of data sharing policy, guidance and processes through outreach activities. |
See NCHRP Web-Only Document 419: Implementing Data Governance at Transportation Agencies, Volume 2: Communications Guide |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
4 |
Implement a Regular Monitoring Process |
Establish monitoring processes to confirm that public data sharing practices are consistent with documented policies and procedures. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish processes to regularly evaluate agency data sharing methodologies and tools to identify and implement opportunities to improve outcomes. |
| 5.1 Data Quality Measurement and Reporting | |
|---|---|
|
Metrics and reporting to ensure user understanding of current data quality. This sub-element assesses the extent to which data quality metrics have been defined and used to inform users about the level of currency, accuracy, coverage and completeness of a given data set. (Note that data reliability is considered to be related to accuracy and is not distinguished here as a separate characteristic. Data integrity, consistency and confidentiality are other important aspects of data quality that are considered as part of Assessment Elements 2 and 3.) |
|
|
Level 1 |
Agency-Wide: The agency has not pursued establishing agency-wide policy and/or guidance related to data quality measurement and reporting. Program-Specific: Data quality metrics have not been identified. |
|
Level 2 |
Agency-Wide The agency is in the process of establishing policy and/or guidance related to data quality measurement and reporting. Some limited guidance and assistance may be available for specific purposes (e.g. to enable data integration, or to provide location). Program-Specific: Metrics for data quality are being defined. |
|
Level 3 |
Agency-Wide: Standard agency data quality metrics, verification techniques and/or reports have been defined. Program-Specific: Metrics and standards for accuracy including location accuracy are defined and documented. Metrics and standards for timeliness and currency are defined and documented. Metrics and standards for completeness including coverage or required entities/areas and inclusion of required attributes have been defined and documented. |
|
Level 4 |
Agency-Wide: Standard data quality measurement and reporting policies and guidance have been adopted into the routine practices of the business. Program-Specific: Processes are in place to measure and document the level of accuracy, currency and completeness of our data sets. Information about data accuracy, currency and completeness is routinely shared with users. Where data are based on sampling, information about confidence levels is made available to data users. |
| 5.1 Data Quality Measurement and Reporting | |
|---|---|
|
Level 5 |
Agency-Wide: The agency proactively identifies new areas where common data quality metrics and reporting processes across data programs would be beneficial. Program-Specific: Data quality measurement processes, metrics and measurement techniques are periodically reviewed and refined as needed. |
This sub-element assesses the extent to which data quality metrics have been defined and used to inform users about the level of currency, accuracy, coverage and completeness of a given data set. (Note that data reliability is related to accuracy and is not distinguished here as a separate characteristic. Data integrity, consistency and confidentiality are other important aspects of data quality that are considered as part of Assessment Elements 2 and 3.)
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, there is a lack of awareness about the quality of different agency data sets beyond anecdotal information. As agencies move up the maturity scale, they measure and report on data quality using metrics reflecting key characteristics of concern to potential users. The agency provides standard definitions of different data quality metrics and models for how to measure data quality to facilitate application within different data program areas and enable data users to become familiar with a consistent set of measures. Providing data users with data quality metrics can help users to determine whether a data set is sufficiently accurate to meet their needs. It can also help to address lack of trust in data that users may have based on seeing a single example of an erroneous data element value. Finally, it can provide a basis for initiating data quality improvement efforts and tracking their progress. Data quality measurement can be costly, so it is important to focus on a few essential measures and take advantage of quality metrics that can be automatically generated (e.g. adherence to validation rules).
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Convene a Work Group |
Convene a work group to explore developing and documenting a standard approach to data quality measurement and reporting |
|
|
2 |
Define and Document Data Quality Metrics |
Define and document a set of data quality metrics for application within the program area, to include accuracy, timeliness and completeness. |
|
|
2 |
Define a Standard Agency Approach |
Get agreement on standard agency data quality metrics, verification techniques and/or reports |
|
|
3 |
Create data quality dashboards |
Promote transparency in data quality measurement and reporting with up-to-date information on data quality published to user-facing data portals, including accessible data visualizations of data quality measurements |
|
|
3 |
Build error reporting into data systems |
Build functionality that allows data users to report errors and receive feedback on corrective actions. The error correction processes are also included in data quality performance measurements. |
|
|
4 |
Implement a Continuous Improvement Process |
Periodically review data quality measurement and reporting processes and identify improvements. |
| 5.2 Data Quality Assurance and Improvement | |
|---|---|
|
Practices for improving quality of existing data and assuring quality of newly acquired data. This sub-element assesses the extent to which the agency pursues a systematic and proactive approach to data quality assurance and improvement. |
|
| 5.2 Data Quality Assurance and Improvement | |
|---|---|
|
Level 1 |
Agency-Wide: Data quality is assessed and improved in the context of individual data programs. No agency-wide support is provided. Program-Specific: Data quality is addressed primarily on a reactive basis in response to reported issues. There is no standard approach in place for quality assurance for new data collection and acquisition. |
|
Level 2 |
Agency-Wide The agency recognizes the need for agency-wide data quality assurance and improvement policy, guidance, and support. Limited technical assistance is available for data program managers or business units on fundamental data quality concepts and practices. Program-Specific: There have been some efforts to work with data users to proactively discuss and define data quality requirements. Standard practices are being defined to ensure quality of data collected or acquired. |
|
Level 3 |
Agency-Wide: The agency has established policy, guidelines, and support for data program managers or business units to apply data quality assurance and improvement practice. These may include one or more of the following:
Program-Specific: Standard, documented data quality assurance and improvement processes are defined. Business rules for assessing data validity have been defined. Specific guidance and procedures for data collection and processing is routinely provided to ensure consistency. A formal process for data certification and acceptance has been defined - including provision for correcting or re-collecting data when it does not meet minimum requirements for accuracy. |
| 5.2 Data Quality Assurance and Improvement | |
|---|---|
|
Level 4 |
Agency-Wide: Standard data quality assurance and improvement policies, practice, and support are adopted and implemented by the business, as evidenced by one or more of the following:
Program-Specific: Standard, documented data quality assurance processes are routinely followed. Data collection personnel are trained and certified based on a demonstrated understanding of standard practices. Business rules for data validity are built into data entry and collection applications. |
|
Level 5 |
Agency-Wide: The agency periodically reviews and refines its data quality assurance and improvement policies, guidelines, support programs and tools. Program-Specific: Data quality assurance processes are regularly assessed and improved. Data collection and acquisition practices are regularly reviewed to identify lessons learned and areas for improvement. Automated error reporting tools are available for data users. Data validation and cleansing tools are used to identify and address missing or invalid values. |
This sub-element assesses the extent to which the agency pursues a systematic and proactive approach to data quality assurance and improvement.
Valuable
Available
Reliable
Authorized
Clear
Efficient
Accountable
At lower levels of maturity, data quality is addressed as issues are reported. Staff responsible for initiating new data collection efforts do not have any standard agency guidance to follow for inclusion of data quality assurance practices in the effort. As agencies move up the maturity scale, data quality is addressed proactively, using multiple techniques. These include use of standard quality control and quality assurance processes for new data collection, development and application of data validation business rules, use of automated data cleansing processes to identify potentially erroneous data element values, and establishment of efficient error reporting and correction processes. Data quality improvement efforts need to be tailored to specific data types and collection methods. Appropriate application of data quality improvement techniques is important to ensure that data can be used as intended and can be used to produce reliable information that is valuable for decision making.
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
1 |
Convene a Work Group |
Convene a group of data stewards to share information on current data quality practices and improvement ideas. |
|
|
1 |
Compile and Share Resources on Best Practices |
Establish a resource list and share with data stewards. Identify internal and external data quality management experts who are available to provide support as needed. |
|
|
2 |
Create Guidelines and Templates |
Develop or adapt a standard data quality management approach and supporting templates. Create guidelines on expected data quality management practices to be followed by data stewards to the data resources under their authority. |
|
|
2 |
Procure Data Quality Tools |
Identify available tools supporting data quality management including profiling and automated application of business rules for validation. Procure tools and provide training on how to use them. |
| Level | Action | Description | Relevant Guidance Sections |
|---|---|---|---|
|
2 |
Conduct Practice Reviews and Training |
Conduct a review of data quality management practices in use. Identify improvements and provide training and support for implementation. |
|
|
3 |
Create QA/QC Guidelines for External Partners |
Establish agency data intake/acceptance practices from local agency partners (e.g., MPOs, RPCs, municipal agencies) to require adoption of the same QA/QC practices as if the data were supplied |
|
|
3 |
Document and Integrate Business Rules |
Document business rules for data validation and build them into data entry and collection applications. |
|
|
4 |
Implement a Continuous Improvement Process |
Establish continuous improvement processes for data quality and data quality management |