Achieving a unified digital enterprise across the Department of the Air Force (DAF) requires identifying and prioritizing common elements that drive digital transformation (DT) success. Currently, due in part to the relatively nascent status of enterprise-wide DT efforts, there is not a proven formula for achieving that transformation.
To address the challenge of identifying common DT success elements, the committee interviewed a variety of DT thought leaders and leaders of organizations engaged in enterprise-wide DT efforts. Each member of the committee then individually authored a “top 10” list of elements that they believed was critical to successful DT efforts. The individual element lists were sent to a common repository with more than 100 inputs.
Because many of the inputs were similar in nature, the committee engaged in a series of categorizing and clustering iterations to abstract the most common elements. What emerged was the following factor groupings—or five groups of common elements necessary for accelerating digital transformation:
Digital engineering (DE) and broader DT efforts within the U.S. Air Force and the U.S. Space Force are still in relatively early stages of implementation. Accordingly, data collection efforts are sparse, and definitive evaluation metrics are still emerging. Recognizing this, the committee does not present these common elements as a final or definitive solution. Rather, it is a framework built from months of interviews, expert insights, and comparative observations—an analytic lens intended to advance discussion, deepen understanding, and support future evaluation of DT efforts.
The committee acknowledges that social and organizational transformations differ significantly from laboratory experiments. What works in one setting (e.g., Miami, Florida) may not generalize to another (e.g., Green Bay, Wisconsin), and this contextual variability is an inherent challenge in studying DT within different organizations and at different levels of organization, from programs to an enterprise level. Like other scientific progressions, the committee’s approach is iterative and is not positioned as an endpoint but as a step forward—a structured lens for identifying what supports or hinders DT outcomes across programs and environments.
At the same time, the evidence from the Northrop Grumman Corporation (NGC) Model 437 Vanguard (as discussed in Chapter 1) offers compelling proof that DT-enabled environments can accelerate development timelines, reduce costs, and improve quality. While enterprise-wide DT success remains an aspirational goal, the committee contends that the NGC 437 example provides meaningful evidence that these elements served as a promising step toward greater understanding and a hypothesis worth refining, validating, and evolving.
Conclusion 3-1: The committee identified several high-priority common elements that may aid in the acceleration of digital transformation: leadership and adoption culture, funding and incentives, software and data, hardware and secure information technology, and metrics.
It is still too early in the history of DT evolution to consider these elements as a definitively comprehensive view of DT effort drivers. At the same time, the committee believes that the data and experiences thus far support these elements as a framework for comparing the strategies and resources that contribute to—or detract from—a DT program’s success. This chapter provides insights associated with each of the elements. They do have overlapping impacts upon one another, and some repetition of concepts is unavoidable when describing these interrelated elements.
Additionally, Chapter 4 presents DT use case examples that provide greater specificity and context for these common elements. The evidence from the use cases shows how the element focus areas, when properly aligned, synergistically strengthen and accelerate DT efforts.
Across numerous interviews, DT experts told the committee similar versions of a common theme—DT adoption is not being hampered because of a lack of technical capabilities. Instead, human elements within DT—such as organizational resistance to change, limited resource commitments, existing organizational policies, and personal and organizational protection of intellectual property (IP)—are more often the causes of lower levels of DT adoption and slower DT progress.
Leaders may provide visionary yet steady and persistent day-to-day focus to cultivate an organizational environment that embraces digital methods across its enterprise. Leaders can also actively champion the digital transition, utilizing structured change management strategies to address risk aversion and promote a “digital first” mindset across their organization’s members. By establishing clear objectives, aligning them with program schedules, and fostering support—or at least non-resistance—from all stakeholders, leaders can create the foundation necessary for sustained success in DT initiatives.
Centralized governance and leadership structures can drive alignment, resource allocation, and enforcement of a cohesive vision, as well as providing supportive policies and infrastructure. A centralized authority, such as an Office of Digital Transformation, can help unify standards, provide minimal yet enforceable guidance, and maintain consistency across programs. This governance model should support authentic adoption from the decentralized divisions and programs. This can be accomplished by consistent shaping of organizational values, norms, policies, and resource decisions with regard to digital objectives.
At the same time, leaders must provide flexibility within their DT initiatives because organizations with a diverse set of programs and divisional units will not be able to find a “one-size-fits-all” set of digital tools or policies that will fit every occasion. A Pareto 80/20 model is suggested for initial consideration: Can program leaders work within their enterprise to find tools that can be used in common by 80 percent of all organizational members? This leaves flexibility at the program level to acquire tools specific to a client’s preference or development need. Rigid, constraining guidance from centralized authority risks inhibiting DT progress or acceleration obsolescence due to the pace of rapidly evolving technologies and requirements. The DT goal is not only to adopt DE practices, but also to develop systems and processes that are resilient, flexible, and capable of meeting new challenges as they arise.
Cultural adoption requires a comprehensive strategy that combines training, collaboration, and ongoing policy and communication engagement. Leaders should help their organization’s members recognize that DT is not just a destination—it is a direction travelled with continuous steps of iteration and improvement. This includes providing digital literacy training to all personnel, fostering collaboration
among stakeholders internal and external to an organization, and equipping the appropriate organizational members with the deeper skills and tools necessary for a product’s successful development and sustainment across its DE life cycle.1
An organization may also enlarge its digitally savvy workforce by offering specialized career pathways for digital roles, closing skill gaps in critical areas through training and prioritized recruiting of digitally skilled members. Training and programs should focus on
Leaders should resource collaborative or stand-alone efforts that support skill development for these five roles.
To encourage DT adoption, DAF leaders may also implement agile contracting and milestone methods that allow for flexible, iterative approaches for DE test cases. Early adopters can be incentivized with access to resources and funding, enabling them to overcome initial technical barriers. Leaders must also provide stable, long-term funding to sustain DE efforts as well as mitigate against unexpected technical pauses during program execution. “Lessons learned” from the early DT efforts can serve as use cases that provide proof of concept to organizational members and also suggest steps for broader organizational adoption. Aligning digital objectives with acquisition schedules, maturity metrics, and key program documents can help create DT accountability while simultaneously providing a clearer roadmap for an acquisition program’s digital journey and the overall enterprise DT efforts.
Collaboration with industry, academia, and other government entities can also prove beneficial. Leaders should engage with industry early, sharing digitally based acquisition strategies and gathering feedback to refine approaches that optimize the DE collaboration opportunities. Developing standardized frameworks and sharing or training these standards across industries can encourage joint investments in digital technologies and strengthen partnerships.
Where possible and cost-efficient, digital ecosystem participation could also be grown by developing tools and standards that are accessible to smaller businesses. Creating a DE tool library could help smaller contributing organizations improve product quality and support the broader DE ecosystem while fostering
___________________
1 American Institute of Aeronautics and Astronautics Digital Engineering Integration Committee, 2025, “Digital Engineering Workforce Development: Challenges, Best Practices, Recommendations.”
innovation and competition. Knowledge management systems should also be leveraged to capture, organize, and disseminate information efficiently, including lessons learned from “blameless” postmortems of DE program efforts. Teams of government, academic, and industry members should collectively learn and foster a culture of continuous improvement.
Lastly, leaders may further foster a stronger DE culture through an effective, ongoing communication campaign. A steady communication campaign—encompassing speeches, town hall talks, articles, online resources, and media outreach—can demonstrate leaders’ commitments to DT and inspire confidence among stakeholders. Beyond merely telling an organization about DT, leaders may showcase their operational DT vision, using real-world examples to show how DE has transformed workflows, programs, decision making, and mission outcomes. Creating DE sandboxes for experimentation and collaboration can further engage personnel. Concrete examples of DE success and investing in low-risk experimental environments helps communicate leader support of DT efforts in tangible ways.
The communication campaign and other DT efforts should not depend on any single leader but should be embedded within the organization’s long-term strategic vision and resourced priorities. A change in leaders may result in lowered DT efforts in favor of other initiatives. To prevent this, organizations should institutionalize DT principles through policy frameworks, resource governance, and cultural reinforcement that extend beyond any individual leader’s tenure. DT initiatives must be integrated into acquisition strategies, training programs, and funding mechanisms so that momentum is preserved even if leadership priorities shift. A well-documented and consistently communicated DT roadmap, supported by measurable milestones and ongoing stakeholder engagement, can help provide continuity by making DT an organizational imperative rather than a leader-specific agenda item.
When leaders align DT priorities and provide structured support—such as training, collaboration environments, and clear communication—they help their organization operate and drive DE and DT mission success.
Conclusion 3-2: Leadership and organizational culture play pivotal roles in the successful adoption of digital transformation (DT). Effective leaders embed DT into their long-term strategic vision and resourcing and integrate initiatives with acquisition strategies, training programs, and funding mechanisms. Ideal DT organizational culture combines training, collaboration, and ongoing policy and communication engagement.
Funding and incentives influence the successful adoption and scalability of DT initiatives. A well-structured funding strategy not only supports foundational
DE programs but also encourages DT across various organizational programs and levels. Incentivizing DE adoption, particularly within new programs, can create efficiencies in development and sustainment costs. Relatedly, DT financial models should align with long-term objectives, emphasizing stability, accountability, and a commitment to sustaining the DT effort and benefits. By centralizing some funding mechanisms while maintaining flexibility for program-specific needs, organizations can foster a culture of DE innovation while addressing unique challenges.
Agile, centralized funding models help enable organizations to respond to evolving DT requirements. Early adopters of DE at the program level often face technical and operational barriers for implementing digital practices. Centralized funding can help those program leaders buy down risks and acquire resources that allow their DE programs to succeed.
Funding centralized DE tools can occur in many ways. For example, organizational leaders may choose to spend their annual “discretionary” funds on DE initiatives. Alternatively, leaders may choose to “tax” all of their programs’ budgets within their enterprise in order to fund a common set of DE tools that become available to all program members.
Collaboration between centralized funding authorities and decentralized program managers can align incentives, which will encourage adoption of useful digital tools and processes at scale. Stable and sustainable funding mechanisms help encourage longer-term thinking. Without such sustained centralized incentives, program managers may be incentivized by shorter-term profitability and budget metrics for which they are accountable during a technology program’s milestone reviews.
By strategically aligning funding mechanisms and incentives with long-term DT goals, organizations can overcome early adoption barriers, foster innovation, and create sustainable pathways for achieving mission-critical efficiencies and capabilities.
Conclusion 3-3: Strategically aligned funding mechanisms and incentives encourage successful adoption of digital transformation (DT) initiatives at scale. The ideal mechanisms evaluate the return on investment for DT efforts, consider simultaneous impacts from centralized and decentralized funding authority, and incentivize feedback for continuous learning and accountability.
Collaborative, interoperable software and data standards form a cornerstone of foundational DT and DE implementation across programs and organizations. By prioritizing openness, flexibility, and resilience, a robust framework for software
and data standards can drive innovation, reduce costs, and accelerate program timelines while ensuring security and reliability.
Engineering practices have continued transitioning from document-based systems to model-based systems engineering (MBSE). The MBSE approach emphasizes the importance of digital models as a single source of truth for a program’s technical baseline, integrating data across all stages of the life cycle. A unified digital thread enables seamless transitions from concept to design, integration, production, operations, sustainment, and dismantlement. This digital continuity minimizes manual handoffs, reduces errors, and allows stakeholders from different organizations to work from a shared authoritative source of truth.
Standardization is a key to achieving interoperability and scalability. Organizations should adopt common engineering tools and standardized formats for digital artifacts, including both models and data. Data-sharing standards should favor the use of open, nonproprietary formats to reduce “vendor lock” and facilitate collaboration. Clear application programming interface definitions and modular open systems approaches further support this goal by enabling third-party adaptability and maintaining long-term maintainability. A unified digital taxonomy and standards-based model, such as a Systems Modeling Language (SysMLv2)-compliant framework,2 helps create consistency across programs and ensure scalability for future requirements.
For effective adoption, digital models should also include comprehensive specifications, such as sourcing information for components, repair processes, and test scenarios. These models should be accompanied by robust documentation that supports compatibility testing and root-cause failure analysis. Government stakeholders should test model compatibility in unique scenarios to verify performance across diverse use cases. Additionally, digital models must evolve with each system update in order to remain accurate and relevant throughout the program life cycle.
Integration across a product’s life cycle is a “gold standard.” A robust digital thread that allows real-time updates among stakeholders stems from data consistency and traceability from a product’s design phase through its sustainment. This life-cycle alignment can still be applied to legacy systems with significant capability time remaining—selectively applying DE principles can optimize cost savings and scheduled upgrade activities.
Process re-engineering to common standards may be necessary for effective model adoption. Before implementing new tools or models, organizations should assess and refine their existing processes to align with digital objectives. This includes defining DE and systems engineering process descriptions, such as rules,
___________________
2 A. Ahlbrecht, B. Lukic, W. Zaeske, and U. Durak, 2024, “Exploring SysML v2 for Model-Based Engineering of Safety-Critical Avionics Systems,” In AIAA DATC/IEEE 43rd Digital Avionics Systems Conference (DASC), pp. 1–8, https://doi.org/10.1109/DASC62030.2024.10749311.
standards, and certification criteria. Certification frameworks should establish clear expectations for DE systems, with deviations requiring approval for the exception. Enterprise-wide configuration management is also essential. This management should include a documented roadmap for the organization’s evolving DE environment as well as continuity across programs.
During the early stages of an organization’s DT efforts, incremental and flexible adoption of DE capabilities can mitigate risks for adopters while also promoting enterprise-level scalability. Tools and methodologies should support diverse models and be designed for gradual integration into existing workflows. Collaborative digital environments with curated models and data can foster knowledge sharing and experimentation, accelerating learning, and enabling incremental adoption. Models and data should align with findability, accessibility, interoperability, and reuse (FAIR) principles to further enhance accessibility and collaboration.
Artificial intelligence (AI)-driven solutions also offer ways for advancing software and data standards. AI-enabled metamodels can enhance modeling and simulation, improving fidelity and reducing uncertainty at each level. These frameworks can automate aspects of data integration, translation, and validation, streamlining workflows and enabling real-time decision making. AI-powered tools can also assist with translation and interoperability functions, addressing challenges posed by differing data standards and proprietary tools.
Digital sustainment requires careful planning and integration with product life-cycle management (PLM). Sustainment plans should incorporate PLM requirements, integrating legacy data and ensuring alignment with life-cycle goals. This can lead to faster upgrades, improved maintainability, and reduced costs over the long term. These efforts should be underpinned by continuous design reviews that keep models accurate and aligned with evolving operational needs. A comprehensive verification strategy also supports the integrity and reliability of digital models by testing data, providing traceability, and integrating digital artifacts. Collaboration platforms as a part of PLM can accelerate change management and propagation as well as support virtual prototyping and testing.
The successful implementation of software and data standards requires a holistic, end-to-end approach. Aligning software and standards with open and interoperable objectives can lead to a more seamless DE life-cycle management while supporting mission success across programs and the enterprise.
Conclusion 3-4: Reliable and innovative digital efforts require collaborative, interoperable software and data standards driven by sustained consistency management of system and engineering artifacts with considerations for data ownership, security, and scalability.
The success of DT also requires robust hardware and secure IT that supports the seamless integration of tools, data, and stakeholders across a product’s entire life cycle. An open architecture promotes interoperability and smoother collaboration between integrators, vendors, and other program participants. A DT infrastructure should be planned to accomplish near-term operational needs and long-term scalability.
Standardized configuration management supports effective IT infrastructure. By adopting common standards and commercial off-the-shelf solutions, organizations can minimize the need for costly customizations and reduce data translation efforts. Pursuing modular designs and plug-and-play capabilities further enhances reusability, creating “assembly line” efficiencies that accelerate development timelines and reduce costs. Balancing transparency and IP protection is also necessary. Stakeholders should have the relevant data rights from the outset of a program to prevent costly delays. Establishing a proactive IP strategy early can safeguard essential data while also promoting trust between the vendor and integrator. Additionally, program participants should have access to high-speed, secure IT capabilities for real-time collaboration and decision making, which reduces delays and improves outcomes.
Resilient and secure IT infrastructure is essential for supporting DE across all phases of a product’s life cycle, from model development to testing and sustainment. The infrastructure must enable efficient storage and sharing of data while adhering to multi-level security requirements. Connected systems introduce additional vulnerabilities, requiring enhanced security measures such as continuous monitoring, “black hat/white hat” testing exercises, and automated threat detection. Proactive cybersecurity strategies should protect the integrity of the digital thread and safeguard data storage from external and internal threats. These measures help maintain trust among stakeholders and support the longevity of DE initiatives.
Leaders will be able to better identify requirements early through market research and planning. Infrastructure and tools should align with a product’s developmental and operational objectives, providing a secure foundation for DE initiatives as well as anticipating future evolutions. By investing in standardized, secure, and scalable hardware and secure IT, leaders can address risk management and interoperability challenges across product life-cycle stages and the enterprise.
Conclusion 4-5: Success in digital near-term operational needs and long-term scalability requires standardized, secure, and scalable hardware and a resilient, secure information technology infrastructure. The ideal strategy for hardware and secure information technology includes a target architecture to support scalabil
ity, existing and legacy infrastructure, sustainment, cybersecurity assessments, and stakeholder integration.
It is useful to think of metrics as values that are measured, predicted, and/or determined by analysis, and that are used for comparing current best information against a plan or expected performance. To exploit metrics, there should be a DT plan that describes the scheduled transformations that lead from the current state to the desired state as defined by DT goals. Once the digitally transformed enterprise is fielded, metrics can be collected on how well the enterprise performs and, as important, what users think of it.
Integrating metrics into a DT plan enables stakeholders to identify and quantify the value of DT initiatives, including how product quality, security, and the pace of progress improve or decline. Metrics such as cycle times, milestone completions, or development timelines highlight how DT can accelerate processes compared to legacy systems. Data on defect rates, rework percentages, or customer satisfaction scores also help evaluate how DT initiatives improve the reliability and performance of products.
As DT efforts increase system connectivity and digital points of entry, reports on the number of detected vulnerabilities, response times to cyber incidents, and compliance with security standards may help quantify improvements in cybersecurity. Additionally, collecting metrics such as system uptime, recovery time objectives, or redundancy measures can show how properly implemented DE and DT enhance a mission while also being able to withstand disruptions. Although collecting mission data may present security or performance concerns, possibilities for automated collection and authentication from related DE activities present opportunities to improve mission-specific analysis. DT not only benefits from the collection of these metrics but enables it.
DT measures of effectiveness (MOEs) are associated with enterprise/mission goals and outcomes.3 DT measures of performance (MOPs) are associated with how well systems and processes are working and how efficiently resources are being used. Exemplar relevant MOEs for DT include the following:
___________________
3 A.M. Madni and N. Noguchi, 2025, “Exploiting Augmented Intelligence in Realizing and Operating a Digitally Transformed Enterprise,” Paper presented at 2025 Conference on Systems Engineering Research.
The relevant MOPs for DT are as follows:
In sum, MOEs and MOPs are inter-related but distinct concepts. MOEs focus on the overall impact and achievement of objectives, while MOPs focus on the performance (or actions) of systems and processes taken to achieve those objectives. It is important to note that multiple MOPs often contribute to the achievement of a single specific MOE. In the same vein, MOEs contribute to evaluating the relevance, refinement, and modification of MOPs.
A comprehensive DT metrics framework should also include measures of program life-cycle dimensions, such as development and design verification. For example, quantitative success criteria could include metrics to measure progress on a DE platform/program (DEP). These metrics guide realistic schedules and milestones and provide clarity on the trajectory toward DEP goals. Establishing a constant feedback loop to monitor progress is equally important, enabling dynamic adjustments to address emerging challenges and opportunities.
Additionally, metrics should align with the principle of “starting small with the big picture in mind,” ensuring that early-stage measurements lead to scalable measures that inform broader organizational strategies. Metrics should also be used to support predictive analyses to inform program and procurement decisions. Because poor metrics could result in misinformed decision making, it is crucial to ensure fidelity when capturing and scaling early-stage measurements.
Organizations should integrate metrics with mechanisms for continuous oversight and evaluation. DE helps enable real-time performance tracking and feedback, improving oversight across all program stages. Key performance indicators and continuous performance tracking are effective tools to measure progress in areas such as cycle time, data alignment, and talent retention, enabling organizations to identify and scale successful practices. Moreover, an adoption facilitation cell or dedicated unit can play a pivotal role in advancing DT initiatives. By partnering with program management offices and program executive officers, a program can develop and refine metrics to build trust and momentum among all stakeholders. These metrics not only guide decision making but also foster a culture of accountability and transparency, ensuring that DT efforts deliver on their expected outcomes.
Metrics provide stakeholders with the tools to evaluate the effectiveness of programs with subjective measures (such as customer survey ratings) and objective criteria (such as budget spend rates). This dual focus helps DT efforts understand and analyze goals with a broad set of insights.
Conclusion 3-6: An established robust framework of metrics assesses both qualitative and quantitative outcomes. Time horizon, measurability, relevance, diag
nosticity, and feedback-driven adjustability provide quality attributes of metrics. Measurability, relevance, and diagnosticity are important metric qualities, but are not themselves metrics.
Recommendation 3-1: The Department of the Air Force Digital Transformation Office should establish initial sets of digital transformation metrics for legacy and new systems that are measurable, relevant, and enable diagnosis of state, status, and health. It should then continue to iterate, improve, and adjust these metrics at regular intervals.