As noted above, numerous federal agencies perform formal independent assessments of major programs and use well-defined procedures in the reviews.1,2 The agencies’ processes often start from a generic set of principles and procedures and then develop into a tailored process that reflects the unique nature of the organization or program being reviewed. Our purpose in reviewing the practices of other agencies is to identify useful guiding principles that could underpin the DoD unique evaluation process.
Federal program evaluation processes are often based on a set of core principles that are consistent across agencies. A good example is NSF’s evaluation policy, which was updated in 2020.3 NSF notes that assessments can have different purposes, such as monitoring progress, guiding improvement efforts, or determining the effectiveness or efficiency of a program. The five evaluation principles in the policy, along with some relevant text from the policy sections, are reproduced here:
___________________
1 Government Accountability Office (GAO), 2021, Technology Assessment Design Handbook, GAO-21-347G, February, https://www.gao.gov/products/GAO-21-347G.
2 See National Science Foundation (NSF), 2020, Business Systems Review (BSR) Guide, NSF 21-046, October, https://www.nsf.gov/pubs/2021/nsf21046/nsf21046.pdf.
3 See NSF, 2020, “Evaluation Policy,” September, https://www.nsf.gov/od/oia/eac/PDFs/nsf_evaluation_policy_september_2020.pdf.
4 Ibid.
In addition to these core principles, some agencies have published detailed guidance to support the development and execution of evaluation processes. For instance, NSF has a guide to support major facility program reviews;5 likewise, the National Aeronautics and Space Administration (NASA) has the NASA Standing Review Board Handbook for mission reviews.6 These are broad guideline documents intended to cover a wide range of agency programs. Both documents assert that specific program reviews in their agencies would need to tailor a review for the unique attributes of the program being evaluated from the larger set of evaluation processes and related metrics. It is useful for DoD’s MII evaluation process to consider some of the common themes from these guides.
The first important consideration of the evaluation process is the selection of the evaluation team. The review team leads are selected based on their recognized leadership in the community and their relevant expertise in the technology area. The review team lead is also responsible for planning and scheduling review activities as well as for assuring consistent implementation of agency policies and processes in the evaluations. NASA suggests a separate review manager, who works with the review team lead, to address these additional responsibilities. The review team lead is responsible for developing a candidate list of team members. The evaluation process requires a diverse team of experts with knowledge on all of the essential elements of the program. Depth of knowledge in the technology area is important, but capabilities in management, programmatics, testing, or integration competencies might also be relevant. NASA suggests that having a review team that supports all of the formal evaluations of a program over its life has benefits in terms of continuity and familiarity with the program’s purpose and history. Also, a best practice suggested is to keep the team size to the minimum required to cover all important aspects of the evaluation.
Any evaluation process needs to be tailored for the specific characteristics of the program reviewed. NASA’s guide mentions the importance of factoring in program maturity in developing the review process. Also, both NASA and NSF’s guides stress the need for sensitivity to the cost and organizational burdens of the review process for the program being reviewed.
Each of the guides divide activities in the review broadly into pre-site visit, site visit, and post-site visit phases. The pre-site visit phase is where all of the information necessary to support the evaluation process is gathered. For both NSF and NASA, the pre-site visit activities start many months prior to the site visit. NSF’s guide mentions activities starting 4 months in advance, and NASA’s is earlier at 6 months in advance of the site visit. The earliest activity in the NASA process is a planning session between the review team leads and program leadership, which sets the expectations for the data and information that will be required in the review process. NASA has formal data delivery requirements set at 60 and 20 days before the site visit.
A related activity is the development of and the agreement to the terms of reference (TOR) for the evaluation. The review team lead is responsible for drafting the TOR, which covers the nature, scope, schedule, and ground rules for the review. The team lead then works with the government convening authorities and the program leadership to collaboratively develop a TOR that meets the agency’s assessment expectations. NSF’s process also includes formal scheduling, planning, and scoping activities. As the site visit draws near, there are reviews made of the progress in assembling the supporting information and data that are needed for the formal evaluation and coordination meetings between the review team and the program leadership. The site visit is typically 1 to 3 days in length, depending on the size and complexity of the program. The review team lead is responsible for developing the agenda in concert with the program leadership. During the pre-site visit phase and at the site visit, the review team—both individually and as a group—perform an assessment of the program’s status against a set of core requirements or strategic criteria. The NSF guide provides a comprehensive catalog of assessment questions organized in focus area modules to support the activities of the review team. The team uses the set of modules and questions to tailor the assessment of the specific program. In the NASA process, the
___________________
5 See https://www.nsf.gov/bfa/lfo/docs/BSR_Guide_October2020_Draft.pdf.
6 NASA, 2014, NASA Standing Review Board Handbook, NASA/SP-2014-3706, REV A, https://ntrs.nasa.gov/citations/20140008530.
team develops an assessment of performance against a set of criteria and reports status as successful, partially successful, or unsuccessful.
The post-site visit activities include report generation and briefing schedules, which again are tailored for different types of programs. Interestingly, in NASA’s process, there is a requirement that the review lead provide a “Snapshot Report” 24 to 48 hours after the site visit. This report is a one-page summary of the program review, which provides a summary of the review team’s findings and a discussion of key issues and risks.
Another useful source of insights for the DoD MII evaluation process is the National Institute of Standards and Technology (NIST)-led Manufacturing Extension Partnership (MEP) program.7 MEPs are public–private partnerships that are dedicated to support small- and medium-sized manufacturing companies. MEPs and MIIs share a focus on development and growth of manufacturing ecosystems. The MEP program has been successfully providing support to industry for more than 30 years. In 2018, the MEP Advisory Group created a Performance and Research Development Working Group. The working group focused on providing input and guidance to the MEP program to examine performance measurement, management, and evaluation processes that support the MEP National Network. The objective of the activity was to provide improved center evaluation processes and approaches, to promote National Network learning, and to suggest additional data and analysis capabilities for the MEP centers. Some of the findings and observations of the group are of interest. First and foremost, the goal of the MEP evaluation process is to provide accountability to all of the program stakeholders and to demonstrate that the MEPs are making a meaningful difference to their clients and that they also have a broader economic impact. The working group noted that the MEP has had a consistent focus on measuring performance and impact, which has driven improvements in the program and in the centers. A key component of the evaluation is the NIST MEP Client Survey, which captures relevant information such as sales, cost savings, hiring, etc., from industry. Also, an IMPACT metrics report is produced that assesses impact and market penetration. There is a panel review process performed that focuses on trends in center performance. This information allows center-to-center comparisons to be made and best practices to be identified.
There have been many studies over the 30-year life of the MEP program that have examined the effectiveness of these PPPs, improvements to the MEP National Network, and improvements to the MEP evaluation processes. Although the MEP program has been in existence for a long time, the working group suggested areas for research to further refine and improve the assessment processes used for the MEPs in the future.
Even though the DoD MIIs have significantly different goals than the Agency programs described above, the DoD 5-year evaluation of the MIIs can benefit from insights and lessons learned provided by these guides and earlier studies. First, the selection of the review team lead is obviously critical. This individual is responsible for all of the planning, execution, and reporting steps in the evaluation. There will be a substantial time commitment for this individual for the period of the evaluation process. Regarding the review team, the JDMC review team could benefit from members who have knowledge of the technical field, knowledge of the EWD needs in the field, knowledge of DoD’s needs in the technical field, knowledge of the ecosystem in the field, and knowledge of DoD’s management in similar major technology development programs. The Joint Defense Manufacturing Technology Panel (JDMTP) is an important source of manufacturing technology expertise in DoD and can support the MII reviews.
___________________
7 MEP Advisory Board Performance & Research Development Working Group, 2019, Performance Framework Final Report, https://www.nist.gov/system/files/documents/2019/05/08/report_wgperformanceframework_final.pdf.
Although across the set of DoD MIIs there are many organizational similarities, the individual MIIs have some unique characteristics. Also, the MIIs are at different levels of maturity (institute age). There will be a need to develop a tailored evaluation process for individual MIIs. Collaborative planning activities and the development of a TOR-like agreement among the JDMC team lead, OSD ManTech, and MII leadership will enable the development of a transparent evaluation process that is balanced between DoD’s needs for a strategic assessment and the time and cost resource requirements placed on the MII. In terms of information and data provided prior to the site visit, these materials would include OSD ManTech’s perspectives on specific attributes of the MII being evaluated and the MII-prepared responses to the relevant metrics-driven questions developed by OSD ManTech. As with DoD’s review of UARCs and FFRDCs, the results of surveys of customer satisfaction needs to be included in such material. Since the MIIs are PPPs, the customers include both department organizations that have sponsored work at the MII as well as industry members and other stakeholders from the ecosystem in the technology area. Finally, the assessment of MII performance would include consideration of responses provided to evaluation criteria questions in the strategic areas of importance defined for the DoD MII review process as the four evaluation questions. This topic is discussed further in the section “Strategic Evaluation Criteria for DoD MIIs” below.
At the site visit, the team would interact with MII leadership and institute members and receive detailed briefings on technology developments and transitions, EWD programs, ecosystem development, and the MII business plan’s status and direction. The team will meet to develop the set of findings and recommendations from the evaluation. The team lead can then provide a recommendation to OSD ManTech and OUSD R&E on the future of the MII. The reports developed by the review team will also be critical for the Department’s tracking of trends in performance of individual MIIs over time.
Finally, it should be noted that as the JDMC MII evaluation process starts in FY 2021, it is expected that an effort will be made to capture lessons learned from the process so that improvements can be incorporated as the 5-year evaluations progress over time. It can also be expected that refinements to team structure, data and information sets, timelines for their delivery prior to the site visit, and suitable reporting requirements will be made as the process matures. A formal meta-evaluation of the first two MII reviews planned for 2021 would be useful to the JDMC. This would be a collaborative effort among OSD ManTech, the JDMC, and MII leadership, which would lead to an effective and efficient evaluation process. In addition, the repository of the 5 year evaluation results for all of the MIIs will enable the identification of lessons learned and best practices from individual MIIs which can be incorporated into future reviews and which can support the general improvements within the set of MIIs.
The JDMC evaluation of the DoD Manufacturing USA institutes provides an opportunity to better insure and document the value of the MIIs for DoD. The nature of the JDMC evaluation is different from the annual review of MII performance, in that the JDMC has an opportunity to assess the broader impact and value of the institutes in meeting DoD’s mission needs in advanced manufacturing technologies. The metrics previously developed by DoD are very valuable to the JDMC evaluation process. Although the four question evaluation framework is appropriate to accomplish the strategic evaluation of the MIIs, there are a number of topics which should be considered which would expand the scope of the questions to better reflect an evaluation process designed for the MIIs.
There are three chartering principles, as seen in Figure 3.1, which are common to all DoD MIIs:
The combination of these principles underlines the uniqueness of MIIs which necessitates the development of a similarly unique set of evaluation criteria. Assessments within these areas serve as the starting point for the evaluation. There are a number of considerations that will require the JDMC to tailor the MII evaluation based on the specific institute being reviewed. DoD has started nine MIIs in a wide variety of technical fields from 2012 to 2020. Within the set of MIIs, there are institutes that have a focus on a facility (or facilities) with advanced technology equipment and resident expertise (e.g., the Manufacturing Innovation Institute NextFlex), institutes that serve as convening bodies for expertise and capabilities that are distributed among the members (e.g., the Manufacturing Innovation Institute AM), and institutes that have combinations of these characteristics. Over time, the institutes will have a natural evolution from their initial, start-up phase to a more mature, execution phase. The technology readiness level (TRL) and manufacturing readiness level (MRL) of technologies will vary with the portfolio mix of breakthroughs, developments, applications, and new concepts from the pipeline of ideas coming to an institute. The number of transitions to DoD applications will also vary over time based on opportunities, viable solution candidates, and application funding. Demonstrations of expansion of the DoD supply chain or enhancements to supply-chain resilience would increase over time as well. The EWD needs of DoD will evolve as technologies mature and are transitioned to DoD systems. As noted in the section “Evaluation Process for DoD MIIs” above, in anticipation of the MII evaluation, the JDMC team lead will need to work with OSD ManTech and MII leadership to develop a department assessment of the appropriate characteristics that then can be evaluated and the relevant metrics and discussion topics that also can be applied to the specific MII. As has been done by other Agencies, the DoD should have a broad set of evaluation topics within the framework which can be used to guide the design of an evaluation process for a particular MII. The evaluation questions, along with suggested additional critical topics for consideration for the JDMC evaluations, are shown below.
This question is the DoD’s assessment of the importance of the MII being evaluated in the context of DoD’s overall needs in advanced manufacturing technologies. MIIs have unique combinations of organizational attributes that provide advanced technologies, an ecosystem of importance to DoD and the nation, and a knowledgeable workforce in a specific manufacturing technology area. The programmatic focus of the MIIs is in advanced technology development and demonstration, with a concentration on programs in TRLs 4 to 7. MIIs typically engage in pre-competitive technology development projects and also perform technology demonstration projects. Other organizations within DoD focus on development of technologies up to TRL 4 and also on transition to implementation in TRL 7-9. The question of continuing need for the MII depends on both DoD’s long-term plans for the MII technology area and how the MII is integrated into DoD’s strategy for advanced technology implementation. As has been noted before, the MII assessment acknowledges that within the set of MIIs, there are technologies that are at different maturities and at different levels of penetration into DoD programs and systems. Furthermore, MIIs are at different points within the start-up and follow-on stages when the 5-year reviews are performed. Therefore, a long-term view of the continuing need is required.
In these strategic reviews, DoD assesses the MII in the context of emerging needs of DoD agencies and military services. The JDMC review process will need input from these organizations and DoD stakeholders. Relevant topics to examine would include the following:
With this question, DoD will assess whether the organizational characteristics of the MII still afford the best approach to achieving DoD’s goals in a specific technology field. The unique organizational characteristic of the MIIs is that they are industry-led, public–private partnerships, unlike other sources available to DoD. MIIs were chartered8 and organized for establishing and growing a manufacturing ecosystem, advancing research and technology, and securing human capital. These outcomes support OSD ManTech’s congressionally mandated mission to support the warfighter while also enhancing U.S. manufacturing base capabilities, expertise, and intellectual property. In assessing
___________________
8 As outlined in the MII Strategy and Assessment document MII Strategy, received on November 9, 2020, from ManTech.
whether continuation of the MII is the best way to achieve these outcomes, the JDMC is expected to assess the strengths and weaknesses of available alternatives in the MII field of discipline. Relevant topics to examine would include the following:
We note that DoD has well established best practices for analysis of alternatives in the acquisition of defense systems.9 A much more streamlined process is called for in the five year review of an MII, focused on the decision as to whether to continue an agreement that was considered the best alternative at the time of original award. The primary emphasis should be on updating the identification and qualitative assessment of alternative sources to include both those originally considered and new alternatives that may now be available to DoD, Standard qualitative techniques such as Strengths, Weaknesses, Opportunities, Threats (SWOT) analysis may be appropriate. If the qualitative evaluation determines a viable competing source may be equally effective in meeting DoD needs, a more detailed quantitative analysis of costs and benefits is required.
This question assesses the performance of the MII against DoD’s goals for the institutes. Consistent with DoD’s chartering principles for the MIIs, there are three components of the assessment—technology development, ecosystem development, and EWD. Since the formation of the institutes, DoD has had an active effort to develop and improve the metrics it uses to perform annual assessments of the MIIs. The metrics provided in the 2020 DoD Metrics List and Final Metrics List10 are of value to the 5-year assessment process in that they provide key inputs on trends in the MII. Being a 5-year strategic evaluation, the evaluation needs to examine trajectories toward a desired steady state of the MII. In some cases, this will entail trend analysis on the data collected by DoD and the institute. In other cases, additional topics that relate to DoD mission needs and national stature, as well as future plans and directions of the institutes, need to be considered.
The defense manufacturing supply chain is critical to both the U.S. economy and national security.11,12 Support is needed for emerging technologies such as directed-energy weapons, hypersonics
___________________
9 DoD Instruction 5000.84, “Analysis of Alternatives”, August 4, 2020.
10 As provided by ManTech through the Excel books: “Final DoD MII Performance Metrics_cleared” and “2020 MII Complete Metrics List_cleared,” both received from ManTech on December 11, 2020.
11 Executive Office of the President, 2017, National Security Strategy of the United States of America, Washington, DC, p. 55.
12 Fiscal Year 2020 Industrial Capabilities Report to Congress, OSD A&S Industrial Policy, January 2021. https://media.defense.gov/2021/Jan/14/2002565311/-1/-1/0/FY20-INDUSTRIAL-CAPABILITIES-REPORT.PDF.
quantum and cybersecurity, all of which are vital to national defense.13,14 A strong manufacturing sector not only ensures a ready supply of defense and commercial goods and services, but also ensures the integrity and safety of these goods, such as electronics and control systems.
Advanced manufacturing encompasses all aspects of manufacturing, including the ability to quickly respond to customer needs through innovations in product design and production processes and innovations in the supply chain. As manufacturing advances, it is becoming increasingly complex and knowledge-intensive, relying on diverse data streams and partner networks to accelerate the pace of new products and services delivered.
Advanced manufacturing ecosystems can provide speed in technology development while enabling partner networks to co-innovate around common challenges.15 This ecosystem approach requires deliberate coordination among and between various parties to solve shared challenges and meet shared objectives. In the case of smart manufacturing initiatives, the Deloitte and MAPI Smart Manufacturing Ecosystem Study identified four primary ecosystems that support advanced manufacturing ecosystems,16 see Table 3.1.
Given the pace of change, maintaining vibrant and resilient manufacturing ecosystems is of paramount importance to U.S. national security and the DOD. Hence, establishing and growing manufacturing ecosystems is one of the three chartering principles of Manufacturing USA MIIs.
Since DOD MIIs vary in the nature, maturity, and intended use of their focused technology, each MII and its stakeholders need to identify the shared challenges and shared opportunities that their ecosystem wishes to develop, grow, and maintain.
TABLE 3.1 Four Primary Advanced Manufacturing Ecosystems and the Focus Areas of Shared Objectives
| Advanced Manufacturing Ecosystems | Focus Areas of Shared Objectives |
|---|---|
| Supply Chain Ecosystem | Source raw materials, calibrate supply to demand, facilitate storage and distribution of finished product to customer. |
| Production Ecosystem | Make products that meet customer requirements, quality standards, and cost margins. |
| Customer Ecosystem | Connect and engage with customers, enable customers to order, maintain, and service products. |
| Talent Ecosystem | Create pipelines for skills and roles that are needed to support smart (advanced) manufacturing. |
___________________
13 Ibid.
14 See https://www.nist.gov/mep/manufacturing-infographics/defense-manufacturing-supply-chain.
15 Deloitte, 2020, “Accelerating Smart Manufacturing: The Value of an Ecosystem Approach,” October, https://www2.deloitte.com/us/en/insights/industry/manufacturing/accelerating-smart-manufacturing.html; Deloitte, “Manufacturing Goes Digital: Smart Factories Have the Potential to Spark Labor Productivity 2019,” Deloitte and MAPI Smart Factory Study, https://www2.deloitte.com/us/en/insights/industry/manufacturing/driving-value-smartfactory-technologies.html.
As public–private partnerships, MIIs provide DoD with an opportunity to broaden and strengthen DoD’s supply chains for advanced technologies through the ecosystem. The character of the ecosystem that best suits the needs of DoD will be dependent on the specific technology field of the MII. Each MII therefore needs to describe the advanced manufacturing ecosystem that it envisions building, based on the vision of the needs of the DoD and what will ultimately be acquired by DoD acquisition and sustainment from the manufacturing community. The MII also needs to explain its role in growing and improving the ecosystem and state why no other entities can play that role better than it can. The health of the ecosystem of the MIIs is therefore an important aspect of the value of the institutes to DoD. Over time, it is natural for members to join and depart from the MII ecosystem based on its own interests. However, it is important for DoD to monitor support for the MIIs with industry representatives in those relevant fields. Strategically, DoD needs to continue to assess the health and growth of the ecosystem over time.
Possible assessment activities for health and growth of the ecosystem are as follows:
One important reason for DoD’s support of MIIs is that they offer effective delivery of manufacturing technologies that support DoD needs. The OSD ManTech metrics list contains the topics that document the contribution of the MIIs to technology development and transition. An additional topic relevant to MII performance is the efficiency of technology transition. Determining the rate-limiting steps in developing an advanced technology is critical to addressing acceleration and maximizing impact. The MIIs are responsible for finding innovative approaches to address technology development challenges. During an MII evaluation process, a key question that can be addressed is, What is limiting the rate of maturation of the technology or a resilient ecosystem for this advanced manufacturing field? Are there technology gaps, a lack of standards, inadequate equipment and facilities, a lack of qualified suppliers, insufficient demand, untrained workforce, etc.? Once identified, the evaluation can focus on the MII strategy for addressing these rate-limiting issues and address if adequate resources are appropriately prioritized and aligned to efficiently and effectively close these gaps. The 5-year evaluation would ideally assess the trends in the effectiveness of the MIIs technology development and implementation activities. Also, as the MIIs mature, it is to be expected that the degree of engagement with DoD’s acquisition and sustainment communities and opportunities for technology insertion onto DoD systems and platforms will increase.
Possible assessment activities include the following:
EWD has been a core mission of the MIIs from the outset, based on an understanding that new production technologies will not be adopted unless there is a workforce ready to apply them. Currently, MIIs collect institute-specific data on the numbers of those trained and reached in their EWD programs. While numerical data provides a useful metric for MII performance, the committee has done a detailed review of the kinds of EWD programs undertaken by MIIs and developed a list of “best practices” that various MIIs are undertaking. While these best practices are less amenable to metrics, evidence that best practices are being applied represents an important indicator of program quality. It is also indicative of the reach, both potential and ongoing, of the MII’s EWD programs in improving workforce education skills in the MII’s technology area. While no single institute is undertaking all of the program elements delineated below, the ability to perform a number, with quality efforts, can be an important indicator of program strength. Questions seeking evidence from the MII of application of best practices appear below, with a more detailed explanation about the need for and significance of each question set out in Appendix C.
Possible questions for information gathering on EWD issues are set out below, with indicators likely to be developed earlier in the MII’s term listed before later indicators:
Current and former MII stakeholders should be consulted in the development of this information. Supplemental material relevant to and discussing these questions appears in Appendix C.
Similar to Question 3, the performance of the MII in establishing and operating the institute has been reviewed on an annual basis by DoD. There is a section in the 2020 DoD Metrics list directed at operations performance. The data collected from the annual reviews, along with the trends in the data, will be very useful to the JDMC evaluation committee performing the 5-year assessment. In addition, the JDMC review can address the MIIs progress toward achieving DoD’s strategic goals for the MIIs, as well as DoD’s overall mission needs for manufacturing technology. The examination of the adequacy of MII leadership in addition to business plans and processes in supporting the achievement of these mission needs could be relevant to the effectiveness of the governance and management of the MIIs.
Possible assessments for governance and management are as follows:
TABLE 3.2 Summary of Goals and Expected Outcomes for Evaluation Criteria
| Question | 1. Is there a continuing need for the MII? | 2. Is an MII the best alternative? | 3. Has the MII performed well? | 4. Is the governance and management effective? |
|---|---|---|---|---|
| Goals | Assess the importance of the MII in the context of DoD’s overall needs in advanced manufacturing technologies. | Assess whether the organizational characteristics of the MII still afford the best approach to achieving DoD’s goals in a specific technology field. | Assess the performance of the MII against DoD’s goals for the institutes. | Assess the performance of the MII in establishing and operating the institute. |
| Expected Outcome | Long-term view of continuing need, in the context of emerging needs of DoD agencies and military services. | Evaluate strengths and weaknesses of available alternatives in the MII field of discipline. | Examine trajectories toward a desired steady state of the MII for (i) technology development, (ii) ecosystem development, and (iii) EWD. | Review trends in the annual operations performance metrics, progress towards achieving DoD’s strategic goals, and DoD’s overall mission needs for manufacturing technology. |
The goals and expected outcomes of the MII key objectives (advancing research and technology, establishing and growing a manufacturing ecosystem, and securing human capital) in the four critical areas of the evaluation framework are summarized in Table 3.2.
All of the topics in the Strategic Evaluation Criteria for DoD MIIs section above represent qualitative metrics which are directly relevant to the three key strategic objectives for the MIIs (advancing research and technology, establishing and growing a manufacturing ecosystem, and securing human
capital). In addition, the trend analysis mentioned, which will be performed based on data collected by OSD ManTech and the Institutes on the MIIs over time will serve as quantitative assessments of progress against these key objectives. These quantitative measures are described below.
As has been noted, OSD ManTech and the MIIs already measure and collect data on a number of metrics important to technology development and operations. This includes tracking the number of projects that are being executed, progress to meeting technical objectives in timely manner, and how many projects are started or get completed over a given time period. The institutes also categorize these metrics in total, by non-DoD government entity, by DoD service, by state or local government, and by commercial entity. In addition to the metrics currently gathered, the committee suggests additional, strategic assessment should include: MIIs must develop a process to track demonstrated outcome impact of completed projects on cost, lead time, and/or platform performance enhancements, and MIIs should track their impact on the roadmaps for the various technologies they tasked with and how these impacts fit within the larger community contributions. The additional metrics suggested for the research and technology objective are:
OSD ManTech and the institutes measure and collect data on a number of quantitative metrics relevant to their manufacturing ecosystem. This includes tracking the number of successful technology transitions, number of patents filed and used, number of members on funded projects, level of member cash to in-kind contributions, level of ecosystem participation, number of DOD subject matter experts engaged, number of states involved and level of state contributions. For the long-term strategic evaluation, the committee suggests that the institutes define the advanced manufacturing ecosystem which the MII and its stakeholders envision to serve the needs of its community. The institutes should identify the ecosystem programs and partnerships, which are required to establish robust and resilient DOD supply chains and track and report trendlines in 1) the number and growth of successful tech transitions into relevant DOD industrial bases, 2) growth in number or percentage of IP portfolio which contributed to the competitiveness of its ecosystem, 3) number of MII programs focused on establishing robust and resilient DoD supply chains, and 4) growth in number and value of regional/state/local economic development programs and partnerships focused on nurturing, growing, and maintaining a vibrant and resilient ecosystem. The institutes should address how the ecosystem metrics have progressed or changed over time to put their impact in context on technical development, talent development, production capability, standards development, regulatory requirements, customer engagement and overall operations. The additional metrics suggested for the Ecosystem objective are:
The Institutes are currently required to collect quantitative data on such items as numbers of students participating in institute projects or institute internship or training programs, numbers completing institute-related certificates or apprenticeships, teachers or trainers completing institute training programs, number of EWD projects and their funding showing longer-term and shorter-term projects, funding to the institutes from outside sources for EWD, and the level and depth of learning experiences for the different EWD participants identified. Aside from extensive qualitative metrics, the committee suggests additional metrics regarding whether those trained completed programs and entered positions or performed work based on the training. The MII should be asked to develop trendlines of this data for the MII’s period of operations, applied against workforce education goals set out in the MII’s skill roadmaps, to indicate progress in meeting strategic, overall skills training demands in the MII’s manufacturing technology sector. The additional metric suggested for the securing human capital (EWD) objective is:
When these additional assessment topics are taken into account, it becomes clear that additional sources of input for the MII 5-year evaluation will be needed beyond those shown in Figure 2.2. The committee believes that the JDMC 5-year review would be significantly improved if inputs from the sources shown on the left in Figure 3.2 are obtained. Any data collection efforts will need to be designed in advance to adequately capture and evaluate quantitative data. The additions to the JDMC evaluation process are the following: