Previous Chapter: 3 The Landscape of DOD SBIR/STTR Awardees
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

4

DOD’s SBIR/STTR Processes

This chapter reviews the Department of Defense’s (DOD’s) processes for executing the Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) programs, including such activities as proposal solicitation, outreach, selection of awardees, and support for awardees during their participation in the programs across the military services and component agencies that administer the awards.1 The chapter provides a detailed assessment of the overall application and award process. This description includes a comparison of internal topic development and recent initiatives (most notably at the Air Force) to introduce an open topics approach for a subset of awards. As well, the chapter describes some key domains in which opportunities for change might be efficacious, particularly with regard to administrative burdens associated with Foreign Influence Due Diligence, enhanced scrutiny of experienced firms, and issues related to contract types and strictures concerning both minimum and maximum award sizes for both Phase I and Phase II awards.

The principal sources of data for this chapter were discussions between committee members and SBIR/STTR program managers and staff from each of the DOD services and components that issue SBIR/STTR awards, based on a list of program managers for each military service or component provided by the DOD Office of Small Business Programs. A list of interviewees and their offices is provided in Appendix C of this report. These program managers oversee the processes and procedures in the programs and rely on technical experts within each service or component to determine specific research topics and monitor the small businesses’ performance. Each discussion followed a similar protocol (also

___________________

1 As of 2025, 15 military services and component agencies offer SBIR/STTR awards (Air Force, Space Force, Army, Navy, Defense Advanced Research Projects Agency, Missile Defense Agency, Defense Health Agency, United States Special Operations Command, Defense Logistics Agency, Defense Threat Reduction Agency, Chemical and Biological Defense Command, National Geospatial-Intelligence Agency, Defense Microelectronics Activity, and Office of Strategic Capital, as well as the Office of the Secretary of Defense). However, this chapter focuses on the 10 services/components listed in Appendix C of this report.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

found in Appendix C) and lasted approximately 60–90 minutes. Additional data and background information were obtained from the DOD websites, from discussions with the Office of the Under Secretary for Research and Engineering (OUSD[R&E]), former DOD procurement and technical points of contact, SBIR and STTR awardees, prime contractors, investors, and other entrepreneurs (see, e.g., NASEM, 2024). The committee acknowledges that DOD’s SBIR/STTR processes and procedures may have changed during the course of this study. For example, program solicitations are offered monthly as of fiscal year (FY) 2025, and the central SBIR/STTR oversight office has been rebranded as the Office for Small Business Innovation.

A central finding of the chapter is that, although many broad rules and policies apply across the different DOD services and components, each has substantial autonomy (and initiative) in program emphasis and administrative orientation to best serve its mission. For example, program officers vary significantly as to whether they prioritize enhancing the resilience and capabilities of the defense industrial base versus enabling the introduction of novel technologies, whether they seek to leverage SBIR/STTR as an opportunity within their component to support agency modernization goals, and whether they are concerned primarily with satisfying their SBIR/STTR obligations in a compliant and responsible manner. These differences in program emphasis and administrative orientation offer insight into the types of outputs resulting from the programs across the different services/components, and they also provide a sense of the breadth of management styles and practices that are possible within the programs under current statute, regulation, policy, and practices.

ORGANIZATION OF DOD’S SBIR AND STTR PROGRAMS

Figure 4-1 is an organizational chart for DOD’s SBIR/STTR programs. While the autonomy and variation among services/components noted above is substantial, the OUSD(R&E) SBIR/STTR Office oversees and coordinates aspects of the programs across DOD. Specifically, that office oversees the DOD SBIR and STTR programs through the following activities:

  • Serves as the primary contact for Congress, the Small Business Administration (SBA), the Government Accountability Office (GAO), and the interagency SBIR/STTR community.
  • Publishes SBIR/STTR topics from across DOD through Broad Agency Announcements (BAAs) and Commercial Solutions Openings (CSOs).
  • Oversees the development, maintenance, and enhancement of the Defense SBIR/STTR Innovation Portal in collaboration with the participating DOD services/components.
  • Establishes and maintains a web presence where DOD and other government, industry, and academic personnel can find useful and relevant information about DOD’s SBIR/STTR programs.
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
DOD SBIR/STTR program organizational chart
FIGURE 4-1 DOD SBIR/STTR program organizational chart.
NOTE: Asterisks identify services and components that did not have a representative meet with the committee. GAO = Government Accountability Office; SBA = Small Business Administration.
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
  • Prepares the policy and guidance documents resulting from new mandates in legislation, such as the annual National Defense Authorization Act and the 2022 SBIR/STTR Extension Act. These documents include guidance on foreign risk management, open topics, increased minimum performance standards for experienced firms, multiple-award recipients, and more.
  • Meets regularly with SBIR/STTR program managers in the services and components to share challenges and ideas. In discussions with the committee, some service/component representatives mentioned holding monthly meetings to discuss challenges.
  • Coordinates with services and components to conduct program outreach and inreach across DOD, in addition to the outreach conducted by the individual programs. One DOD component representative described OUSD(R&E)’s outreach to traditional venues while that component focused on nontraditional venues.
  • Manages the Office of the Secretary of Defense (OSD)–level Transition and Commercialization Program.
  • Manages the execution of OSD’s SBIR/STTR extramural and administrative budgets.

Beyond these functions of the OUSD(R&E) SBIR/STTR Office, DOD’s SBIR/STTR programs are executed by individual offices and personnel across each of the services and components.

Many but not all DOD services/components participate in the SBIR/STTR programs. For example, agencies within the Intelligence Community (IC)2 are exempt from mandatory participation in the programs (see Box 4-1).3 The National Geospatial-Intelligence Agency (NGA) is the only IC member that participates in the programs, finding them valuable enough to warrant voluntary participation.4

Given the myriad objectives of DOD’s SBIR/STTR programs—enhancing the nation’s defense capability through innovation, building the defense–industrial complex, enhancing warfighting capabilities, increasing private–sector commercialization of federal research and development (R&D), and leveraging the inventiveness of the entrepreneurial ecosystem—the SBIR/STTR program managers must balance (and resolve conflicts between) these objectives. Specifically, as described in more detail below, in the process of implementing the SBIR/STTR programs, DOD executive officers and program managers orient implementation of the SBIR/STTR programs in the manner they deem best suited to their service’s/component’s unique mission and needs. The

___________________

2 See https://www.dni.gov/index.php/what-we-do/members-of-the-ic.

3 The Intelligence Community is considered exempted from the requirement to execute SBIR/STTR programs under 15 U.S.C., Section 638(e)(2).

4 See https://media.defense.gov/2023/Aug/22/2003285640/-1/-1/0/OSD-NGA_SBIR_233.PDF.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

BOX 4-1
Extending DOD’s SBIR/STTR Programs to Intelligence Agencies and National Nuclear Security Administration

The National Geospatial-Intelligence Agency (NGA) is currently the only intelligence agency with dedicated SBIR/STTR programs, funding Phase I awards. The Office of the Under Secretary of Defense for Research and Engineering (OUSD[R&E]) provides the necessary funding for NGA Phase II awards. However, it is noteworthy that the Department of Energy’s National Nuclear Security Administration (NNSA), funded largely by DOD, as well as other intelligence agencies, including those within DOD, receive waivers from participation in the SBIR/STTR programs.

As the landscape of intelligence and nuclear security continues to evolve, shaped by emerging threats and cutting-edge technologies, small businesses in the commercial sector are increasingly at the forefront of innovation in this domain. An illustrative example is Sandia National Laboratories, a government-owned, contractor-operated entity under the Department of Energy that has become a significant recipient of DOD STTR funding, while also receiving extensive noncompetitively awarded funding from many organizations within DOD and the Intelligence Community (IC). This example underscores the overlapping missions of the DOD SBIR/STTR programs and the IC and NNSA.

Small businesses often have the agility and cost-effectiveness to deliver innovative capabilities more efficiently and at significantly lower cost compared with national laboratories and large defense contractors. Given that DOD is successfully using the SBIR/STTR programs to tap into small business innovation to support broad defense missions, it stands to reason that similar opportunities could be explored for intelligence and nuclear missions.

result is the differences in how the programs operate noted earlier. Some DOD SBIR/STTR programs operate at early-stage research levels (Technology Readiness Levels [TRLs] 1–3), while others operate at the levels of applied R&D (TRLs 4–6). Most of the programs manage portfolios with a range of readiness levels.

Program managers can be critical to the success of small businesses participating in the SBIR/STTR programs. Program managers come from a variety of educational backgrounds, although, unlike other federal agencies, DOD SBIR/STTR program managers do not have tend to have advanced technical degrees (only one program manager had a PhD in engineering). Instead, most have some management education; at the same time, there appears to be a lack of hands-on expertise with startups. This gap can result in limitations on the kinds of support program managers can provide for small businesses as they participate in the SBIR/STTR programs.

There appears also to be no standardized training for SBIR/STTR program managers across the services and components. Some services/components have thorough training for their program managers, while others have none, potentially contributing to inconsistent support for small

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

businesses. Given the inexperience of SBIR/STTR program managers in working with small businesses, implementing standardized DOD training and sharing best practices focused on the needs of small businesses could enhance the overall effectiveness of the programs, equipping program managers with the skills necessary to better address small business concerns.

DOD’S SBIR/STTR APPLICATION AND AWARD PROCESS

To understand how DOD services/components implement the SBIR/STTR programs and how differences manifest across components, it is useful first to understand the overall process by which applications are solicited, the award process, and the potential for “transition” and follow-on funding and development. The committee’s analysis uncovered important differences in these processes across the DOD services/components.

Topic Development

The first step in the overall SBIR/STTR process is developing the topics that form the basis of the program’s solicitations. In most cases, a technical point of contact (TPOC) leads the development of an SBIR/STTR-specific topic and implementation of the solicitation, source selection, and program execution processes. The specific terminology for these processes varies across programs; in the Army, for example, topics are called projects, and technology broker teams lead their development, selection, and implementation.

Topic development typically involves contributions from military and civilian employees, who are encouraged to submit suggestions and ideas. In some instances, programs may seek input from other DOD agencies. Prioritization of these topics occurs through collaborative discussions within and across programs.

Topic development practices vary among participating DOD services/components. For example, the Defense Health Agency (DHA) actively solicits feedback from at least one military service. Representatives of several services/components, such as the Air Force and Missile Defense Agency (MDA), mentioned inviting prime contractors to provide input on potential topics. The Defense Advanced Research Projects Agency’s (DARPA’s) topic approval process involves discussions with peers and approval by DARPA’s deputy director.

While DOD laboratories are actively engaged in brainstorming ideas for DOD SBIR/STTR topics, other federal labs, such as those of the National Aeronautics and Space Administration (NASA), the National Institute of Standards and Technology, and the Department of Energy, are typically not included in the topic development process. Similarly, DOD labs are not asked to provide input into the topic selection of other agencies, even though they employ the largest federal technical workforce, have significant technical and engineering expertise, and possess a strong knowledge of potential military applications of commercial technologies.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

The nature of topics can vary significantly. Some are highly specific and designed with acquisition in mind, targeting defined customers, with the intent of seamlessly funding projects from Phase I into Phase II and subsequent follow-on. MDA, for example, focuses on identifying mission capability gaps or technological needs. In contrast, the Defense Logistics Agency (DLA) derives its topics from immediate needs expressed by DLA personnel in the field, reflecting a more urgent acquisition focus on addressing current operational needs. Other topics are broader, aimed at exploration and learning, leading to numerous Phase I awards with less concern for direct transition into acquisition programs or operational use.

Topic development typically takes from 2 months for small organizations, such as the Chemical and Biological Defense Command (CBD) and Defense Threat Reduction Agency, to 3–4 months for larger services and components. The SBIR topic development cycle occurs every year, while the STTR cycle takes place every other year for some components, such as CBD.

While topic development has mostly followed the process described above, in June 2018 the Air Force’s innovation arm, AFWERX, began an initiative to experiment with an alternative approach to attracting proposals: open topics. Open topics are intended to solicit R&D proposals submitted by companies that address a critical technology area, instead of requiring companies to propose projects in response to technology-specific or mission area–specific topic areas developed by DOD services/components. According to the committee’s discussions with program personnel, the process was designed to

  • attract new small businesses,
  • deliver technology solutions faster,
  • give companies more flexibility in proposing solutions,
  • accelerate R&D, and
  • showcase commercial products that could be adapted for DOD.

Within the Air Force, the open topics process has largely replaced that service’s conventional approach of identifying specific problems and mission needs as the basis for solicitations.

One rationale cited for the use of open topics is to increase the number of new firms submitting SBIR/STTR proposals to DOD. Howell and colleagues (2025) found that the use of open topics in the Air Force increased the adoption of new technologies and attracted new firms to the defense industrial base. In its report AFWERX 2.0, the Air Force states that it added “more than 2,200 new companies to the AFWERX portfolio since the Open Topic approach launched” (AFWERX, 2022, p. 8).

Given the perceived salutary impact of open topics within AFWERX, Congress mandated in 2022 that SBIR/STTR programs at all federal agencies conduct at least one open topics competition annually. This requirement appears in Section 7 of the SBIR and STTR Extension Act of 2022 and was implemented

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

in FY2023 for all DOD SBIR/STTR programs. The DOD-issued BAA notes that small businesses may submit only one proposal under each open topic solicitation (SBIR Program, 2024, p. 7). OUSD(R&E) created and disseminated guidance on SBIR/STTR open topics to provide a framework for meeting the intent of the statutory requirement while allowing flexibility for each service and component to structure its open topic process in a streamlined manner.5

The committee specifically considered the open topics approach first pioneered within AFWERX and expanded across other services/components. In response to the open topics mandate and OUSD(R&E) guidance, each service and component uses a different approach to implementing the program. Larger organizations, such as the Air Force’s AFWERX, reported significant benefits in identifying and using dual-use technologies to benefit the warfighter (Howell et al., 2025), and AFWERX has increased the number of open topic solicitations each year, now issuing four—two for SBIR and two for STTR. Midsize programs, such as that of MDA, say that using open topics causes some difficulty, but provides the benefit of encouraging more nontraditional (to DOD) small companies to apply. One smaller component (CBD) had similar experience in attracting a broader range of firms applying to the program, and it received a number of proposals similar to that for traditional topics (12–30 proposals). As a result, it held two open topic competitions in 2024.

Smaller components raised concern that the number of proposals received in response to open topic solicitations exceeded their capacity to review and evaluate them. They noted significant challenges with the open topic mandate, including having difficulty finding technical evaluators for the proposals, receiving a large number of proposals that are difficult to evaluate, or receiving large numbers of proposals incompatible with their organizational requirements and technology needs. Since proposals submitted under open topic solicitations are not always naturally aligned with an existing defense technology gap or mission need, and therefore not naturally aligned with a transition partner, these programs may be challenged to gain traction in transitioning efforts from SBIR/STTR into their broader science and technology or acquisition programs. As a result, some components, such as DARPA, indicated a need to provide “tailored” open topics to limit the number of proposals, find appropriate reviewers, and meet agency needs.

In summary, open topics have the potential to increase the number of new SBIR/STTR firms submitting proposals to DOD’s SBIR/STTR programs. These solicitations appear to work better in the larger DOD participating services/components, such as the Air Force. For smaller and more specialized agencies, the number and type of proposals can create a significant administrative burden for processing and review while not yielding the required specialized capacity.

___________________

5 Presentation to the committee by Matthew Williams, Department of Defense, on December 6, 2023, Washington, DC.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

Outreach to Applicants

Whether a service or component uses a conventional topic or open topics approach, the ultimate impact of the SBIR/STTR programs depends on attracting high-quality applicants. Additionally, an explicit objective of the SBIR/STTR programs is fostering and encouraging participation in technological innovation by socially and economically disadvantaged small businesses and those that are 51 percent owned and controlled by women. Similarly, DOD has a stated goal of expanding its small business, nontraditional defense industrial base.6

Outreach within the DOD SBIR/STTR programs occurs on both the external and internal fronts; some examples are presented in Box 4-2. External outreach efforts focus on seeking new applicants—particularly small, innovative companies that may not be aware of the SBIR/STTR programs—with the aim of educating potential future applicants about the opportunities available to them. Internally, outreach is directed at identifying customers that can champion topics and facilitate the transition of Phase II awards into Phase III contracts. This internal effort demands persistence, strong networking, and in-depth knowledge of the programs and defense mission requirements.

The current approach to outreach of DOD’s SBIR/STTR programs can be seen as a dual strategy: a “pull” mechanism that gathers more proposals and a “push” aspect that emphasizes the importance of transitioning technologies to benefit the warfighter. An ongoing discussion is whether the pull should outweigh the push, and whether dedicated personnel to support transition activities are necessary to streamline and improve these efforts.

Personnel often attend both traditional and nontraditional events to connect with potential applicants. Events such as South by Southwest and TechConnect, alongside specialized gatherings for Special Operations Forces, highlight the diverse events that can be leveraged for outreach. Additionally, well-established events such as SBA Road Tours and various conferences provide channels for agencies to promote their programs.

The extent of outreach activity largely depends on the service’s or component’s budget and staffing capabilities. Some centralize their outreach within offices, such as the OUSD(R&E) SBIR/STTR office, or within organizations such as AFWERX. MDA collaborates with AFWERX and Space Ventures in replying to applicants because of overlapping mission areas and common technical interests. Some components refer first-time small business applicants to private-sector accelerators for support, although some accelerators take a percentage of the company’s equity.

Outreach and administration of the programs are currently funded in part by a pilot program originally authorized in the SBIR/STTR Reauthorization Act

___________________

6 15 U.S.C., Section 638(ww)(1)(B).

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
BOX 4-2
Examples of SBIR/STTR Outreach Strategies and Activities
ARMY

As an element of its overall program, the Army’s SBIR/STTR program has undertaken initiatives (xTECH and Reverse Pitch Day) to attract potential applicants for its SBIR/STTR programs (Volkwine, 2024). Designed as a prize competition portfolio, xTECH aims to widen the pool of participants, with an impressive 70 percent of the competing companies having never before collaborated with the government. By partnering with venture capital firms, accelerators, and various organizations, the Army is tapping into broader networks of small businesses. This approach lowers the barrier to entry, allowing new businesses to engage effortlessly, such as by submitting a straightforward one-page white paper. The range of participants in these prize competitions extends across various demographics, including historically Black colleges and universities, minority-serving institutions, and international small businesses, among others.

The unique structure of xTech prize competitions encourages collaboration between the Army and nontraditional innovators, but also offers incentives such as nondilutive cash prizes, educational resources, mentorship opportunities, and networking prospects with Army customers. Notably, recipients gain access to potential follow-on contracts, including Phase I or Phase II SBIR/STTR awards, to develop viable technology solutions for Army challenges.

In addition to xTECH, the Army has undertaken other outreach initiatives, such as the Reverse Pitch Day organized with Plug and Play, which attracted 600–700 companies in August 2024. This event provided a platform for Army customers and their programs to communicate directly about the solutions they seek. Furthermore, the Army is making concerted efforts to engage the clean tech sector, using a blend of in-person, hybrid, and virtual formats to maximize outreach efficiency with minimal staffing resources. Collaborations extend to organizations such as the Women’s Chamber of Commerce in North Carolina and Georgia. Industry days facilitated by the Army Applications Laboratory, a component of the Army Futures Command, further promote interaction with nontraditional startups. The Army is also expanding its social media presence across platforms such as LinkedIn, X, and Facebook to broaden its outreach.

NATIONAL GEOSPATIAL-INTELLIGENCE AGENCY (NGA)

Agencies with specific program requirements appear to limit their outreach to known partners and events. For example, NGA uses tech days to engage with internal customers, occasionally including universities through their academic research programs.

DEFENSE ADVANCED RESEARCH PROJECTS AGENCY (DARPA)

DARPA focuses on outreach to seek nontraditional applicants and new program managers. In addition, it has an extensive online outreach platform, DARPA Connect, aimed at fostering global outreach by supporting opportunities that allow users to engage with others participating in the program. This platform offers training modules on subjects such as DARPA 101 and SBIR 101, alongside resources such as the Connect Corner, which features coaching, office hours, monthly webinars, and opportunities for real-time

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

interaction through “ask me anything” sessions. The DARPA Connect Team actively participates in trade shows and events to bolster its presence, aiming to broaden its network and strengthen its outreach initiatives. The SBIR/STTR team in the Office of the Under Secretary of Defense for Research and Enterprise is interested in expanding activities modeled on DARPA Connect across all DOD SBIR/STTR programs.

DARPA has also established a robust system of communication through weekly newsletters. These newsletters keep all stakeholders—past and present—aware of important updates, including upcoming events, solicitation release dates, and training opportunities, while also requesting insights about the transition of their SBIR/STTR projects.

DOD-WIDE

DOD’s Office of Small Business Programs administers the Rapid Integrated Scalable Enterprise (RISE) program, which provides a collaborative vehicle for small businesses. RISE is designed to provide DOD with innovative technologies that can be rapidly inserted into acquisition programs that meet specific defense needs.

of 2011, which allows agencies to allocate 3 percent of their SBIR/STTR budgets for purposes of administering the programs.7 This funding addresses a chronic issue within the programs’ original authorized structure, under which no funds allocated for the programs could be used for their administration—in contrast with the vast preponderance of other R&D programs within the federal government.

Since the administrative funding pilot program was established across all SBIR/STTR federal agencies, some agencies have used this funding to facilitate faster proposal processing and commercialization of projects, and to enhance outreach activities such as site visits, conferences, and connection with underrepresented businesses, some of which are described in more detail below. Within DOD’s SBIR/STTR programs, these funds are used broadly for any administration costs associated with running the programs. The pilot program appears to have had a positive impact on outreach overall.

Solicitation of Proposals

The conventional and open topics developed by the participating DOD services/components are included in one or more solicitations using either a BAA or CSO, depending on program needs and desired outcomes (DAU, n.d.a, n.d.b). CSOs are typically used to acquire innovative commercial items, technologies, or services that directly meet program requirements, whereas BAAs are generally restricted to basic and applied research activities (DAU, n.d.c).

___________________

7 U.S. Congress, National Defense Authorization Act for Fiscal Year 2012, P.L. 112–81, Section 5141 (December 31, 2011) and subsequent legislation, which extended the provision.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

Evaluating Proposals and Making Awards

Selecting and awarding funding to DOD SBIR/STTR applicants involves several important steps, including reviewing and ranking proposals, as well as attempting to prevent foreign influence among program participants. In contrast with most DOD source selection processes, past performance does not appear to be a major factor in proposal evaluation. Unless specified otherwise in the instructions specific to a service or component, each proposal is evaluated based on three main criteria—technical merit, team qualifications, and commercialization potential.8 There are differences across services and components in how scoring is applied or in the use of consensus or review panels to make recommendations for funding. Final decisions are made by designated authorities, often after reviews and evaluations consistent with their respective ranking frameworks.

Applicants submit a seven-part proposal in accordance with precise guidelines that vary somewhat by service/component. Submissions include technical matters such as problem identification, statement of work, commercialization strategy, and key personnel, as well as discussion of project cost issues, letters of support, and disclosures of foreign affiliations. Many of these elements are similar to those that small businesses would include in proposals for other DOD R&D programs.

Once proposals have been submitted and have undergone an administrative check for completeness, they move through a review process that is generally similar across DOD services and components, although each implements the review process differently according to its needs. Subcriteria are tailored to each topic and subtopic listed in the solicitation. In general, the technical point of contact (TPOC) or project director makes the final proposal recommendations to the SBIR/STTR program manager or designated source selection authority. Key elements of the review process and example review criteria are shown in Box 4-3 and Table 4-1.

The review in Phase II is similar to that in Phase I, with more emphasis on commercialization or transition potential. Phase II approval is based on the proposal’s merits (using the above criteria established in the solicitation announcement), Phase I accomplishments, and TPOC feedback. Recommendations are made to the source selection authority, who can delegate authority to expedite the approval process as needed.

Foreign Influence Due Diligence

In 2022, the SBIR and STTR Extension Act established a due diligence program to enhance the security of proposals submitted by small businesses seeking DOD awards. This program officially took effect on June 14, 2023, prompting DOD to communicate requirements to small businesses through

___________________

8 See https://www.defensesbirsttr.mil/SBIR-STTR/#Structure.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

various channels, including Listserv, the Defense SBIR/STTR Innovation Portal banner, the program’s official website,9 and social media platforms.

Reflecting this program guidance, DOD officials indicated to the committee that they conduct thorough reviews of all proposals in response to SBIR/STTR solicitations, focusing on assessing any security risks associated with these small businesses. A critical aspect of this review process involves the

BOX 4-3
Key Elements of Phase I Review Process

Key elements of the Phase I review process include the following:

  • Reviewer selection: This process may include both internal government and external contractor personnel who are subject matter experts.
  • Evaluation criteria: Overall elements include technology feasibility, team qualifications, and commercialization plans, although weighting of these elements varies. Subcriteria vary by topic.
  • Tools and automation: Some services/components use tools such as the Army’s Valid Eval to streamline the process and provide feedback.
  • Decision authority: Recommendations often progress through multiple levels of review, including portfolio managers and source selection authorities.
  • Transparency: Services/components provide feedback to firms, with a goal of making the process more defensible and unbiased.

Proposals are reviewed using the proposal evaluation criteria described in the solicitation, although services and components are allowed to specify different evaluation criteria.a The evaluation factors for Phase I proposals in a recent Broad Agency Announcement are listed below, in a descending order of importance (DOD, 2025a, p. 19):

  • The soundness, technical merit, and innovation of the proposed approach and its incremental progress toward topic or subtopic solution.
  • The qualifications of the proposed principal/key investigators, supporting staff, and consultants. Qualifications include the ability to perform the proposed R&D and commercialize the results.
  • The potential for commercial (government or private-sector) application and the expected benefits of this commercialization.

__________________

a For example, the Army’s Phase I evaluation criteria (and their relative importance) are Army benefits (15%), technical approach (35%), programmatic potential (20%), commercial potential (25%), and proposal quality (2%) (Army Evaluation Criteria, Appendixes AC). Similarly, the Air Force lists (in descending order of importance) defense need, technical approach, and commercialization potential as its evaluation criteria for open topic proposals in the Commercial Solutions Opening. (See DAF, 2025).

___________________

9 https://www.defensesbirsttr.mil.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

TABLE 4-1 Key Elements of the Review Process, by Service/Component

Service/Component Reviewers Evaluation Criteria
Air Force (AFWERX)
  • Minimum of 3 reviewers
  • Internal to Air and Space Force
  • Open topic reviewers are best-matched from pool of reviewers
  • 3 DOD evaluation criteria plus defense need
  • Reviewer scores are added
  • Conventional topic proposal reviewers adjust weighting of evaluation criteria
  • Open topic proposal reviewers use equal weighting of evaluation criteria
Navy
  • Minimum of 2 reviewers, preferably 3
  • Topic author is a reviewer
  • 3 DOD evaluation criteria plus defense need
  • Reviewers use panel consensus rather than scores
  • TPOC makes final decisions in their topic areas
Army
  • Minimum of 3 reviewers, preferably 5
  • Broad range of SMEs from Army labs and special operations
  • 3 DOD evaluation criteria plus 7 formally defined Army-specific subelements with more under thosea
  • Uses Valid Eval (evaluation software)
  • Scores in four categories from unsatisfactory to superior
  • Panel discussions determine proposal recommendations using technical reviewer and Valid Eval scores
Missile Defense Agency (MDA)
  • Minimum of 2 reviewers
  • Technology area leads and SMEs
  • Mostly MDA personnel supplemented by Federally Funded Research and Development Center employees
  • 3 DOD evaluation criteria plus transition potential and interest from other entities
  • Top proposals are presented to MDA Research Council for endorsement
  • Source selection authority considers transition potential and interest from other entities for final decisions
Defense Advanced Research Projects Agency (DARPA)
  • No formal reviewer structure
  • Program manager evaluates with input from other DARPA program directors
  • 3 DOD evaluation criteria
  • Recommendations from program manager after concurrence from Technical Office deputy director
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Chemical and Biological Defense Program
  • No minimum number of reviewers
  • TPOC creates technical team to evaluate proposals
  • Reviewers can include anyone across DOD
  • 3 DOD evaluation criteria
  • Division chiefs make funding decisions based on alignment with technology portfolio
Defense Health Agency
  • Minimum of 3 reviewers
  • 2 technical evaluators plus topic author
  • Topic author assembles reviewers of internal SMEs
  • Do not use external reviewers
  • 3 DOD evaluation criteria
  • Only fund proposals that align with their listed priority areas
Defense Logistics Agency
  • No minimum number of reviewers
  • Generally the project managers serve as reviewers
  • Will use internal and external end-users as reviewers
  • 3 DOD evaluation criteria
  • Review process differs between conventional and open topics
  • Scores based on overall topic criteria, then ranked by summary score
  • Scores in 4 categories from unsatisfactory to superior
Defense Threat Reduction Agency (DTRA)
  • No minimum number of reviewers
  • Topic author is generally the only reviewer, though sometimes another SME is added
  • 3 DOD evaluation criteria plus DTRA-specific standard scoring sheet
  • Additional security evaluations needed due to high-risk nature of DTRA’s work
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Service/Component Reviewers Evaluation Criteria
National Geospatial-Intelligence Agency (NGA)
  • No minimum number of reviewers, preferably 2
  • Not uncommon for topic author or SME to be only reviewer
  • All reviewers internal to NGA
  • Technical evaluators must get permission to review proposals in DSIP
  • 3 DOD evaluation criteria
  • Use source selection panel for evaluation proposals

a DOD, 2024.

NOTE: DSIP = Defense SBIR/STTR Innovation Portal; SME = subject matter expert; TPOC = technical point of contact.
SOURCE: Committee conversations with DOD SBIR/STTR program managers.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

information provided by small businesses regarding their foreign affiliations or relationships with foreign countries. The reviews encompass an analysis of cybersecurity practices, patents, employee backgrounds, and potential foreign ownership. Additionally, they involve examining financial ties and obligations to foreign entities, including any surety, equity, and debt commitments.

In using advanced analytical tools and open-source analysis, DOD aims to improve the efficiency and effectiveness of its due diligence assessments that were mandated by the SBIR and STTR Extension Act of 2022. The SBIR/STTR Office within OUSD(R&E) works to apply the due diligence program consistently across all DOD services and components. Furthermore, OUSD(R&E) is preparing a course designed to help small businesses understand the implications of foreign ownership, control, or influence. More recently the Department implemented a standard common risk matrix and tried to reduce the administrative burden for both the small businesses and the government.

At the same time, the DOD SBIR/STTR programs are under intense scrutiny from Congress, particularly regarding the timeliness of awards. Even minor delays, such as waiting a couple of days for a waiver decision, can adversely affect the timeliness data, which are actively monitored by GAO.

To streamline the process for DOD’s participating services and components, the Air Force Office of Commercial and Economic Analysis (OCEA) is tasked with conducting reviews of foreign ownership, control, or influence (DOD, 2024). OCEA is an Air Force–led office that performs assessments and analyses in support of efforts to protect DOD and its activities and services/components from commercial and economic risks. This office evaluates proposals and categorizes them as low, medium, or high risk; mitigation measures could be pursued for those deemed high risk by the relevant service or component. This mitigation process in smaller SBIR/STTR programs is challenging, time consuming, and complicated.

Keeping a database of previous due diligence investigations and ensuring that program managers have access to the database and are trained in its use is one way of reducing the burden of this due diligence, especially for those smaller DOD components. Additionally, creating a database of high-risk actors would reduce the time burden associated with this mandate. The 2022 SBIR and STTR Extension Act allows for flexibility in how DOD conducts foreign influence due diligence. Currently, all proposals are evaluated, which wastes resources given that most proposals will not be funded. It might therefore be more efficient to conduct due diligence at a later stage, especially if there are no issues with finding proposal reviewers. The Environmental Protection Agency, National Science Foundation, and National Institutes of Health conduct due diligence reviews only on applications being considered for awards and require disclosures only from those applicants (GAO, 2024). If DOD were to adopt the practice of conducting due diligence only for applicants being considered for awards, disclosure forms could still be submitted with the application in order to protect the timeliness of the selection process.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

Still, due diligence restrictions at the time of a firm’s application for funding may not reveal security concerns that may emerge later. For example, supplemental funding from venture capital may cause concerns about foreign influence, control, or influence because venture capital funds often include foreign investors. Moreover, venture capitalists may require small firms to expand into new and global markets, which may include sales to international rivals.

MULTIPLE-AWARD RECIPIENTS

There has been growing concern about the subset of SBIR/STTR firms that are selected for and receive large numbers of awards within the programs. This concern has led to increased scrutiny and oversight by Congress and GAO. For example, the 2022 SBIR/STTR reauthorization included specific language that established increased minimum performance standards aimed at certain multiple-award recipients. Current legislation that has been introduced is also intended to address perceived problems resulting from the activities of multiple-award recipients. Analysis of these experienced firms is discussed in more detail in Chapter 9.

As part of its data collection, the committee sought to understand the prevalence of multiple-award recipients across programs and the extent to which program managers were concerned about either overreliance on known performers or barriers preventing them from achieving the goal of expanding the supply base.

The SBIR/STTR program managers from the services and components varied in their responses as to whether multiple-award recipients were common within their program and whether they had any concerns about multiple-award recipients as a potential issue or problem for the DOD SBIR/STTR programs overall. About half of the program managers noted that although multiple-award recipients did exist, they were relatively rare in the pool of companies funded through their program. DARPA, for example, given its focus on early-stage technology development, has awarded very few firms multiple awards. These program managers saw multiple-award recipients as a nonissue. The remaining program managers acknowledged the presence of multiple-award recipients but had mixed opinions or were neutral as to whether this was a concern or not, elaborating on both the pros and cons of this practice.

On the positive side, several program managers noted that repeat awardees had the benefit of experience and often were funded because they had the technical knowledge necessary to advance existing projects through sequential Phase II awards. Services and components that require an assured supply chain of specific or high-demand technologies rely on recipients of multiple awards for a variety of reasons: because trust has already been built, program managers believe this to be the fastest way to get required work completed, or these firms can initiate productive work more quickly given their experience working with the agency and its processes. Others acknowledged that this familiarity might lead to a

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

selection bias on the part of government officials, given past experiences and an expectation of benefits from the ability to achieve program outcomes.

Furthermore, although multiple awards are associated with a single company, that company may include new researchers on the project team, effectively bringing additional performers with new ideas and capabilities into the mix. On the negative side, a few of the interviewees expressed concern that the repeat funding of previous awardees may effectively be crowding out new applicants, as often occurs in federal university research programs and traditional defense contracting activities, thus undercutting the SBIR/STTR programs’ ability to expand the national security industrial and innovation base. On the other hand, representatives of SBIR/STTR programs, even those in smaller services and components, acknowledged that the open topic solicitations have broadened the applicant pool, altering the mix of applicants.

Most program managers said that multiple-award recipients are subject to the same level of scrutiny as first-time awardees. Per SBIR/STTR evaluation guidelines, past performance is not a criterion for selecting firms for SBIR/STTR awards. A few SBIR/STTR program managers implied that the standard was effectively higher for previous awardees, which had to meet minimum performance benchmark requirements to be eligible to apply for a new Phase I or Direct to Phase II award.

It is interesting to note that no other defense science and technology program or other federal agency research program has subjected multiple federal award recipients to the level of scrutiny to which small businesses have been subjected under the SBIR/STTR programs—this despite the fact that Federally Funded Research and Development Centers (FFRDCs), including Department of Energy laboratories, universities, and large defense contractors (from both the traditional defense industry and the commercial sector), are all repeat recipients of multiple awards, including much larger awards than those of the SBIR/STTR programs. Yet those other programs have experienced no similar controversy and have had no better metrics for assessing the benefits to taxpayers or federal missions.

POSTAWARD IMPACT:
PHASE III, TECHNOLOGY TRANSITION, AND FOLLOW-ON CONTRACTING

The third and final phase of the SBIR/STTR programs is known as Phase III. A Phase III award supports work that “derives from, extends, or completes work under a prior SBIR/STTR Funding Agreement, but is funded by sources other than the SBIR/STTR programs” (SBA, 2023, p. 25). SBIR/STTR awardees, including those that receive Phase III awards, also receive certain data rights (see Box 4-4). The goal of a Phase III award is to facilitate the process of developing and delivering “products, processes, technologies, or services for sale to or use by

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

the federal government or commercial”10 sector through the funding of further R&D to mature technologies or through procurement of technologies, goods, or services. For DOD, this typically implies transition into mainstream R&D programs for eventual incorporation into larger acquisition programs, production, and operational fielding and use. SBIR/STTR awards offer several benefits to the awardee, including the right to sole source Phase III contracts, exemptions from SBA size standards for contracts, and the retention of SBIR/STTR data rights. In almost all cases, transitioning to a Phase III award is viewed as the ultimate goal for SBIR/STTR programs by both the small business contractors and DOD program managers.

Transition rates to Phase III are not carefully measured and are therefore difficult to quantify and analyze accurately. These rates likely depend on the agency’s mission and research focus. For example, DARPA and DHA focus on earlier-stage technologies (with a higher ex ante likelihood of technical failure and more lengthy transition and commercialization timeframes) relative to services and components such as the Army or Navy, with an extensive set of acquisition activities and mission requirements with which their SBIR/STTR programs must align for the purposes of transition. It is reasonable to expect mission differences to affect transition rates for reasons unrelated to program management quality.

BOX 4-4
Management of Data Rights

A benefit to SBIR/STTR companies is ownership of data rights resulting from their awards. Contractors in the SBIR/STTR programs are considered to have more data rights than is the case in other federal research programs. SBIR/STTR program data rights typically provide proprietary protection of the technical data for a period of 20 years, although data rights do not apply to nontechnical data.

The difference in data rights policy between SBIR/STTR and other DOD research activities does create concern for some defense officials, who are also responsible for supporting DOD’s efforts to maintain access to technical data rights for technologies throughout the acquisition life cycle. For example, DOD believes it needs technical data rights to technologies as they mature to support integration with other systems and to preserve the ability to create more competition in the defense industrial base. DOD also believes it needs technical data rights for technologies as they are used operationally—for example, to perform required system maintenance and upgrade activities.

Representatives of at least one component, National Geospatial-Intelligence Agency (NGA), discussed how they strategically decide which companies to work with through SBIR/STTR contracts or traditional contracts. If NGA wants a company to make its technology or product available for an extended period, it funds the company through an SBIR/STTR contract; but if NGA wants to control the technology, it funds the company through a traditional DOD contract.

___________________

10 15 U.S.C., Section 638(e)(10)(B).

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

The transition rate is also significantly affected by the complexity of the defense acquisition process, which has created a commonly observed “valley of death”—where a promising technology fails to transition from a Phase II prototype to a Phase III contract or commercialization—for many research activities, frustrating both government program managers and contractors.11 These transition challenges are faced routinely by the mainstream science and technology activities of each of the services and components that executes the DOD SBIR/STTR programs, as well as other agencies, such as the Defense Innovation Unit and the Strategic Capabilities Office. It is not to be expected that SBIR/STTR program managers and their awardee firms will have any greater success in addressing these intrinsic and embedded technology transition challenges that face all defense research and innovation efforts. As discussed further in Chapter 7, the programs’ primary success is in helping small firms secure larger R&D contracts, but DOD could enhance pathways to procurement by helping to build collaborations with prime contractors.

While transition to Phase III is an important milestone and metric, SBIR/STTR projects can support an agency’s mission even without a successful transition. On the one hand, program managers made frequent reference in their interviews to the valley of death. On the other hand, they emphasized that an SBIR/STTR project can reveal useful information—about failed attempts or infeasible technological approaches—even without a Phase III transition. This observation is not generally aligned with the business interests of a small business that is seeking to continue and expand defense contracting activities and sales through larger R&D and procurement awards. However, the multiple channels for mission value are important to keep in mind when evaluating transition rate statistics.

Some program managers track whether their Phase I and II awardees transition to Phase III; examples include annual reports from AFWERX and the Navy. Many smaller programs, however, do not systematically track transitions to Phase III, and their representatives stated that they lack the resources to do so. A centralized DOD-wide database linking Phase I projects to subsequent Phase II and Phase III awards does not exist. In principle, such information could be used to track and evaluate “within-program” changes in transition rates following the introduction of new commercialization initiatives or practices. As discussed in Chapter 7, transition from Phase II directly into procurement appears to be relatively rare. Many contracts or other defense awards that would legally be considered Phase III activities are not coded as such in any database by program officials, which is understandable given that program managers for Phase III projects are not connected to the original SBIR/STTR programs in most cases and have no incentive for being consistent with this reporting. As discussed in Chapter 7, Phase III’s are more likely to be for follow-on research funding, perhaps reflecting the lower starting Technical Readiness Levels (TRLs) of the projects,

___________________

11 See, for example, Specht and O’Halloran (2023).

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

but there currently is no way to assess this more comprehensively across the SBIR/STTR programs or other defense research activities since TRLs are not routinely collected for DOD. The only other agency with procurement activity operating SBIR/STTR programs, NASA, does collect TRL information on projects.

Services/components and program managers have additional means of progressing or driving the transition of SBIR/STTR projects. Supplemental funding opportunities are available to Phase II awardees through sequential Phase II and subsequent Phase II awards. These awards, explained below, are funded through SBIR/STTR budgets and thus are not considered “transitions” per se. They may, however, enable Phase II awardees to develop their work further in ways that increase the odds of a successful transition. Awardees may receive a total of two Phase II awards per topic from either the original or another awarding agency.

Sequential Phase II awards were introduced in the SBIR/STTR Reauthorization Act of 2011. They provide an additional government-requested Phase II contract for the same topic to the same small business for the same project. The funds enable the awardee to continue work on the initial Phase II project; thus, the work must be within the scope of the initial Phase II award. Sequential Phase II awards are awarded without competition, with a guideline amount of $1 million and a limit of $1.5 million. A Phase II awardee can receive only one sequential Phase II award per project.

More recently, the 2020 SBIR/STTR Program Directive (SBA, 2020, p. 24) provided separate authority for small businesses that receive a Phase I award from one federal agency to receive a subsequent Phase II award from another agency on the same topic (Navy SBIR, 2022). Subsequent Phase II awards are solicited, evaluated, and awarded on an ongoing basis. Like sequential Phase II awards, they are initiated at the request of the government, but unlike sequential Phase II’s, subsequent Phase II’s typically are used to fund Phase I proposals that did not receive a Phase II award from the original topic sponsor or agency. The original topic sponsor or agency must grant permission for a subsequent Phase II award on the same topic to be considered. This authority is particularly useful for SBIR/STTR projects with the potential to meet the needs of multiple agencies, which thus have multiple potential pathways for transition.

The DOD services and components have many approaches to transition. Some, especially the smaller ones, are not involved in efforts to support it, often because they lack the necessary programmatic and personnel resources, whereas others have more extensive and integrated practices and formalized transition efforts and programs. Examples of proactive practices include those of MDA, which “designs in” DOD-wide priorities at Phase I through technology leads, research council input, and the proactive development of relationships with program managers at services and components with complementary missions (e.g., Air Force, Space Force, DARPA). Incentivizing technology leads who work in targeted transition activities to connect with program managers at services and components with shared interests (e.g., in developing hypersonic defense

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

systems) also helps identify opportunities for sequential Phase II awards and new transition pathways.

Four years ago, the Army restructured its SBIR/STTR programs and created transition broker teams. These teams are tasked with bringing more of an acquisition mindset to topic selection and evaluation of early-stage SBIR/STTR projects. A program manager interviewed by the committee highlighted that this approach allows the transition broker teams to understand the program managers’ goals before a topic is approved.

Table 4-2 describes the established programs, many of which are new or recent DOD initiatives, aimed at facilitating the transition of products resulting from SBIR/STTR activities.

VARIATION IN PROGRAM EMPHASIS AND ADMINISTRATIVE ORIENTATION

It is useful to note that while there is considerable uniformity in the management and processes for DOD SBIR/STTR programs across the services and components, conversations with SBIR/STTR program managers uncovered significant variation in the details of how the programs are implemented. Specifically, as illustrated in Figure 4-2, the committee found it useful to examine two key dimensions of this variation: (1) the program emphasis and (2) the administrative orientation. This flexibility in implementation is useful and important as it allows the programs to adapt to different needs and missions.

Two-dimensional typology of alternative approaches to viewing and implementing the SBIR/STTR program within the DOD services and components
FIGURE 4-2 Two-dimensional typology of alternative approaches to viewing and implementing the SBIR/STTR program within the DOD services and components.
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

TABLE 4-2 Examples of DOD Programs Aimed at Facilitating the Successful Transition of SBIR/STTR-Funded Technologies into Military Acquisition Programs, Commercial Markets, or Both

Agency and Program Link Description and Eligibility Year Introduced and Funding Enhancement
AirForce
AFWERX Ventures: Strategic Funding Increase (STRATFI) Programa
A collaboration between AFWERX and SpaceWERX, the innovation arms of the Department of the Air Force and the U.S. Space Force. Aims to help transition SBIR/STTR projects through added funding and exposure.

Eligibility: Must have been awarded a Phase II contract within the last 2 years. Various levels of matching funding and avenues for Defense and/or industry matching, depending on the program sought.
2020
$3M to $15M for up to 48 months
AirForce
AFWERX Ventures: Tactical Funding Increase (TACFI) Programb
Similar to STRATFI but Air Force only, with a shorter time horizon and smaller funding enhancements.

Eligibility: Similar to STRATFI.
2021
$375k to $1.9M for up to 24 months
Army:
Army SBIR Catalyst Programc
Aims to unite the Army, integrators, and small businesses to create innovative technologies—propelling concepts to transition and commercialization. Provides contracts up to 8x larger than typical SBIR awards.

Eligibility: Awards CATALYST Phase II Enhancement funds only following a successful base performance and if the Army transition partner and integrator have available funding.
2023
Up to $7M
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Navy:
SBIR/STTR Transition Programd
An 11-month program that provides business mentoring, education, and networking opportunities.

Eligibility: Must have an active Navy-funded Phase II.
1999
Funding n/a (services only)
All agencies:
Technical and Business Assistancee
Provides additional funding to SBIR/STTR awardees for commercialization and business assistance expenses; allows awardees to select their own commercialization services provider or use services provided by an agency-selected vendor; can be used to cover expenses that are not included in the SBIR/STTR budget or proposal. Eligibility: Phase I & Phase II awardees. 2019
Up to $6.5k for Phase I; up to $50k for Phase II

a https://afwerx.com/divisions/ventures/stratfi-tacfi/.

b https://afwerx.com/divisions/ventures/stratfi-tacfi/.

c https://armysbir.army.mil/catalyst/.

d https://navystp.com/.

e https://legacy.www.sbir.gov/node/2088581.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

Program emphasis denotes whether the service or component focuses on either developing emerging technologies or expanding the defense industrial base and building supply chain resilience. In some cases, the technical capabilities that are needed are well understood and the goal is finding the best solution; in other cases, exploration of the technological frontier is required to imagine novel ways of accomplishing mission goals. A core difference between these approaches is whether the emphasis is on transitioning technologies to acquisition programs in order to meet current military requirements, strengthening the base of suppliers for defense materials and equipment, or developing disruptive technologies and warfighting capabilities.

The administrative orientation refers to whether the SBIR/STTR programs are viewed as merely a legal mandate with an administrative obligation that must be fulfilled or as a unique and valuable opportunity to support efforts to accomplish service and component missions.

  • Services and components that view the programs as an opportunity are entrepreneurial in expanding the program boundaries by using alternative transaction authorities, seeking additional funding, and building partnerships and coalitions beyond the SBIR/STTR office.
  • Services and components that view the program as an obligation tend to emphasize following its rules and attempting to incorporate it into ongoing contracting and programmatic activities. This view generally applies to agencies with lower research budgets and hence, fewer resources to allocate to the SBIR/STTR programs.

As illustrated in Figure 4-2, considering these two dimensions simultaneously creates a four-quadrant space defining alternative approaches to the SBIR/STTR programs within DOD. The approaches of the services and components participating in the programs do not necessarily fit exclusively into one quadrant. For example, those services/components that view the SBIR/STTR programs as an opportunity must still comply with the programs’ administrative and regulatory guidance and policies. And the SBIR/STTR programs that primarily identify innovative solutions must also attract new and diverse companies to increase the numbers of exceptional innovators working in defense. Instead, the quadrants provide a language and typology that allows discussion of reasonable and useful variation in how the different DOD participants in the SBIR/STTR programs execute their mandate. The typology highlights what works well and sheds light on opportunities for centralization or improvement. The dimensions and quadrants also give program managers a language for developing a strategic approach to their SBIR/STTR programs.

Applying the Framework

The two dimensions and associated quadrants described above imply distinct program philosophies that manifest in how the DOD SBIR/STTR

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

programs are implemented. In this section, that framework is used to consider systematically some tactical differences observed across the services and components. The distinct program elements are first introduced and then supported with a few examples of how they vary according to program emphasis and administrative orientation. Table 4-3 provides a summary of how specific implementation areas vary according to this framework.

The SBIR/STTR programs within the services and components make more or less use of the array of adjacent programs and resources, including supplementary funding, training and outreach programs, and centralized infrastructure. However, some important themes emerged in discussion with program managers who wanted additional support or resources, as did numerous exciting and innovative initiatives in outreach.

Open Topics in This Framework

Enthusiasm for open topics varies across services and components. While some of this variation can be explained by differences in size and resources, it also appears to be driven by the underlying program philosophy. Programs with broader or more entrepreneurial orientations—ones that are either seeking novel technologies from the commercial market or aggressively attempting to expand the supply base see more readily the benefits of interacting with unknown performers, whereas programs with more clearly defined missions or specific technology transition needs have less incentive to solicit ideas broadly from industry.

The underlying program philosophy, as defined by the four quadrants in Figure 4-2, helps explain some of the differences in the leadership of the topic development process, the personnel participating in topic development, the breadth of the topics’ scope, and the perceived utility of open topics:

  • Programs that are focused on broadly expanding technological capabilities rely heavily on scientists and engineers working on novel technologies. These programs tend to be enthusiastic about open topics as a vehicle for exploration.
  • Programs that are focused on addressing established program requirements involve a broad array of stakeholders, including subject matter experts and end users within services/components, as well as prime contractors. For these kinds of programs, broad involvement increases the likelihood that the topics will meet current military requirements and support existing acquisition programs. For these kinds of programs, open topics are less useful, as service/component and program needs are well specified.
  • Programs focused on onboarding nontraditional industry participants rely on small business advocates. Open topics can be helpful in connecting unknown performers, but direct outreach to new suppliers is more effective.
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

TABLE 4-3 Translating the Two-dimensional Typology into Implementation

Quadrant A B C D
Description Expanding Technological Capabilities (Modernization) Addressing Established Program Requirements Onboarding Nontraditional Industry Participants Executing a Funded Program
Lay Person Description Exploring unknown horizons Fulfilling known program needs Identifying and vetting new suppliers Running a mandated program
Advocates Research and development program managers Acquisition program managers and program executive officers Acquisition personnel and small business advocates Legal and compliance officers
Topic Development Objective Developing and delivering disruptive and innovative defense capabilities Meeting current military requirements and supporting acquisition programs Broad topics to increase participation by new entrants Execute a smooth process
Enthusiasm for Open Topics High Low to neutral Neutral Low
Importance of Multiple-Award Recipients Low–high—interested in engaging young companies and nontraditional defense contractors High—relies on a network of “known performers” Medium—values “known performers,” but also wants to expand the base Neutral—following the mandate
Transition Focus Experimentation oriented; tolerance for failure Engage end users at the start of topic selection Providing resources and support for novice partners to be performers Transition is for use by other services and components
Outcome Metrics Leveraging private funding, patents, publications Transition to acquisition programs Broader supply base Compliance with policy directives and other program requirements
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
  • Programs that are simply executing a mandated funded program are concerned about efficiency and want to manage a predictable schedule of topic releases, as opposed to using the releases opportunistically. They also tend to view open topics as creating an additional administrative burden for overworked and understaffed program offices.
Multiple-Award Recipients in This Framework

Similar to the responses to open topics, these variations help explain whether multiple-award recipients are viewed positively or negatively:

  • Programs that are focused on expanding technological capabilities are eager to tap into exciting young companies and actively work to seek out firms that are working at the cutting edge.
  • Programs that are focused on addressing established program requirements tend to rely on a network of known performers who have demonstrated their capacity to meet known needs.
  • Programs focused on onboarding nontraditional industry participants are open to new businesses but also want to support and advocate for existing disadvantaged businesses.
  • Programs that are executing a mandated funded program follow the rules as imposed without taking a stance on the pros or cons.
Phase III and Transitions in This Framework

The typology of program philosophies in Figure 4-2 helps explain some of the variations in emphasis on transition. It also points to other kinds of outcomes that could be tracked or evaluated. The following are examples:

  • Programs that are focused on expanding technological capabilities can use the SBIR/STTR programs for experimentation purposes. Early-stage scientific and technological experimentation entails a high likelihood of failure, but learning comes from these failures. Under this approach, therefore, a much lower emphasis on transition, as well as lower transition rates, would be expected. Other outcomes that would demonstrate the value of the SBIR/STTR investment would be patents or publications.
  • For programs focused on the commercial technological frontier or dual-use technologies, the value of SBIR/STTR investments could be seen in company growth, valuation, or private capital investments.
  • Programs focused on addressing established program requirements should have a relatively high transition rate. If the program runs well, the path to transition has been set in advance and should result in identifiable follow-on SBIR/STTR funding as well as procurement contracts.
  • Programs focused on onboarding nontraditional industry participants are likely to have mixed success in achieving transition as new performers
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
  • may or may not be able to meet DOD’s stringent requirements. It would be important to compare the transition rates for first-time awardees and multiple-award recipients, and to attend to differences by race, gender, veteran status, and geography.
  • Programs that are executing a mandated funded program are unlikely to be concerned with what happens to grantees once they leave the SBIR/STTR programs.

SUMMARY

This chapter has presented an overview of the administration of the SBIR/STTR programs at DOD. The committee’s discussions with program managers for the SBIR/STTR services and components revealed that while many broad rules and policies apply, each service or component exhibits substantial autonomy (and initiative) in program emphasis and administrative orientation. These variations reflect differences among the services and components and offer insight into the types of outputs resulting from the programs across the different services and components. They also offer a sense of the breadth of management styles and practices possible within the programs under current statute, regulation, policy, and practice. Given that the different services and components of DOD’s SBIR/STTR programs differ greatly in size and mission, these varying approaches to running the programs can be beneficial.

Caution is necessary in reforming the programs. For example, the introduction of open topics was embraced by some of the services and components, and was observed to bring in more new small businesses that may expand the defense supply base. At the same time, less targeted solicitations were found to increase the workload of program managers substantially, especially for smaller and more specialized services and components. For them, processing and reviewing the number and type of proposals garnered from an open topics solicitation can create a significant administrative burden while failing to yield the required specialized capacity. More targeted open topic solicitations, adopting an approach similar to the National Science Foundation’s project pitch, or requiring letters of intent might be beneficial, especially for smaller components, to mitigate this burden.

The services and components also differ in their view of making awards to experienced firms. At least one component’s representatives mentioned redacting company names on applications before sending them out for review. The committee notes that this technique may help address unfair selection bias due solely to familiarity with proposing companies. Redacting company names would be a better alternative to limiting experienced firms from submitting proposals, which would likely limit the technical options available to program managers for meeting technical mission goals.

The flexibility to implement their programs differently helps services and components use the programs to advance their missions; however, the committee found that some functions could be centralized to help reduce

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

administrative burden. Specifically, the OUSD(R&E) SBIR/STTR Office could further facilitate coordination across the services and components beyond the monthly meetings and facilitate sharing of innovative practices. For example, the quality of proposals could be improved through a more robust applicant assistance program, which would be especially helpful for small businesses that are new to submitting SBIR/STTR proposals. DOD could also enhance pathways from prototyping to fielded solutions through bridging programs or deeper collaboration with prime contractors. Additionally, to aid in the transition to procurement, new or refined solicitations might offer more guidance to SBIR/STTR awardees seeking to move from proof-of-concept innovations into full-scale acquisitions. More centralization of the due diligence process would be valuable to the smaller services and components as well. While the current process calls for the Air Force’s OCEA office to conduct these reviews, the smaller services and components find the process complicated and time consuming, especially for those proposals classified as high risk. In addition, the OUSD(R&E) SBIR/STTR Office could facilitate the sharing of knowledge about experimentation with initiatives that could be implemented across all parts of DOD, such as the Navy’s SBIR/STTR Transition Program, AFVentures Strategic Funding Increase (STRATFI) and Tactical Funding Increase (TACFI), the Army’s Catalyst, and the Army’s xTech prize competition to identify potential SBIR/STTR applicants.

All participating organizations aim to avoid contract award delays. DOD, like most agencies, is congressionally mandated to issue awards no more than 180 days after the proposal submission deadline. Centralization of some functions, such as the due diligence process, could help with streamlining the selection process.

Increased educational opportunities for SBIR/STTR program managers would help improve their ability to assist companies—especially first-time applicants—as well as their ability to administer the programs in a timely and effective manner. These opportunities could be provided through the Defense Acquisition University. While program managers are well versed in the SBIR/STTR legislation, policies, and guidance, they appeared to be less informed about the variety of contract types, cooperative agreements, and grants.

Currently, only up to 3 percent of funds allocated for the SBIR/STTR programs can be used for program outreach and administration. For larger services and components, the amount of money available to run the programs may be sufficient, but smaller services and components cannot take advantage of scale economies. DOD could consider allocating funding to support these programs or provide centralized support.

Finally, including SBIR/STTR in key strategy documents such as the annual Defense Spending by State report or the Defense Planning Guidance might help highlight the value of these programs to DOD leadership, Congress, and the general public. While topic development, evaluation, and selection processes vary across the services and components, they share common goals of fairness,

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

thoroughness, and a commitment to supporting innovative solutions that benefit both the military and the broader commercial sector.

FINDINGS AND RECOMMENDATIONS

Finding 4-1: DOD’s SBIR/STTR programs vary in terms of size, mission, and operational approaches. Codifying and communicating best practices would help all DOD organizations improve their SBIR/STTR programs.

Finding 4-2: Certain activities related to the implementation of DOD’s SBIR/STTR programs, such as due diligence, application assistance, and commercialization assistance, create an administrative burden for smaller DOD services/components.

Finding 4-3: Open topics help bring into DOD’s SBIR/STTR programs a broader range of firms that could reduce the concentration of awards, but the use of open topics is administratively burdensome for smaller DOD services/components.

Finding 4-4: Opinions vary across the military services (e.g., Army, Navy, Air Force, and Space Force) and components (e.g., Defense Advanced Research Projects Agency, Missile Defense Agency) with regard to the impact of SBIR/STTR open topics, and some services/components find them far more useful than do others.

Finding 4-5: DOD’s SBIR/STTR program managers often lack sufficient expertise concerning the needs of startups and entrepreneurs or the commercialization of outcomes from DOD-funded research and development (R&D).

Finding 4-6: Input from industry stakeholders (for example, Tier 1 contractors/system integrators) on topic selection or transition to procurement could lead to more robust incorporation of SBIR/STTR-supported technologies into products and services for the warfighter.

Finding 4-7: The frequent use of cost contracting methods for DOD SBIR/STTR awards increases the bureaucratic burden on both DOD and awardee firms, creates contracting delays, and may limit participation by those small businesses without dedicated staff to deal with the data reporting requirements associated with these contracts.

Finding 4-8: Citing the SBIR/STTR programs in key strategy documents would elevate the programs’ importance and utility within DOD and help in providing implementation guidance.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

Recommendation 4-1: The Department of Defense’s (DOD’s) Under Secretary of Defense for Policy should include in Defense Planning Guidance that the DOD Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) programs should be used as a mechanism for strengthening and broadening the defense industrial system, and direct the Department’s services and components to promote the transition of SBIR/STTR-generated technologies into mainstream science and technology and acquisition programs.

Recommendation 4-2: The Department of Defense’s (DOD’s) Under Secretary of Defense for Policy should include the DOD Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) programs in the current planning, programming, budgeting, and execution processes, or in the proposed Guidance Document, as a mechanism for strengthening the defense industrial base, alongside metrics provided to DOD leadership to measure the strength, resilience, and diversity of the defense innovation system.

Recommendation 4-3: The Department of Defense’s (DOD’s) Office of Local Defense Community Cooperation should include DOD Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) awards in its annual Defense Spending by State report.

Recommendation 4-4: The Department of Defense’s (DOD’s) Office of the Under Secretary of Defense for Research and Engineering (OUSD[R&E]), which is the DOD office of primary responsibility for the Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) programs, should codify and communicate best practices, such as those for integrating the SBIR/STTR awardees into programs of record or improving outreach to new small businesses. In addition, OUSD(R&E) should incentivize early collaborations across services and components for projects with potential multimission transition pathways.

Recommendation 4-5: Congress should allow but not require the use of Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) open topics. Congress should encourage more flexibility for the Department of Defense’s services and components to experiment with approaches that help broaden their supply base.

Recommendation 4-6: Department of Defense Small Business Innovation Research/Small Business Technology Transfer

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.

(SBIR/STTR) program officials, including contracting officers, should encourage the use of fixed-price contracts for Phase I and II awards.

Recommendation 4-7: The Department of Defense’s Office of the Under Secretary of Defense for Research and Engineering should request and Congress should consider appropriating funds for entrepreneurial training for Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) program managers, perhaps by having the National Defense University and Defense Acquisition University develop training modules and a certification for these program managers.

Recommendation 4-8: The Department of Defense’s (DOD’s) Office of the Under Secretary of Defense for Research and Engineering should request and Congress should consider requiring and appropriating funds to provide the requisite tailored training to DOD acquisition officials, through the Defense Acquisition University, on contracting and budget flexibilities available under the Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) programs.

Recommendation 4-9: The Department of Defense’s Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) Program Office should streamline the due diligence process by creating a centralized database for firms that fail to meet the due diligence requirements, and make the initial due diligence/denial process automated within the Defense SBIR/STTR Innovation Portal.

Recommendation 4-10: The Department of Defense’s Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) Program Office should prioritize due diligence reviews for proposals that are being seriously considered for funding.

Recommendation 4-11: The Department of Defense’s (DOD’s) Office of the Under Secretary of Defense for Research and Engineering should revise DOD’s Small Business Innovation Research/Small Business Technology Transfer (SBIR/STTR) instructions, regulations, and guidance to acknowledge program risk. This guidance should take into account the potential for transformational innovation and take into consideration the different needs, strengths, and challenges of large versus small services and components within the Department.

Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 79
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 80
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 81
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 82
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 83
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 84
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 85
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 86
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 87
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 88
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 89
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 90
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 91
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 92
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 93
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 94
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 95
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 96
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 97
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 98
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 99
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 100
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 101
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 102
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 103
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 104
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 105
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 106
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 107
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 108
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 109
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 110
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 111
Suggested Citation: "4 DOD's SBIR/STTR Processes." National Academies of Sciences, Engineering, and Medicine. 2026. Review of the SBIR and STTR Programs at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/29329.
Page 112
Next Chapter: 5 Who Applies and Who Gets Funded
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.