Evaluation of the Every Day Counts Program (2025)

Chapter: 1 Introduction

Previous Chapter: Summary
Page 3
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.

CHAPTER 1

Introduction

At the request of the Research and Technology (R&T) Program of FHWA, TRB engaged RTI International to perform an evaluation of the Every Day Counts (EDC) Program. FHWA began the EDC Program in 2009 as a state-centric initiative for identifying and deploying proven, but underutilized, innovations that can make U.S. surface transportation more adaptable, sustainable, equitable, and safer for all. By accelerating the adoption of the selected technologies and practices, the EDC Program seeks to promote an innovation-oriented culture in the surface transportation community that will ultimately facilitate and accelerate the diffusion of improvements in the future.

This evaluation project was conducted under the TRB-FHWA Program Evaluation (TFPE) initiative. The TFPE supports the R&T Evaluation Program and aims to assess the following five aspects of FHWA research efforts:

  • Efficiency: Are the project’s activities conducted with an appropriate use of resources, such as budget and staff time (e.g., research and implementation approach, funding level)?
  • Implementation: Is the project being applied and/or adopted by the users (e.g., is there a change in culture or in the state of the practice)?
  • Effectiveness: Is the project achieving the goals and objectives it was intended to accomplish (e.g., what is the impact)?
  • Cost-effectiveness: Does the value or benefit of achieving the project’s goals and objectives exceed the cost of producing them (e.g., what is the return on investment and cost-benefit ratio)?
  • Attribution: Is the project addressing the agency’s strategic goals (e.g., what are the outcomes)?

CRP Special Release 5 provides a summary of the EDC Program evaluation conducted from 2023 through 2024. Chapter 1 presents a brief overview of the EDC Program, including its major activities, as well as the overall objectives of this evaluation and the theoretical framing for the evaluation design. Chapter 2 provides the key details of the evaluation method and plan, including the logic model detailing how the EDC Program attempts to achieve its objectives. Chapter 2 also includes the central evaluation questions (EQs) addressed by this effort. Chapter 3 summarizes the data collected from all sources for the evaluation and the results of the data analysis. Chapter 4 details specific findings derived from the evidence collected, while Chapter 5 provides overall findings for each of the EQs. Chapter 6 presents overall conclusions from this evaluation project.

1.1 Background on the EDC Program and This Evaluation

FHWA is a component agency of the U.S. Department of Transportation. Within FHWA, the Technology and Innovation Deployment Program (TIDP) funds efforts to increase the use of new technologies and approaches that can speed the planning and completion of transportation

Page 4
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.

projects. TIDP’s efforts are authorized by the Infrastructure Investment and Jobs Act of 2022 (also called the Bipartisan Infrastructure Law) and its predecessors. The EDC Program is a key component of the TIDP objective to accelerate the adoption of new technologies, methods, and practices that speed the completion of transportation construction projects, enhance road safety, improve roadway maintenance and operations, and increase the efficiency of the surface transportation system.

EDC Program Development and Activities

In 2009, then-FHWA Administrator Victor Mendez announced the launch of the Every Day Counts Initiative. Mendez envisioned an approach that would increase the rate of adoption of new technologies, organizational practices, and business approaches (Mendez 2013). For the first set of innovations to be supported by the EDC Program, Mendez identified three areas of focus: shortening project delivery, accelerating technology and innovation deployment, and FHWA’s “Going Greener” initiative. In August 2009, a joint committee of key organizations in surface transportation, including the American Association of State Highway and Transportation Officials (AASHTO), the Association of General Contractors (AGC), and the American Road & Transportation Builders Association (ARTBA), created a task force that met to develop recommendations for the EDC Initiative, issuing a set of recommendations in August 2010. The EDC Program launched in 2011, designating 14 EDC Innovations for acceleration (EDC Program n.d.).

The Moving Ahead for Progress in the 21st Century (MAP-21) Act, enacted in 2012, assigned FHWA the specific task of promoting the adoption of innovative technologies to speed the delivery of infrastructure projects. The EDC Program became the responsibility of what is now the Accelerating Innovations Program Team (formerly the Center for Accelerating Innovation [CAI]) within the Office of Innovation and Workforce Solutions. Under the Fixing America’s Surface Transportation (FAST) Act, signed in December 2015, FHWA is required by statute to identify and promote new sets of innovations every 2 years, establishing the legal mandate for the EDC Program. The EDC Program is not specifically referenced or mentioned in the Bipartisan Infrastructure Law but continues to operate based on the mandate and processes established previously.

The EDC Program has established a 2-year cycle, or EDC Round, for program activities. The program conducted its first Round in 2011, and its most recent (the seventh) was announced in December 2022. For each Round, the EDC Program defines and selects a specific set of technologies, practices, or approaches—called EDC Innovations—for which it will encourage adoption by state DOTs. As shown in Figure 1, the initial Round of the EDC Program promoted 14 Innovations. The number of Innovations in subsequent EDC Rounds decreased, with seven Innovations promoted in EDC Round 6. (As presented in Appendix A, EDC Round 7 had seven Innovations.)

Within each EDC Round, FHWA staff, state DOT personnel, and other interested parties engage in a structured sequence of activities under the EDC Program as described below and shown in Figure 2.

  1. Identification and Selection of EDC Innovations. FHWA gathers input from the transportation community, including state DOTs and industry associations, on proven but underutilized innovations. Incorporating internal screening criteria and external feedback, FHWA selects a subset of innovations to promote for the coming EDC Round.
  2. Implementation Team Orientation. Program staff work across FHWA to organize an Innovation Deployment Team (IDT) for each EDC Innovation. Each IDT engages subject matter experts from across the transportation community to develop an Implementation Plan to guide and inform state efforts to implement that innovation. The EDC Program also provides
Page 5
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Count of EDC innovations by EDC Round
Figure 1. Count of EDC innovations by EDC Round.
Progression of activities in each EDC Round
Figure 2. Progression of activities in each EDC Round.
Page 6
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
  1. training to IDTs on how to refine Implementation Plans and how to promote Innovations, as well as support for engaging consultants and other experts to help on Innovations.
  2. EDC Summit. After the public announcement of the EDC Innovations for the Round, FHWA organizes a national summit to present selected Innovations to the wider transportation community and engage in discussions about each EDC Innovation and its benefits. Originally an in-person conference, the summit shifted to being a virtual gathering at the start of the COVID-19 pandemic.
  3. State Transportation Innovation Council (STIC) Caucus. After the EDC Summit, each DOT works with its corresponding FHWA Division Office to convene relevant transportation organizations (e.g., contractors, consultants, municipal authorities) as the STIC. Collectively, the STIC assesses the current adoption level within the state for the Innovations and, during the Caucus, determines which Innovations the state will implement during that EDC Round. Each state is eligible to receive funding of $100,000 per EDC Round to fund projects authorized by its STIC. (Note: the STIC incentive funding was increased to $125,000 per state after this evaluation was conducted.)
  4. Implementation. The EDC Coordinator in each state’s Division Office reports on which Innovations in the EDC Round the state DOT plans to implement. The IDT plans outreach and training and provides the state DOTs with information and technical assistance to facilitate deployment of each EDC Innovation. Common tools developed to support implementation include
    • – Educational webinars presented by experts,
    • – Videos describing the innovation,
    • – Case studies on how other state DOTs implemented the innovation,
    • – Peer exchanges where state DOTs can discuss the innovations and share their own strategies and successes in implementing the innovations, and
    • – Decision support tools (e.g., cost-benefit analyses and business cases) to help state DOTs understand how to demonstrate the value of an innovation relative to the risks of implementation.

States work with FHWA Division Offices to track the status of implementation efforts for those Innovations during a 2-year period.

EDC Innovations are not necessarily discrete technologies or practices. Instead, they may encompass a broader set of related systems and practices around a given aspect of state DOT operations. For example, the e-Construction Innovation in EDC Round 3 noted that “The e-Construction process includes electronic submission of all construction documentation, electronic document routing and approvals (e-signatures), and digital management of all construction documentation in a secure environment” (EDC Program 2015). Given that the Innovations are focused on existing but underutilized systems and practices, they may bundle together related technologies and assign them a new label for easy reference. One EDC Innovation combined several countermeasures designed to improve pedestrian crossings and reduce accidents and promoted that set of discrete innovations as Safe Transportation for Every Pedestrian, or STEP.

The EDC Program expects adoption of these designated Innovations to advance the mission of FHWA in several domains: shortening the time required for road construction while ensuring accountability in the use of public highway funds, enhancing traveler safety and health, increasing the efficiency and capacity of the surface transportation system, and making the highway system more sustainable and durable. Throughout its history, the program has followed a few core principles that have influenced its operation:

  • A focus on accelerating the deployment of proven, yet underutilized, market-ready innovations.
  • Identification of candidate innovations by collecting input from a diverse range of interested parties across the surface transportation community—an approach often labeled “open innovation” (U.S. GAO [Government Accountability Office] 2016).
Page 7
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
  • Provision of information for state DOTs and other transportation groups to decide, based on their own priorities, which Innovations to promote. The EDC Program does not dictate the implementation of Innovations by the states. The EDC Program also does not subsidize deployment of Innovations but does facilitate access to FHWA funding programs supporting deployment (e.g., the Accelerated Innovation Deployment [AID] Demonstration Program).
  • Conduct of systematic reporting and follow-up to track progress in implementing Innovations, documenting of benefits and lessons learned, and assessment of the diffusion of EDC-supported Innovations across the states.

After a Round is completed, support for many EDC Innovations is continued by FHWA teams, including the FHWA Resource Center and Headquarters Program Offices, ensuring that content developed by the EDC Program remains accessible and that states wishing to implement an EDC Innovation in the future can access those resources.

1.2 Evaluation Objectives

As noted earlier, this evaluation was conducted under the TFPE Program, where FHWA seeks an assessment of the efficiency of FHWA activities, the way that activities are implemented, the effectiveness of activities in terms of performance and cost, and the outcomes attributable to those activities. This evaluation faced several challenges in pursuing these objectives:

  • The efficiency of a program is best measured by comparing it to peer efforts; however, very few federal programs resemble the EDC Program, especially in its focus on the diffusion of market-ready innovations.
  • The EDC Program has encompassed a wide number and diversity of Innovations, leading to variation in implementation practices and patterns.
  • Given the diversity among its Innovations, measures of efficiency (outcomes relative to inputs) also vary widely and cannot be captured by a standard set of metrics.
  • The cost-effectiveness (outcomes relative to funds invested) of the EDC Program cannot be calculated easily, given that some of its key outcomes (such as improvements in the adoption of Innovations at state DOTs) cannot be expressed as financial gains.
  • The outcomes of the EDC Program are influenced by many factors beyond the control of the program. By design, the EDC Program is state-led, and FHWA cannot compel implementation of EDC Innovations. This situation complicates any effort to attribute precisely changes in the surface transportation ecosystem to the EDC Program’s efforts.

The EDC Program works through its Innovations to change behaviors and activities in state DOTs by providing DOTs with information, resources, and guidance in adopting unfamiliar technologies and practices at a faster rate. Adopting an innovation is a risky activity, and state DOTs have a reputation for being risk-averse. The aversion to risk is well-justified; state DOTs often implement projects that are costly using public funds, and the consequences of failure can be very visible to the public (e.g., cost overruns in a construction project or failure to ensure that roadways are equipped to handle growing traffic or reduce accidents).

The evaluation of the EDC Program focused on how the program influenced state DOT decision-making about the implementation of EDC Innovations. The evaluation was designed to assess four aspects of the EDC Program’s operations and results:

  1. Strategy and Processes. The evaluation looked at how well the EDC Program’s design and the management of program activities supported the EDC Program’s key objective—accelerating deployment of proven yet underutilized Innovations.
  2. Outcomes. The evaluation gathered evidence showing what the state DOTs were able to achieve with the aid of the EDC Program in terms of implementing Innovations and improving the deployment of Innovations over time.
Page 8
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
  1. Effectiveness. The evaluation examined how the expected outcomes of the efforts to promote EDC Innovations aligned with FHWA’s mission, consistent with the federal role in the U.S. surface transportation sector.
  2. Lessons Learned. The evaluation explored options for future revisions to the EDC Program’s design and processes, including the potential trade-offs that each option would involve.

The goal of this evaluation was to provide FHWA with data, findings, and perspectives about the accomplishments of the EDC Program, its possible strengths and weaknesses, and ideas to consider for future policy and program changes aimed at improving the program’s outcomes.

1.3 Innovation Diffusion and Adoption Frameworks

To develop the evaluation design, the authors of this Special Release drew on frameworks and theories in the published scholarly literature on the diffusion of innovations. Much of the research on innovation focuses on how organizations generate innovations—less work examines how organizations adopt innovations. A common observation is that even those innovations that offer significant advantages over current technologies or processes take longer than might be expected to achieve widespread acceptance and adoption. The study of the diffusion of innovations emerged from multiple disciplines, including psychology, communication, anthropology, and marketing (Rogers 1976). [Prof. Everett Rogers wrote one of the seminal works integrating various streams of literature, Diffusion of Innovations (Rogers 1983).] The evaluation herein uses insights from Rogers and other scholars of innovation diffusion to analyze how the EDC Program influences the adoption of innovations in state DOTs.

In the context of the EDC Program, an “innovation” is framed using Rogers’s construction. An innovation is not necessarily something “new to the world” (i.e., a technology or system that is newly invented) but rather something new to the adopting organization. This encapsulates the EDC Program’s objective of accelerating the deployment of “proven, yet underutilized” innovations. An EDC Innovation may already be well-tested and mature, but state DOTs may still be largely unaware of it.

Rogers’s theory examines four elements that shape how rapidly an innovation is adopted:

  1. The characteristics of the innovation (i.e., what would convince an organization to use that innovation);
  2. The communication channels for learning about the innovation;
  3. The time required to decide to use the innovation; and
  4. The social dynamics within the organization or its immediate community.

Although somewhat idealized, the model developed by Rogers identifies five stages in adopting an innovation:

  1. Awareness (when the adopter is alerted to the existence of an innovation);
  2. Information and persuasion (when the adopter learns the advantages and disadvantages of adoption);
  3. Decision (when the adopter commits to implementing the innovation or rejects it);
  4. Implementation (when the adopter attempts to use the innovation in an operational context); and
  5. Confirmation (when the adopter validates the decision to adopt and assesses whether and how to continue use).

To accelerate the deployment of innovations, the EDC Program’s activities would need to facilitate progress across all five stages. As noted by some critics of the Rogers theory (van Oorschot, Hofman and Halman 2018), an organization’s perception of an innovation can slow

Page 9
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.

or complicate adoption. An organization may be reluctant to implement an advantageous innovation if the organization is ill-equipped to appreciate the benefits relative to the costs of that innovation (an information deficit). An innovation may pose one or more of the following types of risks to the adopting organization:

  • Technical—the innovation may not perform as promised/expected;
  • Organizational—the innovation may disrupt norms and processes (e.g., by requiring joint action across organizational silos);
  • Political—implementing the innovation may result in a backlash by the organization’s members and supporters (e.g., by creating a performance problem that undermines public trust); or
  • Workforce and capacity—the organization may lack the skills and expertise needed to implement the innovation properly.

Even with a strong business case for adoption, implementing the innovation may be blocked by an organization’s inherent risk aversion—if the organization is skeptical that the benefits of the innovation will exceed the actual risks. As a result, the organization will demand additional evidence or strong external incentives before deciding to implement the innovation (Daglio et al. 2015). This situation is prevalent throughout the private sector across different countries. The Organisation for Economic Co-operation and Development (OECD) has spent considerable time and resources on frameworks for innovation in the public sector during the past decade to “help governments understand and harness their innovative capacity” (Kaur et al. 2022). The OECD’s Observatory of Public Sector Innovation (OPSI) developed the Innovative Capacity Framework to identify approaches to promote innovation in private-sector agencies. The Framework takes a broad view of three levels of the systemic elements and actors within the public sector—the individual (including team dynamics), the organization, and the public-sector system (including broader global and environmental influences)—and relates these to the following four areas:

  • Purpose: what elements across the system are driving the intent to innovate?
  • Potential: what elements across the system may influence whether innovative efforts are attempted?
  • Capacity: what is needed to implement innovative efforts, including testing and piloting?
  • Impact: how is the effect of innovative efforts understood and informing future practice?

Whereas the Framework identifies factors and evidence across three levels and four focus areas, in practice, the Framework needs to be tailored and contextualized depending on the ambitions, context, and constraints within the focus area (Kaur et al. 2022). This CRP Special Release incorporates guidance provided by the Framework. Similar to previous studies (Arundel, Bloch, and Ferguson 2019; Arundel, Bloch, and Ferguson 2016), this evaluation uses the Framework to adapt questions about private-sector innovation for use in the study of public-sector innovation and analyzes the effects of the EDC Program on innovation capacity using some of the constructs in the Framework. This Special Release also recognizes that external conditions may affect the deployment of innovations within an organization. The Technology Delivery System model illustrates a framework that addresses the effects of external conditions on innovation deployment and points out that contextual factors can influence the timing and success of innovation deployment, such as the existence of technical standards and regulation, and the existence of organizations providing complementary skills and capabilities enabling the appropriate use of the innovation (Wenk and Kuehn 1977).

Even if all conditions favor implementing an innovation, the OECD’s Framework for public-sector innovation points out that an organization’s culture may work against any decision to embrace that innovation. Anthropologists define organizational culture as the set of norms, incentives, and beliefs (often unwritten) that have evolved in an organization to shape behaviors. An organization may have a culture that is so risk-averse that it will reject almost any innovation, no matter how

Page 10
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.

advantageous. Given that organizational culture is tacit rather than explicit, changing organizational culture is difficult and time-consuming. Organizations that have suffered from repeated failures in innovation will tend to create an internal culture that rejects further innovations.

One of the ultimate objectives of this evaluation effort was to characterize or measure the success of the EDC Program in altering the culture within state DOTs to make them more open to deploying innovations. As with risk aversion, repeated success in adopting Innovations make organizations more amenable to adopting Innovations over time. The OECD’s studies indicate that innovation-oriented organizations develop incentive systems and organizational capacity that facilitate innovation deployment. As discussed in Chapter 2, this evaluation included an investigation of how the EDC Program encourages the development of systematic approaches in state DOTs for assessing potential innovations and developing the capacity to deploy innovations.

Page 3
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 3
Page 4
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 4
Page 5
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 5
Page 6
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 6
Page 7
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 7
Page 8
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 8
Page 9
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 9
Page 10
Suggested Citation: "1 Introduction." National Academies of Sciences, Engineering, and Medicine. 2025. Evaluation of the Every Day Counts Program. Washington, DC: The National Academies Press. doi: 10.17226/28871.
Page 10
Next Chapter: 2 Evaluation Method
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.