The purpose of this chapter is to summarize the perspective of technology vendors who regularly engage with DOTs. Gaining insights from technology vendors is particularly useful to this research because their insights provide best practices from across multiple DOTs in a variety of operational environments, needs, use cases, geographies, and various levels of size, scale, and complexity. Another benefit of learning from technology vendors is that they are familiar with various DOTs and often interact with other entities with whom DOTs regularly coordinate, such as the federal government, state governments, other state agencies, and a variety of local governments such as counties, cities, special districts, regional government, and other municipalities. This breadth of “insider” knowledge and experience from technology vendors provides greater context and insights to DOTs embarking on an innovative technology initiative. In practice, DOTs can learn from the technology vendor community to help them avoid typical pitfalls of the procurement process that might lead technology vendors to refrain from responding to proposals or otherwise negatively influence their proposal responses.
This chapter focuses solely on feedback from technology vendors concerning how DOTs can improve their solicitation practices when procuring innovative technologies.
Finally, the research team notes that interviews with technology vendors were also conducted on post-solicitation best practices. The interview questions related to post-solicitation best practices centered around feedback from technology vendors regarding how DOTs can utilize their procurement processes to better prepare for a successful technology roll-out and adoption process.
The subsequent sections of this chapter are organized as follows:
The content of this chapter is based on interviews with 15 technology vendors who have familiarity working with DOTs (and who often bring additional experience with other state and local governments, airports, and transit authorities) for a variety of technology types and contexts.
Participating vendors had experience with various software systems; reality capture technologies, data analytics and visualization; and professional services for supporting technology implementations. For each vendor, their interviewees typically comprised a small team of two to five participants who represented relevant perspectives within their organizations. Typical job titles/roles of interview participants included:
There were several consistent themes in the vendor responses to this question, each organized in the subsections that follow.
First, technology vendors noted that different technology types will have a different basis for how the vendor community will generally price out their proposal. Their primary feedback item was that DOTs should attempt to understand the cost basis of each new technology being procured. This would allow the DOT to emphasize this information in their SOW when advertising to the vendor community. In other words, DOTs should ensure their SOWs provide the primary inputs vendors will use to prepare their cost proposals.
Following are several common categories, each of which may be a cost driver for a variety of technology types:
The purpose of identifying this information is to ensure clarity and transparency in the pricing methodology used by participating vendors who respond to the RFP. Ultimately, this supports a greater probability of receiving cost proposals that are parallel and thus can be directly compared for evaluation purposes.
However, it is essential to recognize that the pricing framework employed by vendors is not perfectly consistent even when they are dealing with similar technological solutions. Some vendors will have unlimited pricing structures, while others are volume-based or based on users vs. total employees vs. modules, and so forth. In other cases, the same vendor may have flexibility in their pricing modules. For example, one vendor noted that they historically had a single pricing model, but they have since diversified to offer multiple pricing structures. The purpose of diversifying was to best fit within the client’s cash flow requirements, which may vary depending on the project that they are proposing. However, this vendor still desired to understand the primary cost basis information, which can then be input into their various pricing models for consideration.
Based on this feedback, DOTs can start their scoping process by understanding the likely pricing models used by vendors in the marketplace. This information can then be prioritized in the scope development process. A comprehensive scope document can cover the major items noted previously in case of pricing variations across the technology sector. DOTs can emphasize the importance of durability and longevity when outlining technology needs and requirements. This applies to various infrastructure elements such as highways, bridges, traffic management systems, emergency detection and response mechanisms, transit centers, vehicles, and equipment. Special attention may be given to their resilience against extreme weather conditions, including heat, freezing temperatures, rain, wind, and snow. Consideration for warranty issues and the availability of ready-to-go replacements or spare parts should also be included in the SOW.
It is important that the DOT carefully consider the required integrations that the new technology will be required to perform with respect to the DOT’s existing technology systems. If the DOT wants to achieve competitive pricing on the integrations, then each integration (for which a cost proposal will be requested) needs to be detailed in the RFP’s SOW. This information cannot be left too open-ended, or it will result in vendors adding substantial contingencies to their pricing.
“System integrations are the biggest—and riskiest—items for vendors to bid on.”
—Technology implementation specialist
For each integration, vendors will want information on the following (however, the DOT should check with their internal cybersecurity team prior to posting this information for public viewing):
Sometimes it is worth it to leave the integrations out of the upfront pricing competition or ask for them as a separate bid item. This also has the effect of keeping these costs out of the base pricing to get cheaper and more accurate costs. If the DOT keeps it in, then vendors may either dive for minimum (for the lowest cost) or they might add padding to cover the worst-case scenario (making them less competitive than the risk-takers). Either way, this type of guesswork related to ill-defined integrations will lead to an unbalanced cost competition that is not comparable.
Note that some vendors like to help define the requirements and then price them later. This is because there can be such a range in pricing. Integrations can be a few thousand dollars or tens or hundreds of thousands of dollars if vendors must build middleware between systems.
Vendors stated that the “big picture” associated with the DOT’s internal motivation behind the project is more important than trying to account for every little detail in a list of itemized requirements. In other words, the DOT should be comprehensive in painting the boundaries of the scope instead of trying to perfect every little detail and quantity. Sometimes the vendors need less detail than the internal project team may think.
One example was shared by an asset management vendor, who said, “Vendors don’t really care about the specific count of how many assets you have; rather, they care more about the different types of assets. So just tell us the list of different types of assets you plan to include, and this will help vendors to better scope out their responses.” This feedback may be beneficial to some DOT project teams who feel overly burdened by expectations of needing to collect comprehensive information about their scope.
Vendors consistently noted that the quality, content, and surrounding procedures of a client’s RFP and SOW can heavily influence their “bid vs. no-bid” decisions. The three primary themes discussed in this section were collected across the interviews.
Vendors suggested that DOTs should be careful about the emphasis they place on the cost evaluation factor in their RFPs. Overprioritizing cost (or not specifying it at all) may deter vendors, giving the impression that the primary concern is securing a bargain over acquiring
top-tier technology. Additionally, DOTs should consider a procurement strategy that weighs the “best value” against the lowest cost, focusing on the return on investment and emphasizing the quality, effectiveness, and efficiency of the technological products in the outcome.
Technology vendors noted that they will meticulously examine the set of requirements published in the RFP. At times, these requirements can tilt in favor of a particular technology or vendor. Observing such information makes certain technology vendors believe they are at a disadvantage due to the client’s potential pre-existing inclinations. Therefore, it is advisable for DOT project teams not to embed requirements that exclusively favor (or even give the perception of favoring) a specific technology or vendor.
A scenario where this becomes problematic is if the DOT’s project team has conducted extensive market research prior to releasing the RFP. For example, common market research practices include searching the websites of potential technology vendors, referring to vendor-provided materials such as brochures (perhaps obtained at a conference event or downloaded via the vendor’s website), or even gaining insights from online demonstrations or other “samples” published by technology vendors. Although these avenues may be legitimately helpful for general market research purposes, in such cases, a risk occurs if the DOT’s project team uses any verbatim content from the vendor’s materials.
When the DOT uses such verbatim content from a vendor’s website, brochures, or other marketing materials, this is likely to be recognized by competitors. Most technology vendors are familiar with one another’s websites and associated marketing materials. Vendors who participated in these interviews noted that if they see language or terminology that appears to be
sourced from a competitor’s content, then it can send a message (correctly or incorrectly) that the competitor has formed a prior relationship with the DOT’s project team members (thereby potentially giving them a perceived advantage). Therefore, it is preferred to avoid using language or content from vendors, even with the best of intentions. Doing so can inadvertently signify that the evaluation process will be biased, even if that was not at all the intent. DOTs might rather take the time to ensure their SOW and requirements are published in their own words because this signals a fair process for all technology vendors.
Vendors stressed that DOTs should provide sufficient time for vendors to craft their proposal responses. If a potential client sets an overly aggressive deadline, vendors might decline to participate due to time constraints which prevent them from submitting a proposal that meets their quality standards. DOTs need to always remember that vendors may have multiple RFP opportunities at any given time, especially for vendors who provide technologies to several client sectors. In such cases, assuming limited resources on the part of the vendor, the client who uses an overly accelerated timeline may cause problems and nudge the vendor to “no bid” in preference for responding to the more reasonable timeline.
Regarding what is a reasonable time period, the vendors interviewed were fairly consistent. In general, vendors advised DOTs to give a minimum of 1 month for vendors to work on their proposal responses. Another suggestion was to ensure a 2-week gap between the release of the final addendum (e.g., in response to submitted questions) and the ultimate submission date. Vendors noted that they often adjust their proposals in response to questions and answers that are published in the addenda, necessitating this time. For larger and more complex projects, vendors suggested broadening the response period to roughly 6 weeks. The main purpose was to allow for two rounds of questions and answers; for example, vendors would have time to submit their initial questions, then review the addendum containing the DOT’s answers, and then ideally vendors would have time to submit any follow-up questions that might be necessary. Doing all this typically requires more than a 1-month duration, hence the suggestion for 6 weeks.
Finally, vendors noted that if hard copy submissions are required, an additional 3 to 4 business days should be factored in for printing, shipping, and delivery.
“If the client has a budget, please release it. This lets each vendor better propose how to implement the project to meet scope and budget constraints.”
—Software provider
When asked about the common pitfalls they observe within the RFPs of their prospective clients, the participating vendors identified a number of common issues. None of these issues were large enough to be a “deal breaker,” which would cause a vendor not to propose. However, the vendors indicated that addressing these pitfalls would make the RFP more attractive and enhance their ability to prepare a more complete, accurate, and tailored proposal response. Each of these pitfalls—along with suggested solutions—is discussed in the following subsections.
According to the interviewees, the most missed section in a client’s RFP and SOW is the description of their current state environment. It is vital to include this, as vendors highly prioritize this information when building their proposal response and pricing.
In the interviews with various vendors, a recurring theme was their confidence in their product and its capabilities. Their main focus is on understanding the DOT’s existing systems, workflows, business necessities, and use cases. If vendors are kept uninformed about the DOT’s current operational scenarios where their technology will be most used, they are likely to incorporate an additional margin in their quote to account for uncertainties.
To the extent possible (respecting cybersecurity concerns), vendors said it is crucial to detail the current systems or legacy technologies that will be replaced, highlighting factors such as their operational duration and how well they meet the DOT’s anticipated functions, use cases, or other desired capabilities. If the DOT is transitioning from a legacy system, vendors value insights into the motivation behind the shift. If the DOT’s procurement need is for ITS/traffic operations, it would be advisable to include the respective ITS architecture plan, technology mapping, and other related plans and even policies to assist the vendor in understanding the DOT technology goals, roadmap, and needs. What issues or shortcomings does the current system pose? Which elements dissatisfy the DOT?
Vendors suggested that DOTs give a snapshot of their overarching tech infrastructure (again, respecting cybersecurity concerns). This includes details such as whether the client is dependent on a unified tech stack, whether specific security norms are vital for a software vendor to meet, and other database-related aspects that vendors must conform to.
Vendors suggested that the project budget be disclosed, asserting that this is ultimately in the DOT’s best interest. Vendors noted an interest in understanding the client’s budgetary limits because they may be able to offer diverse financial plans for their solutions.
Vendors also suggested that DOTs specify their expected implementation schedule for the new technology’s deployment. This can be as straightforward as indicating, “the DOT aims to begin on X date and conclude by Y date.” This clarity ensures a level playing field for all vendors during the proposal process. The RFP process can also be structured to accommodate alternative schedule options and account for associated benefits. Vendors requested that DOTs identify major legal requirements or other unusual contractual constraints up front in the RFP.
Depending on the type of procurement used, 2 CFR Part 200 and/or the Federal Acquisition Regulation (FAR) may prohibit this practice. If using federal funds, DOTs must follow the CFR and FAR. Also, some state procurement laws may prohibit this practice. Technology vendors could benefit from learning about federal and state procurement laws.
Vendors noted that the expected contract duration should align with the cost proposal format the DOT utilizes in their RFP. For instance, if a 3-year initial contract is anticipated by the client, the cost proposal should solicit pricing for the same 3-year span.
Following are a few other procedures:
Vendors are interested in learning about the DOT’s underlying reasons for the new technology procurement. Vendors suggested that DOTs directly answer the question: “What’s driving the need for this solution?” Essentially, vendors are seeking to understand what challenges the DOT is aiming to address. This information was noted by multiple vendors as being equally important as granular or itemized requirements.
In addition to understanding the “Why?” behind the project, vendors also want to understand the answer to “Why Now?” This helps explain the urgency of the project and underlying business support. According to the interviewees, if the answer is not convincing or is missing altogether, then the vendor community may fear that the project is at risk of being canceled or experiencing an evaluation process that becomes prolonged and disorganized (stemming from a project team that may be unsure of their immediate needs).
DOTs may also provide insights into the larger initiatives the technology will support. As an example, vendors supplying advanced project management and planning tools would appreciate it if clients could share their anticipated capital expenditure for the upcoming 3 years. This type of information adds greater context to the project and directly illustrates a tangible need or operational motivation.
Vendors requested that DOTs clearly prioritize their list of requirements, which enables vendors to better structure their proposal response, pricing, and general offerings to best fit the DOT’s goals. This can be done simply; for example, one interviewee suggested that the DOT’s itemized list of RFP requirements be tagged with a simple nomenclature system, such as the following:
When it comes to performing technology demonstrations, vendors requested that DOTs provide a clear agenda for demos, emphasizing the capabilities the evaluation team wishes to see rather than giving a stringent list of requirements. For effective demonstrations, the script should be structured around use cases that cover the most important functionalities. While many client teams lean heavily on listing requirements they want to have demonstrated, vendors stressed that DOTs should not feel obligated to dictate every detail of what they wish to see in demos. A more practical approach would be to request, “Show us how the XYZ capability functions within your technology” or “Show us how your technology performs XYZ task.” While crafting use cases, the emphasis should be on the desired capabilities (“Show me XYZ”) instead of detailed instructions
and mock scenarios. Use cases can also pivot according to different user perspectives and how a technology integrates into workflows.
Vendors also noted that demonstrations should not become so detailed as to overlook what makes a particular technology stand out from other competing options (because vendors can become overly focused on simply “checking the boxes” on a list of requirements rather than showing a cohesive demonstration of how the new technology can be leveraged to best meet the DOT’s operational needs). Therefore, vendors suggest the DOTs pose questions such as, “What distinguishes your technology?” or “How does your product stand out?” It can be beneficial to engage vendors who can intuitively understand client needs and demonstrate their technology’s aptness based on experience.
Additionally, vendors were dismayed that their clients in general rarely asked about operational efficiencies and cost savings during the demonstration. They suggested that DOTs should prompt vendors to highlight the beneficial outcomes their product can achieve, such as enhanced efficiency, monetary savings, and time conservation. This will help the evaluators understand how the demonstrated capabilities will ultimately deliver value to the end user, and, in turn, the DOT. Yet it is acknowledged that public sector agencies can have greater difficulty in capturing the benefits of operational efficiencies compared to the private sector. This is due to several factors, including differences in budgeting and greater variability in how benefits are measured. For example, the private sector often relies upon relatively more straightforward measures of profitability, whereas the public sector is often measuring less directly quantifiable metrics such as benefits to the public’s experience (in addition to other metrics).
Finally, to maintain engagement amongst the DOT’s evaluation team, demos should ideally be capped at 1–2 hours and limited to two total demonstrations per half-day.