Previous Chapter: Front Matter
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.

Summary

The National Climate Assessment (NCA) was created in response to the Global Change Research Act (GCRA) of 1990.1 The GCRA created the U.S. Global Change Research Program (USGCRP, or Program), and called for a scientific assessment that “integrates, evaluates, and interprets the findings,” “discusses the scientific uncertainties,” “analyzes the effects of global change,” and “analyzes current trends in global change.” So far, there have been five NCA reports, along with additional reports and materials on specific topics.

The NCA goes through an extensive review process for technical accuracy, including by federal agencies and through multiple opportunities for public comment. These reviews have focused on the technical content of the report, but what has been less studied is the users and uses of the NCA and how it has informed decision-making. An evaluation of the NCA was completed in 2016, which focused largely on the process for creating the NCA report.

Perceiving a need for an updated evaluation with a broader focus, USGCRP asked the National Academies of Sciences, Engineering, and Medicine to convene an expert committee to develop a strategy for examining the uses of the NCA. In response, the National Academies formed a committee of experts in the development process for the NCA, climate communication, the uses of the NCA, and engagement with audiences and participants. Notably, the committee was not charged with conducting an actual evaluation or a research design, but with preparing a strategy for evaluation design. See Box S-1 for the committee’s Statement of Task.

Working with USGCRP, the committee first sought to understand the purpose of the evaluation it was charged with informing. USGCRP clarified that the evaluation is intended to support a process of continuous improvement by the Program and that the findings of the evaluation may inform ongoing and future NCAs, as well as other USGCRP products. Thus, the committee observed, an evaluation designed to measure the outcomes of the most recent NCA would be insufficient. USGCRP also needed information on why its products were working or not working with particular audiences. This also suggested to the committee that the original list of questions provided in the Statement of Task may not go far enough. For example, to support continuous improvement, it might also be important to look at the process used to create the NCA and to understand more about its uses, including what aspects have been effective or not to different users. The focus of the Statement of Task on the users of the NCA, their awareness of and engagement with the NCA products, and the usefulness of the information contained in the NCA for decision-making critically informed the committee’s evaluation strategy.

The committee also determined that it was premature to create a detailed evaluation design. The Statement of Task requested a strategy, not a design, and much more would need to be known about USGCRP’s needs and

___________________

1 Global Change Research Act of 1990, 15 U.S.C. Chapter 56, Public Law 101-606, 104 Stat. 3096-3104, §106.

Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.

BOX S-1
Statement of Task

At the request of the U.S. Global Change Research Program (USGCRP), the National Academies of Sciences, Engineering, and Medicine (the National Academies) will establish an ad hoc committee to develop a strategy for evaluating stakeholder use of the National Climate Assessment (NCA) and selected other USGCRP products. The committee will develop criteria to prioritize the stakeholder groups that should be involved in such an evaluation, a conceptual and methodological framework and design for an evaluation, and plans for data collection and other information-gathering activities. The evaluation strategy will be designed to help determine to what extent these products meet decision and informational needs of selected stakeholder groups. Diversity, equity, inclusion, and justice (DEIJ) principles will be considered and incorporated in the evaluation strategy. The committee will not provide a technical review of the assessments.

The evaluation strategy will address the following questions:

Regarding usefulness

  • How and to what extent have stakeholders found NCA materials to be useful? What specifically has been useful?
  • Does the selection of topics, regions, sectors, and level of detail (e.g., time frame, spatial resolution, subsector concerns) in the NCA (and USGCRP products it references as well as related USGCRP products) adequately address the stakeholders’ needs?
  • What decision or informational needs were well-addressed by the NCA? What decisions can stakeholders make given the level of information provided?

Regarding decision making, future needs and missing information/details/tools

  • What future needs are anticipated? What additional types of decisions (if any) do stakeholders anticipate they would revisit given different topics and/or levels of detail? What information would be required to meet needs that USGCRP is not meeting already?
  • What decision or information needs did the stakeholders expect would be met by the NCA but were not?
  • For stakeholders whose decision or informational needs were not met by the NCA and selected USGCRP products, what is the reason? What other products/materials, including other USGCRP and non-USGCRP products, did they use, if any?

Stakeholder awareness and engagement

  • How aware or involved were different stakeholder groups in the NCA development process and how did this influence their use of the report? For stakeholders who were not previously aware of the NCA development process, how did they become aware of the NCA?
  • How effectively does the NCA development process engage historically marginalized communities and underrepresented stakeholders?
  • How understandable and navigable is the NCA, including the report documents and findings, and underlying supporting data? Is the NCA information presented in a format that informs decision making?

priorities to create a design. Such an effort would likely require a self-study on the part of USGCRP, perhaps with the assistance of a consultant or contractor, and would require a different type of process and schedule than the Statement of Task provided for. In addition, the creation of an evaluation design will require an iterative process, first prioritizing potential audiences and then developing appropriate strategies for each audience. Instead, this report provides a strategy for creating and implementing an evaluation design.

The committee performed several steps to complete its task. It sought presentations from both past NCA staff and users of the NCA. Working to refine the research questions provided by USGCRP, the committee determined that first a logic model was needed, and developed a preliminary illustrative model, realizing that USGCRP’s input is needed to finalize the logic model. The Statement of Task did not request a logic model, but the committee felt that creating one is a key step toward developing an evaluation strategy and design. While considering the audiences served by the NCA, the committee also determined that a structure was needed for examining the roles of the various audiences. Because information from the NCA spreads via existing networks, both formal and informal, to form a dynamic web of actors and organizations, the committee decided that an evaluation study would benefit by treating the audiences as having their own networks, and consider communications across audiences

Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.

as representing a network of networks. Given the multitude of potential audiences, the committee decided that an evaluation study would be forced to prioritize among them and that audiences would differ in the extent to which data could be collected and analyzed. Thus, the committee decided on criteria for prioritization, based first on USGCRP’s information needs and second on the feasibility of engaging with them; these criteria can be used to set priorities among the audiences. The committee also explains in this report how different audiences might require different methodological approaches.

In addition to recommending an approach to evaluation design, this report identifies critical questions and decisions to be addressed by USGCRP, to obtain an evaluation that meets its highest-priority needs for enhancing the accessibility, usability, and appropriateness of the NCA, given its wide range of audiences and users and its goal of informing decision-making.

MULTIPLE AUDIENCES AND NETWORKS USE NCA INFORMATION

As specified in the Statement of Task (Box S-1), multiple audiences, with varied needs, might make use of NCA information. This is important in the context of evaluation because evaluators may need to customize approaches for each audience. In addition, the NCA is produced and functions within a complex environment in which there are multiple sources of information on climate change; these sources comprise networks that may retransmit the information from an NCA (with or without attribution to the NCA), customize or translate the NCA information (perhaps by adding new information such as for specific localities), or be based on sources other than the NCA. This situation complicates the task of an evaluator who wishes to focus specifically on the NCA, as it is difficult to isolate its role and impact. Moreover, the committee believes that it is useful to characterize this environment as a network of networks, with multiple nodes through which information is transmitted and modified. In the context of evaluation, developing an understanding of these nodes—specifically, how information from the NCA is used, modified, and transmitted—is thus important for measuring the outcomes of the NCA and understanding how and why it is used.

Recommendation 5-1: In choosing which groups to study as part of an evaluation, the U.S. Global Change Research Program should seek diversity (including a focus on marginalized populations) similar to that of the participants and audiences with which the Program seeks to engage.

Recommendation 4-1: In designing an evaluation of the National Climate Assessment or related products, the U.S. Global Change Research Program should make use of network analysis as a tool for addressing the evaluation questions related to understanding who key actors are, how information is transmitted across multiple entities, which entities serve as key nodes for disseminating information, and how the network of networks supports that flow of information.

LOGIC MODELS TO CONNECT PROGRAMS AND OUTCOMES

In response to the Statement of Task, the committee developed a conceptual basis for the evaluation, using an established approach in evaluation, namely, the development of a logic model or theory of change for the program(s) to be evaluated. If USGCRP has not already done so, this is a first step in creating an evaluation design. A logic model displays how specified inputs, such as the NCA itself, are intended to produce a range of outcomes. Creating a logic model helps to establish what an evaluation should measure and how the relationships across various elements of the logic model should be modeled or considered.

The committee feels that it is currently premature, based on the information available to the committee, for it to develop a fully refined logic model; such a model would require working directly with USGCRP to determine its goals, priorities, and beliefs about how the NCA is engaging with audiences and informing their decisions. Instead, in Figure S-1, the committee provides an illustrative logic model showing how such a model can be created and used, with the expectation that USGCRP could work with evaluators to modify, refine, or replace this illustrative model. The column “What We Do” lists those products and services produced as part of the NCA; “Who With” lists USGCRP’s partners and audiences. The next three columns describe the potential outcomes of the NCA for its audiences.

Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
An illustrative logic model in matrix form, showing examples for each logic model component. Entries corresponding to requirements in the Global Change Research Act of 1990 are highlighted.
FIGURE S-1 Illustrative logic model for an evaluation of the National Climate Assessment (NCA).
NOTE: GCRA = Global Change Research Act; NGO = nongovernmental organization.
SOURCE: Generated by the committee, adapted from Morton and Cook, 2023.
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.

The column “Ultimate Impacts” lists broader potential societal impacts from the NCA, although these are outside the scope of the planned evaluation effort, which focuses on outcomes for users. Empty boxes are included in the figure to illustrate that these lists may not be complete.

Recommendation 3-3: The U.S. Global Change Research Program should develop a logic model to describe how its products, including the National Climate Assessment, are hypothesized to achieve their intended outcomes.

OVERARCHING EVALUATION QUESTIONS FOR THE NCA

Using the illustrative logic model, the committee developed a list of six overarching questions. These are not intended to be used directly in the evaluation but instead are meant to guide evaluators in developing more specific data collection questions and ultimately measures of use of the NCA and factors affecting its use. The committee recommends that USGCRP and its evaluators build upon these questions by creating their own logic model and derivative evaluation questions.

  1. To what extent are priority audiences2 aware of NCA products and what are the most effective ways to increase awareness? How, if at all, did involvement in the development process contribute to general awareness and use of the report?
  2. How, and to what extent, did NCA products address information needs among priority audiences (i.e., what did they gain cognitively in terms of knowledge, skills, attitudes, capacities, etc.)?
  3. How, and to what extent, did NCA products address decision needs among priority audiences (i.e., what did they do as a result of using the products)?
  4. How did the attributes of the products and process contribute to how users feel, what they gain (e.g., cognition), and what they do (e.g., behaviors)? What about the products and process could be changed to make them more effective?
  5. How do the contextual factors described in the logic model influence how audiences feel, what they gain (e.g., cognition), what they do (e.g., behaviors), and how they mediate the use of the NCA?
  6. How does the network of networks factor into use?

Note that the sixth question has implications for new types of analysis not included in the previous five. Some types of questions will be relevant regardless of whether one thinks about a network (e.g., what specific parts of the NCA are used, what helps or hinders in accessing or making use of the information), but introducing the concept of a network raises questions such as how the various nodes are connected, which nodes are most used, and what happens to the information as it passes through a node.

Recommendation 3-4: The U.S. Global Change Research Program should use a logic model developed for the evaluation to generate a set of overarching evaluation questions and should consult with partners and selected evaluation users to ascertain whether answering those evaluation questions will meet evaluation users’ needs.

PRIORITIZING AMONG MULTIPLE AUDIENCES

The Statement of Task recognizes that the NCA has multiple audiences, and it requests criteria for prioritizing which audiences to include in the evaluation data collection. Both substantive and practical considerations are at play in evaluating outcomes across audiences or selecting which audiences to evaluate. For example, two audiences—Congress and the President—are specifically named in the legislation that created USGCRP and

___________________

2 It may not be possible to cover all NCA audiences in an evaluation and thus will be up to USGCRP and the evaluator to prioritize those included in the evaluation. The committee provides a list of criteria for doing so in this report, which are introduced in subsequent paragraphs.

Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.

the NCA, and thus may be presumed to be top-priority audiences for the NCA. However, a much wider array of audiences makes use of the NCA or could find it applicable to their decision-making and policymaking. The committee did not attempt to set priorities among these audiences, but instead presents criteria that USGCRP can use to select which audiences to include in the evaluation (see Recommendation 5-1). The criteria should also be weighed alongside available resources: the acceptable financial cost of engaging with a high-priority audience may be greater than for other audiences.

Recommendation 5-2: The U.S. Global Change Research Program (USGCRP) should select audiences to include in evaluation based on the following criteria: importance in USGCRP’s logic model, including (1) the role of an audience in climate-related decision-making; (2) the role of an audience in the transmission of climate information for decision-making; (3) the generalizability of results from an audience to other populations; and (4) feasibility, diversity, and suitability for the evaluation question and method used. A targeted audience does not need to meet all of these criteria, but the audiences prioritized in an evaluation should meet these criteria collectively.

TAILORING METHODOLOGICAL APPROACHES TO SPECIFIC AUDIENCES

Depending on the characteristics of an audience and what USGCRP wishes to learn from it, different methodological approaches will apply. In-depth qualitative tools, such as personal interviews and focus groups, can be indispensable for understanding the issues involved, while statistical sample surveys can provide nationally representative quantitative measures of outcomes and barriers.

The size of the NCA audiences evaluated also matters. Some groups, such as K–12 educators or the general public, are so large that evaluators would either need to conduct a survey using statistical sampling (or make use of existing surveys that might have collected relevant data) or choose approaches that are more limited in scope but also less representative (e.g., conducting an online focus group; talking with curriculum developers; working with professional associations, such as the National Science Teachers Association).

For some audiences, it may be best to start with exploratory data collection, such as interviews or focus groups, to learn what types of information people are able to provide and how they interact with climate science information. Audiences that are exposed to climate science primarily through intermediaries may require a different approach than those who deal with the NCA more directly.

Recommendation 6-1: The U.S. Global Change Research Program should design its evaluation and data collection plan so that the methods used for priority audiences can answer the overall evaluation questions identified in the logic model. The methods and approach chosen should be tailored to the audience and the evaluation question being investigated.

USING EVALUATIONS TO SUPPORT CONTINUOUS IMPROVEMENT

Finally, the evaluation of the NCA and its products is best considered not as a “one and done” process, but rather as an iterative process of learning over time. USGCRP made clear to the committee that the evaluation is intended to support a process of continuous improvement and that it was desirable to use the findings of the evaluation to inform ongoing and future NCAs, as well as other USGCRP products. For this, USGCRP needs an evaluation designed not only to measure the outcomes of the most recent NCA, but also to provide information on why its products are working or not working with particular audiences. To support continuous improvement, the committee believes that it might also be important to look at how the process used to create the NCA affects its uses, including what aspects have been effective or not for different users.

At times, exploratory analysis will be needed to determine what data can and should be collected. A case study of an audience (such as a particular nongovernmental organization) may be helpful before conducting a more comprehensive study, and it may provide valuable insights that can help generate lessons learned, to be applied to other entities. If USGCRP creates tracking and feedback mechanisms to be used in the ongoing report preparation

Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.

and dissemination, such data may be used to develop or improve later data collection. To the extent that the NCA continues to change, there will be value in examining how well those changes support USGCRP’s goals.

Recommendation 7-2: The U.S. Global Change Research Program should sequence evaluation into manageable components, allowing for iterative testing and learning about how to best pursue evaluation over time. Sequenced components may include conducting evaluability assessments, piloting focused on certain agencies or chapters of the National Climate Assessment, picking low-hanging fruit first, or developing case studies.

Recommendation 7-3: In communication about evaluation efforts, the U.S. Global Change Research Program (USGCRP) should aim for active two-way communication with users. Communication mechanisms may include ongoing feedback, interim findings, meetings to tailor the communication of evaluation findings to particular situations, and communication about how input was used that helps connect evaluation efforts with USGCRP’s objectives.

The size and diversity of the population that does and could use climate information to make decisions means that an evaluation of the outcomes of the NCA and related products is necessarily ambitious. In this report, the committee introduces concepts from evaluation practice, including network analysis, that will be essential to understanding the way the NCA and related products inform decision-making. The approaches recommended will require significant commitment of time and resources from USGCRP. The committee believes that evaluation of use will reveal both the large impact of federal climate science on decisions across the nation and gaps and frailties that can be addressed in the future. Undertaking outcome evaluation is accordingly a major step for the Program. The committee suggests taking this important step to support and improve national assessments to come, and to prioritize USGCRP’s efforts on the NCA and related products. At the same time, a targeted effort to understand the outcomes and utilization of the NCA among various audiences can support the more effective allocation of resources by USGCRP and its member agencies, and the understanding of how better to serve the needs of the Program’s priority audiences and participants.

EVALUATION LEADERSHIP AND IMPLEMENTATION

To ensure that an evaluation addresses priority needs and opportunities for the Program and the wide range of audiences and decision-makers who use the NCA, engagement and buy-in to the evaluation by USGCRP leadership is critical, as these individuals have the seniority, authority, and responsibility to define the scope. USGCRP must determine the scope of the evaluation; the makeup of the team; the budget; and whether the needs can be met internally, externally, or through some combination of personnel.

Early in the process, USGCRP may consider forming an evaluation team, made of leadership, staff, key partners, and/or others who will guide the evaluation process, and provide diverse perspectives. The committee’s recommendations call on USGCRP as the primary actor responsible for decision-making. Additionally, the committee identifies actions and activities to be carried out by evaluators with expertise in designing and implementing evaluation.

Recommendation 3-1: The leadership of the U.S. Global Change Research Program should engage from the start in defining the evaluation scope and should ensure that the leadership perspective, as well as the necessary evaluation expertise, is incorporated throughout the design and implementation of the evaluation.

Recommendation 7-4: The U.S. Global Change Research Program should consider bringing in outside expertise and research capabilities—such as through contractors, consultants, grantees, or interagency agreements—to assist in designing and implementing the evaluation.

Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.

This page intentionally left blank.

Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 1
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 2
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 3
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 4
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 5
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 6
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 7
Suggested Citation: "Summary." National Academies of Sciences, Engineering, and Medicine. 2024. Developing a Strategy to Evaluate the National Climate Assessment. Washington, DC: The National Academies Press. doi: 10.17226/27923.
Page 8
Next Chapter: 1 Introduction
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.