Understanding and Addressing Misinformation About Science (2025)

Chapter: 9 Conclusions, Recommendations, and Research Agenda

Previous Chapter: 8 The Study of Misinformation About Science
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

9

Conclusions, Recommendations, and Research Agenda

The committee was tasked with characterizing the nature and scope of misinformation about science and its differential impacts; identifying solutions to limit its spread; and providing guidance on interventions, policies, and research toward reducing harms from it. To this end, the committee examined the existing body of evidence from diverse disciplines investigating misinformation about science (e.g., agricultural science, communication, computational social science, engineering, history, information science, journalism, law, media studies, political science, psychology, and sociology) to yield this consensus report.

This chapter summarizes the committee’s conclusions and outlines 13 recommended actions toward prioritizing capacities, resources, and policies to better understand misinformation about science and intervene, when needed, to the greatest effect. The committee’s recommendations reflect prioritized actions to mitigate misinformation about science based on relative potential for harm. They also reflect today’s complex information ecosystem where action is needed at multiple levels (i.e., individual, community, organizational/institutional, societal) and involve a diversity of actors who are well positioned to employ specific mitigation strategies (e.g., community and civil society organizations, funders, media companies, policymakers, regulators, science communicators, scientists, medical professionals, and scientific institutions/organizations). Given the complex and multi-layered nature of the spread and impact of misinformation about science, the committee’s recommendations have the greatest potential to effectively mitigate the negative consequences of misinformation about science when implemented in concert, rather than independently.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

In presenting the committee’s conclusions and recommendations, we begin by defining misinformation about science and describing the contemporary information ecosystem in which the science information environment is embedded. We then describe key sources of (mis)information about science and the possible paths of influence by which it originates and spreads. Third, we discuss the impacts of misinformation about science at different levels (individual, community, and societal) and describe which specific impacts are documented empirically. Finally, we review the range of interventions that are being employed to address misinformation about science and discuss their documented effectiveness.

DEFINING MISINFORMATION ABOUT SCIENCE

Establishing a clear definition of misinformation about science was an essential part of the committee’s charge. Currently, there are many terms used in social science research to describe information that deviates from accuracy (e.g., conspiracy theories, disinformation, fabricated news, malinformation, misinformation, propaganda, rumor), but across different disciplines and methods, there is some disagreement about key concepts within their meaning (Altay et al., 2023; Søe, 2021; Vraga & Bode, 2020). Misinformation specifically, is often used as a broad term to describe falsehoods, and there is some debate about which phenomena are distinct from this broader concept (Altay et al., 2023; Vraga & Bode, 2020). Defining what constitutes misinformation about science is also nontrivial, in part because of the contingent nature of scientific consensus. Moreover, science can be poorly communicated or misrepresented, and there are currently no bright lines between scientific uncertainty, science done poorly, and misinformation about science. Thus, to provide clarity and focus for its analysis and to offer a guidepost for the broader research community, the committee developed a definition of misinformation about science (see Chapter 2 for further discussion of considerations for defining misinformation about science).

Conclusion 2-1: In both public discourse and in peer-reviewed research, the term misinformation has been used as an umbrella term to refer to various types of false, inaccurate, incorrect, and misleading information. The broad nature of the term has made it difficult to develop a coherent understanding of the nature, scope, and impacts of misinformation, and by extension, misinformation about science. To address the lack of a consistent definition, the committee has developed the following definition: Misinformation about science is information that asserts or implies claims that are inconsistent with the weight of accepted scientific evidence at the time (reflecting both quality and quantity of

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

evidence). Which claims are determined to be misinformation about science can evolve over time as new evidence accumulates and scientific knowledge regarding those claims advances.

Given that scientific knowledge is not static, we reiterate that “at the time” is a significant component of the definition. That is, a scientific claim that is considered to be inconsistent with accepted scientific evidence at one point in time could, through the generation of new empirical evidence, become a reasonable alternative view on the topic at a later point. We also note that while the contingent nature of scientific agreement may be seen as a given from the perspective of scientists, updates to scientific understanding due to new evidence can inadvertently create confusion for non-scientists.

Much attention has also been paid to disinformation and its conceptual relationship to misinformation. In the literature, the concept of intent is often included as a distinguishing feature of this informational phenomenon (Freelon & Wells, 2020); however, intent is difficult to evaluate and operationalize. In defining misinformation, the committee determined that the motive or intent of the agent promulgating the information is immaterial to the potential impacts of that information on the recipient of the information. Therefore, the committee considers disinformation about science to be a sub-category of misinformation that is circulated by agents that are aware that the science information they are circulating is false.

THE CONTEMPORARY INFORMATION ECOSYSTEM

The science information environment is primarily composed of scientific research findings and science news in both print and broadcast forms. Importantly, science news has been significantly impacted by decades of newsroom cutbacks and the closing of local newspapers (Abernathy, 2018). As discussed in Chapter 3, the science information environment is nested within and shaped by the broader 21st century information ecosystem. This broader information ecosystem is characterized by advanced information and communication technologies that have greatly enhanced the volume and speed of the production, and dissemination of both accurate and inaccurate science information. Moreover, the media system of the 21st century information ecosystem is a hybrid of interconnected digital technologies and media types (e.g., search engines, social media, internet websites, electronic broadcast media) as opposed to the predominantly analog media system of the 20th century. This means that science information can quickly travel across different channels and media types, and in some cases, can become divorced from the original context needed to appropriately evaluate the accuracy and reliability of the information (e.g., a screenshot of a

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

news article headline and photo circulating through online platforms apart from the associated content of the article). Additionally, online platforms have enabled a flattening of hierarchies across professional and social networks (a phenomenon that has been characterized as “context collapse” [Davis & Jurgenson, 2014]) in ways that enhance information exchange but may also blur the lines between reliable and unreliable sources of science information. Additionally, the rise in the production and dissemination of information via generative artificial intelligence (AI) along with the more recent integration of AI into search engine results may make it even more challenging to discern reliable science information (Memon & West, 2024). All of these advancements, though beneficial in many ways, add to the complexity that consumers of science information have to navigate within contemporary online environments.

Conclusion 3-1: Though inaccuracy in scientific claims has been a long-standing public concern, recent changes within the information ecosystem have accelerated widespread visibility of such claims. These changes include:

  • the emergence of new information and communication technologies that have facilitated access, production, and sharing of science information at greater volume and velocity than before,
  • the rise of highly participatory online environments that have made it more difficult to assess scientific expertise and credibility of sources of science information, and
  • the decline in the capacity of news media that has likely reduced both the production and quality of science news and information.

In recent years, there has also been increased attention to public trust in science as a reliable source of information, including concerns that declines in trust may contribute to the spread and uptake of misinformation about science (Lupia et al., 2024). Current data suggest that trust in science and confidence in scientific institutions has actually fared better than a number of other institutions over the last five decades (Brady & Kent, 2022; Krause et al., 2019). But a notable decline was observed in 2022, and levels of confidence in the scientific community vary significantly by partisan identity (Davern et al., 2024). Additionally, patterns of trust and mistrust in science and other civic institutions also varies across demographic groups and, in some cases, have been shaped by negative historical experiences with such institutions (Bui, 2021; Peña, 1997; Visperas, 2022). See Chapter 3 for a detailed discussion about trends in trust in different U.S. institutions over time.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

Conclusion 3-2: Trust in science has declined in recent years, yet remains relatively high compared to trust in other civic institutions. Although confidence in the scientific community varies significantly by partisan identity, patterns of trust in science across demographic groups also vary as a function of the specific topic, the science organization or scientists being considered, or respective histories and experiences.

SOURCES AND SPREAD OF MISINFORMATION ABOUT SCIENCE

Misinformation about science is produced from a range of institutional and individual sources, and spreads through the same channels and many of the same mechanisms as accurate information about science (see Chapters 4 and 5 for detailed discussions of the various sources of misinformation about science and how it spreads). It can also be difficult to draw a clear line between sources of misinformation about science and the mechanisms for its spread. For example, a single source might create original content, produce it in various formats, and distribute it to others. This content might be picked up and shared by others, but it might also be “re-created” and packaged differently, making an individual both a “spreader” and a “producer.”

Historically, misinformation about science has emerged from news stories (e.g., news reports in the mid-1800s that misrepresented the tomato hornworm as a deadly caterpillar; O’Connor & Weatherall, 2019), from advertising (e.g., the marketing of oil made from rattlesnakes—so called “snake oil”—as an effective cure for various illnesses in the late 1800s; Jaafar et al., 2021), and even from research propagated by prominent scientific institutions (e.g., eugenics research in the 19th and early 20th century—based on the false premise of racial superiority through genetic inheritance; Grant & Mislán, 2020). Over time, it has become clear that misinformation about science is not limited to one type of medium or to a particular source, and there are differences across sources in terms of their relative influence on the spread of misinformation about science, and consequently, the potential impacts on individuals, groups, and society.

As discussed in Chapter 4, strategic campaigns are especially effective in spreading disinformation about science for profit or other personal interests, and in some cases, specific populations have been intentionally targeted by such efforts (e.g., the sugar industry targeting low-income and communities of color [Bailin et al., 2014], and the tobacco industry’s marketing to Black communities [Wailoo, 2021]). Misinformation about science can also unintentionally originate from the scientific and medical community, including universities press offices, research organizations, and individual scientists and healthcare professionals. This can occur as a result of exaggerations or oversimplifications of research claims in press

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

releases, distortions in the interpretations of scientific data, omissions of details about the preliminary nature of research published in the form of preprints, deviations from evidence-based science communication strategies when engaging in debates about science in the public arena, dissemination of misinformation to patients as a result of inadequate knowledge of the latest consensus on the part of the medical provider, and the propagation of false information based on racial and cultural biases (e.g., in medical textbooks). Other key sources of misinformation about science with great potential for influence include elite individuals, government actors, and news media outlets.

Conclusion 4-1: Misinformation about science is widely understood to originate from several different sources. These include, but are not limited to:

  • for-profit corporations and non-profit organizations that use strategic communication (e.g., public relations, advertising, promotions, and other marketing campaigns) to intentionally seed and amplify misinformation about science for financial gain, to advance ideological goals, or to mitigate potential losses,
  • governments and politicians that either publicly discredit the weight of evidence on science issues or seed misinformation about science as part of their political agendas,
  • alternative science and health media that advocate for treatments and therapies that are not supported by scientific evidence,
  • entertainment media through fictional and non-fictional narratives and plotlines that oversimplify, exaggerate, or otherwise misrepresent science and scientists to be compelling or for cinematic effect,
  • reputable science organizations, institutions, universities, and individual scientists as a byproduct of poor science communication, distortions of scientific data, the dissemination of research findings before they have been formally vetted and substantiated, and in the worst cases, scientific fraud,
  • press offices and news media organizations due to misrepresentation and misreporting of scientific studies, medical developments, and health issues, and
  • elite and non-elite individuals due to a variety of motivations.

CONCLUSION 4-2: Not all misinformation about science is equal in its influence. Rather, misinformation about science has greater potential for influence when it originates from authoritative sources; is amplified by powerful actors; reaches large audiences; is targeted to specific populations, or is produced in a deliberate, customized, and organized

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

fashion (e.g., tobacco industry campaigns to cast doubt about the health risks of smoking).

Changes Within Journalism

Many adults in the United States get their science information from news outlets, making the quality and quantity of science news production especially important (Funk et al., 2017). In the past few decades, however, the institution of journalism has experienced decreases in funding which has led to significant reductions in local news coverage and a major downsizing in reporting across many news organizations (Minow, 2021). At the same time, public trust and confidence in the media and journalists has steadily dropped—while 23% of Americans indicated high levels of confidence in the press in the 1973 General Social Survey (GSS), only 7% did in 2022 (Davern et al., 2024). This constellation of factors create conditions in which misinformation about science can more easily spread.

Science reporting is often guided by journalistic values and norms that prioritize capturing public attention over careful consideration of the process of science and the nature of scientific evidence (Dunwoody, 2021). Layoffs in journalism have meant that journalists who lack specialized training in science are being assigned to cover science news (Dunwoody, 2021). Insufficient expertise in science and scientific research methods may make it challenging for journalists to correctly interpret scientific research and properly contextualize the findings in their reporting. Moreover, issues related to health and wellness, which in the context of this report is considered to be a category of science, are often reported on by journalists who specialize in lifestyle, trends, or general news (O’Keeffe et al., 2021, Tanner, 2004; Voss, 2002). This kind of topic-based segmentation within journalism also creates conditions that can lead to the unintentional spread of misinformation about science from news media organizations.

Conclusion 4-3: Journalists, editors, writers, and media/news organizations covering science, medical and health issues (regardless of their assigned beat or specialty areas) serve as critical mediators between producers of scientific knowledge and consumers of science information. Yet, financial challenges in the past two decades have reduced newsroom capacity to report on science, particularly at local levels.

Conclusion 4-4: Science reporting for the general public may be particularly prone to the unintentional spread of misinformation about science. Several factors can influence this, including journalistic norms (e.g., giving equal weight to both sides of a scientific debate, even when the scientific evidence overwhelmingly points in one direction),

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

informational and ideological biases, over-reliance on public relations and other information subsidies (e.g., university press releases), exaggerations and omissions of important details from the original science articles, and insufficient scientific expertise, among journalists, particularly at under-resourced news organizations.

Features of Online Platforms

Specific features of online platforms contribute to the spread of misinformation about science. For example, online platforms often have content prioritization algorithms that privilege emotional and controversial content over credibility along with lax content moderation policies and practices (see Chapter 5). Such conditions also make it easier for dedicated purveyors (individual and institutions) to spread misinformation about science. Additionally, while social media platforms are designed to make information easily accessible and shareable across social networks and groups, they may also create environments that distract from the truthfulness of material shared (Epstein et al., 2023). In general, there is strong evidence that people prefer sharing true, rather than false information (Pennycook & Rand, 2021) and share information with good intentions. Furthermore, individuals who share information about a science topic that is inconsistent with the weight of accepted scientific evidence at that time, may actually believe that what they are sharing is a true interpretation of the scientific evidence. Nevertheless, a variety of factors can lead to either the intentional or unintentional spread of misinformation about science online.

Conclusion 5-1: Individuals share information for a variety of reasons—for example, to improve their social status, to express a particular partisan identity, or to persuade others to adopt a certain viewpoint. Individuals may inadvertently share misinformation in the process of sharing information, and this may be due to their confusion about the credibility of the information, their inattention to accuracy, or their desire to help or warn loved ones, among other reasons.

Conclusion 5-2: In some cases, individuals and organizations may knowingly share misinformation to profit financially, to accrue social rewards (e.g., followers and likes), to accrue and maintain power, to erode trust, or to disrupt existing social order and create chaos (e.g., trolling). These motivations may be especially incentivized in social media environments.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

Conclusion 5-3: The spread of misinformation about science through social networks on social media and through online search platforms is affected by design and algorithmic choices (e.g., those shaping individualized feeds based on prior platform activity), permissive and loosely enforced or hard-to-enforce terms of service, and limited content moderation. Moreover, platform companies may not voluntarily implement approaches to specifically address such issues when they are in conflict with other business priorities.

Exploiting Trust in Science

Some purveyors of misinformation about science have leveraged the relatively high trust in science and the authoritative “voice” of science to facilitate spread of misinformation. In some cases these efforts take the form of intentional campaigns that employ key strategies to spread misinformation about science, such as manufacturing doubt about established scientific evidence, creating Astroturf campaigns (i.e., hiding conflicts of interest, for example, between the message and the source that sponsors it) to create the illusion of public support, promoting false balance in scientific debates (in part by exploiting journalistic norms requiring the coverage of “both sides”), and leveraging relationships with scientists or medical professionals who disagree with the prevailing weight of the scientific evidence to generate an illusion of credibility (see Chapter 5).

Conclusion 5-4: Science has traditionally been recognized as an authoritative civic institution that produces many benefits for individuals, communities, and societies. Yet, at times, scientific authority has been co-opted by individuals and organizations feigning scientific expertise, and by science and medical professionals acting unethically in ways that contribute to the spread of misinformation about science (e.g., speaking authoritatively on scientific topics outside of one’s area of expertise).

Recommendations to Promote the Spread of Accurate Information About Science

Considering the complexities of the current information ecosystem and what is known about sources and mechanisms of the spread of misinformation about science, the committee recommends the following prioritized actions to enhance the visibility and prevalence of accurate science information while minimizing the spread of misinformation about science.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

Recommendation 1: Some corporations, strategic communication companies, and non-profit organizations have at times embarked on systematic campaigns to mislead the public, with negative consequences to individuals and society. Universities, researchers, and civil society organizations should work together to proactively counter such campaigns using evidence from science and science communication to mitigate their impact. For example, researchers, government, and advocacy organizations have come together to counter campaigns from the tobacco industry to reduce the public health impact of tobacco use. Similar efforts should be made for other scientific topics of public interest.

Recommendation 2: To ensure the promotion of accurate science information and reduce the spread of misinformation or misleading information from the scientific community:

  • Press offices of universities, research organizations, and funders of scientific research should consult with scientists to accurately report on their research findings and review draft press releases prior to dissemination. Press releases should explicitly state that they have been reviewed by the authors of these papers, and the authors should be accountable for the approved content.
  • Universities, research organizations, and public and private funders of scientific research should encourage both their scientists and press offices to provide appropriate context—limitations and weight of evidence—when publicizing research from their organizations.

Recommendation 3: Scientists and medical professionals who are active in the public arena can play a critical role in communicating accurate and reliable science and health information to the public.

  • Scientists, medical professionals, and health professionals who choose to take on high profile roles as public communicators of science should understand how their communications may be misinterpreted in the absence of context or in the wrong context. They should work proactively with professional communicators and draw on evidence-based science communication strategies to include appropriate context, interpretations, and caveats of scientific findings in their public communication.
  • Universities and research organizations who promote individual scientists to share their research with the public should provide them with training and support to take on such public communication roles.
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

For some topics (e.g., genetics, epidemiology), it is likely that legitimate scientific evidence can be misused to support inaccurate beliefs (Lee et al., 2021; Nam & Sawyer, 2024), so it is especially important for researchers in such areas to be mindful about how their findings are communicated and disseminated. As such, the committee acknowledges that even if the actions in the above recommendation are taken, it may not prevent others from sharing the information in ways that distort the original or intended meaning. Therefore, it is also important for these stakeholders (i.e., scientists, medical professionals, health professionals, universities, and research organizations) to evaluate when the benefits of sharing this information publicly outweigh the risks of the information being distorted. Notwithstanding these considerations, the committee sees a critical need for more supports for scientists to effectively engage with the public in order to minimize the spread of misinformation about science (see Chapter 4).

Recommendation 4: To promote the dissemination of and broad access to evidence-based science information, funders of scientific research (e.g., federal science agencies, non-profit and philanthropic foundations) and non-partisan professional science organizations (e.g., American Association for the Advancement of Science, American Association for Cancer Research, American Psychological Association, American Society of Plant Biologists) should establish and fund an independent, non-partisan consortium that can identify and curate sources of high-quality (e.g., weight of evidence—quantity and quality) science information on topics of public interest. The consortium should also frequently review the science information from these sources for accuracy, needs, and relevance. It is particularly critical to ensure that access to such science information is openly and equitably available to all groups, especially underserved groups. Additional possible functions of the consortium could include:

  • identifying which sources should be included for curation,
  • providing ratings of accuracy for different sources,
  • creating short, accessible summaries of science information drawn from high-quality sources on topics determined by the consortium, and
  • reviewing the science information from different sources on a routine basis to update ratings of accuracy.

Recommendation 5: Online platforms, including search engines and social media, are major disseminators of true and false science information. These platforms should prioritize and foreground evidence-based science information that is understandable to different audiences, working closely with non-profit, non-partisan professional science societies and organizations to identify such information.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

The approach of privileging accounts or information from organizations deemed credible has been suggested and implemented by some organizations such as the Council of Medical Specialty Societies, the World Health Organization, and the National Academy of Medicine (Kington et al., 2021). Through elevating information from such sources of credible science information, it is implied that misinformation from other sources is expected to recede in the background. Moreover, in contrast to post-hoc approaches to moderating inaccurate science information online, this recommendation reflects a more proactive approach that platforms can take to increase the supply of high-quality science information.

Recommendation 6: To support and promote high-quality science, health, and medical journalism:

  • Professional science and journalism organizations, funders of news media organizations and journalism, and universities should establish mechanisms for journalists and news media organizations to readily access high-quality science information and scientific sources, and for sharing best practices in science, health, and medical reporting. Such supports are especially important for those working in news media organizations with limited capacity or resources (e.g., local and community-centered newsrooms).
  • Funders of news media organizations and journalism should make intentional investments in local and community media (newspapers, television, radio, among others) to bolster their capacity to serve the science information needs of their audiences.
  • News media organizations should help to increase the visibility of high-quality science journalism and best practices in science and medical reporting through incentives, rewards, and other recognition models.
  • News media organizations should increase access to high-quality science journalism by dropping paywalls around critical and timely science and health issues.

Recommendation 7: In training the next generation of professional communicators in journalism, public relations, and other media and communication industries, universities and other providers of communication training programs should design learning experiences that integrate disciplinary knowledge and practices from communication research and various sciences and support the development of competencies in scientific and data literacy and reasoning. These competencies should be reinforced through continuous learning opportunities offered by organizations that support mass communication and journalism professionals.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

The committee recognizes the limitations in adopting some of the recommended actions above, given the current financial challenges faced by the news media. Nonetheless, there is a critical need to support news media to maintain the capacity for high-quality science reporting, especially given its essential role in communicating science information to the general public.

Importantly, during times of crisis and emergencies, and when uncertainty and interest are high, journalists (national and local) become critical frontline communicators of science information (Altay et al., 2022; Van Aelst et al., 2021). Moreover, during times of emergencies, disasters, threats, and new crises, the need for high-quality science information and the potential for the spread of misinformation about science are particularly high. Additionally, researchers have demonstrated the negative impact that bad actors can have, specifically during a crisis event (Bennett & Livingston, 2018). Therefore, experts on emergency preparedness, disaster response, and environmental threat mitigation could also be important sources of credible science information for the public during such times.

Recommendation 8: Government agencies at national, state, and local levels (e.g., Federal Emergency Management Agency, Centers for Disease Control and Prevention, Food and Drug Administration, state public health departments) and civil society organizations (e.g., Association of State and Territorial Health Officials or National Association of County and City Health Officials) that deliver services during times of public health emergencies, natural disasters, threats, and new crises should contribute proactively to building and maintaining preparedness capacity for communicating science information at national, state, and local levels by:

  • developing internal workforce capacity to produce high-quality science information for the public,
  • bolstering capacity to engage and partner with diverse communities to understand their needs, goals, and priorities for high-quality science information,
  • establishing and maintaining trusted channels of communication across national, state, and local levels and between crises, and
  • working collaboratively with local news organizations to ensure that accurate, high-quality science information is disseminated to diverse publics both during emergencies as well as in preparing for emergencies.
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

UNDERSTANDING THE IMPACTS OF MISINFORMATION ABOUT SCIENCE

Misinformation about science has the potential to directly and/or indirectly impact individuals, communities, and societies (see Chapter 6 for an in-depth discussion of the evidence on the impacts of misinformation about science). For example, misinformation about science can undermine evidence-based personal and policy decisions. It can also reinforce negative stereotypes about specific groups, exacerbating discrimination and marginalization, and in some cases, stoking violence. However, current evidence substantiating a direct causal relationship between misinformation and harmful behaviors is weak. Importantly, most studies exploring the link between misinformation, misbeliefs, and behaviors measure behavioral intentions (what an individual will likely do), rather than measuring behavior directly. Furthermore, the majority of studies investigating the impacts of misinformation focus on the individual level (e.g., Phillips & Elledge, 2022), with relatively limited evidence about societal-level impacts. The committee notes that impacts at the societal level can be challenging to measure, given some societal harms are most consequential in the ways that they amass over time.

Social Inequality

Drivers of social inequality, such as education, race/ethnicity, class, and geography, shape how different communities and groups are situated with respect to access to high-quality science information and also exposure to misinformation about science (Viswanath, 2006; Viswanath et al., 2022a). For example, many low-income communities and communities of color have disproportionately less access to high-quality science and health information, and science information is rarely translated to other languages besides English. In addition, in the United States, platforms’ efforts to monitor and flag misinformation tend to prioritize content in the dominant English language (Bonnevie et al., 2023), leaving misinformation in other languages largely unchecked. All of these factors (i.e., disproportionate access to high-quality information, lack of in-language information, and unequal regulation of misinformation in non-English languages) create information voids within these communities.

Conclusion 6-1: Many historically marginalized and under-resourced communities (e.g., communities of color, low-income communities, rural communities) experience disproportionately low access to accurate information, including science-related information. Such long-standing inequities in access to accurate, culturally relevant, and sufficiently

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

translated science-related information can create information voids that may be exploited and filled by misinformation about science.

Individual-Level Impacts

In the view of the committee, a primary reason that misinformation is important to understand is because it can lead to the formation of misbeliefs in individuals (Adams et al., 2023; van der Linden et al., 2023). These misbeliefs can, in turn, lead individuals to make ill-informed decisions for themselves and/or their communities. It is important to note that people can certainly hold misbeliefs independent of exposure to misinformation. Nevertheless, current evidence suggests that exposure to misinformation about science as well as psychological and social factors that increase individual receptivity to it, all play a role in causing potentially consequential misbeliefs about science.

Additionally, while all people have the potential to believe misinformation (Berinsky, 2023), there are a number of factors that, either alone or in combination, influence whether an individual will be more or less receptive to it (i.e., demonstrate more or less openness to engaging with misinformation, and ultimately believe it). These include characteristics of the information itself (e.g., repeated information), how people appraise the sources of information (e.g., credibility, trustworthiness, confidence), and their prior knowledge, attitudes, and beliefs (e.g., science literacy, strong attitudes or firmly held beliefs about a topic, worldviews, values) (see Chapter 6). Importantly, the current empirical evidence does not support a simple and linear relationship between science literacy and resistance to misinformation (i.e., being more discerning about accurate and inaccurate information, more critical of dubious claims or sources, or more open to updating beliefs based on new evidence). In fact, while it is clear that science literacy is an important factor in how people process and interpret science information, including misinformation, other factors are also involved in these processes, such as religious beliefs, values, social identity, media repertoire, social networks, and values (Kahan et al., 2012; National Academies of Sciences, Engineering, and Medicine, 2017).

Conclusion 6-2: Most research to date on misinformation, including misinformation about science, has focused on its relationship to individuals’ knowledge, attitudes, beliefs, and behavioral intentions. Some research has examined the impact of misinformation on behavior. From this work, it is known that:

  • Misinformation about science can cause individuals to develop or hold misbeliefs, and these misbeliefs can potentially lead to detrimental behaviors and actions. Although, a direct causal
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
  • link between misinformation about science and detrimental behaviors and actions has not been definitively established, the current body of existence evidence does indicate that misinformation plays a role in impacting behaviors, that in some cases results in negative consequences for individuals, communities, and societies.
  • Individuals are more receptive to misinformation about science, and, consequently, most affected by it, when it aligns with their worldviews and values, originates from a source they trust, is repeated, and/or is about a topic for which they lack strong preexisting attitudes and beliefs.
  • Science literacy is an important competency that enables informed decision making but is not sufficient for individual resilience to misinformation about science.

As noted above, the impacts of misinformation about science on individuals are also shaped by social factors, including race/ethnicity, culture, socio-economic status, geography, community embeddedness (i.e., the closeness of interpersonal relationships and social ties within their community), and access to material and social resources (Crenshaw, 2017; Goulbourne & Yanovitzky, 2021; McCall, 2005; Smedley, 2012; Viswanath et al., 2000). Specifically, these factors influence what information individuals are exposed to, their information-seeking and -sharing behaviors, and what actions they may take in science-related contexts. For example, an individual who is a regular consumer of alternative health media (e.g., popular health-related TV talk shows, health blogs, websites that advocate for treatments not supported by scientific evidence) may hold misbeliefs about the efficacy of natural home remedies for serious illnesses, leading them to refuse conventional medicinal treatments—a decision that is linked to greater risk of death (Johnson et al., 2018). On the other hand, an individual may believe in the safety and effectiveness of vaccines and have access to sufficient vaccination information, but due to logistical (e.g., time offerings are inconvenient or inaccessible) or transportation difficulties, they may not get vaccinated.

Conclusion 6-3: Many individual-level factors such as personal values, prior beliefs, interests, identity, preferences, and biases influence how individuals seek, process, interpret, engage with, and share science information, including misinformation. Social factors, including race, ethnicity, socio-economic status, culture, social networks, and geography also play a critical role in affecting information access. This constellation of factors shapes an individual’s information diet, media repertoires, and social networks, and therefore may also determine how

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

much misinformation about science they encounter, the extent to which they engage with it, and whether it alters their beliefs.

Conclusion 6-4: The accuracy of the science information people consume is only one factor among many that influences an individual’s use of such information for decision making. Even when people have accurate information, additional influences can lead them to make decisions and engage in behaviors that are not aligned to the best available evidence. At the individual level, these include their interests, values, worldviews, religious beliefs, social identity, and political predispositions. At the structural level, access to material and social resources such as healthcare coverage, affordable nutritious food, internet connectivity, and reliable transportation, among others, may play a particularly important role.

Community- and Societal-Level Impacts

While most research on misinformation focuses on the individual level, some findings at the community and societal levels are emerging. Misinformation about science can disrupt the ability to discern reliable information from science for use in collective decision making, distort public opinion in ways that limit productive debate, and diminish trust in institutions that are important to a healthy democracy (see Chapter 6). Additionally, when misinformation about science reflects popular discourses in society about specific groups, it may be more likely to be perceived as factually accurate and accepted as true, with negative consequences for those groups (e.g., racialized discourses equating disease and illness with immigrants and foreigners, which then stoke anti-immigrant sentiments and violence; Paik, 2013). Misinformation about science can also perpetuate long-standing racism, discrimination, and marginalization, which in some cases, negatively impacts the extent to which some communities of color have access to, trust, and use information from the scientific community (Arvin, 2019; Gee & Ford, 2011; Kauanui, 2008; Teaiwa, 1994).

Relatedly, some populations have been direct targets of misinformation about science. For example, misinformation campaigns about menthol cigarettes have been heavily marketed toward African Americans for decades with severe impacts (Anderson, 2011; U.S. National Cancer Institute, 2017; Wailoo, 2021), and more recently, misinformation about Ebola and vaccines have also been targeted to African countries and Black neighborhoods, respectively (Campeau, 2023; Vinck et al., 2019). Indeed, some of the most troubling impacts of misinformation about science occur within public health when it delays the implementation of beneficial interventions, and the relationship between misinformation and vaccine hesitancy

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

is the most notable (Pierri et al., 2022; Viswanath et al., 2022b; Wilson & Wiysonge, 2020). For example, the 2017 measles outbreak among the Somali immigrant community in Minnesota was found to be associated with a significant drop in immunization rates among young children of Somali descent (from 90% in 2008 to 36% in 2014), following the targeted spread of misinformation about vaccines (that vaccines cause autism) within this community (Hall et al., 2017).

Given that significant health, educational, and wealth disparities across social groups already contribute to inequitable access to resources to support well-being (including credible science information), the impacts on communities that are typically targeted by misinformation about science may be compounded. These realities highlight the importance of a better understanding of the problem at the scale of the broader information ecosystem in order to delineate which structural factors may exacerbate the impacts of misinformation about science. Additionally, they suggest the critical need for more scrutiny and accountability within the current information ecosystem with respect to sources of information as well as more supports for consumers of information (individual and institutions) to navigate the current complexities.

Conclusion 6-5: Misinformation about science about and/or targeted to historically marginalized communities and populations may create and/or reinforce stereotypes, bias, and negative, untrue narratives that have the potential to cause further harm to such groups.

Conclusion 6-6: Overall, there is a critical need for continuous monitoring of the current information ecosystem to track and document the origins, spread, and impact of misinformation about science across different platforms and communication spheres. Such a process, like monitoring for signals of epidemics, could better support institutions and individuals in navigating the complexities of the current information ecosystem, including proactively managing misinformation about science.

Recommendation 9: Professional scientific organizations, philanthropic organizations engaged in supporting scientific research, and media organizations should collaborate to support an independent entity or entities to track and document the origins, spread, and impact of misinformation across different platforms and communication spheres. The data produced through this effort should be made publicly available and be widely disseminated. Various entities, including public health emergency operations centers, can serve as potential models for such collaborative efforts.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

In thinking about more expansive ways to leverage the data collected through monitoring and documentation, these entities could also explore the possibility of establishing thresholds of concerns toward informing if and when various stakeholders should intervene. Such intervention thresholds would also need to be commensurate with the values, needs, and priorities of the different communities. To a large extent, such monitoring also depends on the ability to access data from social media to track and identify misinformation about science and allow appropriate actions for interventions. Yet this has become increasingly difficult to do as most social media platforms’ application programming interfaces (APIs) are either disallowing access to such data or making it difficult to access and analyze the data (see Chapter 8).

INTERVENING TO ADDRESS MISINFORMATION ABOUT SCIENCE

Efforts to address misinformation about science operate across many levels—for example, individual, community, platform, and the broader information environment—and have generally been implemented in a topically agnostic fashion, although some have been intentionally employed to address climate misinformation (Farrell et al., 2019; Lewandowsky, 2021a; Spampatti et al., 2024). Efforts to address misinformation about science typically target one or more of the four intervention points: supply, demand, distribution, and uptake (see Chapter 7 for a more detailed discussion of the range of existing approaches to address misinformation).

Supply-based interventions seek to reduce the volume of circulating misinformation and/or shift the balance in the quality of circulating information by either increasing the production of high-quality science content (e.g., by foregrounding credible information online or by providing funding to under-resourced newsrooms) or reducing the production of low-quality science content (e.g., through punitive measures, such as deplatforming, decredentialing, or content moderation). Demand-based interventions are aimed at reducing the consumption of misinformation with the premise that people seek out information to answer pressing questions they may have. These approaches are proactive and may include increasing trust in sources of credible information, increasing people’s ability to notice and avoid misinformation, and/or filling information voids to reduce the likelihood of people turning to misinformation.

Indeed, efforts to develop and promote ongoing trusted channels hold promise for addressing science information voids. Community-based organizations (CBOs), including some locally-owned businesses, non-profit organizations, and faith-based organizations, are trusted sources of information to community residents on many topics, including science and health. To this end, there is moderate evidence to suggest that cultivating

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

relationships with, and relying upon, such trusted intermediaries can be an effective way to overcome informational challenges, as in the case of the Health Advocates In-Reach and Research Campaign (HAIR), which is a community-based network of barbers and stylists who engage in community health promotion around issues such as colorectal cancer, COVID-19, and vaccines (University of Maryland School of Public Health, n.d.). Yet, further research needs to be conducted to determine how to evaluate these interventions and replicate their successes across different communities.

Distribution-based interventions are designed to limit the spread of misinformation about science, and include psychology-based strategies to encourage evaluative thinking in individuals, voluntary actions taken by platforms to reduce the presence of misinformation through algorithmic changes (e.g., demote or remove content from algorithmic recommendations), and governance approaches that involve law and policy mechanisms for regulating misinformation (e.g., mandated disclosure laws about the use of bots, amendments to Section 230 of the Communications Decency Act of 1996). Governance approaches using law or policy is currently an area of ongoing exploration for feasibility and effectiveness in addressing online misinformation about science in the United States and globally. However, long-standing free speech protections within the U.S. context, particularly protections afforded under Section 230 of the Communications Decency Act of 1996, have made it challenging to address misinformation at the distribution or platform level. Given these challenges, several efforts to reform Section 230 have been proposed, and existing laws such as those dealing with libel or state-level mandatory disclosure laws are being used to stem misinformation at the platform level. Promising approaches to content moderation are being tried in other countries such as Europe, and their suitability and feasibility for adoption into the United States remains to be fully explored.

Psychology-based interventions related to demand have been more widely adopted. These approaches include “accuracy nudges” to encourage individuals to consider the accuracy of the content they encounter before choosing to share (Pennycook & Rand, 2022a), introducing friction to slow-down the speed at which individuals make decisions to share content (Fazio, 2020a), and presenting individuals with the gist or bottom-line meanings of their decision to share content (e.g., sharing or not sharing this content signals alignment with democratic values). Accuracy nudges have been successful in decreasing people’s actual sharing behaviors (Pennycook et al., 2021) and are relatively easy to implement; but similar to some demand-based interventions, the effect sizes are small and non-durable (Pennycook & Rand, 2022b). Moreover, the efficacy of accuracy nudges is also dependent on an individual’s ability to determine if information is accurate or not (Pennycook & Rand, 2022a).

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

Uptake-based interventions are designed to reduce the effects of misinformation about science on people’s beliefs or behaviors. These interventions assume that misinformation is already in circulation and seek to limit its negative effects on individuals before, during, or after exposure to it. This class of interventions include prebunking, debunking, motivational interviewing (to enhance motivation in individuals to engage in health protective behaviors), and providing warning labels about source credibility. Uptake-based interventions are the most well-studied, particularly prebunking and debunking techniques. It is important to note that debunking efforts can be time-consuming, and in isolation, cannot sufficiently manage the pace and scale of widespread misinformation due to being inherently reactionary. Additionally, the effectiveness fades over time (lasting anywhere from a few weeks to a few years), and it is often difficult to ensure that fact-checks actually reach the individuals who were exposed to the original misinformation.

Conclusion 7-1: Many initiatives are currently exploring ways to address misinformation through various interventions. Such interventions have been generally implemented in a topically agnostic fashion and target one or more of four intervention points to combat the negative impacts of misinformation: supply, demand, distribution, and uptake. So far, there is no indication that a particular point is the best place to intervene, and many of the most effective interventions target multiple points.

Conclusion 7-2: Community-based organizations (CBOs) have attempted to mitigate misinformation in their communities and are particularly well positioned to do so because of their ties to the local residents, their awareness of local needs and concerns, and the trust that residents have in them. However, whether and when CBOs’ efforts to mitigate misinformation are effective, and whether they have sufficient capacity to do so, are not well-understood.

Conclusion 7-3: The role of legal and regulatory efforts to address misinformation about science remains to be explored more fully. Current approaches include several efforts to amend Section 230 of the U.S. Communications Decency Act of 1996 and mandated disclosure laws at the state level (e.g., laws that require warning labels about “deep fakes” or online “bots”). Other countries have deployed regulatory approaches to bolster content moderation practices on online platforms (e.g., Europe), which are being considered for useful adaptation into the United States context.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

Conclusion 7-4: Providing warnings about common manipulative techniques and false narratives, providing corrective information (especially when accompanied by explanatory content), and encouraging evaluative thinking (e.g., lateral reading, accuracy nudges, friction) are effective individual-level solutions to specifically prevent belief in misinformation about science and reduce the sharing of misinformation about science by individuals, although the durability of these interventions is a common challenge.

Recommendation 10: To enhance the capacity of community-based organizations (CBOs) to provide high-quality, culturally relevant, accurately translated, and timely science information to the communities they serve, funders (e.g., government agencies, public and private, philanthropic foundations) should provide direct funding to CBOs:

  • to identify and work with research partners to determine science information voids within the communities they serve and to develop strategies and products to fill them, and
  • to develop internal capacity and capability to routinely assess science information needs and build resilience against misinformation about science, particularly among those serving non-English speaking and other underserved groups (e.g., communities of color, low-income communities, rural communities).

Recommendation 11: Organizations at national, state, and local levels that are specifically engaged in mitigating the uptake of misinformation about science at the individual level should identify and utilize effective approach(es) that are best suited to their goals and the point of intervention (e.g., before or after exposure). For example:

  • When seeking to prevent uptake of misinformation about science prior to exposure, organizations should consider using prebunking techniques such as anticipating common themes and false narratives widely used in propagating misinformation, and proactively develop messages to counter them. For example, public health agencies and media organizations could counter false narratives by the tobacco industry to misinform the public about the impact of bans on mentholated cigarettes. Teaching people about common manipulation techniques used by propagators of misinformation about science is also effective.
  • When seeking to prevent beliefs in misinformation about science after exposure, organizations should consider using debunking techniques such as providing detailed corrective information. Instead of merely labeling a claim as false, organizations should explain why the claim is false and, if possible,
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
  • highlight why the original source might be motivated to spread misinformation (e.g., an organization spreading doubt about climate change is funded by fossil fuel companies).

CHALLENGES TO UNDERSTANDING AND ADDRESSING MISINFORMATION ABOUT SCIENCE

While considerable progress has been made to better understand the causes, consequences, and potential solutions to misinformation about science, there are also challenges to studying this phenomenon and mitigating its impact. Some of the challenges include scalability of interventions (e.g., the mismatch between single-shot and/or individual-level interventions, on the one hand, and, on the other, the dynamic, complex nature of misinformation about science at larger scales), testing efficacy versus effectiveness (e.g., the common use of artificial laboratory-based tasks for testing efficacy over real-world conditions), and obtaining high-quality data (e.g., inadequate data including data absenteeism and data collection methods across various contexts and populations).

Challenges of Scale and Efficacy

As noted above, approaches (designed or proposed) for addressing misinformation about science are primarily aimed at the individual level. This emphasis has inadvertently placed the onus of mitigating the problem of misinformation on individuals, despite recognition in the field that systems-level action is needed (Altay et al., 2023; Bak-Coleman et al., 2022; van der Linden et al., 2023). Indeed, literacy-focused approaches have been criticized for framing the problem of misinformation as one of individual vigilance and avoidance (boyd, 2017). Some systems-level approaches (e.g., filling information voids, building and maintaining trust in sources of credible information, governance) are already being employed by various types of organizations; however, such approaches have not been rigorously tested for efficacy and durability. More importantly, addressing misinformation through interventions has been pursued in parallel tracks across sectors and academic disciplines in ways that do not inform one other and, in some cases, may even push in different directions.

Conclusion 7-5: Efforts to mitigate misinformation have become more prevalent over time, although interventions to specifically address misinformation about science are less prevalent than for other topical domains (e.g., political misinformation). Additionally, efforts to intervene have been largely uncoordinated across actors, sectors, disciplinary domains, and intended outcomes.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

While many approaches to addressing misinformation about science have demonstrated efficacy in small-scale, controlled experiments, current understanding of their effectiveness in real-world settings is limited. Additionally, many experiments to understand the efficacy of interventions are rooted in methodological individualism. Therefore, it is challenging for a single study to take into account all of the relevant social and cultural factors that also shape how misinformation about science affects individuals and communities.

Current funding mechanisms (e.g., requests for proposals) also play an important role in shaping misinformation studies. Funding support has yielded a better understanding of impacts and interventions at the individual level to an extent, but not levels beyond the individual (i.e. networks of individuals, communities or society as a whole). Importantly, the abundance of evidence at the individual level risks giving the perception that this approach is the most effective way to address misinformation about science (Chater & Loewenstein, 2023; Maani et al., 2022). It is also still unknown whether interventions that are effective at the individual level are useful for countering community- and societal-level impacts of misinformation about science.

Conclusion 8-1: There has been considerable emphasis on studying misinformation about science and potential solutions at the individual level. In contrast, there has been limited emphasis on understanding the phenomenon of misinformation at higher levels and larger scales, which may inadvertently place the onus on the individual to mitigate the problem. There has been limited progress on:

  • Understanding how structural and contextual factors such as social class, race/ethnicity, culture, and geography, and social networks and institutions influence the origin, spread, and impact of misinformation about science.
  • Understanding how other important factors (i.e., social, political, and technological forces) that shape the information ecosystem interact with misinformation about science to influence decision making and well-being.
  • Understanding the larger impact that systematic disinformation campaigns can have and how to effectively intervene to counter misinformation about science from such sources.
  • Understanding the effectiveness of existing approaches to address misinformation about science, either alone or in combination, with an eye toward better design, selection, and implementation.

Recommendation 12: To strengthen the evidence base on the impacts of misinformation about science across levels and the suite of approaches

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

to mitigate them (e.g., community-based, platform- and platform design-based, policy, and regulatory approaches), funding agencies and funding organizations should direct more investments toward systems-level research. Such investments would increase understanding of the ways that structural and individual factors may interact to influence the spread and impacts of misinformation about science.

Challenges of Obtaining High-Quality, Comprehensive Data

Two main challenges exist for obtaining the optimal data to study misinformation about science, as discussed in Chapter 8. The first is availability of data about online platforms; data are either impossible to obtain or are prohibitively expensive to obtain. This creates particular challenges for researchers at lesser-resourced institutions, who may lack sufficient funding to conduct research on platforms. As a potential solution to some of these data challenges, some countries have established mechanisms to facilitate adequate data sharing between online platforms and researchers, (e.g., Article 40 of the European Union’s Digital Services Act).1 Overall, data on a wider range of platforms are still needed.

Conclusion 8-2: Some progress has been made on understanding the nature of misinformation on select social media platforms; however, a comprehensive picture across all major platforms is lacking. The ability to detect and study misinformation about science on social media platforms is currently limited by inconsistent rules for data access, cost-prohibitive data restrictions, and privacy concerns.

Recommendation 13: To reduce current barriers to obtaining high-quality, comprehensive data about misinformation about science on social media platforms:

  • Social media companies should make a good faith effort to provide access to data to examine the origins, spread, and potential impacts of misinformation about science free of charge and without any restrictions when used for non-commercial purposes, except for privacy-related data restrictions.
  • Universities and other research institutions should facilitate the relationships between their individual researchers and social media companies to obtain more reliable data for studying misinformation about science. This should be accomplished while ensuring independence of researchers from the companies.

___________________

1 For more information, see https://www.eu-digital-services-act.com/Digital_Services_Act_Article_40.html

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

A second challenge is a lack of data on the impacts of misinformation about science and the effectiveness of mitigation for underserved groups, a phenomenon recognized as data absenteeism (Lee & Viswanath, 2020; Viswanath et al., 2022c). The limited availability or lack of data on underserved groups has posed a significant hurdle in understanding the ways that spread and impact of misinformation about science may vary across different demographic groups (Southwell et al., 2023). Additionally, this kind of data challenge can manifest across methodological approaches (e.g., surveys, clinical trials, observational studies), and it can exist for a variety of reasons, such as low participant recruitment, mismatches between the general population and the user base of a given context of study, and the exclusion of the experiences of certain groups from consideration.

DIRECTIONS FOR FUTURE RESEARCH

Several gaps in the current understanding of misinformation about science emerged over the course of the committee’s examination of the evidence base. These gaps spanned from being able to accurately estimate how widespread misinformation about science is to understanding when and how to effectively intervene to mitigate harms at different levels. Despite the challenging nature of conducting research on misinformation about science, researchers are still well positioned to advance the science and funders to support it. Areas of needed attention include expanding research to address more types of misinformation about science; improving and expanding methodological approaches; and expanding the measurement and evaluation of aggregate exposure, impacts, and effective mitigation of misinformation about science through intervention. Additionally, many of these areas of need are overlapping and interconnected.

Expanding the Types of Misinformation Studied

From the outset, we note that there is a critical need for an expansion of the types of topics related to misinformation about science that are studied. Topical coverage is currently narrow, with a lot of scholarly attention to issues on topics where there are political divisions, leaving other areas significantly understudied (e.g., misinformation in the area of women’s health, environmental issues beyond climate change). There is also misalignment between the topics or scientific claims that people are more likely to be exposed to compared to those that attract scholarly attention. With the ability to track the incorrect scientific claims that people are exposed to on a daily basis, it may also be possible to better identify science information voids. Such a process could also serve as an indicator of alignment between research topics and questions that are studied

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

vis-à-vis the misinformation about science that people are exposed to and is prospectively harmful.

Additionally, at the time of this report, the scholarship on the relationship between AI and misinformation about science was still in its infancy. Some of the preliminary research focuses on exploring ways for AI to be leveraged to address misinformation online through its capabilities to analyze patterns and language use, and detect falsehoods (e.g., Joshi et al., 2023; Ozbay et al., 2020). Other work has been more geared toward understanding the role of generative AI in the production and spread of misinformation and disinformation (e.g., Kreps et al., 2022; Zhou et al., 2023). Given these technologies are anticipated to become more integrated and in use across different sectors of society (e.g., science, medicine, education), the committee sees a critical need for more empirical research on the evolving role of AI.

It is also important to note that because misinformation research often touches upon topics of major societal importance, the study of misinformation about science can, at times, become controversial, with these controversies occasioning reputational risks for researchers. In some cases, these risks can translate to threats and harassment, potentially impinging on the free inquiry necessary to conduct rigorous scientific research on misinformation about science. The strengthening of institutional structures that continue to support the scholarship on this topic is therefore crucial.

Measuring Aggregate Exposure to Misinformation

Although there is some empirical evidence on how prevalent misinformation about science is within a given medium/channel (e.g., a social media platform), not much is known about aggregate exposure to misinformation over time and across media/channels. An outsized proportion of the current evidence base reflects studies of X (formerly Twitter), and this is primarily a result of the relatively broad access to Twitter data until early 2023. However, almost none of the research on X directly measured exposure, though some used indirect measures (e.g., by looking at who users followed on X; Grinberg et al., 2019). Here again, we note the difference between the abundance of misinformation within a given medium/channel and the degree of aggregate exposure to misinformation in a given population and on a given topic. It is quite possible for there to be large amounts of misinformation on a platform—as measured by unique pieces of information or by the number of times pieces of misinformation are shared—that is viewed by few people. This could occur either because the sharers of misinformation are substantially isolated from the more general population (e.g., see Grinberg et al., 2019) or because of content moderation policies that reduce the spread of misinformation on a platform (e.g., Vincent et al., 2022). The committee

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

notes the importance of distinguishing between the effects of misinformation when a large number of people are occasionally exposed to misinformation, as compared to the effects when a small number of people are frequently exposed. The literature suggests that these are the two modal scenarios for misinformation exposure, and they may produce very different pathways to adverse effects. Both scenarios merit further research.

More generally, there is relatively very little research on real-world exposure to misinformation, at both the individual and aggregate levels—far too little, in the committee’s opinion, given that the harms of misinformation will generally be mediated by exposure. Most of what is known about the effects of exposure to misinformation is based on data drawn from experimental and lab settings, which substantially limits generalizability. Additionally, while there are many studies that have documented the prevalence of misinformation on the internet, likely because of ease of measurement and data collection in this context, there are far fewer that capture exposure (Lazer, 2020). This is likely because the measurement-related affordances that internet platforms have provided have made it far easier to measure what is posted and not what is seen. There have been a few important exceptions, like browsing-based studies, that focus on what domains people have seen (Allcott & Gentzkow, 2017; Guess et al., 2018b). Furthermore, there have been some platform-provided data that included exposure and related data; for example, Meta- as well as some Facebook-provided data on exposure and engagement (e.g., Social Science One [Buntain et al., 2023a; King & Persily, 2020] and CrowdTangle [Edelson et al., 2021]). We note that CrowdTangle is now defunct, and that Social Science One is essentially moribund, not having been updated in years. A study conducted by Allen et al. (2020) is, perhaps, the most comprehensive effort to evaluate exposure to civic information and misinformation, using Comscore and Nielsen data from web browsing, television, and mobile use. Yet, even this impressive effort omits radio, interpersonal communication, intra-platform content (e.g., social media posts), and messaging apps, let alone news alerts on phones and information on Alexa devices, podcasts, and many other sources. Determining the scope of misinformation about science in these additional settings is an important area of need.

Finally, more data are needed from community and social contexts. To the limited extent that there has been research on exposure, very little of that research in turn contextualizes that individual exposure within communities; for example, how do norms, local culture, existing local informational resources, and social capital moderate the downstream effects of misinformation on beliefs and behavior? That is, the committee identified a need for more research on how individuals and groups in specific community contexts are exposed to specific types of misinformation about science and with what effects.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

Validating Impacts of Misinformation About Science

Establishing a direct causal link between misinformation about science and a given outcome (i.e., individual-, community-, and societal-level harmful behaviors and/or actions) has been a challenging task, in large part because most consequential real-world outcomes are influenced by many factors outside of exposure to misinformation. Although it is assumed by many that misinformation has widespread and negative effects on individuals, groups, and society, such direct effects have not been well-documented or consistently demonstrated empirically. More research to substantiate causal impacts is a critical need, and equally critical is understanding how the impacts of misinformation about science may be different across social class, race, ethnicity, cultural ideologies, and geography, among other factors. There is also a need to determine how historical and contemporary discrimination, systemic racism, and social determinants of health may compound the impacts of misinformation about science on disparate communities. Understanding the role of such factors is essential to more accurately identify and document community-level impacts of misinformation about science and determine the most important and relevant outcomes for a given science topic, community type, or context. Moreover, as more types of misinformation about science are studied, it will be possible to identify and validate a wider range of impacts.

There are also aspects around the nature of the misinformation about science itself that are also important to understanding impacts, including how the form of the content might make it more or less impactful (e.g., a message delivered as a comment versus a video, an image, on TV, or via the radio), recognizing that a message interacts with the recipient, technology, and the social context to determine impact. Establishing criteria for harm (least harmful to most harmful) is another fundamental need. Such agreement around harms would help in determining the circumstances under which misinformation about science leads to harmful behaviors as well as in documenting accumulated harms (months to years) of misinformation about science on individuals, communities, and society.

Bolstering the Efficacy of Interventions Against Misinformation About Science

A great deal of attention has been directed toward identifying effective solutions to address misinformation about science, with a focus to mitigate the supply, demand, distribution, and/or uptake of misinformation. As noted earlier, a number of these approaches have demonstrated some efficacy in mitigating negative impacts of misinformation about science, but for a given class of interventions, the empirical evidence is uneven across

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

models. Current understanding of effective misinformation interventions is limited by multiple factors including a lack of data access, the common use of artificial laboratory-based tasks, the disconnect between single-shot interventions and the persistent influence of misinformation, and the fact that many organizations currently intervening to combat misinformation about science lack the time and resources to evaluate their efforts. Additionally, there is not a robust evidence base for interventions that are being designed and implemented beyond the individual level or across multiple levels (e.g., institutional, policy, or combined approaches). More broadly, a theoretical accounting of why particular interventions are effective is needed for determining potential for generalizability across contexts and levels of analysis.

Beyond establishing the efficacy of existing approaches, the committee sees a more near-term need to specifically design, deploy, and evaluate interventions that are more tailored to populations and communities who may benefit the most: older adults who are more likely to share misinformation, those who are not exposed to common uptake-based interventions (e.g., fact-checks), and marginalized and underserved communities who lack access to high-quality information. The committee acknowledges that many interventions face implementation challenges including: limited access to data for evaluating effectiveness, and for identifying and addressing information voids; design features of platforms that may circumvent policy-related approaches to limit the spread of misinformation online; and limited scalability, durability, and reach, with respect to approaches that seek to reduce uptake. Nevertheless, there are still important areas of needs for bolstering existing efforts.

For some supply-based interventions, more research is needed to specifically understand and bolster the efficacy of removing and demoting content in search engine results or on social media platforms. In addition, there is need to evaluate specific approaches to reduce demand for misinformation about science carried out by CBOs to determine which are most effective across different populations (e.g., hardly-reached communities, groups that are hesitant to participate in scientist-led studies). Some distribution-based interventions leverage the distributed structures of online platforms (e.g., crowdsourced fact-checking), but whether such approaches can be adopted across all platforms, and especially those containing closed or private groups, is still an open area for inquiry. Additionally, interventions of this kind that pause and flag viral content for review before it can spread more widely have shown efficacy in environments where amplification is the main driver of exposure to misinformation, but it is unknown whether this approach would work in an environment where the primary audience is primed to be receptive to misinformation (e.g., in misinformation-focused groups on social media). Demonstrated efficacy of some uptake-based

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

interventions is also limited, for example, for source labels that indicate the degree of credibility of a particular source and for warnings about common themes and narratives typically associated with misinformation about science. More generally, establishing the best way(s) to implement these kinds of intervention is still to be determined, given many people do not regularly attend to such signals (e.g., Fazio et al., 2023).

Improving and Expanding Methods for Studying Misinformation About Science

To address the areas of need that have been described above there is a need to overcome current methodological limitations in the field. As mentioned above, to date, there has been an overreliance on public social media data, particularly from X (formerly Twitter), in large part because of data availability; however, while digital media account for a substantial share of contemporary communication, other important media domains such as television, radio, podcasts, and private messaging apps, are understudied. Even less is known about the nature of misinformation about science in offline communication contexts, and the empirical record likely under-indexes modes of communication that are ephemeral in nature or cannot easily be converted to a digital format. The most obvious example of this is interpersonal communications, and the committee found little published evidence of how much misinformation about science people say or hear daily from others. Beyond speech, there are few rigorous methods for analyzing misinformation about science in transient media such as advertisements, billboards, bumper stickers, leaflets, and direct mail, and non-textual media (e.g., videos, photographs).

There is also a need for more mixed-methods studies (e.g., studies that combine quantitative and qualitative methods) and interdisciplinary research for gaining more comprehensive understandings of misinformation about science. Many community organizations are already engaged in research efforts that support community-based knowledge production. To this end, formal partnerships can be created between researchers and community members and CBOs to both inform and leverage such data to better understand the ways that misinformation about science moves across platforms, borders, languages, and cultures, and differentially impacts specific communities. Additionally, researchers can work more effectively across silos to establish more interdisciplinary collaborations that enable better integration of different theoretical frameworks and methodological approaches into the research on misinformation about science and a greater understanding across science topics. For example, more systematic research informed by insights from psychology (e.g., dual process theory, theory of planned behavior) and public policy (e.g., Advocacy Coalition Framework) could yield a

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

clearer picture of the impacts of misinformation about environmental issues beyond the individual level. Overall, there is need for better incorporation of survey-based studies, interviews, ethnography (online and offline), focus groups, content analysis, discourse analysis, quasi-experiments, case studies, and longitudinal studies alongside the more widely employed randomized control trials, meta-analyses, and computational social science methods. Specifically, the committee sees the following types of studies as starting points:

  • Field studies (offline and online) that can substantiate lab-setting findings about the causes, consequences, and effective solutions for misinformation about science.
  • New models for studying the dynamics of misinformation in nonsocial media and non-textual media contexts (e.g., TV, radio, podcasts, videos, photographs).
  • Mixed-methods studies to better accommodate populations that are not adequately represented in typical datasets (e.g., hardly reached communities, and groups who are hesitant to participate in scientific studies), including community-based participatory research and studies co-designed with communities.
  • Studies that include complex experimental design as well as larger, diverse sample sizes to study and better understand the interactions between social, political, and technological factors within the contemporary information ecosystem and misinformation about science and how these interactions shape people’s relationship to information and impact personal and policy decision making.

The current state of the science on misinformation about science beyond the individual level underscores the need for more qualitative insights into the social and contextual drivers of the negative impacts of misinformation about science experienced at larger scales. Such insights will also be essential for the design and/or selection of appropriate interventions. Qualitative work may also be a way to overcome some of the current challenges to studying misinformation outlined in Chapter 8 of this report. Indeed, as access to technology platforms’ data becomes increasingly more restrictive, merging qualitative and quantitative insights into research on algorithmic insights needs much more investment and expansion. It is the belief of the committee that this larger suite of methodologies can especially support efforts to strengthen causal inferences, especially about contexts that may be harder to study using only quantitative methods such as closed/private interpersonal spaces. Overall, the methodological needs called for in this research agenda reflect a shift from the status quo in the field and will require sustained levels of adequate funding to make possible.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

FINAL THOUGHTS

This report provides key conclusions drawn from disparate lines of evidence on the nature of misinformation about science; a conceptual understanding of the influences and mechanisms for the origins, spread, and impacts of misinformation about science at multiple levels; and actionable recommendations that are informed by perspectives and expertise from industry, academia, policy, and practice. In mapping the landscape, it became apparent that isolated, individual actions will be insufficient to make progress in this space given the confluence of forces known to shape the dynamics of misinformation about science. Lastly, the current state of knowledge about the scope and severity of misinformation about science is rapidly evolving, but still limited in many domains, and prioritized gaps in understanding have been identified to better orient future research directions in the field.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.

This page intentionally left blank.

Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 211
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 212
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 213
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 214
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 215
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 216
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 217
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 218
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 219
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 220
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 221
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 222
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 223
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 224
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 225
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 226
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 227
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 228
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 229
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 230
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 231
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 232
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 233
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 234
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 235
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 236
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 237
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 238
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 239
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 240
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 241
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 242
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 243
Suggested Citation: "9 Conclusions, Recommendations, and Research Agenda." National Academies of Sciences, Engineering, and Medicine. 2025. Understanding and Addressing Misinformation About Science. Washington, DC: The National Academies Press. doi: 10.17226/27894.
Page 244
Next Chapter: References
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.