TRUTH, TRUST, AND HOPE
“This is not our first rodeo with misinformation,” commented moderator Kelly Stoetzel. “History can help us understand what circumstances give rise to bouts of misinformation, where the pressure points lie, and how we address the problem.” Presenters explored the historical and cultural context of misinformation and disinformation included Rachel Kuo (University of Illinois Urbana–Champaign) and Nat Kendall-Taylor (FrameWorks Institute).
The problems related to inaccurate or wrong information are not new but draw on histories of inequality, Kuo stated. Everyone has received such information, often from a family member or friend. She pointed to a connection between information, belief, and action—for example, in deciding whether to get vaccinated or for whom to vote. Beyond concerns over the personal effect of mis- and disinformation, there has been widespread concern over the transnational ramifications of what happens with information and potential harms, she added.
Mis- and disinformation are linked with power and politics, Kuo said. “Often what we discern as what is true or false is also about applying our political lines, commitments, and values,” she commented; further, recent public concerns of mis- and disinformation “have been tied to a resurgence of ethnonationalism and white supremacy” backed by longer histories. Knowledge production serves particular interests, she asserted. As an example, the COVID-19 vaccine should be considered a disease prevention and harm reduction tool, but instead it has polarized people and revealed fractures in how they think about health and safety. Different histories and lived experience in the use of public health in some communities can intensify intergenerational distrust of institutions and the state. To counter this, rather than place the responsibility on individuals to get “better information,” she urged participantsw to think about and analyze intersecting structural harms. “Information has always been bound up in social, economic, and political systems and structures. And so, our information problems today are not just technological ones,” she said.
“Information has always been bound up in social, economic, and political systems and structures. And so, our information problems today are not just technological ones.” —Rachel Kuo
U.S.-based, for-profit tech platforms have become police, judge, and jury of online violence, she continued. She worries that without a nuanced analysis of power and context, they can further marginalize and harm people who are already oppressed. For example, grassroots activists have been removed from social media platforms as “terrorists” when they critique power hierarchies and state violence, such as in the case of the Sikh farmers’ protests in India. What is at stake for the future of democracy, she said, is reckoning with the ways the current system is underwritten by racial and colonial violence. As a solution, she pointed to community groups who have built infrastructures, projects, and networks such as community housing projects, mobile medical clinics, and in-language information about resources and civic processes. They need to be resourced, she said.
Kendall-Taylor said, as a psychological anthropologist, he has spent the last 20 years looking at how culture affects the way people think. In December 2021, the Pew Research Center reported, 22 percent of Americans said they had not much or no confidence in science.1 Trust in science matters, he continued, to move society and because trust, or its absence, affects funding, influence, and effect. Where trust in science and science literacy are low, susceptibility to misinformation is high, he added.
Most efforts to increase trust in science are educational, but he suggested other routes and tools that are rooted in cultural mindsets. Cultural mindsets are deep, implicit assumptions and understandings that shape people’s thinking below the level of conscious thought. Everyone has multiple mindsets used to make sense of a given social issue. Some are productive and open conversation, allowing people to consider new ideas and solutions. Others are counterproductive in that they
___________________
1 For more information, see https://www.pewresearch.org/science/2022/02/15/americans-trust-in-scientists-other-groups-declines. While the survey still shows that scientists are the most trusted group in society, what is notable is the decline in trust from previous years across the board for all groups.
close down thinking, derail discussions, and keep people from engaging with new ideas. Cultural mindsets are a large but untapped strategy to use in the mission to navigate disinformation and build trust in science, he stressed.
Some cultural mindsets undermine trust in science. For example, some hold the belief that science is capricious. Science becomes something that cannot be trusted because of the rapid pace at which it “flips and flops and doubles back on itself.” Another example is the mindset that science has a hidden agenda, and science and scientists are trying to “pull one over” on the public to manipulate data and rig the system. Conversely, mindsets can build trust in science, such as the belief that science is awesome and unlocks mysteries to understand how the world works. Another mindset holds that science is a valuable tool to solve social problems, serve as an engine of innovation, and move society forward.
Faced with these two types of mindsets, Kendall-Taylor discussed the science of “framing”: that is, how and what people say has the ability to selectively activate and foreground certain ways of thinking. This thinking becomes the dominant lens through which people see the world and simultaneously backgrounds other ways of thinking. “What this does is it gives us a set of very concrete and applicable practices that we should avoid doing because of their power in
activating those unproductive mindsets,” he said. For example, he recommended avoiding certain terms that are used all the time: for example, “science says,” “the science is undeniable,” or “a preponderance of evidence suggests” can trigger those with the mindset that science has a hidden agenda. To reach those who do not trust science because it seems capricious and fickle, he recommended avoiding unnecessary contradictions and hedges when speaking to the public (as opposed to communicating with other scientists, where qualifying is essential). He urged: “Be clear about what we’re finding, why it matters, and what remains to be learned.”
As a set of practices to advance, he suggested leaning into examples where science has solved problems, providing straightforward explanations, and leading with broadly resonant and generally shared principles, known as values framing (see Figure 2-1). The power and potential of cultural mindsets to build trust in science can decrease susceptibility to misinformation. As the 20th century journalist and commentator Walter Lippman wrote in his 1922 book Public Opinion, “The way in which the world is imagined determines at any particular moment what people will do.”2
___________________
2 Public Opinion is available at https://wwnorton.com/college/history/america9/brief/docs/WLippmann-Public_Opinion-1922.pdf.