Disrupting the Impacts of Implicit Bias
Feature Story
By Sara Frueh
Last update May 28, 2021
Research continues to reveal more about the nature and impacts of implicit bias — something hidden from our direct awareness that causes tangible damage in many spheres of life, even for the very young.
A recent study by Stanford University social psychologist Jennifer Eberhardt, for example, revealed differences in how schoolchildren are disciplined based upon race. She and her colleagues found that if a Black student misbehaves and then, a few days later, a different Black student misbehaves, teachers respond to the second child as if he had misbehaved twice. The pattern did not hold for white students.
“It’s as though the sins of one Black child get piled onto the other,” said Eberhardt. “One Black child can stand in for another, yet white children are treated individually.”
Eberhardt discussed her research at a workshop held this spring by the National Academies’ Committee on Science, Technology, and Law. The workshop engaged researchers and legal scholars in an exploration of implicit bias — unconscious favoritism toward or prejudice against people of a certain race, gender, or group that influences one’s own actions or perceptions — that considered its origins, its consequences, and ways to interrupt its harmful impacts.
Bias’s damaging effects
Like Eberhardt, author and social justice scholar Monique Morris has examined how implicit bias affects the school experience of Black children, and specifically Black girls. Morris described how Black girls experience age compression or “adultification” — being perceived as more adult than their actual ages reflect. This leads adults to view them as less in need of nurturing, protection, and comfort — and also to impose more severe punishments for behaviors seen as problematic. “They are receiving unnecessarily harsh treatment for offenses and actions that are treated as repairable by other groups of girls,” said Morris.
Health care is another arena where biases have long led to inequities, explained Marcella Nunez-Smith, co-chair of the White House COVID-19 Task Force, who also spoke at the workshop. For example, research shows that Black patients are systematically undertreated for pain relative to white patients. “There is implicit bias in the system, and it’s led to earned distrust,” she said.
Former U.S. Attorney General Eric Holder described the impacts of implicit bias in criminal justice, noting a study that found that Black defendants in the federal criminal justice system typically received sentences that were 20% longer than those of white defendants, after controlling for other factors. “Every step of the way there is too much room for implicit — and sometimes explicit — bias to wreak havoc — from the police stops of Black and Brown people that can escalate into dangerous situations, to the discretion prosecutors have in charging decisions, to the views that jurors might hold when deciding on a case, to the jury selection process itself, to the sentencing judge that decides the fate of those convicted of crimes,” said Holder. “This has been painfully clear to me throughout my career.”
Voting, too, is also affected by implicit bias, though it has received little attention so far, said Holder. A survey following the 2008 elections found that Black and Hispanic voters were asked for identification more frequently than white voters, for example. And a 2012 study showed that California state legislators were less likely to respond to emails asking about voter identification requirements sent by a fictional Latino person than by a fictional white person.
‘Biases are being caught, not taught’
Where do implicit biases come from? Developmental psychologist Andrew Meltzoff has studied babies and children to try to find an answer. “Newborns, when they come into the world, are not starting off with prejudices and biases, but by the time they are five to six years of age — before they step foot in first grade — they’re absolutely dripping with them.”
“Some of our experiments and those of others are showing that these biases are being caught, not taught,” he said. Children’s brains are highly attuned to observational social learning, and they watch and imitate adults’ behavior.
In one study, for example, 4- and 5-year-old children watched a video in which an adult offers a friendly greeting to a person wearing a black shirt, and then gives a cold greeting to an adult in a red shirt. When asked which of the two adults they liked more or would share toys with, the children showed a preference for the adult wearing the black shirt. What’s more, Meltzoff said, they showed a preference not just for that individual, but also for people in general who wore black shirts. “The children, from a very thin slice of behavior, had their reactions deeply influenced,” he said. “They generalize, not just from the specific target, but to a whole class of people who look like them.”
The take-away for parents is that our children are watching us, he said. “Children are social pattern detectors … They are parsing what they see — their attention is drawn to it, they’re trying to give meaning to it, and they remember it.”
The bias of crowds and contexts
In a different sense, biases may be “caught” even by adults. Our level of bias — and whether we act on it — may depend on the context we’re in, explained researcher and psychologist Keith Payne.
Although bias is often seen as a stable characteristic people carry around with them, individual scores on implicit bias tests, such as the Implicit Association Test, may change from week to week, said Payne. But at the group level — averaged across the population at a city or state level — the level of bias is stable over time. Studies show that the same states that were most biased in 2007 were also the most biased in 2016.
These “aggregate” biases are linked to actual behavior: In cities where there are indications of a stronger implicit association on average between Black people and guns, for example, there is an actual disparity in police use of deadly force, he said.
“If individuals aren’t stable across two weeks, why is it that states are stable across a decade?” asked Payne. “It suggests that there’s something in those contexts that is stable and long-lasting. To answer that, we need to ask not only about the individuals’ learning histories but also about the systems and structures and historical forces that make some places more biased than others.”
Payne’s research found that states and counties that relied heavily on slavery in 1860 have higher levels of implicit bias today. And these areas have greater structural inequalities — larger racial disparities in poverty and upward mobility, and greater segregation in housing — that perpetuate people’s racial biases, he explained.
In light of this, interventions should focus not on changing individuals, but on changing institutional decision-making processes that tend to reproduce inequalities, said Payne. “We should focus on interventions that try to change situations rather than people.”
His emphasis on institutions rather than individuals echoed remarks from workshop co-chair Camara Jones from the Morehouse School of Medicine, who said that implicit bias must be examined in the larger context of structural racism. Psychologist and workshop planning committee member Stacey Sinclair reflected on how structural racism and implicit bias reinforce each other. “Historical, systemic, structural racial hierarchies create the biases, and then the biases … allow us to perpetuate all of those structural things,” she said. “That keeps our society locked in this racial hierarchy.”
Adding guardrails to decisions
How can we interrupt that cycle and keep implicit bias from perpetuating inequalities? Researcher Calvin Lai suggested that the best approach is not to try to extinguish individual biases — something his research has demonstrated as difficult, if not impossible — but rather to keep those biases from shaping outcomes.
“While we may not be able to combat these biases at the source, we can mitigate their impact by adding guardrails to how we make our decisions, so that we are less likely to act on our bias,” said Lai. For example, employers can write out and pre-commit to their hiring criteria before looking at individual candidates for a position — “structuring the architecture of our decisions so that we’re less likely to discriminate.”
Adding “friction” to slow down decision-making is another important way to dampen biases, according to Stanford’s Eberhardt. As an example, she described an intervention that prompted police officers, before traffic stops, to consider whether they were relying on prior intelligence in pulling over the specific driver and to list the source of that intelligence, rather than relying on intuition.
The workshop’s closing keynote speaker, Bryan Stevenson, founder of the Equal Justice Initiative, stressed that, to truly understand implicit bias and how to reduce its impacts, researchers and other scholars must get close to those who have been excluded because of biases.
“Until you have studied the multiple ways in which people can be disfavored, until you’ve been in proximity with people who have been marginalized and excluded, you will not fully appreciate the multiple ways that bias manifests itself,” said Stevenson. “We cannot stay just in academic settings, we cannot just stay in the courthouse, we cannot just stay in the places where there isn’t the kind of proximity to exclusion and poverty that I believe we have to understand effectively if we’re going to make a difference.”