I really like being right. Chances are you do too, because we humans are psychologically inclined to seek out evidence that suggests we’re right. We tend to interpret neutral information in our favor and contrary information as flawed. These related tendencies are often referred to as confirmation bias.
Confirmation bias is inevitable, and it colors how every one of us sees the world. If I’ve just received an email from a student asking me to bump their grade up so they can get into med school, I might start grumbling to myself about my lazy students. Then as I’m ruminating on lazy students, I might interpret the next student’s well-intentioned question as a manipulative attempt to score higher on the exam. In other words, I may interpret this latter interaction as confirming my feeling that arose from the prior one — the students’ laziness — even if the second student wasn’t lazy or manipulative at all.
Confirmation bias can, in part, explain why there are still way too many parents who don’t have their children vaccinated. Once they believe that vaccines might be harmful for their children, they seek evidence to confirm that belief — for example, clinging to the very small percentage of people who do have adverse reactions to vaccines. Even an overwhelming amount of data demonstrating the benefits of vaccines and the fact that the vaccines-and-autism rumors started from completely fraudulent “science” will not persuade this person. They’ve chosen which evidence to believe and which to discard, even if they don’t necessarily see it as a conscious choice.
An audience’s confirmation bias can be extremely frustrating for science communicators. It can make it feel like communication attempts are futile, since some members will already have their mind made up, and will interpret new information through the lens of their current belief.
But successful science communication is not just a process of information transmission. The idea that the public just hasn’t received enough science info, and that they’ll hold more pro-science beliefs and make more pro-science decisions is incredibly misguided. Confirmation bias illustrates why heaping information on people will not change minds if they have contrary beliefs they seek to confirm. I’ve written about this before, and so havemanyothergreatwriters.
For science communicators: Your audience is going to have cognitive biases. It’s important not to let your awareness of their biases color how you think of the people you’re communicating with. In fact, if you start to characterize your audience as stubborn or irrational because their biases act as obstacles to accepting the science you want to share, you are falling prey to yet another cognitive bias — a fundamental attribution error, or a correspondence bias. This bias plays out when we attribute someone else’s behaviors to their personality (for example, they’re not understanding my science because they’re irrational) more than we would attribute our own behaviors to our personalities.
Remember, you, too, have cognitive biases. Although at times those biases might drive you to make stubborn or irrational conclusions, you probably don’t think of yourself as a stubborn or irrational person. Instead, you might recognize that your unique background and your current circumstances have led you to make biased decisions. Acting stubborn in a certain context does not necessarily make you a stubborn person. We must remember this is true, even when we’re communicating with seemingly stubborn people.
So when you’re communicating, recognize that your audience has cognitive biases — this part, I think, is not too hard. What’s more difficult is to also recognize that you have cognitive biases. No real communication can happen until you do this — until you acknowledge that your audience is comprised of human beings, all of whom have wonderfully complex cognitive baggage, just like you.
We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not.
A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.”
But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Or they read an online article by a scientist about the necessity of vaccines, but leave comments expressing outrage that doctors are trying to further a pro-vaccine agenda. Convincing people that scientific evidence has merit and should guide behavior may be the greatest science communication challenge, particularly in our “post-truth” era.
Luckily, we know a lot about human psychology – how people perceive, reason and learn about the world – and many lessons from psychology can be applied to science communication endeavors.
Consider human nature
Regardless of your religious affiliation, imagine that you’ve always learned that God created human beings just as we are today. Your parents, teachers and books all told you so. You’ve also noticed throughout your life that science is pretty useful – you especially love heating up a frozen dinner in the microwave while browsing Snapchat on your iPhone.
One day you read that scientists have evidence for human evolution. You feel uncomfortable: Were your parents, teachers and books wrong about where people originally came from? Are these scientists wrong? You experience cognitive dissonance – the uneasiness that results from entertaining two conflicting ideas.
One way we subconsciously avoid cognitive dissonance is through confirmation bias – a tendency to seek information that confirms what we already believe and discard information that doesn’t.
This human tendency was first exposed by psychologist Peter Wason in the 1960s in a simple logic experiment. He found that people tend to seek confirmatory information and avoid information that would potentially disprove their beliefs.
The concept of confirmation bias scales up to larger issues, too. For example, psychologists John Cook and Stephen Lewandowsky asked people about their beliefs concerning global warming and then gave them information stating that 97 percent of scientists agree that human activity causes climate change. The researchers measured whether the information about the scientific consensus influenced people’s beliefs about global warming.
Those who initially opposed the idea of human-caused global warming became even less accepting after reading about the scientific consensus on the issue. People who had already believed that human actions cause global warming supported their position even more strongly after learning about the scientific consensus. Presenting these participants with factual information ended up further polarizing their views, strengthening everyone’s resolve in their initial positions. It was a case of confirmation bias at work: New information consistent with prior beliefs strengthened those beliefs; new information conflicting with existing beliefs led people to discredit the message as a way to hold on to their original position.
Overcoming cognitive biases
How can science communicators share their messages in a way that leads people to change their beliefs and actions about important science issues, given our natural cognitive biases?
The first step is to acknowledge that every audience has preexisting beliefs about the world. Expect those beliefs to color the way they receive your message. Anticipate that people will accept information that is consistent with their prior beliefs and discredit information that is not.
Then, focus on framing. No message can contain all the information available on a topic, so any communication will emphasize some aspects while downplaying others. While it’s unhelpful to cherry-pick and present only evidence in your favor – which can backfire anyway – it is helpful to focus on what an audience cares about.
For example, these University of California researchers point out that the idea of climate change causing rising sea levels may not alarm an inland farmer dealing with drought as much as it does someone living on the coast. Referring to the impact our actions today may have for our grandchildren might be more compelling to those who actually have grandchildren than to those who don’t. By anticipating what an audience believes and what’s important to them, communicators can choose more effective frames for their messages – focusing on the most compelling aspects of the issue for their audience and presenting it in a way the audience can identify with.
In addition to the ideas expressed in a frame, the specific words used matter. Psychologists Amos Tversky and Daniel Kahneman first showed when numerical information is presented in different ways, people think about it differently. Here’s an example from their 1981 study:
Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. If Program B is adopted, there is ⅓ probability that 600 people will be saved, and ⅔ probability that no people will be saved.
Both programs have an expected value of 200 lives saved. But 72 percent of participants chose Program A. We reason about mathematically equivalent options differently when they’re framed differently: Our intuitions are often not consistent with probabilities and other math concepts.
Metaphors can also act as linguistic frames. Psychologists Paul Thibodeau and Lera Boroditsky found that people who read that crime is a beast proposed different solutions than those who read that crime is a virus – even if they had no memory of reading the metaphor. The metaphors guided people’s reasoning, encouraging them to transfer solutions they’d propose for real beasts (cage them) or viruses (find the source) to dealing with crime (harsher law enforcement or more social programs).
The words we use to package our ideas can drastically influence how people think about those ideas.
We have a lot to learn. Quantitative research on the efficacy of science communication strategies is in its infancy but becoming an increasing priority. As we continue to untangle more about what works and why, it’s important for science communicators to be conscious of the biases they and their audiences bring to their exchanges and the frames they select to share their messages.