confirmation bias

Hurdles to Communicating Science & Strategies to Overcome them

Communicating science is hard in part because doing and understanding science is hard, but there are also some unique hurdles that science communicators face — especially when communicating information that’s relevant for policies. James Druckman recently described some of the challenges that particularly face people communicating policy-relevant science, and ways those challenges can be minimized.

Value-laden diversity

We all have values, relatively unchanging beliefs that reflect the way we see the world. For example, some people are more “individualist,” while others are more “communitarian.” If scientific information seems to contradict a value, we’ll be hesitant to accept that information. Information about climate change, especially if it contains or implies suggestions for reducing the problem by increasing regulations on businesses, might contradict an individualist’s values, making it hard for that person to even consider the scientific information. For more on this hurdle, see Dan Kahan’s work on The Science of Science communication, an earlier post on this blog about Kahan’s work, or a great post by Chris Mooney on Mother Jones.

What to do about it

First, communicators have to recognize that their audience will have a diverse set of values, some of which will conflict with the communicator’s values, and that these values will influence the way people receive scientific information.

Next, communicators should minimize the extent to which their message contains a value commentary. In other words, they should make sure the relevant science comes into play for certain policy decisions without defining “good” or “competent” decisions.

Motivated Reasoning

Motivated reasoning (or confirmation bias) is our drive to seek information that reinforces our prior beliefs and disregard information that does not. For example, in work by Duckman & Bolsen (2011), participants initially indicated their support for genetically modified (GM) foods. After 10 days, all participants received 3 types of info: positive information about how GM foods combat diseases, negative information about their possible longterm health consequences, and neutral information about their economic consequences.

People who initially supported GM foods dismissed the negative information and rated the positive information as valid, and perceived the neutral information as indicating benefits of GM foods. People who were initially opposed to GM foods did the exact opposite: dismissed the positive information, considered the negative information as valid, and interpreted the neutral information as indicating drawbacks of GM foods. Work on motivated reasoning shows we interpret information through a lens laden with our prior beliefs.

There have been lots of great articles highlighting motivated reasoning lately. These include Why Facts don’t Change our Minds, This Article won’t Change your Mind, and Why You Think You’re Right, Even When You’re Wrong.

What to do about it

When motivated reasoning occurs, people are motivated to understand information in a way that aligns with their previous beliefs. Instead, science communicators want to motivate their audience to understand new information in a way that will lead to maximum accuracy. There are a few things communicators can do to encourage people to seek accurate understandings:

  • Show that the issue and information matter for the individual’s life. Show relevance.
  • Present information that comes from a variety of sources, preferably ones with different goals (i.e., from Democrat and Republican sources)
  • Encourage people to explain their position to others (or at least prepare themselves to explain their position). Elaborating on your position requires people to think it through more carefully, and provide explicit evidence for their claims that goes beyond “because I want to believe this.”

Politicization

This term does not mean what we might expect given its name. Politicization is “the inevitable uncertainties about aspects of science to cast doubt on the science overall…thereby magnifying doubts in the public mind” (Stekette 2010, p. 2). It’s not exactly misinformation, since it doesn’t introduce false findings, but instead magnifies doubt. It’s especially common in issues about global warming and vaccination. People who politicize these issues send the message that scientific evidence on these issues is not as conclusive as it’s been made out to be.

What to do about it

Politicization comes directly from people perceiving scientists or informants as lacking credibility and being motivated to reason in ways consistent with their prior beliefs. Thus, politicization can be countered by addressing those hurdles – establishing credibility and encouraging an accuracy motivation. There are a couple other things we can do to overcome this hurdle:

  • Warn people of politicization they’re likely to encounter before they encounter it. This is sometimes referred to as an inoculation message, and it points out the strategies politicizers use and why their message is not to be trusted
  • Correct politicized messages after people have encountered them. Corrections are often not as effective as inoculation messages since people may have already had time to process and begin to believe the politicized message. However, corrections can be effective when people are motivated to reach an accurate understanding.

There’s more

Of course, these aren’t the only hurdles to communicating policy-relevant science. Other hurdles described by Druckman that I haven’t elaborated on include: communicating policy-relevant science requires effort on the part of scientists, getting and maintaining attention, establishing credibility, and changing government inaction.

More and more scientists are recognizing the value of communicating their science outside the Ivory Tower. At the same time, the science of science communication is advancing to help us all understand the hurdles we face and how to best overcome them.

Getting a scientific message across means taking human nature into account

I really enjoyed thinking about, researching, and writing this piece for The Conversation, where this work was originally published.

Getting a scientific message across means taking human nature into account

Rose Hendricks, University of California, San Diego

We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not.

Scientists and the media need to communicate more science and communicate it better. Good communication ensures that scientific progress benefits society, bolsters democracy, weakens the potency of fake news and misinformation and fulfills researchers’ responsibility to engage with the public. Such beliefs have motivated training programs, workshops and a research agenda from the National Academies of Science, Engineering, and Medicine on learning more about science communication. A resounding question remains for science communicators: What can we do better?

A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.”

But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Or they read an online article by a scientist about the necessity of vaccines, but leave comments expressing outrage that doctors are trying to further a pro-vaccine agenda. Convincing people that scientific evidence has merit and should guide behavior may be the greatest science communication challenge, particularly in our “post-truth” era.

Luckily, we know a lot about human psychology – how people perceive, reason and learn about the world – and many lessons from psychology can be applied to science communication endeavors.

Consider human nature

Regardless of your religious affiliation, imagine that you’ve always learned that God created human beings just as we are today. Your parents, teachers and books all told you so. You’ve also noticed throughout your life that science is pretty useful – you especially love heating up a frozen dinner in the microwave while browsing Snapchat on your iPhone.

One day you read that scientists have evidence for human evolution. You feel uncomfortable: Were your parents, teachers and books wrong about where people originally came from? Are these scientists wrong? You experience cognitive dissonance – the uneasiness that results from entertaining two conflicting ideas.

It’s uncomfortable to hold two conflicting ideas at the same time. Man image via www.shutterstock.com.

Psychologist Leon Festinger first articulated the theory of cognitive dissonance in 1957, noting that it’s human nature to be uncomfortable with maintaining two conflicting beliefs at the same time. That discomfort leads us to try to reconcile the competing ideas we come across. Regardless of political leaning, we’re hesitant to accept new information that contradicts our existing worldviews.

One way we subconsciously avoid cognitive dissonance is through confirmation bias – a tendency to seek information that confirms what we already believe and discard information that doesn’t.

This human tendency was first exposed by psychologist Peter Wason in the 1960s in a simple logic experiment. He found that people tend to seek confirmatory information and avoid information that would potentially disprove their beliefs.

The concept of confirmation bias scales up to larger issues, too. For example, psychologists John Cook and Stephen Lewandowsky asked people about their beliefs concerning global warming and then gave them information stating that 97 percent of scientists agree that human activity causes climate change. The researchers measured whether the information about the scientific consensus influenced people’s beliefs about global warming.

Those who initially opposed the idea of human-caused global warming became even less accepting after reading about the scientific consensus on the issue. People who had already believed that human actions cause global warming supported their position even more strongly after learning about the scientific consensus. Presenting these participants with factual information ended up further polarizing their views, strengthening everyone’s resolve in their initial positions. It was a case of confirmation bias at work: New information consistent with prior beliefs strengthened those beliefs; new information conflicting with existing beliefs led people to discredit the message as a way to hold on to their original position.

Just shouting louder isn’t going to help. Megaphone image via www.shutterstock.com.

Overcoming cognitive biases

How can science communicators share their messages in a way that leads people to change their beliefs and actions about important science issues, given our natural cognitive biases?

The first step is to acknowledge that every audience has preexisting beliefs about the world. Expect those beliefs to color the way they receive your message. Anticipate that people will accept information that is consistent with their prior beliefs and discredit information that is not.

Then, focus on framing. No message can contain all the information available on a topic, so any communication will emphasize some aspects while downplaying others. While it’s unhelpful to cherry-pick and present only evidence in your favor – which can backfire anyway – it is helpful to focus on what an audience cares about.

For example, these University of California researchers point out that the idea of climate change causing rising sea levels may not alarm an inland farmer dealing with drought as much as it does someone living on the coast. Referring to the impact our actions today may have for our grandchildren might be more compelling to those who actually have grandchildren than to those who don’t. By anticipating what an audience believes and what’s important to them, communicators can choose more effective frames for their messages – focusing on the most compelling aspects of the issue for their audience and presenting it in a way the audience can identify with.

In addition to the ideas expressed in a frame, the specific words used matter. Psychologists Amos Tversky and Daniel Kahneman first showed when numerical information is presented in different ways, people think about it differently. Here’s an example from their 1981 study:

Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. If Program B is adopted, there is ⅓ probability that 600 people will be saved, and ⅔ probability that no people will be saved.

Both programs have an expected value of 200 lives saved. But 72 percent of participants chose Program A. We reason about mathematically equivalent options differently when they’re framed differently: Our intuitions are often not consistent with probabilities and other math concepts.

Metaphors can also act as linguistic frames. Psychologists Paul Thibodeau and Lera Boroditsky found that people who read that crime is a beast proposed different solutions than those who read that crime is a virus – even if they had no memory of reading the metaphor. The metaphors guided people’s reasoning, encouraging them to transfer solutions they’d propose for real beasts (cage them) or viruses (find the source) to dealing with crime (harsher law enforcement or more social programs).

The words we use to package our ideas can drastically influence how people think about those ideas.

What’s next?

We have a lot to learn. Quantitative research on the efficacy of science communication strategies is in its infancy but becoming an increasing priority. As we continue to untangle more about what works and why, it’s important for science communicators to be conscious of the biases they and their audiences bring to their exchanges and the frames they select to share their messages.