Getting a scientific message across means taking human nature into account

I really enjoyed thinking about, researching, and writing this piece for The Conversation, where this work was originally published.

Getting a scientific message across means taking human nature into account

Rose Hendricks, University of California, San Diego

We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not.

Scientists and the media need to communicate more science and communicate it better. Good communication ensures that scientific progress benefits society, bolsters democracy, weakens the potency of fake news and misinformation and fulfills researchers’ responsibility to engage with the public. Such beliefs have motivated training programs, workshops and a research agenda from the National Academies of Science, Engineering, and Medicine on learning more about science communication. A resounding question remains for science communicators: What can we do better?

A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.”

But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Or they read an online article by a scientist about the necessity of vaccines, but leave comments expressing outrage that doctors are trying to further a pro-vaccine agenda. Convincing people that scientific evidence has merit and should guide behavior may be the greatest science communication challenge, particularly in our “post-truth” era.

Luckily, we know a lot about human psychology – how people perceive, reason and learn about the world – and many lessons from psychology can be applied to science communication endeavors.

Consider human nature

Regardless of your religious affiliation, imagine that you’ve always learned that God created human beings just as we are today. Your parents, teachers and books all told you so. You’ve also noticed throughout your life that science is pretty useful – you especially love heating up a frozen dinner in the microwave while browsing Snapchat on your iPhone.

One day you read that scientists have evidence for human evolution. You feel uncomfortable: Were your parents, teachers and books wrong about where people originally came from? Are these scientists wrong? You experience cognitive dissonance – the uneasiness that results from entertaining two conflicting ideas.

It’s uncomfortable to hold two conflicting ideas at the same time. Man image via www.shutterstock.com.

Psychologist Leon Festinger first articulated the theory of cognitive dissonance in 1957, noting that it’s human nature to be uncomfortable with maintaining two conflicting beliefs at the same time. That discomfort leads us to try to reconcile the competing ideas we come across. Regardless of political leaning, we’re hesitant to accept new information that contradicts our existing worldviews.

One way we subconsciously avoid cognitive dissonance is through confirmation bias – a tendency to seek information that confirms what we already believe and discard information that doesn’t.

This human tendency was first exposed by psychologist Peter Wason in the 1960s in a simple logic experiment. He found that people tend to seek confirmatory information and avoid information that would potentially disprove their beliefs.

The concept of confirmation bias scales up to larger issues, too. For example, psychologists John Cook and Stephen Lewandowsky asked people about their beliefs concerning global warming and then gave them information stating that 97 percent of scientists agree that human activity causes climate change. The researchers measured whether the information about the scientific consensus influenced people’s beliefs about global warming.

Those who initially opposed the idea of human-caused global warming became even less accepting after reading about the scientific consensus on the issue. People who had already believed that human actions cause global warming supported their position even more strongly after learning about the scientific consensus. Presenting these participants with factual information ended up further polarizing their views, strengthening everyone’s resolve in their initial positions. It was a case of confirmation bias at work: New information consistent with prior beliefs strengthened those beliefs; new information conflicting with existing beliefs led people to discredit the message as a way to hold on to their original position.

Just shouting louder isn’t going to help. Megaphone image via www.shutterstock.com.

Overcoming cognitive biases

How can science communicators share their messages in a way that leads people to change their beliefs and actions about important science issues, given our natural cognitive biases?

The first step is to acknowledge that every audience has preexisting beliefs about the world. Expect those beliefs to color the way they receive your message. Anticipate that people will accept information that is consistent with their prior beliefs and discredit information that is not.

Then, focus on framing. No message can contain all the information available on a topic, so any communication will emphasize some aspects while downplaying others. While it’s unhelpful to cherry-pick and present only evidence in your favor – which can backfire anyway – it is helpful to focus on what an audience cares about.

For example, these University of California researchers point out that the idea of climate change causing rising sea levels may not alarm an inland farmer dealing with drought as much as it does someone living on the coast. Referring to the impact our actions today may have for our grandchildren might be more compelling to those who actually have grandchildren than to those who don’t. By anticipating what an audience believes and what’s important to them, communicators can choose more effective frames for their messages – focusing on the most compelling aspects of the issue for their audience and presenting it in a way the audience can identify with.

In addition to the ideas expressed in a frame, the specific words used matter. Psychologists Amos Tversky and Daniel Kahneman first showed when numerical information is presented in different ways, people think about it differently. Here’s an example from their 1981 study:

Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. If Program B is adopted, there is ⅓ probability that 600 people will be saved, and ⅔ probability that no people will be saved.

Both programs have an expected value of 200 lives saved. But 72 percent of participants chose Program A. We reason about mathematically equivalent options differently when they’re framed differently: Our intuitions are often not consistent with probabilities and other math concepts.

Metaphors can also act as linguistic frames. Psychologists Paul Thibodeau and Lera Boroditsky found that people who read that crime is a beast proposed different solutions than those who read that crime is a virus – even if they had no memory of reading the metaphor. The metaphors guided people’s reasoning, encouraging them to transfer solutions they’d propose for real beasts (cage them) or viruses (find the source) to dealing with crime (harsher law enforcement or more social programs).

The words we use to package our ideas can drastically influence how people think about those ideas.

What’s next?

We have a lot to learn. Quantitative research on the efficacy of science communication strategies is in its infancy but becoming an increasing priority. As we continue to untangle more about what works and why, it’s important for science communicators to be conscious of the biases they and their audiences bring to their exchanges and the frames they select to share their messages.

Advertisements

Chapter 2 TLDR Guide to Communicating Science Effectively: A Research Agenda

The National Academy of Science published a thorough (127-page) guide for communicating science effectively, with a detailed description of what the science of science communication has already revealed, but more importantly, with an agenda for the future of research on this topic. It’s long but useful, so I’ve broken it down into an abridged guide. Yesterday I posted my distillation of chapter 1, and today’s focus is chapter 2.


Chapter 2: The complexities of communicating science

Public engagement: seeking and facilitating the sharing and exchange of knowledge, perspectives, and preferences between or among groups who often have differences in expertise, power, and values

  • Public engagement is important for goals of generating excitement, sharing info needed for a decision, and finding common ground on an issue among diverse stakeholders.

Challenges posed by scientific content

Uncertainty. People generally dislike uncertainty and avoid ambiguity. As a result, it might seem like avoiding talking about the uncertainty inherent in science will be a productive way to communicate. However, avoiding discussion of uncertainty is a problem too, since it creates a false sense of certainty among people, and if (or when) new findings arise that require original information to be revised, people are likely to lose trust in the communicators. So far, presenting relevant narratives seems to be an effective way to engage audience with scientific issues, helping them to remember and process the information, but we need more research on the role of narratives for communicating science and on broader best practices for communicating scientific uncertainty.

uncertainty
Lifescape series. Ambiguity or Opportunity? by ArtistIvanChew CC

Different audiences, different needs

Aspects of audiences that affect science communication help explain why the same information can be understood very differently by different people:

  • Prior knowledge of scienceScreen Shot 2016-12-18 at 3.22.01 PM.png
    Plus, scientific knowledge alone doesn’t necessarily lead to holding positive attitudes toward science. Instead, someone’s characteristics, background, values and beliefs, and the information they receive from the media all influence the role their scientific knowledge has on their attitudes.
  • Ability to understand numeric information
    When communication strategies rely on quantities, rates, or probabilities and they take into account that people (including scientists, particularly when the issue is outside their area of expertise) struggle to make sense of numeric information, they are often more successful than just presenting the numbers. In health communications, at least, the following strategies have proven helpful:

    • Don’t avoid the numbers – provide them.
    • Reduce the cognitive effort required by the consumer
    • Explain what the numbers mean
    • Draw attention to important information
  • Ways of interpreting new information
    Everyone has their own beliefs about that way the world works, and these beliefs play prominent roles in making sense of new information. We also rely heavily on mental shortcuts when we encounter new information:

    • Heuristics: We often believe information that is consistent with our preexisting beliefs and information that we encounter more often than inconsistent and less frequently encountered info.
    • Emotion: Our initial emotional reactions to new information can shape the way we continue to think about that information, and some research suggests that we tend to pay more attention to negative than positive information.
    • Motivated reasoning: We’re biased to make sense of information in a way that is consistent with our immediately accessible beliefs and feelings.
    • Cognitive dissonance: we’re able to hold two conflicting thoughts, but that often makes us feel uncomfortable, and we try to resolve that conflict for ourselves. If you really love Big Macs, for example, and you also know that health professionals say Big Macs are not good for you, you might feel some dissonance. You can either change your behavior (stop eating Big Macs) or justify your behavior by tweaking your belief (well, I walked into the restaurant instead of using the drive thru, so I got my exercise and can probably have the Big Mac OR well, those scientists are studying mice so really, does that apply to me? OR well, I’m poor and a Big Mac is cheap OR, or, or…).

Presenting information in different forms

The way we present information affects the way it’s received.

Framing is used when information is presented in one way to influence how people interpret it. When issues are communicated about in terms of being a priority or a problem, or when specific causes and solutions are focused on, the issue is being framed. Framing is an inherent part of persuasion and communication about complex topics: You can’t possibly present an issue in its entirety, so a communicator must decide what to highlight and what to downplay. When frames are relevant to the way a person already thinks about the world, they’re most likely to be influential.

  • Gain/loss framing: A 70% success rate and a 30% failure rate are mathematically the same, but depending on the context, may actually influence people in different ways. However, whether framing an issue in terms of potential gains or potential losses influences people more seems to vary based on the issue at hand, so we need more research to understand when each framing is most beneficial.
  • Emphasis framing: Complex issues are often presented as story lines that suggest different trains of thought, which in turn emphasize some features of an issue over others. In particular, scientific information is often presented in terms of personalized stories (episodes) or more generally (themes). Again, the issue at hand determines how productive emphasizing episodes vs. themes will be, so we need more research.

Trust and credibility of science communication

People primarily rely on different social information to figure out what and whom they believe about scientific issues:

  • Having common interests, in that the communicator and the audience both want the same outcome from the communication
    • This point relates to the earlier points on the ways we encounter new information. When scientific information conflicts with someone’s political ideology, they might not only reject the information, but their trust in the communicator might also decline.
  • Perceived expertise which is not equivalent to a communicator’s actual expertise.

Applying the lessons of large-scale science communication efforts

  • It’s important for audiences to receive sufficient exposure (aka, a lot) to information so that it can reach enough of the target audience and bring about change.
  • Communication that’s provided before people form strong opinions on a topic is likely to be more educational than communication after, so timing matters. It can be helpful to expose people early to counterarguments for the misinformation they may eventually receive, as a way of “inoculating” them from misinformation.
  • Duration is also crucial: “long-term and comprehensive approaches” will likely be successful and necessary for communication goals. Isolated attempts are not enough.

An overall theme of this chapter is that because of the many complexities of communicating science, “…an effective science communication strategy will be iterative and adaptable… it will evolve over time based on lessons learned about what is and is not working, as well as shifting needs and opportunities.” (p. 35)


Tomorrow I’ll post a condensed guide to Chapter 3: The Nature of Science-Related Public Controversies.

The problem with things that “cause” cancer

Causation is a tough concept to wrap our heads around. In its simplest sense, we say that one thing causes another when the first made that second thing happen. This is usually a 1:1 relationship. A leads to B, regardless of whether some other things do or don’t happen, and without A, B would not happen.

One common error is to attribute causality when there is none. It’s this type of thinking that leads us to believe that we need a lucky pencil to take tests – with it, we’ll ace the test; without it, we’ll bomb. When two things are correlated (for example, losing fifteen pounds and getting asked on more dates), it’s easy to make a causal inference, even when it’s not warranted. This is the reason that science teachers drill the phrase CORRELATION IS NOT CAUSATION into students’ heads.

https://xkcd.com/552/
Image: xkcd

We can also make the reverse inferential mistake; that is, when one thing does actually cause another, we can interpret it as a correlation. This is especially true when ascribing to causation would require that we change our behavior. For example, we might be less likely to really buy into the idea that obesity leads to heart disease if it suggests that we should change our habits, instead diluting the relationship to a more correlational one in our minds, acknowledging that, yeah, people who are obese tend to have more heart disease, but there are plenty of obese people who don’t, so maybe there’s no need to cut out the Big Macs just yet. This is commonly referred to as cognitive dissonance: having inconsistent thoughts, beliefs, or attitudes, especially as relating to behavioral decisions and attitude change.

To further complicate causal thinking, many things don’t have 1:1 causes. A might cause B, but only in the presence of C, D, and E, or only in the absence of F and G. And sometimes one of those factors that mediates whether A causes B is pure randomness. This is another concept that is really difficult for humans to wrap our heads around, but randomness has played a huge role in making us the creatures we are and making the world the place it is today.

This week the World Health Organization (WHO) made a splash by releasing guidelines that placed processed meats in the same “cause” category for cancer as smoking and asbestos. What does this mean? It means that the WHO is confident that processed meats increases our likelihood of developing cancer. It does not mean that they increase our chances of getting cancer as much as asbestos or smoking do, but that they are equally confident that all of these things do in fact increase cancer risk. This is not one of those straightforward A causes B types of causation, though. We know that there are some people who eat lots of processed meats and never develop cancer. The causation is one of the more complicated types, most notably involving randomness. If someone eats a lot of these meats and then the right randomness (genetic mutations) take place, that person is more likely to end up with cancer than someone who didn’t eat any processed meat but experienced the same randomness (though that second person could very well get the disease too, as we know).

So the word “cause” is not a lie, or even an exaggeration. It’s true. But how do we interpret it? This week, it seems that most people interpreted it as the 1:1 relationship cause, accounting for much of the media hype. It might seem, then, that we should avoid this chaos-inducing word, and instead go for something less anxiety-provoking: maybe “linked to” or “associated with” would get the job done.

These weaker phrases have their own drawbacks, though, precisely because they induce less alarm. They are likely to encourage more cognitive dissonance, more of the reasoning that this is not something that affects me personally and I therefore shouldn’t feel as compelled to overhaul my sausage-filled diet.

There is probably no single verb that can be used in a headline to capture the relationship between certain behaviors and cancer risk, one that will encourage the right amount of alarm. Our best bet is to be aware that there are no perfect words to talk about complex ideas, and that means we will inevitably use imperfect words, words that mislead in different ways. Sometimes it takes some media chaos for an issue to get the attention it needs so that people can understand a situation and make informed decisions. Hopefully this is one of those times.

 

PS: There is a very cool study of science blogs and blog readers going on! I’ll also be receiving information about survey results from my blog readers, so your responses will be helpful to me as well as the researchers looking to learn more about science blogging more generally. To participate, take this survey: http://lsu.qualtrics.com/jfe/form/SV_0dIyegEdCzOFNxr

For completing the survey, readers will be entered into a drawing for a $50.00 Amazon gift card and other prizes, and all participants will receive a small thank-you gift