Hurdles to Communicating Science & Strategies to Overcome them

Communicating science is hard in part because doing and understanding science is hard, but there are also some unique hurdles that science communicators face — especially when communicating information that’s relevant for policies. James Druckman recently described some of the challenges that particularly face people communicating policy-relevant science, and ways those challenges can be minimized.

Value-laden diversity

We all have values, relatively unchanging beliefs that reflect the way we see the world. For example, some people are more “individualist,” while others are more “communitarian.” If scientific information seems to contradict a value, we’ll be hesitant to accept that information. Information about climate change, especially if it contains or implies suggestions for reducing the problem by increasing regulations on businesses, might contradict an individualist’s values, making it hard for that person to even consider the scientific information. For more on this hurdle, see Dan Kahan’s work on The Science of Science communication, an earlier post on this blog about Kahan’s work, or a great post by Chris Mooney on Mother Jones.

What to do about it

First, communicators have to recognize that their audience will have a diverse set of values, some of which will conflict with the communicator’s values, and that these values will influence the way people receive scientific information.

Next, communicators should minimize the extent to which their message contains a value commentary. In other words, they should make sure the relevant science comes into play for certain policy decisions without defining “good” or “competent” decisions.

774926895_9fe0f5495a_b
Hurdles start. by robert voors. CC BY-NC-ND.

Motivated Reasoning

Motivated reasoning (or confirmation bias) is our drive to seek information that reinforces our prior beliefs and disregard information that does not. For example, in work by Duckman & Bolsen (2011), participants initially indicated their support for genetically modified (GM) foods. After 10 days, all participants received 3 types of info: positive information about how GM foods combat diseases, negative information about their possible longterm health consequences, and neutral information about their economic consequences.

People who initially supported GM foods dismissed the negative information and rated the positive information as valid, and perceived the neutral information as indicating benefits of GM foods. People who were initially opposed to GM foods did the exact opposite: dismissed the positive information, considered the negative information as valid, and interpreted the neutral information as indicating drawbacks of GM foods. Work on motivated reasoning shows we interpret information through a lens laden with our prior beliefs.

1427849853_1cb5cefd16_m
Last of the crop. By Mrs eNil. CC BY-ND-NC.

There have been lots of great articles highlighting motivated reasoning lately. These include Why Facts don’t Change our Minds, This Article won’t Change your Mind, and Why You Think You’re Right, Even When You’re Wrong.

What to do about it

When motivated reasoning occurs, people are motivated to understand information in a way that aligns with their previous beliefs. Instead, science communicators want to motivate their audience to understand new information in a way that will lead to maximum accuracy. There are a few things communicators can do to encourage people to seek accurate understandings:

  • Show that the issue and information matter for the individual’s life. Show relevance.
  • Present information that comes from a variety of sources, preferably ones with different goals (i.e., from Democrat and Republican sources)
  • Encourage people to explain their position to others (or at least prepare themselves to explain their position). Elaborating on your position requires people to think it through more carefully, and provide explicit evidence for their claims that goes beyond “because I want to believe this.”

Politicization

This term does not mean what we might expect given its name. Politicization is “the inevitable uncertainties about aspects of science to cast doubt on the science overall…thereby magnifying doubts in the public mind” (Stekette 2010, p. 2). It’s not exactly misinformation, since it doesn’t introduce false findings, but instead magnifies doubt. It’s especially common in issues about global warming and vaccination. People who politicize these issues send the message that scientific evidence on these issues is not as conclusive as it’s been made out to be.

What to do about it

Politicization comes directly from people perceiving scientists or informants as lacking credibility and being motivated to reason in ways consistent with their prior beliefs. Thus, politicization can be countered by addressing those hurdles – establishing credibility and encouraging an accuracy motivation. There are a couple other things we can do to overcome this hurdle:

  • Warn people of politicization they’re likely to encounter before they encounter it. This is sometimes referred to as an inoculation message, and it points out the strategies politicizers use and why their message is not to be trusted
  • Correct politicized messages after people have encountered them. Corrections are often not as effective as inoculation messages since people may have already had time to process and begin to believe the politicized message. However, corrections can be effective when people are motivated to reach an accurate understanding.

There’s more

Of course, these aren’t the only hurdles to communicating policy-relevant science. Other hurdles described by Druckman that I haven’t elaborated on include: communicating policy-relevant science requires effort on the part of scientists, getting and maintaining attention, establishing credibility, and changing government inaction.

More and more scientists are recognizing the value of communicating their science outside the Ivory Tower. At the same time, the science of science communication is advancing to help us all understand the hurdles we face and how to best overcome them.

Scientists Agree on Climate Change: A Gateway Belief

 

Screen Shot 2017-03-24 at 3.13.04 PM
https://climate.nasa.gov/scientific-consensus/

It doesn’t get much clearer. The Earth’s climate is warming. Humans are the reason. But how many people are actually aware of the scientific consensus on this issue?

Research by Sander van der Linden and colleagues shows that when people believe that scientists overwhelmingly agree about climate change, they increase their a) own beliefs in climate change and b) beliefs that humans are responsible. They feel c) more worried about climate change, and as a result of a, b, and c, they support public action to mitigate the effects of climate change.

journal.pone.0118489.g001

At the beginning of the study, participants indicated the percentage of scientists they thought agree on global warming and they answered some questions about their own climate change beliefs. People then received a message about scientific consensus, which took the form either of a) a simple description, b) a pie chart, or c) a metaphorical comparison related to trusting engineers’ consensus about bridges (i.e., if 97% of engineers agreed that a bridge was unsafe, would you use it?) or doctors’ consensus about illness. All the messages included the info that “97 % of climate scientists have concluded that human-caused climate change is happening.”

Then participants again indicated what percent of scientists they thought agree on global warming and answered questions about their own beliefs. All messages “worked,” in the sense that people perceived greater scientific agreement after the messages telling them that 97% of scientists agree than if they hadn’t read anything about the consensus at all (though the simple description and pie chart were more effective than the metaphor. People shifted their climate change beliefs more after encountering one of the more straightforward messages than the more complex metaphor. Great food for thought, as many science communicators insert metaphors wherever they can).

Of course, having people believe that there’s strong scientific consensus about climate is only one step toward the larger goal of having them endorse actions that mitigate the effects of climate change. But in follow-up analyses, the researchers identified that perceiving scientific agreement is a gateway belief: believing that scientists agree about global warming led to other beliefs, ones that get us closer to the goal of actions in favor of mitigating climate change. Specifically, it led to greater belief that climate change was real, human-caused, and worrisome. These beliefs, in turn, led to greater support for public action against climate change. It’s often hard to know what leads to what, especially when it comes to beliefs we keep hidden in our own heads, but with some semi-fancy math, these researchers quantified those relationships.

9154498428_d5d720f9ee_b
Climate 365 by NASA Goddard Space Space Flight Center. CC BY.

These studies have some clear takeaways for science communicators (especially when communicating about climate change — but maybe these ideas apply to other topics too — need more research!)

  • Emphasize scientific consensus, that an overwhelming percentage of scientists agree that climate change is a real problem caused by human activity.
  • Don’t worry so much about immediately pushing for public action against climate change. When people understand that scientists agree, they come to agree themselves that climate change is a problem that should be addressed, and THEN they come to support public action. Be careful about skipping steps.

At the same time, there’s not only one right way to communicate about climate change. There are truly effective ways, ineffective and potentially backfiring ways, and many in between. There aren’t cut-and-dry rules because every audience is unique, and taking the audience into account — their beliefs, values, and past experiences, for example — is crucial. But this work sheds light on communication strategies that are probably pretty far toward the “truly effective” end of the ways-to-communicate-climate-change continuum.

Communicating climate change: Focus on the framing, not just the facts

Image 20170303 29002 1h47na1
How you package the information matters.
Frame image via www.shutterstock.com.

Rose Hendricks, University of California, San Diego

Humans are currently in a war against global warming. Or is it a race against global warming? Or maybe it’s just a problem we have to deal with? The Conversation

If you already consider climate change a pressing issue, you might not think carefully about the way you talk about it – regardless of how you discuss it, you already think of global warming as a problem. But the way we talk about climate change affects the way people think about it.

For scientific evidence to shape people’s actions – both personal behaviors like recycling and choices on policies to vote for – it’s crucial that science be communicated to the public effectively. Social scientists have been increasingly studying the science of science communication, to better understand what does and does not work for discussing different scientific topics. It turns out the language you use and how you frame the discussion can make a big difference.

The paradox of science communication

“Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know,” writes Yale law professor Dan Kahan, a leading researcher in the science of science communication.

Kahan’s work shows that just because someone has scientific knowledge, he or she won’t necessarily hold science-supported beliefs about controversial topics like global warming, private gun possession or fracking.

Instead, beliefs are shaped by the social groups people consider themselves to be a part of. We’re all simultaneously members of many social groups – based, for example, on political or religious affiliation, occupation or sexuality. If people are confronted with scientific evidence that seems to attack their group’s values, they’re likely to become defensive. They may consider the evidence they’ve encountered to be flawed, and strengthen their conviction in their prior beliefs.

Unfortunately, scientific evidence does sometimes contradict some groups’ values. For example, some religious people trust a strict reading of the Bible: God said there would be four seasons, and hot and cold, so they don’t worry about the patterns in climate that alarm scientists. In cases like this one, how can communicators get their message across?

A growing body of research suggests that instead of bombarding people with piles of evidence, science communicators can focus more on how they present it. The problem isn’t that people haven’t been given enough facts. It’s that they haven’t been given facts in the right ways. Researchers often refer to this packaging as framing. Just as picture frames enhance and draw attention to parts of an image inside, linguistic frames can do the same with ideas.

One framing technique Kahan encourages is disentangling facts from people’s identities. Biologist Andrew Thaler describes one way of doing so in a post called “When I talk about climate change, I don’t talk about science.” Instead, he talks about things that are important to his audiences, such as fishing, flooding, farming, faith and the future. These issues that matter to the people with whom he’s communicating become an entry into discussing global warming. Now they can see scientific evidence as important to their social group identity, not contradictory to it.

Let me rephrase that

Metaphors also provide frames for talking about climate change. Recent work by psychologists Stephen Flusberg, Paul Thibodeau and Teenie Matlock suggests that the metaphors we use to describe global warming can influence people’s beliefs and actions.

Ready for combat?
Thomas Hawk, CC BY-NC

The researchers asked 3,000 Americans on an online platform to read a short fictional news article about climate change. The articles were exactly the same, but they used different metaphors: One referred to the “war against” and another to the “race against” climate change. For example, each article included phrases about the U.S. seeking to either “combat” (war) or “go after” (race) excessive energy use.

After reading just one of these passages, participants answered questions about their global warming beliefs, like how serious global warming is and whether they would be willing to engage in more pro-environmental behaviors.

Metaphors mattered. Reading about the “war” against global warming led to greater agreement with scientific evidence showing it is real and human-caused. This group of participants indicated more urgency for reducing emissions, believed global warming poses a greater risk and responded that they were more willing to change their behaviors to reduce their carbon footprint than people who read about the “race” against global warming.

The only difference between the articles that participants read was the metaphors they included. Why would reading about a war rather than a race affect people’s beliefs about climate change in such important ways?

The researchers suggest that when we encounter war metaphors, we are reminded (though not always consciously) of other war-related concepts like death, destruction, opposition and struggle. These concepts affect our emotions and remind us of the negative feelings and consequences of defeat. With those war-related thoughts in mind, we may be motivated to avoid losing. If we have these war thoughts swimming around in our minds when we think about global warming, we’re more likely to believe it’s important to defeat the opponent, which, in this case, is global warming.

There are other analogies that are good at conveying the causes and consequences for global warming. Work by psychologists Kaitlin Raimi, Paul Stern and Alexander Maki suggests it helps to point out how global warming is similar to many medical diseases. For both, risks are often caused or aggravated by human behaviors, the processes are often progressive, they produce symptoms outside the normal range of past experiences, there are uncertainties in the prognosis of future events, treatment often involves trade-offs or side effects, it’s usually most effective to treat the underlying problem instead of just alleviating symptoms and they’re hard to reverse.

People who read the medical disease analogy for climate change were more likely to agree with the science-backed explanations for global warming causes and consequences than those who read a different analogy or no analogy at all.

Golden past or rosy future?

Climate change messages can also be framed by focusing on different time periods. Social psychologists Matthew Baldwin and Joris Lammers asked people to read either a past-focused climate change message (like “Looking back to our nation’s past… there was less traffic on the road”) or a similar future-focused message (“Looking forward to our nation’s future… there is increasing traffic on the road”).

The researchers found that self-identified conservatives, who tend to resist climate change messages more than liberals, agreed that we should change how we interact with the planet more after reading the past-focused passage. Liberals, on the other hand, reported liking the future-focused frame better, but the frames had no influence on their environmental attitudes.

Example of a past-focused image (top) and a future-focused image (bottom) of a reservoir.
Image courtesy of NASA. Used in Baldwin and Lammers, PNAS December 27, 2016 vol. 113 no. 52 14953-14957.

And the frames didn’t have to be words. Conservatives also shifted their beliefs to be more pro-environmental after seeing past-focused images (satellite images that progressed from the past to today) more than after seeing future-focused ones (satellite images that progressed from today into the future). Liberals showed no differences in their attitudes after seeing the two frames.

Many climate change messages focus on the potential future consequences of not addressing climate change now. This research on time-framing suggests that such a forward-looking message may in fact be unproductive for those who already tend to resist the idea.

There’s no one-size-fits-all frame for motivating people to care about climate change. Communicators need to know their audience and anticipate their reactions to different messages. When in doubt, though, these studies suggest science communicators might want to bring out the big guns and encourage people to fire away in this war on climate change, while reminding them how wonderful the Earth used to be before our universal opponent began attacking full force.

Rose Hendricks, Ph.D. Candidate in Cognitive Science, University of California, San Diego

This article was originally published on The Conversation. Read the original article.

Past vs. Future Frames for Communicating Climate Change

Climate change (is it happening? how problematic is it? and are humans responsible?) is a partisan issue. Work by Dan Kahan (which I’ve written about before) shows that conservatives are more likely than liberals to believe that climate change is not a result of human activity and that if unchanged, it will not be as destructive as many people claim. Researchers Matthew Baldwin & Joris Lammers explore the possibility that partisan differences in beliefs about climate change might result from differences in the way conservatives and liberals tend to think about time (their temporal focus).

Their starting point was that previous research has shown that conservatives focus more on the past than liberals do. Then they tested two competing frames: one was future-focused (“Looking forward to our nation’s future… there is increasing traffic on the road”) and the other was past-focused (“Looking back to our nation’s past… there was less traffic on the road”). Each participant read just one of these, and then reported their attitudes about climate change and the environment. They found that conservatives reported liking the past-focused message better than the future-focused one and also reported higher environmental attitudes after the past- compared to the future-focused frame.

Screen Shot 2017-01-08 at 5.20.38 PM.png

They replicated these findings in additional experiments with variations. For example, in one test, instead of using linguistic frames to draw attention to either the past or the future, they used satellite images, either showing a progression from the past to today or a forecasted progression from today to the future. Again, conservatives reported more proenvironmental attitudes after viewing past-focused images than future-focused ones.

Next they investigated the temporal focus that real environmental charities tend to use. Not surprisingly, they found that the charities’ messages disproportionately express future consequences, with less focus on the past. Following up on this, they presented participants with money that they could divide among two (fake) charities (one whose message was strongly past- and one whose message was strongly future-focused), or they could keep some or all of it. They saw each charity’s logo and mission statement (the past-focused one stated: “Restoring the planet to its original state” and the future one: “Creating a new Earth for the future”).

Screen Shot 2017-01-08 at 5.28.45 PM.png

Conservatives donated more to the past- than the future-oriented charity. Liberals did the opposite. Further, looking at just the past-oriented charity, conservatives donated more than liberals did. Looking just at the future-oriented one, the opposite pattern emerges. This is a very beautiful interaction (plus the researchers did a few other experiments with slightly varied methods and a meta-analysis, all of which add some weight to these findings).

Considering the finding that climate change communications rely heavily on future-focused appeals, these findings should really make us pause. Is it possible that climate change issues themselves may not actually be what divides conservatives and liberals so much, but instead the way they’re communicated might be driving much of the disagreement between them? My intuition is that framing is not entirely to blame for conservatives’ and liberals’ divergent beliefs about climate change, but this work shows that it may be a big part of the story. It certainly won’t hurt for communicators to start diversifying our temporal frames for discussing climate change.


For more consideration on this topic, see earlier posts: Climate change is a big problem and we need to find better ways of talking about it; Narratives for Communicating Climate Change; and The paradox of science communication and the new science to resolve it.

All figures from Baldwin, M. & Lammers, J. (2016) Past-focused environmental comparisons proenvironmental outcomes for conservatives. PNAS, 113(52), 14953-14957.

For a discussion of why the framing described in this paper might not be enough to change conservatives’ minds about climate change, see This one weird trick will not convince conservatives to fight climate change, by David Roberts for Vox.

Climate change is like a medical disease

I recently wrote for PLOS SciComm about a very cool study on the benefits of using analogies to talk about climate change (aptly called The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change). The researchers found that using any analogy (comparing climate change to a medical disease, a courtroom, or a natural disaster) was helpful, but that the medical disease analogy in particular helped people consider important aspects of climate change that often polarize people along political party lines.

Please check out the full piece here!

 

The paradox of science communication and the new science to resolve it

I recently discovered this interesting paper by Yale Law Professor Dan Kahan: What is the “science of science communication”? He introduces the concept of the science communication paradox: “Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know.” This figure demonstrates the science communication paradox, showing strong disagreements about different risks for people with different political beliefs:

Screen Shot 2017-01-08 at 9.55.52 AM.pngAccording to Kahan, resolving the science communication paradox is the main goal of the new science of science communication. He lays out two potential explanations for the science communication paradox:

  • Public Irrationality Thesis (PIT): Advocates of this position believe that the public, on the whole, is not scientifically literate and does not think about risk the way scientists do.

If PIT is correct, then as people acquire more scientific literacy, their views on different risks should align more with scientists’ views. This is not what actually happens:

Screen Shot 2017-01-08 at 10.05.01 AM.pngWith increasing science comprehension scores, people actually become more polarized along party lines in their belief about the risk of climate change (shown in more detail in an earlier publication by the same group). Scientific literacy does not necessarily mean views aligned with scientists’.

  • Cultural Cognition Thesis (CCT), alternative explanation for the science communication paradox: Suggests that our group identities are fundamental shapers of how we think about risk. Kahan gives the analogy of sports fans for two opposing teams: The opposing fans are likely to actually see a replay of a questionable call differently and in favor of their team. Along these lines, when people feel their group’s stance is being threatened, they’re more likely to see evidence as confirming their belief or to discount the source if it contradicts what they want to believe. This account is much more consistent with the real-world data (for example, the previous graph) and experimental work he and collaborators have done.

It’s important to note that risk assessments of most science issues do not demonstrate the science communication paradox: people with more science knowledge don’t necessarily become more polarized in their beliefs. Here are a few examples of issues that science intelligence predicts people’s risk assessments better than their political leaning:

Screen Shot 2017-01-08 at 10.19.23 AM.png

How can we use awareness of the cultural cognition thesis to improve science communication?

Kahan suggests the disentanglement principle: If polarization results from a clash between people’s identities as members of a cultural group and scientific facts they’re encountering, we should work to separate these two. For example, a true/false question might state: “Human beings, as we know them today, developed from earlier species of animals.” Someone who belongs to a religious group that doesn’t support evolution has to choose between answering this question in a way that’s consistent with scientific consensus OR their group identity. But rewording the statement to something like: “According to the theory of evolution, human beings, as we know them today, developed from earlier species of animals.” takes away the conflict. The respond can now demonstrate scientific knowledge by agreeing with the statement without jeopardizing their identity of part of a group that doesn’t believe in evolution.

Screen Shot 2017-01-08 at 10.29.32 AM.png

While the original statement has led to polarized views (as science knowledge increases, religious and non-religious people’s responses begin to diverge more), the second framing has shown converging responses (scientific knowledge, rather than religious beliefs, now becomes the best predictor of correct responses).

In a great blog post for Southern Fried Science, Andrew Thaler shares that he talks about a lot of things when he talks about climate change, but science isn’t one of them. He talks about fishing, flooding, farming, faith, and the future. These are things that his audiences know deeply, and climate change is relevant to anyone with interest in any of his f’s. He provides a great example of disentangling people’s identities from the scientific issue, and instead actually uses their identities to show the issue’s relevance.

The disentanglement principle offers one way that the new science of science communication begins to reduce the paradox of science communication, but it’s just one drop in a huge pond of paradox. We have to keep working on ways to communicate information that conflicts with people’s cultural identities, and as this work shows, jamming information down people’s throats isn’t the way to close belief gaps.


Feature image: Communication by Joan M. MasCC.

All figures from Kahan, D. (2015). What is the “science of science communication”? Journal of Science Communication, 14(3).

Narratives for communicating climate change

Last week I wrote about work by UC researchers on framing climate change, a chapter that focuses on how we can harness our understanding of human psychology — how we learn, think, and behave — to communicate science better. Here’s another paper (one that’s gotten very popular, very quickly) that considers human cognition for the efficacy of communicating about climate change.

Narrative Style Influences Citation Frequency in Climate Change ScienceThe authors of this paper (Ann Hillier, Ryan Kelly, & Terrie Klinger, all from the University of Washington) started with the insight from psychology that people understand and remember story-like (narrative) writing better than explanatory (expository) writing. They considered abstracts from 802 scientific papers about climate change, and looked for different markers of narrative structure:
1) description of setting (where/when the events took place)
2) narrative perspective (the presence of a narrator)
3) sensory language (appealing to the senses or emotions)
4) conjunctions (used often in narratives to connect narratives logically)
5) connectivity (phrases that create explicit links to something mentioned earlier in the text)
6) appeal (whether the text makes an appeal to the reader or a recommendation for specific action)

The authors crowdsourced this first part of their data analysis. This means that non-scientists who use an online job platform (crowdflower.com) were given the authors’ instructions for analyzing the abstracts. This way, each abstract was analyzed by 7 independent people, and involved human interpretation and discretion, which can likely provide a more accurate index of narrativity than any computerized methods can at the moment.

The authors considered how many times each paper in the study had been cited by others as a reflection of how much impact each paper had on subsequent science conducted. They found that 4 of their 6 narrative indicators (sensory language, conjunctions, connectivity, and appeal to reader) were related to how frequently articles were cited by others. In other words, papers higher in narrativity were cited more often than those that were more expository.

journal-pone-0167983-g001-2
Subset of Figure 1, showing that as articles increase in narrativity, their citations increase as well.

The more citations a paper receives, the more other researchers will see the work. It’s possible that higher quality work lends itself better to a narrative style, so papers high in narrativity will also be cited often. Since this study is correlational, we have no way of ruling out this possibility that the best science is conducive to narrative presentation, and it would be cited a lot regardless of its narrative style because it’s just good research. The causal arrow is not clear here, but it is clear that impactful research tends to take on a narrative structure. Even though narrative writing doesn’t necessarily lead to citations, imitating the style of papers that are cited often doesn’t seem to be a bad idea.

This work is not the first to suggest that narratives can be helpful for understanding climate change. FrameWorks Institute, a nonprofit organization that designs ways to communicate complex issues and tests their efficacy for cognitive and behavior changes, has a toolkit that uses (visual) narratives to communicate about climate change. (Also note that the toolkit is just the tip of the iceberg for the extensive work FrameWorks has done on communicating climate change.)

Together, the work by FrameWorks and the study of narrativity and citations present a pretty clear takeaway for climate scientists (and likely scientists in many fields): ease off the traditional academic expository style and lean into a more understandable and memorable narrative style.


For an interesting (and more critical) take on this paper, see this post by Randy Olson at scienceneedsstory.com)

Getting a scientific message across means taking human nature into account

I really enjoyed thinking about, researching, and writing this piece for The Conversation, where this work was originally published.

Getting a scientific message across means taking human nature into account

Rose Hendricks, University of California, San Diego

We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not.

Scientists and the media need to communicate more science and communicate it better. Good communication ensures that scientific progress benefits society, bolsters democracy, weakens the potency of fake news and misinformation and fulfills researchers’ responsibility to engage with the public. Such beliefs have motivated training programs, workshops and a research agenda from the National Academies of Science, Engineering, and Medicine on learning more about science communication. A resounding question remains for science communicators: What can we do better?

A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.”

But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Or they read an online article by a scientist about the necessity of vaccines, but leave comments expressing outrage that doctors are trying to further a pro-vaccine agenda. Convincing people that scientific evidence has merit and should guide behavior may be the greatest science communication challenge, particularly in our “post-truth” era.

Luckily, we know a lot about human psychology – how people perceive, reason and learn about the world – and many lessons from psychology can be applied to science communication endeavors.

Consider human nature

Regardless of your religious affiliation, imagine that you’ve always learned that God created human beings just as we are today. Your parents, teachers and books all told you so. You’ve also noticed throughout your life that science is pretty useful – you especially love heating up a frozen dinner in the microwave while browsing Snapchat on your iPhone.

One day you read that scientists have evidence for human evolution. You feel uncomfortable: Were your parents, teachers and books wrong about where people originally came from? Are these scientists wrong? You experience cognitive dissonance – the uneasiness that results from entertaining two conflicting ideas.

It’s uncomfortable to hold two conflicting ideas at the same time. Man image via www.shutterstock.com.

Psychologist Leon Festinger first articulated the theory of cognitive dissonance in 1957, noting that it’s human nature to be uncomfortable with maintaining two conflicting beliefs at the same time. That discomfort leads us to try to reconcile the competing ideas we come across. Regardless of political leaning, we’re hesitant to accept new information that contradicts our existing worldviews.

One way we subconsciously avoid cognitive dissonance is through confirmation bias – a tendency to seek information that confirms what we already believe and discard information that doesn’t.

This human tendency was first exposed by psychologist Peter Wason in the 1960s in a simple logic experiment. He found that people tend to seek confirmatory information and avoid information that would potentially disprove their beliefs.

The concept of confirmation bias scales up to larger issues, too. For example, psychologists John Cook and Stephen Lewandowsky asked people about their beliefs concerning global warming and then gave them information stating that 97 percent of scientists agree that human activity causes climate change. The researchers measured whether the information about the scientific consensus influenced people’s beliefs about global warming.

Those who initially opposed the idea of human-caused global warming became even less accepting after reading about the scientific consensus on the issue. People who had already believed that human actions cause global warming supported their position even more strongly after learning about the scientific consensus. Presenting these participants with factual information ended up further polarizing their views, strengthening everyone’s resolve in their initial positions. It was a case of confirmation bias at work: New information consistent with prior beliefs strengthened those beliefs; new information conflicting with existing beliefs led people to discredit the message as a way to hold on to their original position.

Just shouting louder isn’t going to help. Megaphone image via www.shutterstock.com.

Overcoming cognitive biases

How can science communicators share their messages in a way that leads people to change their beliefs and actions about important science issues, given our natural cognitive biases?

The first step is to acknowledge that every audience has preexisting beliefs about the world. Expect those beliefs to color the way they receive your message. Anticipate that people will accept information that is consistent with their prior beliefs and discredit information that is not.

Then, focus on framing. No message can contain all the information available on a topic, so any communication will emphasize some aspects while downplaying others. While it’s unhelpful to cherry-pick and present only evidence in your favor – which can backfire anyway – it is helpful to focus on what an audience cares about.

For example, these University of California researchers point out that the idea of climate change causing rising sea levels may not alarm an inland farmer dealing with drought as much as it does someone living on the coast. Referring to the impact our actions today may have for our grandchildren might be more compelling to those who actually have grandchildren than to those who don’t. By anticipating what an audience believes and what’s important to them, communicators can choose more effective frames for their messages – focusing on the most compelling aspects of the issue for their audience and presenting it in a way the audience can identify with.

In addition to the ideas expressed in a frame, the specific words used matter. Psychologists Amos Tversky and Daniel Kahneman first showed when numerical information is presented in different ways, people think about it differently. Here’s an example from their 1981 study:

Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. If Program B is adopted, there is ⅓ probability that 600 people will be saved, and ⅔ probability that no people will be saved.

Both programs have an expected value of 200 lives saved. But 72 percent of participants chose Program A. We reason about mathematically equivalent options differently when they’re framed differently: Our intuitions are often not consistent with probabilities and other math concepts.

Metaphors can also act as linguistic frames. Psychologists Paul Thibodeau and Lera Boroditsky found that people who read that crime is a beast proposed different solutions than those who read that crime is a virus – even if they had no memory of reading the metaphor. The metaphors guided people’s reasoning, encouraging them to transfer solutions they’d propose for real beasts (cage them) or viruses (find the source) to dealing with crime (harsher law enforcement or more social programs).

The words we use to package our ideas can drastically influence how people think about those ideas.

What’s next?

We have a lot to learn. Quantitative research on the efficacy of science communication strategies is in its infancy but becoming an increasing priority. As we continue to untangle more about what works and why, it’s important for science communicators to be conscious of the biases they and their audiences bring to their exchanges and the frames they select to share their messages.

TLDR Guide to Ch 5 of Communicating Science Effectively: A Research Agenda

Each day so far this week, I’ve shared my highlights of the National Academy of Science’s guide and research agenda for communicating science effectively (ch1, ch2, ch3, ch4). Today I’ll cover the final chapter.


Chapter 5: Building the knowledge base for effective science communication

This chapter brings back a number of issues discussed in earlier chapters with a focus on how the science of science communication can continue to be more informative.

Scientific communications often have an underlying assumption that when communication is done well, the public’s understanding of and attitudes about societal issues will be affected. It seems like a reasonable assumption, but it has not been extensively tested, and there are likely many conditions under which the assumption is false. “Good” communication alone won’t suffice for many of science communicators’ goals.

Future steps for science communicators

The report calls for more partnerships between researchers and science communicators to put into practice the lessons revealed by research on science communication. These partnerships will also be important for furthering research on science communication and testing hypotheses about ideal communication practices.

Screen Shot 2016-12-19 at 12.51.00 PM.png

I had never considered the possibility that science communication could be irrelevant for the achieving end goals. I think science communicators generally believe that it’s important for their messages to be communicated, and in many cases this is probably true, but I think it is worth considering the relative importance of science communication in creating changes compared to all the other things that also matter.

Using a systems approach to guide research on science communication

In cognitive science, we’re often drawn to look at the cognition of a system. For example, we might not just look at neural activity in order to try to understand some cognitive process, but instead will consider the whole body, environment, and culture in which the cognitive act is situated. This report calls us to think about science communication similarly: every communicative effort is part of a larger system, encompassing the content being communicated, its format, the diverse organizations and individuals who make up the communicators and audiences, the channels of communication, and the political and social contexts that the communication takes place in. This kind of holistic perspective takes into account the system-wide complexity instead of focusing on isolated elements, since findings about elements in isolation may not hold in complex and realistic situations. Since research does often need to be specific to be productive, the report suggests that researchers who are focusing on a single level or element in the system should at least be “acutely aware” of the broader context.

8161518829_fb12e9a367_b
communicate by johnny goldstein, CC

More research

We need more research that will inform best practices for communicating science. Some of this research should come in the form of randomized controlled field experiments, which will involve comparison conditions (for example, strategy A was more successful than strategy B) that take place in identical groups (participants were randomly assigned so that people who received strategy A didn’t differ in any way from those who received strategy B except in the strategy they received).

The report also calls for more training for researchers at all career levels, both so that the science of science communication can continue to become more rigorous, and also so that all other scientists can improve the way they communicate about their own work.


Seriously, we can all get better. This report is long, but it has a lot of important points for science communicators, which I’ve tried to distill into this series of blog posts. For me, the report provides encouragement: there’s a lot we already know about ways to most effectively communicate science, and there’s a comprehensive agenda for continuing to improve.