Hurdles to Communicating Science & Strategies to Overcome them

Communicating science is hard in part because doing and understanding science is hard, but there are also some unique hurdles that science communicators face — especially when communicating information that’s relevant for policies. James Druckman recently described some of the challenges that particularly face people communicating policy-relevant science, and ways those challenges can be minimized.

Value-laden diversity

We all have values, relatively unchanging beliefs that reflect the way we see the world. For example, some people are more “individualist,” while others are more “communitarian.” If scientific information seems to contradict a value, we’ll be hesitant to accept that information. Information about climate change, especially if it contains or implies suggestions for reducing the problem by increasing regulations on businesses, might contradict an individualist’s values, making it hard for that person to even consider the scientific information. For more on this hurdle, see Dan Kahan’s work on The Science of Science communication, an earlier post on this blog about Kahan’s work, or a great post by Chris Mooney on Mother Jones.

What to do about it

First, communicators have to recognize that their audience will have a diverse set of values, some of which will conflict with the communicator’s values, and that these values will influence the way people receive scientific information.

Next, communicators should minimize the extent to which their message contains a value commentary. In other words, they should make sure the relevant science comes into play for certain policy decisions without defining “good” or “competent” decisions.

774926895_9fe0f5495a_b
Hurdles start. by robert voors. CC BY-NC-ND.

Motivated Reasoning

Motivated reasoning (or confirmation bias) is our drive to seek information that reinforces our prior beliefs and disregard information that does not. For example, in work by Duckman & Bolsen (2011), participants initially indicated their support for genetically modified (GM) foods. After 10 days, all participants received 3 types of info: positive information about how GM foods combat diseases, negative information about their possible longterm health consequences, and neutral information about their economic consequences.

People who initially supported GM foods dismissed the negative information and rated the positive information as valid, and perceived the neutral information as indicating benefits of GM foods. People who were initially opposed to GM foods did the exact opposite: dismissed the positive information, considered the negative information as valid, and interpreted the neutral information as indicating drawbacks of GM foods. Work on motivated reasoning shows we interpret information through a lens laden with our prior beliefs.

1427849853_1cb5cefd16_m
Last of the crop. By Mrs eNil. CC BY-ND-NC.

There have been lots of great articles highlighting motivated reasoning lately. These include Why Facts don’t Change our Minds, This Article won’t Change your Mind, and Why You Think You’re Right, Even When You’re Wrong.

What to do about it

When motivated reasoning occurs, people are motivated to understand information in a way that aligns with their previous beliefs. Instead, science communicators want to motivate their audience to understand new information in a way that will lead to maximum accuracy. There are a few things communicators can do to encourage people to seek accurate understandings:

  • Show that the issue and information matter for the individual’s life. Show relevance.
  • Present information that comes from a variety of sources, preferably ones with different goals (i.e., from Democrat and Republican sources)
  • Encourage people to explain their position to others (or at least prepare themselves to explain their position). Elaborating on your position requires people to think it through more carefully, and provide explicit evidence for their claims that goes beyond “because I want to believe this.”

Politicization

This term does not mean what we might expect given its name. Politicization is “the inevitable uncertainties about aspects of science to cast doubt on the science overall…thereby magnifying doubts in the public mind” (Stekette 2010, p. 2). It’s not exactly misinformation, since it doesn’t introduce false findings, but instead magnifies doubt. It’s especially common in issues about global warming and vaccination. People who politicize these issues send the message that scientific evidence on these issues is not as conclusive as it’s been made out to be.

What to do about it

Politicization comes directly from people perceiving scientists or informants as lacking credibility and being motivated to reason in ways consistent with their prior beliefs. Thus, politicization can be countered by addressing those hurdles – establishing credibility and encouraging an accuracy motivation. There are a couple other things we can do to overcome this hurdle:

  • Warn people of politicization they’re likely to encounter before they encounter it. This is sometimes referred to as an inoculation message, and it points out the strategies politicizers use and why their message is not to be trusted
  • Correct politicized messages after people have encountered them. Corrections are often not as effective as inoculation messages since people may have already had time to process and begin to believe the politicized message. However, corrections can be effective when people are motivated to reach an accurate understanding.

There’s more

Of course, these aren’t the only hurdles to communicating policy-relevant science. Other hurdles described by Druckman that I haven’t elaborated on include: communicating policy-relevant science requires effort on the part of scientists, getting and maintaining attention, establishing credibility, and changing government inaction.

More and more scientists are recognizing the value of communicating their science outside the Ivory Tower. At the same time, the science of science communication is advancing to help us all understand the hurdles we face and how to best overcome them.

Advertisements

Scientists Agree on Climate Change: A Gateway Belief

 

Screen Shot 2017-03-24 at 3.13.04 PM
https://climate.nasa.gov/scientific-consensus/

It doesn’t get much clearer. The Earth’s climate is warming. Humans are the reason. But how many people are actually aware of the scientific consensus on this issue?

Research by Sander van der Linden and colleagues shows that when people believe that scientists overwhelmingly agree about climate change, they increase their a) own beliefs in climate change and b) beliefs that humans are responsible. They feel c) more worried about climate change, and as a result of a, b, and c, they support public action to mitigate the effects of climate change.

journal.pone.0118489.g001

At the beginning of the study, participants indicated the percentage of scientists they thought agree on global warming and they answered some questions about their own climate change beliefs. People then received a message about scientific consensus, which took the form either of a) a simple description, b) a pie chart, or c) a metaphorical comparison related to trusting engineers’ consensus about bridges (i.e., if 97% of engineers agreed that a bridge was unsafe, would you use it?) or doctors’ consensus about illness. All the messages included the info that “97 % of climate scientists have concluded that human-caused climate change is happening.”

Then participants again indicated what percent of scientists they thought agree on global warming and answered questions about their own beliefs. All messages “worked,” in the sense that people perceived greater scientific agreement after the messages telling them that 97% of scientists agree than if they hadn’t read anything about the consensus at all (though the simple description and pie chart were more effective than the metaphor. People shifted their climate change beliefs more after encountering one of the more straightforward messages than the more complex metaphor. Great food for thought, as many science communicators insert metaphors wherever they can).

Of course, having people believe that there’s strong scientific consensus about climate is only one step toward the larger goal of having them endorse actions that mitigate the effects of climate change. But in follow-up analyses, the researchers identified that perceiving scientific agreement is a gateway belief: believing that scientists agree about global warming led to other beliefs, ones that get us closer to the goal of actions in favor of mitigating climate change. Specifically, it led to greater belief that climate change was real, human-caused, and worrisome. These beliefs, in turn, led to greater support for public action against climate change. It’s often hard to know what leads to what, especially when it comes to beliefs we keep hidden in our own heads, but with some semi-fancy math, these researchers quantified those relationships.

9154498428_d5d720f9ee_b
Climate 365 by NASA Goddard Space Space Flight Center. CC BY.

These studies have some clear takeaways for science communicators (especially when communicating about climate change — but maybe these ideas apply to other topics too — need more research!)

  • Emphasize scientific consensus, that an overwhelming percentage of scientists agree that climate change is a real problem caused by human activity.
  • Don’t worry so much about immediately pushing for public action against climate change. When people understand that scientists agree, they come to agree themselves that climate change is a problem that should be addressed, and THEN they come to support public action. Be careful about skipping steps.

At the same time, there’s not only one right way to communicate about climate change. There are truly effective ways, ineffective and potentially backfiring ways, and many in between. There aren’t cut-and-dry rules because every audience is unique, and taking the audience into account — their beliefs, values, and past experiences, for example — is crucial. But this work sheds light on communication strategies that are probably pretty far toward the “truly effective” end of the ways-to-communicate-climate-change continuum.

Reframing the war on science

America’s kind of tense right now. Leading up to and following the November 2016 election, there’s a lot of talk of “the two Americas” and “the Divided States of America.” Americans are divided on a lot of issues, including scientific topics like vaccine safety and global warming. To many, it’s surprising that we disagree about these things because according to scientists who research these topics, there are no debates at all: vaccines do not cause autism and humans are responsible for global warming

At the same time, the current administration in the US has sent numerous messages that they devalue science (for example, by censoring scientists at organizations like the USA and EPA and establishing a Committee on Vaccine Safety). Actions like these seem to be only fueling the divided science beliefs.

In response, many people have declared that we’re in a war on science: This idea is expressed in headlines like Facts are the reason science is losing the current war on reason, How the Anti-Vaxxers are Winning, and documentary titles like The Vaccine War. (There are so many pieces that talk about the war on science).

screen-shot-2017-03-04-at-10-08-43-am
Scientific American

I’m a PhD student in Cognitive Science, a firm believer in the scientific method and basing beliefs and actions on evidence. I highly value scientific funding, vaccinations, and measures that reduce the effects of climate change. As Americans, we have freedom of speech, and we should exercise that freedom to speak up when scientific knowledge and interests are being trampled on. I agree with the ideas expressed in blog posts like The War on Facts is a War on Democracy and I’m a Scientist. This is what I’ll Fight for and many of the ideas that continuously populate threads on Twitter like #defendscience and #resist. But I’m much less enthusiastic about the widespread use of a war metaphor to get those ideas across.

Here’s why.

Metaphors shape thought

The metaphors we use to describe complex social problems actually shape the way we think about them. For example, when crime was described as a beast ravaging a town, people tended to suggest harsh law enforcement policies — similar to how they’d likely react to a literal beast ravaging their town. On the other hand, when that same crime was described as a virus, people suggested fewer harsh enforcement policies. Instead, they turned their focus to curing the town of problems that may underlie the crime, like improving education and welfare services.

People make inferences in line with the metaphors used to describe complex issues, so it’s important to reflect on what the war on science implies. It does have some helpful implications. Wars are serious, and often require urgent action. These are probably the messages that those who perpetuate the war on science want us to infer, even if not consciously.

But the war also suggests that there are enemies and casualties. There are two sides locked in combat, and neither will back down until they win (or they’re decimated). I like this quote from A Gentleman in Moscow, a novel I just happened to be reading while working on this post: After all, in the midst of armed conflicts, facts are bound to be just as susceptible to injury as ships and men, if not more so. In other words, we sometimes do stupid things in wars. We shirk thoughtfulness and conscientiousness, and instead we just fight. As I see it, our current political situation (for lack of a better word) needs all the thoughtfulness and conscientiousness we can give it.  

I recently expressed my concern in a conversation on Twitter:

The war metaphor challenges those who are not already on the “side” of science. It tells them they’re the enemy. When people feel that they’re being attacked, even idealistically, they’re likely to strengthen their stance and gear up to fight back. No matter how many scientists tweet about science or participate in the March for Science on Earth Day, people who have found themselves on the “anti-science” side of this war are not going to decide all of a sudden that climate change must be real after all or that they should rush their kids to the pediatrician for overdue vaccines (especially if we tell them we’re marching to fight the war on science!). People who have already been labeled as the enemy of science may as well go out and buy a new gas guzzler and decide that their kids are just fine without vaccines.

Others have already pointed out that actions like the science march are already in jeopardy of isolating anti-science proponents as opponents (for example see  Daniel Engber’s piece for Slate and Robert Young’s in the New York Times). Using war metaphors has the potential to only hammer that point home.

138245187_3f2bafe0bc_z
This just doesn’t seem productive. Image: Battle by Thomas Hawk. CC BY-NC 2.0

Alternative frames?

If we want to stop thinking about ourselves as engaged in a war on science, we need an alternative. Proponents of and believers in science are experiencing a sort of struggle, but it doesn’t have to be a fight between the left and right, Democrats and Republicans, Coastal Elite and Middle America. Maybe we can reframe the situation as a challenge that unites all humans. Science communicators want to share how important it is to address climate change and to have children vaccinated for the good of all people. We can all be on the same side, working to better the world we live in, and it’s important that we convey that message in our communications.

Referring to the movie Hidden Figures, NPR blogger Marcelo Gleiser points out that if there is a central lesson in the movie, it is that united we win; that what makes America great is not segregation and intolerance, but openness and inclusiveness.

I considered the possibility that guiding people to trust empirical evidence and the scientific process might be better framed as a puzzle — a challenge, no doubt, but at least everyone’s working toward a common goal.

Marisa makes a really important point. The peacekeeper in me would love a frame that emphasizes hey, guys! We’re all in this together!, but that ship may have already sailed. At this point, it’s important not to downplay the gravity of discrediting and distrusting science. This is not a game.

 

I’ve had quite a few conversations on the war on science, but I still don’t have a one-size-fits-all framing suggestion for talking about America’s disconnect in belief in science. But when we’re considering talking about this issue as a war, it’ll be helpful to step back and assess our goals and the potential consequences of the words we use.

Right now, there are deep social and political divides in American society — and though it’s crucial to stand up for what we believe in (especially science and facts!), we should be careful about taking up arms in a war on science that might deepen those divides. 

I welcome other comments on the framing of the war on science. Do you find the war helpful? Why? Are there other frames we could use to avoid deepening ideological divides?


Featured image: United States USA Flag by Mike Mozart. CC BY 2.0

Communicating climate change: Focus on the framing, not just the facts

Image 20170303 29002 1h47na1
How you package the information matters.
Frame image via www.shutterstock.com.

Rose Hendricks, University of California, San Diego

Humans are currently in a war against global warming. Or is it a race against global warming? Or maybe it’s just a problem we have to deal with? The Conversation

If you already consider climate change a pressing issue, you might not think carefully about the way you talk about it – regardless of how you discuss it, you already think of global warming as a problem. But the way we talk about climate change affects the way people think about it.

For scientific evidence to shape people’s actions – both personal behaviors like recycling and choices on policies to vote for – it’s crucial that science be communicated to the public effectively. Social scientists have been increasingly studying the science of science communication, to better understand what does and does not work for discussing different scientific topics. It turns out the language you use and how you frame the discussion can make a big difference.

The paradox of science communication

“Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know,” writes Yale law professor Dan Kahan, a leading researcher in the science of science communication.

Kahan’s work shows that just because someone has scientific knowledge, he or she won’t necessarily hold science-supported beliefs about controversial topics like global warming, private gun possession or fracking.

Instead, beliefs are shaped by the social groups people consider themselves to be a part of. We’re all simultaneously members of many social groups – based, for example, on political or religious affiliation, occupation or sexuality. If people are confronted with scientific evidence that seems to attack their group’s values, they’re likely to become defensive. They may consider the evidence they’ve encountered to be flawed, and strengthen their conviction in their prior beliefs.

Unfortunately, scientific evidence does sometimes contradict some groups’ values. For example, some religious people trust a strict reading of the Bible: God said there would be four seasons, and hot and cold, so they don’t worry about the patterns in climate that alarm scientists. In cases like this one, how can communicators get their message across?

A growing body of research suggests that instead of bombarding people with piles of evidence, science communicators can focus more on how they present it. The problem isn’t that people haven’t been given enough facts. It’s that they haven’t been given facts in the right ways. Researchers often refer to this packaging as framing. Just as picture frames enhance and draw attention to parts of an image inside, linguistic frames can do the same with ideas.

One framing technique Kahan encourages is disentangling facts from people’s identities. Biologist Andrew Thaler describes one way of doing so in a post called “When I talk about climate change, I don’t talk about science.” Instead, he talks about things that are important to his audiences, such as fishing, flooding, farming, faith and the future. These issues that matter to the people with whom he’s communicating become an entry into discussing global warming. Now they can see scientific evidence as important to their social group identity, not contradictory to it.

Let me rephrase that

Metaphors also provide frames for talking about climate change. Recent work by psychologists Stephen Flusberg, Paul Thibodeau and Teenie Matlock suggests that the metaphors we use to describe global warming can influence people’s beliefs and actions.

Ready for combat?
Thomas Hawk, CC BY-NC

The researchers asked 3,000 Americans on an online platform to read a short fictional news article about climate change. The articles were exactly the same, but they used different metaphors: One referred to the “war against” and another to the “race against” climate change. For example, each article included phrases about the U.S. seeking to either “combat” (war) or “go after” (race) excessive energy use.

After reading just one of these passages, participants answered questions about their global warming beliefs, like how serious global warming is and whether they would be willing to engage in more pro-environmental behaviors.

Metaphors mattered. Reading about the “war” against global warming led to greater agreement with scientific evidence showing it is real and human-caused. This group of participants indicated more urgency for reducing emissions, believed global warming poses a greater risk and responded that they were more willing to change their behaviors to reduce their carbon footprint than people who read about the “race” against global warming.

The only difference between the articles that participants read was the metaphors they included. Why would reading about a war rather than a race affect people’s beliefs about climate change in such important ways?

The researchers suggest that when we encounter war metaphors, we are reminded (though not always consciously) of other war-related concepts like death, destruction, opposition and struggle. These concepts affect our emotions and remind us of the negative feelings and consequences of defeat. With those war-related thoughts in mind, we may be motivated to avoid losing. If we have these war thoughts swimming around in our minds when we think about global warming, we’re more likely to believe it’s important to defeat the opponent, which, in this case, is global warming.

There are other analogies that are good at conveying the causes and consequences for global warming. Work by psychologists Kaitlin Raimi, Paul Stern and Alexander Maki suggests it helps to point out how global warming is similar to many medical diseases. For both, risks are often caused or aggravated by human behaviors, the processes are often progressive, they produce symptoms outside the normal range of past experiences, there are uncertainties in the prognosis of future events, treatment often involves trade-offs or side effects, it’s usually most effective to treat the underlying problem instead of just alleviating symptoms and they’re hard to reverse.

People who read the medical disease analogy for climate change were more likely to agree with the science-backed explanations for global warming causes and consequences than those who read a different analogy or no analogy at all.

Golden past or rosy future?

Climate change messages can also be framed by focusing on different time periods. Social psychologists Matthew Baldwin and Joris Lammers asked people to read either a past-focused climate change message (like “Looking back to our nation’s past… there was less traffic on the road”) or a similar future-focused message (“Looking forward to our nation’s future… there is increasing traffic on the road”).

The researchers found that self-identified conservatives, who tend to resist climate change messages more than liberals, agreed that we should change how we interact with the planet more after reading the past-focused passage. Liberals, on the other hand, reported liking the future-focused frame better, but the frames had no influence on their environmental attitudes.

Example of a past-focused image (top) and a future-focused image (bottom) of a reservoir.
Image courtesy of NASA. Used in Baldwin and Lammers, PNAS December 27, 2016 vol. 113 no. 52 14953-14957.

And the frames didn’t have to be words. Conservatives also shifted their beliefs to be more pro-environmental after seeing past-focused images (satellite images that progressed from the past to today) more than after seeing future-focused ones (satellite images that progressed from today into the future). Liberals showed no differences in their attitudes after seeing the two frames.

Many climate change messages focus on the potential future consequences of not addressing climate change now. This research on time-framing suggests that such a forward-looking message may in fact be unproductive for those who already tend to resist the idea.

There’s no one-size-fits-all frame for motivating people to care about climate change. Communicators need to know their audience and anticipate their reactions to different messages. When in doubt, though, these studies suggest science communicators might want to bring out the big guns and encourage people to fire away in this war on climate change, while reminding them how wonderful the Earth used to be before our universal opponent began attacking full force.

Rose Hendricks, Ph.D. Candidate in Cognitive Science, University of California, San Diego

This article was originally published on The Conversation. Read the original article.

Past vs. Future Frames for Communicating Climate Change

Climate change (is it happening? how problematic is it? and are humans responsible?) is a partisan issue. Work by Dan Kahan (which I’ve written about before) shows that conservatives are more likely than liberals to believe that climate change is not a result of human activity and that if unchanged, it will not be as destructive as many people claim. Researchers Matthew Baldwin & Joris Lammers explore the possibility that partisan differences in beliefs about climate change might result from differences in the way conservatives and liberals tend to think about time (their temporal focus).

Their starting point was that previous research has shown that conservatives focus more on the past than liberals do. Then they tested two competing frames: one was future-focused (“Looking forward to our nation’s future… there is increasing traffic on the road”) and the other was past-focused (“Looking back to our nation’s past… there was less traffic on the road”). Each participant read just one of these, and then reported their attitudes about climate change and the environment. They found that conservatives reported liking the past-focused message better than the future-focused one and also reported higher environmental attitudes after the past- compared to the future-focused frame.

Screen Shot 2017-01-08 at 5.20.38 PM.png

They replicated these findings in additional experiments with variations. For example, in one test, instead of using linguistic frames to draw attention to either the past or the future, they used satellite images, either showing a progression from the past to today or a forecasted progression from today to the future. Again, conservatives reported more proenvironmental attitudes after viewing past-focused images than future-focused ones.

Next they investigated the temporal focus that real environmental charities tend to use. Not surprisingly, they found that the charities’ messages disproportionately express future consequences, with less focus on the past. Following up on this, they presented participants with money that they could divide among two (fake) charities (one whose message was strongly past- and one whose message was strongly future-focused), or they could keep some or all of it. They saw each charity’s logo and mission statement (the past-focused one stated: “Restoring the planet to its original state” and the future one: “Creating a new Earth for the future”).

Screen Shot 2017-01-08 at 5.28.45 PM.png

Conservatives donated more to the past- than the future-oriented charity. Liberals did the opposite. Further, looking at just the past-oriented charity, conservatives donated more than liberals did. Looking just at the future-oriented one, the opposite pattern emerges. This is a very beautiful interaction (plus the researchers did a few other experiments with slightly varied methods and a meta-analysis, all of which add some weight to these findings).

Considering the finding that climate change communications rely heavily on future-focused appeals, these findings should really make us pause. Is it possible that climate change issues themselves may not actually be what divides conservatives and liberals so much, but instead the way they’re communicated might be driving much of the disagreement between them? My intuition is that framing is not entirely to blame for conservatives’ and liberals’ divergent beliefs about climate change, but this work shows that it may be a big part of the story. It certainly won’t hurt for communicators to start diversifying our temporal frames for discussing climate change.


For more consideration on this topic, see earlier posts: Climate change is a big problem and we need to find better ways of talking about it; Narratives for Communicating Climate Change; and The paradox of science communication and the new science to resolve it.

All figures from Baldwin, M. & Lammers, J. (2016) Past-focused environmental comparisons proenvironmental outcomes for conservatives. PNAS, 113(52), 14953-14957.

For a discussion of why the framing described in this paper might not be enough to change conservatives’ minds about climate change, see This one weird trick will not convince conservatives to fight climate change, by David Roberts for Vox.

Climate change is like a medical disease

I recently wrote for PLOS SciComm about a very cool study on the benefits of using analogies to talk about climate change (aptly called The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change). The researchers found that using any analogy (comparing climate change to a medical disease, a courtroom, or a natural disaster) was helpful, but that the medical disease analogy in particular helped people consider important aspects of climate change that often polarize people along political party lines.

Please check out the full piece here!

 

Becoming a better teacher: Fish is Fish

This summer, I’ll be the Instructor of Record (real teacher, not Teaching Assistant) for the first time. I’m teaching Research Methods, which is a “lower level” (mainly first- and second-year undergrads) course that I’ve TAed for twice, and I really enjoy its content. Because I’m participating in UCSD’s Summer Graduate Teaching Scholar Program, I have to complete a course called Teaching + Learning at the College Level. We’re two weeks in, and I’ve picked up some interesting nuggets from the readings and class discussions, but one analogy in particular is still on my mind.

We talked about the children’s story Fish is Fish by Leo Lionni. I’m kind of glad I never encountered this story when I was a kid because its novelty had a great impact on 25-year old me. The story is about a fish and tadpole who wonder what life on land is like. Eventually the tadpole becomes a frog who can leave the water to learn about the land. He reports back to the fish, listing off features of things on land. Cows, he says, have black spots, four legs, and udders. The frog describes birds and people too, and here’s what the fish imagines:

fish is fish.jpg
Image: The Eric Carle Museum

The fish and the frog are talking about the same things, and they assume that they have common concepts of cows and birds and people, but their actual mental representations — what they see in their mind’s eye — of these things are quite different. If the fish had been given a traditional paper and pencil test that asked him to define a cow, he’d be correct in writing that it has 4 legs, black spots, and udders. He’d ace the test, fooling not only the teacher, but also himself, into thinking he actually knows what a cow is.

The takeaway, of course, is to try to make sure your students aren’t fish. Find ways to lead them beyond their fishy cow concepts, which can especially be hard when they’ve never been on land and they come to class knowing only what fish are like. Students almost always need foundational knowledge in order to understand a new concept, and there’s a good chance that at least some of the students in any class are missing that foundation.  Instructors need to be mindful that there will be times when they have to step back to assess and teach prerequisite knowledge before launching into an hour-long lecture about cows (or cognition). And then once they think the students actually know what cows are, it’s important to provide assessments that actually test understanding, and not just memorization.

There are plenty of things I might not pull off perfectly when I teach for the first time this summer, but I do feel confident that I’ll at least be on a quest to help the fish in the class become frogs so they can see what cows are really like.

Teachers of all levels and subjects: I invite you to share how you make sure your students are truly understanding and not simply parroting. How do you make sure their concepts of the cow are really cow-like, and not just fish with spots and udders?

The paradox of science communication and the new science to resolve it

I recently discovered this interesting paper by Yale Law Professor Dan Kahan: What is the “science of science communication”? He introduces the concept of the science communication paradox: “Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know.” This figure demonstrates the science communication paradox, showing strong disagreements about different risks for people with different political beliefs:

Screen Shot 2017-01-08 at 9.55.52 AM.pngAccording to Kahan, resolving the science communication paradox is the main goal of the new science of science communication. He lays out two potential explanations for the science communication paradox:

  • Public Irrationality Thesis (PIT): Advocates of this position believe that the public, on the whole, is not scientifically literate and does not think about risk the way scientists do.

If PIT is correct, then as people acquire more scientific literacy, their views on different risks should align more with scientists’ views. This is not what actually happens:

Screen Shot 2017-01-08 at 10.05.01 AM.pngWith increasing science comprehension scores, people actually become more polarized along party lines in their belief about the risk of climate change (shown in more detail in an earlier publication by the same group). Scientific literacy does not necessarily mean views aligned with scientists’.

  • Cultural Cognition Thesis (CCT), alternative explanation for the science communication paradox: Suggests that our group identities are fundamental shapers of how we think about risk. Kahan gives the analogy of sports fans for two opposing teams: The opposing fans are likely to actually see a replay of a questionable call differently and in favor of their team. Along these lines, when people feel their group’s stance is being threatened, they’re more likely to see evidence as confirming their belief or to discount the source if it contradicts what they want to believe. This account is much more consistent with the real-world data (for example, the previous graph) and experimental work he and collaborators have done.

It’s important to note that risk assessments of most science issues do not demonstrate the science communication paradox: people with more science knowledge don’t necessarily become more polarized in their beliefs. Here are a few examples of issues that science intelligence predicts people’s risk assessments better than their political leaning:

Screen Shot 2017-01-08 at 10.19.23 AM.png

How can we use awareness of the cultural cognition thesis to improve science communication?

Kahan suggests the disentanglement principle: If polarization results from a clash between people’s identities as members of a cultural group and scientific facts they’re encountering, we should work to separate these two. For example, a true/false question might state: “Human beings, as we know them today, developed from earlier species of animals.” Someone who belongs to a religious group that doesn’t support evolution has to choose between answering this question in a way that’s consistent with scientific consensus OR their group identity. But rewording the statement to something like: “According to the theory of evolution, human beings, as we know them today, developed from earlier species of animals.” takes away the conflict. The respond can now demonstrate scientific knowledge by agreeing with the statement without jeopardizing their identity of part of a group that doesn’t believe in evolution.

Screen Shot 2017-01-08 at 10.29.32 AM.png

While the original statement has led to polarized views (as science knowledge increases, religious and non-religious people’s responses begin to diverge more), the second framing has shown converging responses (scientific knowledge, rather than religious beliefs, now becomes the best predictor of correct responses).

In a great blog post for Southern Fried Science, Andrew Thaler shares that he talks about a lot of things when he talks about climate change, but science isn’t one of them. He talks about fishing, flooding, farming, faith, and the future. These are things that his audiences know deeply, and climate change is relevant to anyone with interest in any of his f’s. He provides a great example of disentangling people’s identities from the scientific issue, and instead actually uses their identities to show the issue’s relevance.

The disentanglement principle offers one way that the new science of science communication begins to reduce the paradox of science communication, but it’s just one drop in a huge pond of paradox. We have to keep working on ways to communicate information that conflicts with people’s cultural identities, and as this work shows, jamming information down people’s throats isn’t the way to close belief gaps.


Feature image: Communication by Joan M. MasCC.

All figures from Kahan, D. (2015). What is the “science of science communication”? Journal of Science Communication, 14(3).

Narratives for communicating climate change

Last week I wrote about work by UC researchers on framing climate change, a chapter that focuses on how we can harness our understanding of human psychology — how we learn, think, and behave — to communicate science better. Here’s another paper (one that’s gotten very popular, very quickly) that considers human cognition for the efficacy of communicating about climate change.

Narrative Style Influences Citation Frequency in Climate Change ScienceThe authors of this paper (Ann Hillier, Ryan Kelly, & Terrie Klinger, all from the University of Washington) started with the insight from psychology that people understand and remember story-like (narrative) writing better than explanatory (expository) writing. They considered abstracts from 802 scientific papers about climate change, and looked for different markers of narrative structure:
1) description of setting (where/when the events took place)
2) narrative perspective (the presence of a narrator)
3) sensory language (appealing to the senses or emotions)
4) conjunctions (used often in narratives to connect narratives logically)
5) connectivity (phrases that create explicit links to something mentioned earlier in the text)
6) appeal (whether the text makes an appeal to the reader or a recommendation for specific action)

The authors crowdsourced this first part of their data analysis. This means that non-scientists who use an online job platform (crowdflower.com) were given the authors’ instructions for analyzing the abstracts. This way, each abstract was analyzed by 7 independent people, and involved human interpretation and discretion, which can likely provide a more accurate index of narrativity than any computerized methods can at the moment.

The authors considered how many times each paper in the study had been cited by others as a reflection of how much impact each paper had on subsequent science conducted. They found that 4 of their 6 narrative indicators (sensory language, conjunctions, connectivity, and appeal to reader) were related to how frequently articles were cited by others. In other words, papers higher in narrativity were cited more often than those that were more expository.

journal-pone-0167983-g001-2
Subset of Figure 1, showing that as articles increase in narrativity, their citations increase as well.

The more citations a paper receives, the more other researchers will see the work. It’s possible that higher quality work lends itself better to a narrative style, so papers high in narrativity will also be cited often. Since this study is correlational, we have no way of ruling out this possibility that the best science is conducive to narrative presentation, and it would be cited a lot regardless of its narrative style because it’s just good research. The causal arrow is not clear here, but it is clear that impactful research tends to take on a narrative structure. Even though narrative writing doesn’t necessarily lead to citations, imitating the style of papers that are cited often doesn’t seem to be a bad idea.

This work is not the first to suggest that narratives can be helpful for understanding climate change. FrameWorks Institute, a nonprofit organization that designs ways to communicate complex issues and tests their efficacy for cognitive and behavior changes, has a toolkit that uses (visual) narratives to communicate about climate change. (Also note that the toolkit is just the tip of the iceberg for the extensive work FrameWorks has done on communicating climate change.)

Together, the work by FrameWorks and the study of narrativity and citations present a pretty clear takeaway for climate scientists (and likely scientists in many fields): ease off the traditional academic expository style and lean into a more understandable and memorable narrative style.


For an interesting (and more critical) take on this paper, see this post by Randy Olson at scienceneedsstory.com)

Getting a scientific message across means taking human nature into account

I really enjoyed thinking about, researching, and writing this piece for The Conversation, where this work was originally published.

Getting a scientific message across means taking human nature into account

Rose Hendricks, University of California, San Diego

We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not.

Scientists and the media need to communicate more science and communicate it better. Good communication ensures that scientific progress benefits society, bolsters democracy, weakens the potency of fake news and misinformation and fulfills researchers’ responsibility to engage with the public. Such beliefs have motivated training programs, workshops and a research agenda from the National Academies of Science, Engineering, and Medicine on learning more about science communication. A resounding question remains for science communicators: What can we do better?

A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.”

But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Or they read an online article by a scientist about the necessity of vaccines, but leave comments expressing outrage that doctors are trying to further a pro-vaccine agenda. Convincing people that scientific evidence has merit and should guide behavior may be the greatest science communication challenge, particularly in our “post-truth” era.

Luckily, we know a lot about human psychology – how people perceive, reason and learn about the world – and many lessons from psychology can be applied to science communication endeavors.

Consider human nature

Regardless of your religious affiliation, imagine that you’ve always learned that God created human beings just as we are today. Your parents, teachers and books all told you so. You’ve also noticed throughout your life that science is pretty useful – you especially love heating up a frozen dinner in the microwave while browsing Snapchat on your iPhone.

One day you read that scientists have evidence for human evolution. You feel uncomfortable: Were your parents, teachers and books wrong about where people originally came from? Are these scientists wrong? You experience cognitive dissonance – the uneasiness that results from entertaining two conflicting ideas.

It’s uncomfortable to hold two conflicting ideas at the same time. Man image via www.shutterstock.com.

Psychologist Leon Festinger first articulated the theory of cognitive dissonance in 1957, noting that it’s human nature to be uncomfortable with maintaining two conflicting beliefs at the same time. That discomfort leads us to try to reconcile the competing ideas we come across. Regardless of political leaning, we’re hesitant to accept new information that contradicts our existing worldviews.

One way we subconsciously avoid cognitive dissonance is through confirmation bias – a tendency to seek information that confirms what we already believe and discard information that doesn’t.

This human tendency was first exposed by psychologist Peter Wason in the 1960s in a simple logic experiment. He found that people tend to seek confirmatory information and avoid information that would potentially disprove their beliefs.

The concept of confirmation bias scales up to larger issues, too. For example, psychologists John Cook and Stephen Lewandowsky asked people about their beliefs concerning global warming and then gave them information stating that 97 percent of scientists agree that human activity causes climate change. The researchers measured whether the information about the scientific consensus influenced people’s beliefs about global warming.

Those who initially opposed the idea of human-caused global warming became even less accepting after reading about the scientific consensus on the issue. People who had already believed that human actions cause global warming supported their position even more strongly after learning about the scientific consensus. Presenting these participants with factual information ended up further polarizing their views, strengthening everyone’s resolve in their initial positions. It was a case of confirmation bias at work: New information consistent with prior beliefs strengthened those beliefs; new information conflicting with existing beliefs led people to discredit the message as a way to hold on to their original position.

Just shouting louder isn’t going to help. Megaphone image via www.shutterstock.com.

Overcoming cognitive biases

How can science communicators share their messages in a way that leads people to change their beliefs and actions about important science issues, given our natural cognitive biases?

The first step is to acknowledge that every audience has preexisting beliefs about the world. Expect those beliefs to color the way they receive your message. Anticipate that people will accept information that is consistent with their prior beliefs and discredit information that is not.

Then, focus on framing. No message can contain all the information available on a topic, so any communication will emphasize some aspects while downplaying others. While it’s unhelpful to cherry-pick and present only evidence in your favor – which can backfire anyway – it is helpful to focus on what an audience cares about.

For example, these University of California researchers point out that the idea of climate change causing rising sea levels may not alarm an inland farmer dealing with drought as much as it does someone living on the coast. Referring to the impact our actions today may have for our grandchildren might be more compelling to those who actually have grandchildren than to those who don’t. By anticipating what an audience believes and what’s important to them, communicators can choose more effective frames for their messages – focusing on the most compelling aspects of the issue for their audience and presenting it in a way the audience can identify with.

In addition to the ideas expressed in a frame, the specific words used matter. Psychologists Amos Tversky and Daniel Kahneman first showed when numerical information is presented in different ways, people think about it differently. Here’s an example from their 1981 study:

Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. If Program B is adopted, there is ⅓ probability that 600 people will be saved, and ⅔ probability that no people will be saved.

Both programs have an expected value of 200 lives saved. But 72 percent of participants chose Program A. We reason about mathematically equivalent options differently when they’re framed differently: Our intuitions are often not consistent with probabilities and other math concepts.

Metaphors can also act as linguistic frames. Psychologists Paul Thibodeau and Lera Boroditsky found that people who read that crime is a beast proposed different solutions than those who read that crime is a virus – even if they had no memory of reading the metaphor. The metaphors guided people’s reasoning, encouraging them to transfer solutions they’d propose for real beasts (cage them) or viruses (find the source) to dealing with crime (harsher law enforcement or more social programs).

The words we use to package our ideas can drastically influence how people think about those ideas.

What’s next?

We have a lot to learn. Quantitative research on the efficacy of science communication strategies is in its infancy but becoming an increasing priority. As we continue to untangle more about what works and why, it’s important for science communicators to be conscious of the biases they and their audiences bring to their exchanges and the frames they select to share their messages.