Vaccinating, metaphorically and literally

There’s a lot of bad (either misleading or blatantly false) science information on the Internet. Science communicators often try to combat the bad content by dumping as much accurate information as they can into the world, but that strategy is not as effective as many would hope. One reason it’s not effective is that social circles on the Internet are echo chambers: people tend to be follow like-minded others. Scientists and science communicators follow each other, and skeptics follow each other, so we rarely even hear what others outside our circle are talking about. Plus, when we do encounter evidence that contradicts our beliefs, we tend to discount it and keep believing what we already did.

A recent study by Sander van der Linden, Anthony Leiserowitz, Seth Rosenthal, & Edward Maibach (that I recently wrote about) gives a glimmer of hope to this science communication trap: communicators may be able to “vaccinate” their audiences against misinformation. They found that if people are cued in to the kinds of tactics that opponents of global warming deploy, they’re less likely to believe them. This finding offers some hope in a time when the proliferation of fake and misleading science information seems inevitable. Scientific facts, along with a heads up about anti-scientific strategies, can help people better evaluate the information they receive to form evidence-based beliefs and decisions.

Does this apply to other scientific issues? Can we vaccinate against anti-vaccination rhetoric?

I don’t know. But I’d like to find out. In order to design a communication that alerts people about anti-vaccine messages they might encounter, it’s important to understand anti-vaccine tactics. I explored some very passionate corners of the Internet (videos, discussion threads, and blog posts by anti-vaccine proponents) for a better understanding. Here are the anti-vaccine tactics I found, a lot of which are described in this SciShow video:

Ethos: Appeal to Authority

First, note that this immunologist isn’t explicitly saying that children shouldn’t be vaccinated. But the quote implies so much. I don’t know if that’s actually her belief, but regardless, as a consumer of this image, I do get the sense that she looks pretty smart (#educated, in fact), and maybe she knows what she’s talking about…

Jargon

Screen Shot 2017-03-11 at 9.32.35 AM

There are four chemical names in the first five lines of the ad above. It sounds like whoever wrote it must really know their science. The message implies that the author has deep scientific knowledge about the chemicals mentioned and wants to warn you of their presence in vaccines. Paired with our society’s tendency to believe that all things “natural” are good, and all things “chemical” are enemies, this jargon-wielding author might appeal as someone worth listening to. Most of us (and I am definitely included here) don’t know much or anything about those chemicals — how do they work? Are they actually dangerous in the doses found in vaccines? This jargon paves the way for persuasion through the naturalistic fallacy — the idea that all natural things are better than non-natural things.

Logos: Appeal to “Logic”

Logical fallacies

A logical fallacy is faulty logic disguised as real logic, and it’s another common tactic used by anti-vaccine proponents. In the Tweet above, the author presents two facts, implying that they’re connected (that an increase in mandatory vaccines led to a change from 20th to 37th in the worldwide ranking of infant mortality rates. Just because America “lost ground” on this ranking, it doesn’t necessarily mean our mortality rate even went up — it’s likely that many other nations’ mortality went down. Plus, there are so many other factors beyond number of mandatory vaccines that influence infant mortality rate, and no evidence supplied by the Tweet that vaccines and mortality are related. They’re just two pieces of information, placed next to each other to give a sense of a causal relationship.

There are lots of ways logic can be distorted to suggest that vaccines are bad. One that really stands out to me is the suggestion that if vaccines work, why should we care if some children are not vaccinated? After all, they’ll be the ones who get sick… why does it concern the rest of us?

It does. For one, no child should end up with a paralyzing or fatal disease because their parent chose to disregard scientific consensus. But one person’s choice not to vaccinate directly affects others — for example, people who CAN’T be vaccinated for health reasons. If everyone else receives vaccines, that one person who cannot is safe thanks to “community immunity.” But if others stop receiving those vaccines, the person who had no choice but to remain unvaccinated is susceptible. This person is unjustly at danger as a result of others’ choices.

Pathos: Appeal to Emotion

Fear

Fear is a powerful motivator. Appeals to ethos and logos can work together to have an emotional effect. Parents just want to do their best for their kids, so messages that strike up fears about the harms of vaccines have a good chance of swaying them.

One way of drumming up fear is to promote vaccine proponents as bullies, as this article demonstrates:

Screen Shot 2017-04-30 at 1.08.20 PM.png

Yea, that description sounds pretty scary to me… Breakdown Radio. Link to article

Considerations for inoculation messages

Of course, I’ve just scratched the surface with these tactics that anti-vaccine proponents use (you can get an idea of some others in a post on how the anti-vax movement uses psychology to endanger us by Dr. Doom) Messages that vaccinate against misconceptions have to walk an extremely fine line. The goal of such a message is to foreshadow misleading messages a person may encounter, and to point out the reasons that message should be reconsidered.

Vaccine messages might be useful when they introduce new information, but they also need to be proactive, anticipating anti-vaccine rhetoric and alerting people to its flaws. There are a few dangers in doing so, though. For one, it often requires repeating the misconception, and research shows that doing so can backfire and reinforce the inaccuracy instead. In addition, pointing out flaws in an argument that someone might be prone to believing can alienate that person. If the warning message isn’t constructed conscientiously (for example, if it suggests that seeing through the misleading information is a no-brainer), it can imply that anyone who might believe the misconceptions is an idiot. A message like this will make some members of the audience feel defensive (wow, am I an idiot? No, I can’t be an idiot. Maybe this author of this message is the idiot…).

That doesn’t mean that inoculation messages can’t be effective. We have some evidence to suggest they can, and I think there’s a lot of room to continue honing this strategy. The first step in a successful inoculation message is to uncover the tactics used by those who misrepresent the science. Then it’s important to raise awareness of those tactics without alienating the audience and while being careful not to repeat the misinformation in a way that can be construed as reinforcing it.

Communicators can keep in mind that anti-vaccine messages often attempt to establish authority, tap into emotions, and apply misleading logic in order to convince people of their message. By anticipating these strategies, we can have greater success in counteracting them and promoting vaccines as the life-saving technologies they are.

More information

 

Advertisements

The Pope’s #scicomm: Effects of Laudato si’ on beliefs about climate change

Climate change is an extremely polarized issue: while many people firmly believe scientific evidence that human-caused climate change is ruining the planet and our health, many others adamantly maintain that it is not a problem. Figuring out how to communicate the gravity of climate change has been an urgent puzzle for climate change scientists and communicators (a topic I’ve written quite a bit about).

Collectively, we’re trying many different ways of communicating this issue. I especially love these videos by climate scientist Katharine Hayhoe and others by researcher M. Sanjayan with the University of California and Vox. Pope Francis also contributes to the scicomm effort — in 2015 he published an encyclical called Laudato si’: On Care for Our Common Home, which called for global action toward climate change (he also gave a copy of this encyclical to Donald Trump recently when the two met).

Was Laudato si’ effective?

Did the document influence beliefs about the seriousness of climate change and its effects on the poor? Recent research by Asheley Landrum and colleagues took up this question.

The work is based on survey results from Americans — the same people reported their beliefs about climate change before and after the encyclical came out.

They found that the encyclical did not directly affect people’s beliefs about the seriousness of climate change or its disproportionate effects on the poor.

But… the encyclical did affect people’s views of the pope’s credibility on climate change, encouraging them to see him as more of an authority after the document was published than before. This was especially true for liberals, though, reflecting a sort of echo chamber effect: people who already found climate change to be an issue gave the pope more credit for his stances on climate change after he published the encyclical.

Importantly, these altered views of the pope’s credibility did in turn affect how much people agreed with the pope’s message on climate change. In other words, there wasn’t a direct effect from the publication of the encyclical to agreement with its message; instead, there was first an effect of the document on beliefs about the pope’s credibility, and then an effect of those credibility assessments on agreement with the pope’s message.

Diem.png

This work reminds us that science communication efforts can’t be considered in isolation. Whether people agree with a message is influenced by factors like their political beliefs and the credibility of the source. This point calls for two directions for future scicomm: for one, communicators should do their best to consider their message and audience holistically — what factors are likely to shape an audience’s receptiveness to a message, and how can those be influenced? This work also reminds us that we need more research on the science of science communication. We need to continue working to understand how people perceive scientific issues and communicators, and how they respond to the scicomm they encounter.


Featured Image: Korea.net / Korean Culture and Information Service (Jeon Han)

Hurdles to Communicating Science & Strategies to Overcome them

Communicating science is hard in part because doing and understanding science is hard, but there are also some unique hurdles that science communicators face — especially when communicating information that’s relevant for policies. James Druckman recently described some of the challenges that particularly face people communicating policy-relevant science, and ways those challenges can be minimized.

Value-laden diversity

We all have values, relatively unchanging beliefs that reflect the way we see the world. For example, some people are more “individualist,” while others are more “communitarian.” If scientific information seems to contradict a value, we’ll be hesitant to accept that information. Information about climate change, especially if it contains or implies suggestions for reducing the problem by increasing regulations on businesses, might contradict an individualist’s values, making it hard for that person to even consider the scientific information. For more on this hurdle, see Dan Kahan’s work on The Science of Science communication, an earlier post on this blog about Kahan’s work, or a great post by Chris Mooney on Mother Jones.

What to do about it

First, communicators have to recognize that their audience will have a diverse set of values, some of which will conflict with the communicator’s values, and that these values will influence the way people receive scientific information.

Next, communicators should minimize the extent to which their message contains a value commentary. In other words, they should make sure the relevant science comes into play for certain policy decisions without defining “good” or “competent” decisions.

774926895_9fe0f5495a_b
Hurdles start. by robert voors. CC BY-NC-ND.

Motivated Reasoning

Motivated reasoning (or confirmation bias) is our drive to seek information that reinforces our prior beliefs and disregard information that does not. For example, in work by Duckman & Bolsen (2011), participants initially indicated their support for genetically modified (GM) foods. After 10 days, all participants received 3 types of info: positive information about how GM foods combat diseases, negative information about their possible longterm health consequences, and neutral information about their economic consequences.

People who initially supported GM foods dismissed the negative information and rated the positive information as valid, and perceived the neutral information as indicating benefits of GM foods. People who were initially opposed to GM foods did the exact opposite: dismissed the positive information, considered the negative information as valid, and interpreted the neutral information as indicating drawbacks of GM foods. Work on motivated reasoning shows we interpret information through a lens laden with our prior beliefs.

1427849853_1cb5cefd16_m
Last of the crop. By Mrs eNil. CC BY-ND-NC.

There have been lots of great articles highlighting motivated reasoning lately. These include Why Facts don’t Change our Minds, This Article won’t Change your Mind, and Why You Think You’re Right, Even When You’re Wrong.

What to do about it

When motivated reasoning occurs, people are motivated to understand information in a way that aligns with their previous beliefs. Instead, science communicators want to motivate their audience to understand new information in a way that will lead to maximum accuracy. There are a few things communicators can do to encourage people to seek accurate understandings:

  • Show that the issue and information matter for the individual’s life. Show relevance.
  • Present information that comes from a variety of sources, preferably ones with different goals (i.e., from Democrat and Republican sources)
  • Encourage people to explain their position to others (or at least prepare themselves to explain their position). Elaborating on your position requires people to think it through more carefully, and provide explicit evidence for their claims that goes beyond “because I want to believe this.”

Politicization

This term does not mean what we might expect given its name. Politicization is “the inevitable uncertainties about aspects of science to cast doubt on the science overall…thereby magnifying doubts in the public mind” (Stekette 2010, p. 2). It’s not exactly misinformation, since it doesn’t introduce false findings, but instead magnifies doubt. It’s especially common in issues about global warming and vaccination. People who politicize these issues send the message that scientific evidence on these issues is not as conclusive as it’s been made out to be.

What to do about it

Politicization comes directly from people perceiving scientists or informants as lacking credibility and being motivated to reason in ways consistent with their prior beliefs. Thus, politicization can be countered by addressing those hurdles – establishing credibility and encouraging an accuracy motivation. There are a couple other things we can do to overcome this hurdle:

  • Warn people of politicization they’re likely to encounter before they encounter it. This is sometimes referred to as an inoculation message, and it points out the strategies politicizers use and why their message is not to be trusted
  • Correct politicized messages after people have encountered them. Corrections are often not as effective as inoculation messages since people may have already had time to process and begin to believe the politicized message. However, corrections can be effective when people are motivated to reach an accurate understanding.

There’s more

Of course, these aren’t the only hurdles to communicating policy-relevant science. Other hurdles described by Druckman that I haven’t elaborated on include: communicating policy-relevant science requires effort on the part of scientists, getting and maintaining attention, establishing credibility, and changing government inaction.

More and more scientists are recognizing the value of communicating their science outside the Ivory Tower. At the same time, the science of science communication is advancing to help us all understand the hurdles we face and how to best overcome them.

Scientists Agree on Climate Change: A Gateway Belief

 

Screen Shot 2017-03-24 at 3.13.04 PM
https://climate.nasa.gov/scientific-consensus/

It doesn’t get much clearer. The Earth’s climate is warming. Humans are the reason. But how many people are actually aware of the scientific consensus on this issue?

Research by Sander van der Linden and colleagues shows that when people believe that scientists overwhelmingly agree about climate change, they increase their a) own beliefs in climate change and b) beliefs that humans are responsible. They feel c) more worried about climate change, and as a result of a, b, and c, they support public action to mitigate the effects of climate change.

journal.pone.0118489.g001

At the beginning of the study, participants indicated the percentage of scientists they thought agree on global warming and they answered some questions about their own climate change beliefs. People then received a message about scientific consensus, which took the form either of a) a simple description, b) a pie chart, or c) a metaphorical comparison related to trusting engineers’ consensus about bridges (i.e., if 97% of engineers agreed that a bridge was unsafe, would you use it?) or doctors’ consensus about illness. All the messages included the info that “97 % of climate scientists have concluded that human-caused climate change is happening.”

Then participants again indicated what percent of scientists they thought agree on global warming and answered questions about their own beliefs. All messages “worked,” in the sense that people perceived greater scientific agreement after the messages telling them that 97% of scientists agree than if they hadn’t read anything about the consensus at all (though the simple description and pie chart were more effective than the metaphor. People shifted their climate change beliefs more after encountering one of the more straightforward messages than the more complex metaphor. Great food for thought, as many science communicators insert metaphors wherever they can).

Of course, having people believe that there’s strong scientific consensus about climate is only one step toward the larger goal of having them endorse actions that mitigate the effects of climate change. But in follow-up analyses, the researchers identified that perceiving scientific agreement is a gateway belief: believing that scientists agree about global warming led to other beliefs, ones that get us closer to the goal of actions in favor of mitigating climate change. Specifically, it led to greater belief that climate change was real, human-caused, and worrisome. These beliefs, in turn, led to greater support for public action against climate change. It’s often hard to know what leads to what, especially when it comes to beliefs we keep hidden in our own heads, but with some semi-fancy math, these researchers quantified those relationships.

9154498428_d5d720f9ee_b
Climate 365 by NASA Goddard Space Space Flight Center. CC BY.

These studies have some clear takeaways for science communicators (especially when communicating about climate change — but maybe these ideas apply to other topics too — need more research!)

  • Emphasize scientific consensus, that an overwhelming percentage of scientists agree that climate change is a real problem caused by human activity.
  • Don’t worry so much about immediately pushing for public action against climate change. When people understand that scientists agree, they come to agree themselves that climate change is a problem that should be addressed, and THEN they come to support public action. Be careful about skipping steps.

At the same time, there’s not only one right way to communicate about climate change. There are truly effective ways, ineffective and potentially backfiring ways, and many in between. There aren’t cut-and-dry rules because every audience is unique, and taking the audience into account — their beliefs, values, and past experiences, for example — is crucial. But this work sheds light on communication strategies that are probably pretty far toward the “truly effective” end of the ways-to-communicate-climate-change continuum.

Communicating climate change: Focus on the framing, not just the facts

Image 20170303 29002 1h47na1
How you package the information matters.
Frame image via www.shutterstock.com.

Rose Hendricks, University of California, San Diego

Humans are currently in a war against global warming. Or is it a race against global warming? Or maybe it’s just a problem we have to deal with? The Conversation

If you already consider climate change a pressing issue, you might not think carefully about the way you talk about it – regardless of how you discuss it, you already think of global warming as a problem. But the way we talk about climate change affects the way people think about it.

For scientific evidence to shape people’s actions – both personal behaviors like recycling and choices on policies to vote for – it’s crucial that science be communicated to the public effectively. Social scientists have been increasingly studying the science of science communication, to better understand what does and does not work for discussing different scientific topics. It turns out the language you use and how you frame the discussion can make a big difference.

The paradox of science communication

“Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know,” writes Yale law professor Dan Kahan, a leading researcher in the science of science communication.

Kahan’s work shows that just because someone has scientific knowledge, he or she won’t necessarily hold science-supported beliefs about controversial topics like global warming, private gun possession or fracking.

Instead, beliefs are shaped by the social groups people consider themselves to be a part of. We’re all simultaneously members of many social groups – based, for example, on political or religious affiliation, occupation or sexuality. If people are confronted with scientific evidence that seems to attack their group’s values, they’re likely to become defensive. They may consider the evidence they’ve encountered to be flawed, and strengthen their conviction in their prior beliefs.

Unfortunately, scientific evidence does sometimes contradict some groups’ values. For example, some religious people trust a strict reading of the Bible: God said there would be four seasons, and hot and cold, so they don’t worry about the patterns in climate that alarm scientists. In cases like this one, how can communicators get their message across?

A growing body of research suggests that instead of bombarding people with piles of evidence, science communicators can focus more on how they present it. The problem isn’t that people haven’t been given enough facts. It’s that they haven’t been given facts in the right ways. Researchers often refer to this packaging as framing. Just as picture frames enhance and draw attention to parts of an image inside, linguistic frames can do the same with ideas.

One framing technique Kahan encourages is disentangling facts from people’s identities. Biologist Andrew Thaler describes one way of doing so in a post called “When I talk about climate change, I don’t talk about science.” Instead, he talks about things that are important to his audiences, such as fishing, flooding, farming, faith and the future. These issues that matter to the people with whom he’s communicating become an entry into discussing global warming. Now they can see scientific evidence as important to their social group identity, not contradictory to it.

Let me rephrase that

Metaphors also provide frames for talking about climate change. Recent work by psychologists Stephen Flusberg, Paul Thibodeau and Teenie Matlock suggests that the metaphors we use to describe global warming can influence people’s beliefs and actions.

Ready for combat?
Thomas Hawk, CC BY-NC

The researchers asked 3,000 Americans on an online platform to read a short fictional news article about climate change. The articles were exactly the same, but they used different metaphors: One referred to the “war against” and another to the “race against” climate change. For example, each article included phrases about the U.S. seeking to either “combat” (war) or “go after” (race) excessive energy use.

After reading just one of these passages, participants answered questions about their global warming beliefs, like how serious global warming is and whether they would be willing to engage in more pro-environmental behaviors.

Metaphors mattered. Reading about the “war” against global warming led to greater agreement with scientific evidence showing it is real and human-caused. This group of participants indicated more urgency for reducing emissions, believed global warming poses a greater risk and responded that they were more willing to change their behaviors to reduce their carbon footprint than people who read about the “race” against global warming.

The only difference between the articles that participants read was the metaphors they included. Why would reading about a war rather than a race affect people’s beliefs about climate change in such important ways?

The researchers suggest that when we encounter war metaphors, we are reminded (though not always consciously) of other war-related concepts like death, destruction, opposition and struggle. These concepts affect our emotions and remind us of the negative feelings and consequences of defeat. With those war-related thoughts in mind, we may be motivated to avoid losing. If we have these war thoughts swimming around in our minds when we think about global warming, we’re more likely to believe it’s important to defeat the opponent, which, in this case, is global warming.

There are other analogies that are good at conveying the causes and consequences for global warming. Work by psychologists Kaitlin Raimi, Paul Stern and Alexander Maki suggests it helps to point out how global warming is similar to many medical diseases. For both, risks are often caused or aggravated by human behaviors, the processes are often progressive, they produce symptoms outside the normal range of past experiences, there are uncertainties in the prognosis of future events, treatment often involves trade-offs or side effects, it’s usually most effective to treat the underlying problem instead of just alleviating symptoms and they’re hard to reverse.

People who read the medical disease analogy for climate change were more likely to agree with the science-backed explanations for global warming causes and consequences than those who read a different analogy or no analogy at all.

Golden past or rosy future?

Climate change messages can also be framed by focusing on different time periods. Social psychologists Matthew Baldwin and Joris Lammers asked people to read either a past-focused climate change message (like “Looking back to our nation’s past… there was less traffic on the road”) or a similar future-focused message (“Looking forward to our nation’s future… there is increasing traffic on the road”).

The researchers found that self-identified conservatives, who tend to resist climate change messages more than liberals, agreed that we should change how we interact with the planet more after reading the past-focused passage. Liberals, on the other hand, reported liking the future-focused frame better, but the frames had no influence on their environmental attitudes.

Example of a past-focused image (top) and a future-focused image (bottom) of a reservoir.
Image courtesy of NASA. Used in Baldwin and Lammers, PNAS December 27, 2016 vol. 113 no. 52 14953-14957.

And the frames didn’t have to be words. Conservatives also shifted their beliefs to be more pro-environmental after seeing past-focused images (satellite images that progressed from the past to today) more than after seeing future-focused ones (satellite images that progressed from today into the future). Liberals showed no differences in their attitudes after seeing the two frames.

Many climate change messages focus on the potential future consequences of not addressing climate change now. This research on time-framing suggests that such a forward-looking message may in fact be unproductive for those who already tend to resist the idea.

There’s no one-size-fits-all frame for motivating people to care about climate change. Communicators need to know their audience and anticipate their reactions to different messages. When in doubt, though, these studies suggest science communicators might want to bring out the big guns and encourage people to fire away in this war on climate change, while reminding them how wonderful the Earth used to be before our universal opponent began attacking full force.

Rose Hendricks, Ph.D. Candidate in Cognitive Science, University of California, San Diego

This article was originally published on The Conversation. Read the original article.

Past vs. Future Frames for Communicating Climate Change

Climate change (is it happening? how problematic is it? and are humans responsible?) is a partisan issue. Work by Dan Kahan (which I’ve written about before) shows that conservatives are more likely than liberals to believe that climate change is not a result of human activity and that if unchanged, it will not be as destructive as many people claim. Researchers Matthew Baldwin & Joris Lammers explore the possibility that partisan differences in beliefs about climate change might result from differences in the way conservatives and liberals tend to think about time (their temporal focus).

Their starting point was that previous research has shown that conservatives focus more on the past than liberals do. Then they tested two competing frames: one was future-focused (“Looking forward to our nation’s future… there is increasing traffic on the road”) and the other was past-focused (“Looking back to our nation’s past… there was less traffic on the road”). Each participant read just one of these, and then reported their attitudes about climate change and the environment. They found that conservatives reported liking the past-focused message better than the future-focused one and also reported higher environmental attitudes after the past- compared to the future-focused frame.

Screen Shot 2017-01-08 at 5.20.38 PM.png

They replicated these findings in additional experiments with variations. For example, in one test, instead of using linguistic frames to draw attention to either the past or the future, they used satellite images, either showing a progression from the past to today or a forecasted progression from today to the future. Again, conservatives reported more proenvironmental attitudes after viewing past-focused images than future-focused ones.

Next they investigated the temporal focus that real environmental charities tend to use. Not surprisingly, they found that the charities’ messages disproportionately express future consequences, with less focus on the past. Following up on this, they presented participants with money that they could divide among two (fake) charities (one whose message was strongly past- and one whose message was strongly future-focused), or they could keep some or all of it. They saw each charity’s logo and mission statement (the past-focused one stated: “Restoring the planet to its original state” and the future one: “Creating a new Earth for the future”).

Screen Shot 2017-01-08 at 5.28.45 PM.png

Conservatives donated more to the past- than the future-oriented charity. Liberals did the opposite. Further, looking at just the past-oriented charity, conservatives donated more than liberals did. Looking just at the future-oriented one, the opposite pattern emerges. This is a very beautiful interaction (plus the researchers did a few other experiments with slightly varied methods and a meta-analysis, all of which add some weight to these findings).

Considering the finding that climate change communications rely heavily on future-focused appeals, these findings should really make us pause. Is it possible that climate change issues themselves may not actually be what divides conservatives and liberals so much, but instead the way they’re communicated might be driving much of the disagreement between them? My intuition is that framing is not entirely to blame for conservatives’ and liberals’ divergent beliefs about climate change, but this work shows that it may be a big part of the story. It certainly won’t hurt for communicators to start diversifying our temporal frames for discussing climate change.


For more consideration on this topic, see earlier posts: Climate change is a big problem and we need to find better ways of talking about it; Narratives for Communicating Climate Change; and The paradox of science communication and the new science to resolve it.

All figures from Baldwin, M. & Lammers, J. (2016) Past-focused environmental comparisons proenvironmental outcomes for conservatives. PNAS, 113(52), 14953-14957.

For a discussion of why the framing described in this paper might not be enough to change conservatives’ minds about climate change, see This one weird trick will not convince conservatives to fight climate change, by David Roberts for Vox.

Climate change is like a medical disease

I recently wrote for PLOS SciComm about a very cool study on the benefits of using analogies to talk about climate change (aptly called The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change). The researchers found that using any analogy (comparing climate change to a medical disease, a courtroom, or a natural disaster) was helpful, but that the medical disease analogy in particular helped people consider important aspects of climate change that often polarize people along political party lines.

Please check out the full piece here!

 

The paradox of science communication and the new science to resolve it

I recently discovered this interesting paper by Yale Law Professor Dan Kahan: What is the “science of science communication”? He introduces the concept of the science communication paradox: “Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know.” This figure demonstrates the science communication paradox, showing strong disagreements about different risks for people with different political beliefs:

Screen Shot 2017-01-08 at 9.55.52 AM.pngAccording to Kahan, resolving the science communication paradox is the main goal of the new science of science communication. He lays out two potential explanations for the science communication paradox:

  • Public Irrationality Thesis (PIT): Advocates of this position believe that the public, on the whole, is not scientifically literate and does not think about risk the way scientists do.

If PIT is correct, then as people acquire more scientific literacy, their views on different risks should align more with scientists’ views. This is not what actually happens:

Screen Shot 2017-01-08 at 10.05.01 AM.pngWith increasing science comprehension scores, people actually become more polarized along party lines in their belief about the risk of climate change (shown in more detail in an earlier publication by the same group). Scientific literacy does not necessarily mean views aligned with scientists’.

  • Cultural Cognition Thesis (CCT), alternative explanation for the science communication paradox: Suggests that our group identities are fundamental shapers of how we think about risk. Kahan gives the analogy of sports fans for two opposing teams: The opposing fans are likely to actually see a replay of a questionable call differently and in favor of their team. Along these lines, when people feel their group’s stance is being threatened, they’re more likely to see evidence as confirming their belief or to discount the source if it contradicts what they want to believe. This account is much more consistent with the real-world data (for example, the previous graph) and experimental work he and collaborators have done.

It’s important to note that risk assessments of most science issues do not demonstrate the science communication paradox: people with more science knowledge don’t necessarily become more polarized in their beliefs. Here are a few examples of issues that science intelligence predicts people’s risk assessments better than their political leaning:

Screen Shot 2017-01-08 at 10.19.23 AM.png

How can we use awareness of the cultural cognition thesis to improve science communication?

Kahan suggests the disentanglement principle: If polarization results from a clash between people’s identities as members of a cultural group and scientific facts they’re encountering, we should work to separate these two. For example, a true/false question might state: “Human beings, as we know them today, developed from earlier species of animals.” Someone who belongs to a religious group that doesn’t support evolution has to choose between answering this question in a way that’s consistent with scientific consensus OR their group identity. But rewording the statement to something like: “According to the theory of evolution, human beings, as we know them today, developed from earlier species of animals.” takes away the conflict. The respond can now demonstrate scientific knowledge by agreeing with the statement without jeopardizing their identity of part of a group that doesn’t believe in evolution.

Screen Shot 2017-01-08 at 10.29.32 AM.png

While the original statement has led to polarized views (as science knowledge increases, religious and non-religious people’s responses begin to diverge more), the second framing has shown converging responses (scientific knowledge, rather than religious beliefs, now becomes the best predictor of correct responses).

In a great blog post for Southern Fried Science, Andrew Thaler shares that he talks about a lot of things when he talks about climate change, but science isn’t one of them. He talks about fishing, flooding, farming, faith, and the future. These are things that his audiences know deeply, and climate change is relevant to anyone with interest in any of his f’s. He provides a great example of disentangling people’s identities from the scientific issue, and instead actually uses their identities to show the issue’s relevance.

The disentanglement principle offers one way that the new science of science communication begins to reduce the paradox of science communication, but it’s just one drop in a huge pond of paradox. We have to keep working on ways to communicate information that conflicts with people’s cultural identities, and as this work shows, jamming information down people’s throats isn’t the way to close belief gaps.


Feature image: Communication by Joan M. MasCC.

All figures from Kahan, D. (2015). What is the “science of science communication”? Journal of Science Communication, 14(3).

Narratives for communicating climate change

Last week I wrote about work by UC researchers on framing climate change, a chapter that focuses on how we can harness our understanding of human psychology — how we learn, think, and behave — to communicate science better. Here’s another paper (one that’s gotten very popular, very quickly) that considers human cognition for the efficacy of communicating about climate change.

Narrative Style Influences Citation Frequency in Climate Change ScienceThe authors of this paper (Ann Hillier, Ryan Kelly, & Terrie Klinger, all from the University of Washington) started with the insight from psychology that people understand and remember story-like (narrative) writing better than explanatory (expository) writing. They considered abstracts from 802 scientific papers about climate change, and looked for different markers of narrative structure:
1) description of setting (where/when the events took place)
2) narrative perspective (the presence of a narrator)
3) sensory language (appealing to the senses or emotions)
4) conjunctions (used often in narratives to connect narratives logically)
5) connectivity (phrases that create explicit links to something mentioned earlier in the text)
6) appeal (whether the text makes an appeal to the reader or a recommendation for specific action)

The authors crowdsourced this first part of their data analysis. This means that non-scientists who use an online job platform (crowdflower.com) were given the authors’ instructions for analyzing the abstracts. This way, each abstract was analyzed by 7 independent people, and involved human interpretation and discretion, which can likely provide a more accurate index of narrativity than any computerized methods can at the moment.

The authors considered how many times each paper in the study had been cited by others as a reflection of how much impact each paper had on subsequent science conducted. They found that 4 of their 6 narrative indicators (sensory language, conjunctions, connectivity, and appeal to reader) were related to how frequently articles were cited by others. In other words, papers higher in narrativity were cited more often than those that were more expository.

journal-pone-0167983-g001-2
Subset of Figure 1, showing that as articles increase in narrativity, their citations increase as well.

The more citations a paper receives, the more other researchers will see the work. It’s possible that higher quality work lends itself better to a narrative style, so papers high in narrativity will also be cited often. Since this study is correlational, we have no way of ruling out this possibility that the best science is conducive to narrative presentation, and it would be cited a lot regardless of its narrative style because it’s just good research. The causal arrow is not clear here, but it is clear that impactful research tends to take on a narrative structure. Even though narrative writing doesn’t necessarily lead to citations, imitating the style of papers that are cited often doesn’t seem to be a bad idea.

This work is not the first to suggest that narratives can be helpful for understanding climate change. FrameWorks Institute, a nonprofit organization that designs ways to communicate complex issues and tests their efficacy for cognitive and behavior changes, has a toolkit that uses (visual) narratives to communicate about climate change. (Also note that the toolkit is just the tip of the iceberg for the extensive work FrameWorks has done on communicating climate change.)

Together, the work by FrameWorks and the study of narrativity and citations present a pretty clear takeaway for climate scientists (and likely scientists in many fields): ease off the traditional academic expository style and lean into a more understandable and memorable narrative style.


For an interesting (and more critical) take on this paper, see this post by Randy Olson at scienceneedsstory.com)