Communication

Scientists Agree on Climate Change: A Gateway Belief

 

It doesn’t get much clearer. The Earth’s climate is warming. Humans are the reason. But how many people are actually aware of the scientific consensus on this issue?

Research by Sander van der Linden and colleagues shows that when people believe that scientists overwhelmingly agree about climate change, they increase their a) own beliefs in climate change and b) beliefs that humans are responsible. They feel c) more worried about climate change, and as a result of a, b, and c, they support public action to mitigate the effects of climate change.

journal.pone.0118489.g001

At the beginning of the study, participants indicated the percentage of scientists they thought agree on global warming and they answered some questions about their own climate change beliefs. People then received a message about scientific consensus, which took the form either of a) a simple description, b) a pie chart, or c) a metaphorical comparison related to trusting engineers’ consensus about bridges (i.e., if 97% of engineers agreed that a bridge was unsafe, would you use it?) or doctors’ consensus about illness. All the messages included the info that “97 % of climate scientists have concluded that human-caused climate change is happening.”

Then participants again indicated what percent of scientists they thought agree on global warming and answered questions about their own beliefs. All messages “worked,” in the sense that people perceived greater scientific agreement after the messages telling them that 97% of scientists agree than if they hadn’t read anything about the consensus at all (though the simple description and pie chart were more effective than the metaphor. People shifted their climate change beliefs more after encountering one of the more straightforward messages than the more complex metaphor. Great food for thought, as many science communicators insert metaphors wherever they can).

Of course, having people believe that there’s strong scientific consensus about climate is only one step toward the larger goal of having them endorse actions that mitigate the effects of climate change. But in follow-up analyses, the researchers identified that perceiving scientific agreement is a gateway belief: believing that scientists agree about global warming led to other beliefs, ones that get us closer to the goal of actions in favor of mitigating climate change. Specifically, it led to greater belief that climate change was real, human-caused, and worrisome. These beliefs, in turn, led to greater support for public action against climate change. It’s often hard to know what leads to what, especially when it comes to beliefs we keep hidden in our own heads, but with some semi-fancy math, these researchers quantified those relationships.

These studies have some clear takeaways for science communicators (especially when communicating about climate change — but maybe these ideas apply to other topics too — need more research!)

  • Emphasize scientific consensus, that an overwhelming percentage of scientists agree that climate change is a real problem caused by human activity.
  • Don’t worry so much about immediately pushing for public action against climate change. When people understand that scientists agree, they come to agree themselves that climate change is a problem that should be addressed, and THEN they come to support public action. Be careful about skipping steps.

At the same time, there’s not only one right way to communicate about climate change. There are truly effective ways, ineffective and potentially backfiring ways, and many in between. There aren’t cut-and-dry rules because every audience is unique, and taking the audience into account — their beliefs, values, and past experiences, for example — is crucial. But this work sheds light on communication strategies that are probably pretty far toward the “truly effective” end of the ways-to-communicate-climate-change continuum.

Reframing the war on science

America’s kind of tense right now. Leading up to and following the November 2016 election, there’s a lot of talk of “the two Americas” and “the Divided States of America.” Americans are divided on a lot of issues, including scientific topics like vaccine safety and global warming. To many, it’s surprising that we disagree about these things because according to scientists who research these topics, there are no debates at all: vaccines do not cause autism and humans are responsible for global warming

At the same time, the current administration in the US has sent numerous messages that they devalue science (for example, by censoring scientists at organizations like the USA and EPA and establishing a Committee on Vaccine Safety). Actions like these seem to be only fueling the divided science beliefs.

In response, many people have declared that we’re in a war on science: This idea is expressed in headlines like Facts are the reason science is losing the current war on reason, How the Anti-Vaxxers are Winning, and documentary titles like The Vaccine War. (There are so many pieces that talk about the war on science).

screen-shot-2017-03-04-at-10-08-43-am
Scientific American

I’m a PhD student in Cognitive Science, a firm believer in the scientific method and basing beliefs and actions on evidence. I highly value scientific funding, vaccinations, and measures that reduce the effects of climate change. As Americans, we have freedom of speech, and we should exercise that freedom to speak up when scientific knowledge and interests are being trampled on. I agree with the ideas expressed in blog posts like The War on Facts is a War on Democracy and I’m a Scientist. This is what I’ll Fight for and many of the ideas that continuously populate threads on Twitter like #defendscience and #resist. But I’m much less enthusiastic about the widespread use of a war metaphor to get those ideas across.

Here’s why.

Metaphors shape thought

The metaphors we use to describe complex social problems actually shape the way we think about them. For example, when crime was described as a beast ravaging a town, people tended to suggest harsh law enforcement policies — similar to how they’d likely react to a literal beast ravaging their town. On the other hand, when that same crime was described as a virus, people suggested fewer harsh enforcement policies. Instead, they turned their focus to curing the town of problems that may underlie the crime, like improving education and welfare services.

People make inferences in line with the metaphors used to describe complex issues, so it’s important to reflect on what the war on science implies. It does have some helpful implications. Wars are serious, and often require urgent action. These are probably the messages that those who perpetuate the war on science want us to infer, even if not consciously.

But the war also suggests that there are enemies and casualties. There are two sides locked in combat, and neither will back down until they win (or they’re decimated). I like this quote from A Gentleman in Moscow, a novel I just happened to be reading while working on this post: After all, in the midst of armed conflicts, facts are bound to be just as susceptible to injury as ships and men, if not more so. In other words, we sometimes do stupid things in wars. We shirk thoughtfulness and conscientiousness, and instead we just fight. As I see it, our current political situation (for lack of a better word) needs all the thoughtfulness and conscientiousness we can give it.  

I recently expressed my concern in a conversation on Twitter:

The war metaphor challenges those who are not already on the “side” of science. It tells them they’re the enemy. When people feel that they’re being attacked, even idealistically, they’re likely to strengthen their stance and gear up to fight back. No matter how many scientists tweet about science or participate in the March for Science on Earth Day, people who have found themselves on the “anti-science” side of this war are not going to decide all of a sudden that climate change must be real after all or that they should rush their kids to the pediatrician for overdue vaccines (especially if we tell them we’re marching to fight the war on science!). People who have already been labeled as the enemy of science may as well go out and buy a new gas guzzler and decide that their kids are just fine without vaccines.

Others have already pointed out that actions like the science march are already in jeopardy of isolating anti-science proponents as opponents (for example see  Daniel Engber’s piece for Slate and Robert Young’s in the New York Times). Using war metaphors has the potential to only hammer that point home.

138245187_3f2bafe0bc_z

This just doesn’t seem productive. Image: Battle by Thomas Hawk. CC BY-NC 2.0

Alternative frames?

If we want to stop thinking about ourselves as engaged in a war on science, we need an alternative. Proponents of and believers in science are experiencing a sort of struggle, but it doesn’t have to be a fight between the left and right, Democrats and Republicans, Coastal Elite and Middle America. Maybe we can reframe the situation as a challenge that unites all humans. Science communicators want to share how important it is to address climate change and to have children vaccinated for the good of all people. We can all be on the same side, working to better the world we live in, and it’s important that we convey that message in our communications.

Referring to the movie Hidden Figures, NPR blogger Marcelo Gleiser points out that if there is a central lesson in the movie, it is that united we win; that what makes America great is not segregation and intolerance, but openness and inclusiveness.

I considered the possibility that guiding people to trust empirical evidence and the scientific process might be better framed as a puzzle — a challenge, no doubt, but at least everyone’s working toward a common goal.

Marisa makes a really important point. The peacekeeper in me would love a frame that emphasizes hey, guys! We’re all in this together!, but that ship may have already sailed. At this point, it’s important not to downplay the gravity of discrediting and distrusting science. This is not a game.

 

I’ve had quite a few conversations on the war on science, but I still don’t have a one-size-fits-all framing suggestion for talking about America’s disconnect in belief in science. But when we’re considering talking about this issue as a war, it’ll be helpful to step back and assess our goals and the potential consequences of the words we use.

Right now, there are deep social and political divides in American society — and though it’s crucial to stand up for what we believe in (especially science and facts!), we should be careful about taking up arms in a war on science that might deepen those divides. 

I welcome other comments on the framing of the war on science. Do you find the war helpful? Why? Are there other frames we could use to avoid deepening ideological divides?


Featured image: United States USA Flag by Mike Mozart. CC BY 2.0

Communicating climate change: Focus on the framing, not just the facts

Image 20170303 29002 1h47na1
How you package the information matters.
Frame image via www.shutterstock.com.

Rose Hendricks, University of California, San Diego

Humans are currently in a war against global warming. Or is it a race against global warming? Or maybe it’s just a problem we have to deal with? The Conversation

If you already consider climate change a pressing issue, you might not think carefully about the way you talk about it – regardless of how you discuss it, you already think of global warming as a problem. But the way we talk about climate change affects the way people think about it.

For scientific evidence to shape people’s actions – both personal behaviors like recycling and choices on policies to vote for – it’s crucial that science be communicated to the public effectively. Social scientists have been increasingly studying the science of science communication, to better understand what does and does not work for discussing different scientific topics. It turns out the language you use and how you frame the discussion can make a big difference.

The paradox of science communication

“Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know,” writes Yale law professor Dan Kahan, a leading researcher in the science of science communication.

Kahan’s work shows that just because someone has scientific knowledge, he or she won’t necessarily hold science-supported beliefs about controversial topics like global warming, private gun possession or fracking.

Instead, beliefs are shaped by the social groups people consider themselves to be a part of. We’re all simultaneously members of many social groups – based, for example, on political or religious affiliation, occupation or sexuality. If people are confronted with scientific evidence that seems to attack their group’s values, they’re likely to become defensive. They may consider the evidence they’ve encountered to be flawed, and strengthen their conviction in their prior beliefs.

Unfortunately, scientific evidence does sometimes contradict some groups’ values. For example, some religious people trust a strict reading of the Bible: God said there would be four seasons, and hot and cold, so they don’t worry about the patterns in climate that alarm scientists. In cases like this one, how can communicators get their message across?

A growing body of research suggests that instead of bombarding people with piles of evidence, science communicators can focus more on how they present it. The problem isn’t that people haven’t been given enough facts. It’s that they haven’t been given facts in the right ways. Researchers often refer to this packaging as framing. Just as picture frames enhance and draw attention to parts of an image inside, linguistic frames can do the same with ideas.

One framing technique Kahan encourages is disentangling facts from people’s identities. Biologist Andrew Thaler describes one way of doing so in a post called “When I talk about climate change, I don’t talk about science.” Instead, he talks about things that are important to his audiences, such as fishing, flooding, farming, faith and the future. These issues that matter to the people with whom he’s communicating become an entry into discussing global warming. Now they can see scientific evidence as important to their social group identity, not contradictory to it.

Let me rephrase that

Metaphors also provide frames for talking about climate change. Recent work by psychologists Stephen Flusberg, Paul Thibodeau and Teenie Matlock suggests that the metaphors we use to describe global warming can influence people’s beliefs and actions.

Ready for combat?
Thomas Hawk, CC BY-NC

The researchers asked 3,000 Americans on an online platform to read a short fictional news article about climate change. The articles were exactly the same, but they used different metaphors: One referred to the “war against” and another to the “race against” climate change. For example, each article included phrases about the U.S. seeking to either “combat” (war) or “go after” (race) excessive energy use.

After reading just one of these passages, participants answered questions about their global warming beliefs, like how serious global warming is and whether they would be willing to engage in more pro-environmental behaviors.

Metaphors mattered. Reading about the “war” against global warming led to greater agreement with scientific evidence showing it is real and human-caused. This group of participants indicated more urgency for reducing emissions, believed global warming poses a greater risk and responded that they were more willing to change their behaviors to reduce their carbon footprint than people who read about the “race” against global warming.

The only difference between the articles that participants read was the metaphors they included. Why would reading about a war rather than a race affect people’s beliefs about climate change in such important ways?

The researchers suggest that when we encounter war metaphors, we are reminded (though not always consciously) of other war-related concepts like death, destruction, opposition and struggle. These concepts affect our emotions and remind us of the negative feelings and consequences of defeat. With those war-related thoughts in mind, we may be motivated to avoid losing. If we have these war thoughts swimming around in our minds when we think about global warming, we’re more likely to believe it’s important to defeat the opponent, which, in this case, is global warming.

There are other analogies that are good at conveying the causes and consequences for global warming. Work by psychologists Kaitlin Raimi, Paul Stern and Alexander Maki suggests it helps to point out how global warming is similar to many medical diseases. For both, risks are often caused or aggravated by human behaviors, the processes are often progressive, they produce symptoms outside the normal range of past experiences, there are uncertainties in the prognosis of future events, treatment often involves trade-offs or side effects, it’s usually most effective to treat the underlying problem instead of just alleviating symptoms and they’re hard to reverse.

People who read the medical disease analogy for climate change were more likely to agree with the science-backed explanations for global warming causes and consequences than those who read a different analogy or no analogy at all.

Golden past or rosy future?

Climate change messages can also be framed by focusing on different time periods. Social psychologists Matthew Baldwin and Joris Lammers asked people to read either a past-focused climate change message (like “Looking back to our nation’s past… there was less traffic on the road”) or a similar future-focused message (“Looking forward to our nation’s future… there is increasing traffic on the road”).

The researchers found that self-identified conservatives, who tend to resist climate change messages more than liberals, agreed that we should change how we interact with the planet more after reading the past-focused passage. Liberals, on the other hand, reported liking the future-focused frame better, but the frames had no influence on their environmental attitudes.

Example of a past-focused image (top) and a future-focused image (bottom) of a reservoir.
Image courtesy of NASA. Used in Baldwin and Lammers, PNAS December 27, 2016 vol. 113 no. 52 14953-14957.

And the frames didn’t have to be words. Conservatives also shifted their beliefs to be more pro-environmental after seeing past-focused images (satellite images that progressed from the past to today) more than after seeing future-focused ones (satellite images that progressed from today into the future). Liberals showed no differences in their attitudes after seeing the two frames.

Many climate change messages focus on the potential future consequences of not addressing climate change now. This research on time-framing suggests that such a forward-looking message may in fact be unproductive for those who already tend to resist the idea.

There’s no one-size-fits-all frame for motivating people to care about climate change. Communicators need to know their audience and anticipate their reactions to different messages. When in doubt, though, these studies suggest science communicators might want to bring out the big guns and encourage people to fire away in this war on climate change, while reminding them how wonderful the Earth used to be before our universal opponent began attacking full force.

Rose Hendricks, Ph.D. Candidate in Cognitive Science, University of California, San Diego

This article was originally published on The Conversation. Read the original article.

Past vs. Future Frames for Communicating Climate Change

Climate change (is it happening? how problematic is it? and are humans responsible?) is a partisan issue. Work by Dan Kahan (which I’ve written about before) shows that conservatives are more likely than liberals to believe that climate change is not a result of human activity and that if unchanged, it will not be as destructive as many people claim. Researchers Matthew Baldwin & Joris Lammers explore the possibility that partisan differences in beliefs about climate change might result from differences in the way conservatives and liberals tend to think about time (their temporal focus).

Their starting point was that previous research has shown that conservatives focus more on the past than liberals do. Then they tested two competing frames: one was future-focused (“Looking forward to our nation’s future… there is increasing traffic on the road”) and the other was past-focused (“Looking back to our nation’s past… there was less traffic on the road”). Each participant read just one of these, and then reported their attitudes about climate change and the environment. They found that conservatives reported liking the past-focused message better than the future-focused one and also reported higher environmental attitudes after the past- compared to the future-focused frame.

Screen Shot 2017-01-08 at 5.20.38 PM.png

They replicated these findings in additional experiments with variations. For example, in one test, instead of using linguistic frames to draw attention to either the past or the future, they used satellite images, either showing a progression from the past to today or a forecasted progression from today to the future. Again, conservatives reported more proenvironmental attitudes after viewing past-focused images than future-focused ones.

Next they investigated the temporal focus that real environmental charities tend to use. Not surprisingly, they found that the charities’ messages disproportionately express future consequences, with less focus on the past. Following up on this, they presented participants with money that they could divide among two (fake) charities (one whose message was strongly past- and one whose message was strongly future-focused), or they could keep some or all of it. They saw each charity’s logo and mission statement (the past-focused one stated: “Restoring the planet to its original state” and the future one: “Creating a new Earth for the future”).

Screen Shot 2017-01-08 at 5.28.45 PM.png

Conservatives donated more to the past- than the future-oriented charity. Liberals did the opposite. Further, looking at just the past-oriented charity, conservatives donated more than liberals did. Looking just at the future-oriented one, the opposite pattern emerges. This is a very beautiful interaction (plus the researchers did a few other experiments with slightly varied methods and a meta-analysis, all of which add some weight to these findings).

Considering the finding that climate change communications rely heavily on future-focused appeals, these findings should really make us pause. Is it possible that climate change issues themselves may not actually be what divides conservatives and liberals so much, but instead the way they’re communicated might be driving much of the disagreement between them? My intuition is that framing is not entirely to blame for conservatives’ and liberals’ divergent beliefs about climate change, but this work shows that it may be a big part of the story. It certainly won’t hurt for communicators to start diversifying our temporal frames for discussing climate change.


For more consideration on this topic, see earlier posts: Climate change is a big problem and we need to find better ways of talking about it; Narratives for Communicating Climate Change; and The paradox of science communication and the new science to resolve it.

All figures from Baldwin, M. & Lammers, J. (2016) Past-focused environmental comparisons proenvironmental outcomes for conservatives. PNAS, 113(52), 14953-14957.

For a discussion of why the framing described in this paper might not be enough to change conservatives’ minds about climate change, see This one weird trick will not convince conservatives to fight climate change, by David Roberts for Vox.

Climate change is like a medical disease

I recently wrote for PLOS SciComm about a very cool study on the benefits of using analogies to talk about climate change (aptly called The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change). The researchers found that using any analogy (comparing climate change to a medical disease, a courtroom, or a natural disaster) was helpful, but that the medical disease analogy in particular helped people consider important aspects of climate change that often polarize people along political party lines.

Please check out the full piece here!

 

Becoming a better teacher: Fish is Fish

This summer, I’ll be the Instructor of Record (real teacher, not Teaching Assistant) for the first time. I’m teaching Research Methods, which is a “lower level” (mainly first- and second-year undergrads) course that I’ve TAed for twice, and I really enjoy its content. Because I’m participating in UCSD’s Summer Graduate Teaching Scholar Program, I have to complete a course called Teaching + Learning at the College Level. We’re two weeks in, and I’ve picked up some interesting nuggets from the readings and class discussions, but one analogy in particular is still on my mind.

We talked about the children’s story Fish is Fish by Leo Lionni. I’m kind of glad I never encountered this story when I was a kid because its novelty had a great impact on 25-year old me. The story is about a fish and tadpole who wonder what life on land is like. Eventually the tadpole becomes a frog who can leave the water to learn about the land. He reports back to the fish, listing off features of things on land. Cows, he says, have black spots, four legs, and udders. The frog describes birds and people too, and here’s what the fish imagines:

The fish and the frog are talking about the same things, and they assume that they have common concepts of cows and birds and people, but their actual mental representations — what they see in their mind’s eye — of these things are quite different. If the fish had been given a traditional paper and pencil test that asked him to define a cow, he’d be correct in writing that it has 4 legs, black spots, and udders. He’d ace the test, fooling not only the teacher, but also himself, into thinking he actually knows what a cow is.

The takeaway, of course, is to try to make sure your students aren’t fish. Find ways to lead them beyond their fishy cow concepts, which can especially be hard when they’ve never been on land and they come to class knowing only what fish are like. Students almost always need foundational knowledge in order to understand a new concept, and there’s a good chance that at least some of the students in any class are missing that foundation.  Instructors need to be mindful that there will be times when they have to step back to assess and teach prerequisite knowledge before launching into an hour-long lecture about cows (or cognition). And then once they think the students actually know what cows are, it’s important to provide assessments that actually test understanding, and not just memorization.

There are plenty of things I might not pull off perfectly when I teach for the first time this summer, but I do feel confident that I’ll at least be on a quest to help the fish in the class become frogs so they can see what cows are really like.

Teachers of all levels and subjects: I invite you to share how you make sure your students are truly understanding and not simply parroting. How do you make sure their concepts of the cow are really cow-like, and not just fish with spots and udders?

The paradox of science communication and the new science to resolve it

I recently discovered this interesting paper by Yale Law Professor Dan Kahan: What is the “science of science communication”? He introduces the concept of the science communication paradox: “Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know.” This figure demonstrates the science communication paradox, showing strong disagreements about different risks for people with different political beliefs:

Screen Shot 2017-01-08 at 9.55.52 AM.pngAccording to Kahan, resolving the science communication paradox is the main goal of the new science of science communication. He lays out two potential explanations for the science communication paradox:

  • Public Irrationality Thesis (PIT): Advocates of this position believe that the public, on the whole, is not scientifically literate and does not think about risk the way scientists do.

If PIT is correct, then as people acquire more scientific literacy, their views on different risks should align more with scientists’ views. This is not what actually happens:

Screen Shot 2017-01-08 at 10.05.01 AM.pngWith increasing science comprehension scores, people actually become more polarized along party lines in their belief about the risk of climate change (shown in more detail in an earlier publication by the same group). Scientific literacy does not necessarily mean views aligned with scientists’.

  • Cultural Cognition Thesis (CCT), alternative explanation for the science communication paradox: Suggests that our group identities are fundamental shapers of how we think about risk. Kahan gives the analogy of sports fans for two opposing teams: The opposing fans are likely to actually see a replay of a questionable call differently and in favor of their team. Along these lines, when people feel their group’s stance is being threatened, they’re more likely to see evidence as confirming their belief or to discount the source if it contradicts what they want to believe. This account is much more consistent with the real-world data (for example, the previous graph) and experimental work he and collaborators have done.

It’s important to note that risk assessments of most science issues do not demonstrate the science communication paradox: people with more science knowledge don’t necessarily become more polarized in their beliefs. Here are a few examples of issues that science intelligence predicts people’s risk assessments better than their political leaning:

Screen Shot 2017-01-08 at 10.19.23 AM.png

How can we use awareness of the cultural cognition thesis to improve science communication?

Kahan suggests the disentanglement principle: If polarization results from a clash between people’s identities as members of a cultural group and scientific facts they’re encountering, we should work to separate these two. For example, a true/false question might state: “Human beings, as we know them today, developed from earlier species of animals.” Someone who belongs to a religious group that doesn’t support evolution has to choose between answering this question in a way that’s consistent with scientific consensus OR their group identity. But rewording the statement to something like: “According to the theory of evolution, human beings, as we know them today, developed from earlier species of animals.” takes away the conflict. The respond can now demonstrate scientific knowledge by agreeing with the statement without jeopardizing their identity of part of a group that doesn’t believe in evolution.

Screen Shot 2017-01-08 at 10.29.32 AM.png

While the original statement has led to polarized views (as science knowledge increases, religious and non-religious people’s responses begin to diverge more), the second framing has shown converging responses (scientific knowledge, rather than religious beliefs, now becomes the best predictor of correct responses).

In a great blog post for Southern Fried Science, Andrew Thaler shares that he talks about a lot of things when he talks about climate change, but science isn’t one of them. He talks about fishing, flooding, farming, faith, and the future. These are things that his audiences know deeply, and climate change is relevant to anyone with interest in any of his f’s. He provides a great example of disentangling people’s identities from the scientific issue, and instead actually uses their identities to show the issue’s relevance.

The disentanglement principle offers one way that the new science of science communication begins to reduce the paradox of science communication, but it’s just one drop in a huge pond of paradox. We have to keep working on ways to communicate information that conflicts with people’s cultural identities, and as this work shows, jamming information down people’s throats isn’t the way to close belief gaps.


Feature image: Communication by Joan M. MasCC.

All figures from Kahan, D. (2015). What is the “science of science communication”? Journal of Science Communication, 14(3).