Communication

Climate change is like a medical disease

I recently wrote for PLOS SciComm about a very cool study on the benefits of using analogies to talk about climate change (aptly called The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change). The researchers found that using any analogy (comparing climate change to a medical disease, a courtroom, or a natural disaster) was helpful, but that the medical disease analogy in particular helped people consider important aspects of climate change that often polarize people along political party lines.

Please check out the full piece here!

 

Becoming a better teacher: Fish is Fish

This summer, I’ll be the Instructor of Record (real teacher, not Teaching Assistant) for the first time. I’m teaching Research Methods, which is a “lower level” (mainly first- and second-year undergrads) course that I’ve TAed for twice, and I really enjoy its content. Because I’m participating in UCSD’s Summer Graduate Teaching Scholar Program, I have to complete a course called Teaching + Learning at the College Level. We’re two weeks in, and I’ve picked up some interesting nuggets from the readings and class discussions, but one analogy in particular is still on my mind.

We talked about the children’s story Fish is Fish by Leo Lionni. I’m kind of glad I never encountered this story when I was a kid because its novelty had a great impact on 25-year old me. The story is about a fish and tadpole who wonder what life on land is like. Eventually the tadpole becomes a frog who can leave the water to learn about the land. He reports back to the fish, listing off features of things on land. Cows, he says, have black spots, four legs, and udders. The frog describes birds and people too, and here’s what the fish imagines:

The fish and the frog are talking about the same things, and they assume that they have common concepts of cows and birds and people, but their actual mental representations — what they see in their mind’s eye — of these things are quite different. If the fish had been given a traditional paper and pencil test that asked him to define a cow, he’d be correct in writing that it has 4 legs, black spots, and udders. He’d ace the test, fooling not only the teacher, but also himself, into thinking he actually knows what a cow is.

The takeaway, of course, is to try to make sure your students aren’t fish. Find ways to lead them beyond their fishy cow concepts, which can especially be hard when they’ve never been on land and they come to class knowing only what fish are like. Students almost always need foundational knowledge in order to understand a new concept, and there’s a good chance that at least some of the students in any class are missing that foundation.  Instructors need to be mindful that there will be times when they have to step back to assess and teach prerequisite knowledge before launching into an hour-long lecture about cows (or cognition). And then once they think the students actually know what cows are, it’s important to provide assessments that actually test understanding, and not just memorization.

There are plenty of things I might not pull off perfectly when I teach for the first time this summer, but I do feel confident that I’ll at least be on a quest to help the fish in the class become frogs so they can see what cows are really like.

Teachers of all levels and subjects: I invite you to share how you make sure your students are truly understanding and not simply parroting. How do you make sure their concepts of the cow are really cow-like, and not just fish with spots and udders?

1459055735_3480b4050e_m

The paradox of science communication and the new science to resolve it

I recently discovered this interesting paper by Yale Law Professor Dan Kahan: What is the “science of science communication”? He introduces the concept of the science communication paradox: “Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know.” This figure demonstrates the science communication paradox, showing strong disagreements about different risks for people with different political beliefs:

Screen Shot 2017-01-08 at 9.55.52 AM.pngAccording to Kahan, resolving the science communication paradox is the main goal of the new science of science communication. He lays out two potential explanations for the science communication paradox:

  • Public Irrationality Thesis (PIT): Advocates of this position believe that the public, on the whole, is not scientifically literate and does not think about risk the way scientists do.

If PIT is correct, then as people acquire more scientific literacy, their views on different risks should align more with scientists’ views. This is not what actually happens:

Screen Shot 2017-01-08 at 10.05.01 AM.pngWith increasing science comprehension scores, people actually become more polarized along party lines in their belief about the risk of climate change (shown in more detail in an earlier publication by the same group). Scientific literacy does not necessarily mean views aligned with scientists’.

  • Cultural Cognition Thesis (CCT), alternative explanation for the science communication paradox: Suggests that our group identities are fundamental shapers of how we think about risk. Kahan gives the analogy of sports fans for two opposing teams: The opposing fans are likely to actually see a replay of a questionable call differently and in favor of their team. Along these lines, when people feel their group’s stance is being threatened, they’re more likely to see evidence as confirming their belief or to discount the source if it contradicts what they want to believe. This account is much more consistent with the real-world data (for example, the previous graph) and experimental work he and collaborators have done.

It’s important to note that risk assessments of most science issues do not demonstrate the science communication paradox: people with more science knowledge don’t necessarily become more polarized in their beliefs. Here are a few examples of issues that science intelligence predicts people’s risk assessments better than their political leaning:

Screen Shot 2017-01-08 at 10.19.23 AM.png

How can we use awareness of the cultural cognition thesis to improve science communication?

Kahan suggests the disentanglement principle: If polarization results from a clash between people’s identities as members of a cultural group and scientific facts they’re encountering, we should work to separate these two. For example, a true/false question might state: “Human beings, as we know them today, developed from earlier species of animals.” Someone who belongs to a religious group that doesn’t support evolution has to choose between answering this question in a way that’s consistent with scientific consensus OR their group identity. But rewording the statement to something like: “According to the theory of evolution, human beings, as we know them today, developed from earlier species of animals.” takes away the conflict. The respond can now demonstrate scientific knowledge by agreeing with the statement without jeopardizing their identity of part of a group that doesn’t believe in evolution.

Screen Shot 2017-01-08 at 10.29.32 AM.png

While the original statement has led to polarized views (as science knowledge increases, religious and non-religious people’s responses begin to diverge more), the second framing has shown converging responses (scientific knowledge, rather than religious beliefs, now becomes the best predictor of correct responses).

In a great blog post for Southern Fried Science, Andrew Thaler shares that he talks about a lot of things when he talks about climate change, but science isn’t one of them. He talks about fishing, flooding, farming, faith, and the future. These are things that his audiences know deeply, and climate change is relevant to anyone with interest in any of his f’s. He provides a great example of disentangling people’s identities from the scientific issue, and instead actually uses their identities to show the issue’s relevance.

The disentanglement principle offers one way that the new science of science communication begins to reduce the paradox of science communication, but it’s just one drop in a huge pond of paradox. We have to keep working on ways to communicate information that conflicts with people’s cultural identities, and as this work shows, jamming information down people’s throats isn’t the way to close belief gaps.


Feature image: Communication by Joan M. MasCC.

All figures from Kahan, D. (2015). What is the “science of science communication”? Journal of Science Communication, 14(3).

screen-shot-2017-01-04-at-6-03-10-pm

Narratives for communicating climate change

Last week I wrote about work by UC researchers on framing climate change, a chapter that focuses on how we can harness our understanding of human psychology — how we learn, think, and behave — to communicate science better. Here’s another paper (one that’s gotten very popular, very quickly) that considers human cognition for the efficacy of communicating about climate change.

Narrative Style Influences Citation Frequency in Climate Change ScienceThe authors of this paper (Ann Hillier, Ryan Kelly, & Terrie Klinger, all from the University of Washington) started with the insight from psychology that people understand and remember story-like (narrative) writing better than explanatory (expository) writing. They considered abstracts from 802 scientific papers about climate change, and looked for different markers of narrative structure:
1) description of setting (where/when the events took place)
2) narrative perspective (the presence of a narrator)
3) sensory language (appealing to the senses or emotions)
4) conjunctions (used often in narratives to connect narratives logically)
5) connectivity (phrases that create explicit links to something mentioned earlier in the text)
6) appeal (whether the text makes an appeal to the reader or a recommendation for specific action)

The authors crowdsourced this first part of their data analysis. This means that non-scientists who use an online job platform (crowdflower.com) were given the authors’ instructions for analyzing the abstracts. This way, each abstract was analyzed by 7 independent people, and involved human interpretation and discretion, which can likely provide a more accurate index of narrativity than any computerized methods can at the moment.

The authors considered how many times each paper in the study had been cited by others as a reflection of how much impact each paper had on subsequent science conducted. They found that 4 of their 6 narrative indicators (sensory language, conjunctions, connectivity, and appeal to reader) were related to how frequently articles were cited by others. In other words, papers higher in narrativity were cited more often than those that were more expository.

journal-pone-0167983-g001-2

Subset of Figure 1, showing that as articles increase in narrativity, their citations increase as well.

The more citations a paper receives, the more other researchers will see the work. It’s possible that higher quality work lends itself better to a narrative style, so papers high in narrativity will also be cited often. Since this study is correlational, we have no way of ruling out this possibility that the best science is conducive to narrative presentation, and it would be cited a lot regardless of its narrative style because it’s just good research. The causal arrow is not clear here, but it is clear that impactful research tends to take on a narrative structure. Even though narrative writing doesn’t necessarily lead to citations, imitating the style of papers that are cited often doesn’t seem to be a bad idea.

This work is not the first to suggest that narratives can be helpful for understanding climate change. FrameWorks Institute, a nonprofit organization that designs ways to communicate complex issues and tests their efficacy for cognitive and behavior changes, has a toolkit that uses (visual) narratives to communicate about climate change. (Also note that the toolkit is just the tip of the iceberg for the extensive work FrameWorks has done on communicating climate change.)

Together, the work by FrameWorks and the study of narrativity and citations present a pretty clear takeaway for climate scientists (and likely scientists in many fields): ease off the traditional academic expository style and lean into a more understandable and memorable narrative style.


For an interesting (and more critical) take on this paper, see this post by Randy Olson at scienceneedsstory.com)

Getting a scientific message across means taking human nature into account

I really enjoyed thinking about, researching, and writing this piece for The Conversation, where this work was originally published.

Getting a scientific message across means taking human nature into account

Rose Hendricks, University of California, San Diego

We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not.

Scientists and the media need to communicate more science and communicate it better. Good communication ensures that scientific progress benefits society, bolsters democracy, weakens the potency of fake news and misinformation and fulfills researchers’ responsibility to engage with the public. Such beliefs have motivated training programs, workshops and a research agenda from the National Academies of Science, Engineering, and Medicine on learning more about science communication. A resounding question remains for science communicators: What can we do better?

A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.”

But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Or they read an online article by a scientist about the necessity of vaccines, but leave comments expressing outrage that doctors are trying to further a pro-vaccine agenda. Convincing people that scientific evidence has merit and should guide behavior may be the greatest science communication challenge, particularly in our “post-truth” era.

Luckily, we know a lot about human psychology – how people perceive, reason and learn about the world – and many lessons from psychology can be applied to science communication endeavors.

Consider human nature

Regardless of your religious affiliation, imagine that you’ve always learned that God created human beings just as we are today. Your parents, teachers and books all told you so. You’ve also noticed throughout your life that science is pretty useful – you especially love heating up a frozen dinner in the microwave while browsing Snapchat on your iPhone.

One day you read that scientists have evidence for human evolution. You feel uncomfortable: Were your parents, teachers and books wrong about where people originally came from? Are these scientists wrong? You experience cognitive dissonance – the uneasiness that results from entertaining two conflicting ideas.

It’s uncomfortable to hold two conflicting ideas at the same time. Man image via www.shutterstock.com.

Psychologist Leon Festinger first articulated the theory of cognitive dissonance in 1957, noting that it’s human nature to be uncomfortable with maintaining two conflicting beliefs at the same time. That discomfort leads us to try to reconcile the competing ideas we come across. Regardless of political leaning, we’re hesitant to accept new information that contradicts our existing worldviews.

One way we subconsciously avoid cognitive dissonance is through confirmation bias – a tendency to seek information that confirms what we already believe and discard information that doesn’t.

This human tendency was first exposed by psychologist Peter Wason in the 1960s in a simple logic experiment. He found that people tend to seek confirmatory information and avoid information that would potentially disprove their beliefs.

The concept of confirmation bias scales up to larger issues, too. For example, psychologists John Cook and Stephen Lewandowsky asked people about their beliefs concerning global warming and then gave them information stating that 97 percent of scientists agree that human activity causes climate change. The researchers measured whether the information about the scientific consensus influenced people’s beliefs about global warming.

Those who initially opposed the idea of human-caused global warming became even less accepting after reading about the scientific consensus on the issue. People who had already believed that human actions cause global warming supported their position even more strongly after learning about the scientific consensus. Presenting these participants with factual information ended up further polarizing their views, strengthening everyone’s resolve in their initial positions. It was a case of confirmation bias at work: New information consistent with prior beliefs strengthened those beliefs; new information conflicting with existing beliefs led people to discredit the message as a way to hold on to their original position.

Just shouting louder isn’t going to help. Megaphone image via www.shutterstock.com.

Overcoming cognitive biases

How can science communicators share their messages in a way that leads people to change their beliefs and actions about important science issues, given our natural cognitive biases?

The first step is to acknowledge that every audience has preexisting beliefs about the world. Expect those beliefs to color the way they receive your message. Anticipate that people will accept information that is consistent with their prior beliefs and discredit information that is not.

Then, focus on framing. No message can contain all the information available on a topic, so any communication will emphasize some aspects while downplaying others. While it’s unhelpful to cherry-pick and present only evidence in your favor – which can backfire anyway – it is helpful to focus on what an audience cares about.

For example, these University of California researchers point out that the idea of climate change causing rising sea levels may not alarm an inland farmer dealing with drought as much as it does someone living on the coast. Referring to the impact our actions today may have for our grandchildren might be more compelling to those who actually have grandchildren than to those who don’t. By anticipating what an audience believes and what’s important to them, communicators can choose more effective frames for their messages – focusing on the most compelling aspects of the issue for their audience and presenting it in a way the audience can identify with.

In addition to the ideas expressed in a frame, the specific words used matter. Psychologists Amos Tversky and Daniel Kahneman first showed when numerical information is presented in different ways, people think about it differently. Here’s an example from their 1981 study:

Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. If Program B is adopted, there is ⅓ probability that 600 people will be saved, and ⅔ probability that no people will be saved.

Both programs have an expected value of 200 lives saved. But 72 percent of participants chose Program A. We reason about mathematically equivalent options differently when they’re framed differently: Our intuitions are often not consistent with probabilities and other math concepts.

Metaphors can also act as linguistic frames. Psychologists Paul Thibodeau and Lera Boroditsky found that people who read that crime is a beast proposed different solutions than those who read that crime is a virus – even if they had no memory of reading the metaphor. The metaphors guided people’s reasoning, encouraging them to transfer solutions they’d propose for real beasts (cage them) or viruses (find the source) to dealing with crime (harsher law enforcement or more social programs).

The words we use to package our ideas can drastically influence how people think about those ideas.

What’s next?

We have a lot to learn. Quantitative research on the efficacy of science communication strategies is in its infancy but becoming an increasing priority. As we continue to untangle more about what works and why, it’s important for science communicators to be conscious of the biases they and their audiences bring to their exchanges and the frames they select to share their messages.

screen-shot-2017-01-04-at-2-57-05-pm

Climate change is a big problem and we need to find better ways of talking about it

A team of researchers representing a range of academic departments across most of the schools in the University of California (UC) system recently published a chapter summarizing what we know about efforts to communicate climate disruption and how we can improve on them. It’s full of useful information (especially in the tables, which include things like common climate myths vs. facts and existing communication programs in the UC system). An overarching theme that I’ll focus on is that framing matters.

What’s a frame?

Picture frames often enhance the image inside. Frames can draw attention to the parts of the image that lie inside them and obscure or detract from the parts that lie outside. Linguistic frames do the same thing. The chapter refers to framing as “an effective communication tool for drawing attention to, legitimizing, and providing an interpretive context for abstract, complex, or unfamiliar information” (p. 9). For example, one person might frame a medical procedure by saying that it has a 70% success rate, while another might frame that same procedure as having a 30% failure rate. Although they both reflect the same information, each highlights something different — either success or failure — and psychology research has shown that in many instances, people reason differently when they encounter different frames for the same idea. Truly complex concepts like climate change can’t be communicated without framing, because it’s impossible for a communication to portray everything imaginable that’s known about a topic without highlighting some information and downplaying others.

The power and ubiquity of framing show us that facts alone are not enough. Frames used to communicate about climate disruption need to be selected conscientiously in order to give people a sense of why they should care about the issue and what they personally can do about it. Climate change can be framed by highlighting the human health issues it creates, the economic gains that can be realized by addressing it, or effects on local versus global levels. Climate change can also be framed using images.

This image makes me think, damn, we need to save the Earth. If that one didn’t work for you, maybe this one will:

Considerations for Frames

There is no one-size-fits all frame for motivating people to care about and act on climate change. Instead, communicators need to know their audience and anticipate the audience’s reaction to different messages. Tailoring frames for specific audiences becomes even more challenging when audiences are culturally diverse (a very notable point, since the authors are all from California, the most populous and diverse state). But it’s a challenge worth taking up. In the state of CA, for example, a message about rising sea levels may impact someone living on the coast more than someone living inland in an area affected by drought. Anticipating what matters to an audience can help communicators choose the most appropriate frames.

Religion provides an additional opportunity for framing. The major world religions emphasize humans’ responsibility to care for their natural world, and religious leaders have begun explicitly urging their followers to take this message seriously in the context of climate change. Unlike religion, climate change is often associated with political beliefs (almost half of Republicans are skeptical of climate change while just over 10% of Democrats are). In order to get more people to acknowledge the gravity of climate change and the actions we need to take to prevent disaster, communicators should focus on reducing the political divide on the issue, for example having prominent Republican groups and “opinion leaders,” people who have clout in their communities (such as Bible study or PTA leaders), speak about the urgency of addressing global warming.

Economics and business frames are also important to hone. Many people currently see addressing climate change as bringing about job losses, but in reality job prospects in the renewable energy sector are greater than those for traditional energy sources. Communicators need to emphasize these facts as well as highlighting the major companies that are already committed to improving energy practices.

Climate change is one of the most contentious issues nationally (and globally, at least in places where people have even heard of it), and communicating any controversial issue presents challenges (the subject of a chapter in the National Academy of Science’s guide for effective science communication, which I summarized previously). Adequately addressing climate change may involve more scientific innovations, legislation, and a lot of behavior changes… but we won’t get there if we don’t also focus on communicating the gravity of the issue and what can be done about it.

TLDR Guide to Ch 5 of Communicating Science Effectively: A Research Agenda

Each day so far this week, I’ve shared my highlights of the National Academy of Science’s guide and research agenda for communicating science effectively (ch1, ch2, ch3, ch4). Today I’ll cover the final chapter.


Chapter 5: Building the knowledge base for effective science communication

This chapter brings back a number of issues discussed in earlier chapters with a focus on how the science of science communication can continue to be more informative.

Scientific communications often have an underlying assumption that when communication is done well, the public’s understanding of and attitudes about societal issues will be affected. It seems like a reasonable assumption, but it has not been extensively tested, and there are likely many conditions under which the assumption is false. “Good” communication alone won’t suffice for many of science communicators’ goals.

Future steps for science communicators

The report calls for more partnerships between researchers and science communicators to put into practice the lessons revealed by research on science communication. These partnerships will also be important for furthering research on science communication and testing hypotheses about ideal communication practices.

Screen Shot 2016-12-19 at 12.51.00 PM.png

I had never considered the possibility that science communication could be irrelevant for the achieving end goals. I think science communicators generally believe that it’s important for their messages to be communicated, and in many cases this is probably true, but I think it is worth considering the relative importance of science communication in creating changes compared to all the other things that also matter.

Using a systems approach to guide research on science communication

In cognitive science, we’re often drawn to look at the cognition of a system. For example, we might not just look at neural activity in order to try to understand some cognitive process, but instead will consider the whole body, environment, and culture in which the cognitive act is situated. This report calls us to think about science communication similarly: every communicative effort is part of a larger system, encompassing the content being communicated, its format, the diverse organizations and individuals who make up the communicators and audiences, the channels of communication, and the political and social contexts that the communication takes place in. This kind of holistic perspective takes into account the system-wide complexity instead of focusing on isolated elements, since findings about elements in isolation may not hold in complex and realistic situations. Since research does often need to be specific to be productive, the report suggests that researchers who are focusing on a single level or element in the system should at least be “acutely aware” of the broader context.

More research

We need more research that will inform best practices for communicating science. Some of this research should come in the form of randomized controlled field experiments, which will involve comparison conditions (for example, strategy A was more successful than strategy B) that take place in identical groups (participants were randomly assigned so that people who received strategy A didn’t differ in any way from those who received strategy B except in the strategy they received).

The report also calls for more training for researchers at all career levels, both so that the science of science communication can continue to become more rigorous, and also so that all other scientists can improve the way they communicate about their own work.


Seriously, we can all get better. This report is long, but it has a lot of important points for science communicators, which I’ve tried to distill into this series of blog posts. For me, the report provides encouragement: there’s a lot we already know about ways to most effectively communicate science, and there’s a comprehensive agenda for continuing to improve.