Rethinking SciComm’s Spotlight on Jargon

It seems that a resounding theme of every science communication how-to guide is to avoid jargon. Jargon is any language that’s field-specific. On the surface, this “rule” makes total sense — if your audience doesn’t understand the words you’re using, they’re unlikely to understand the ideas you’re communicating. Jargon can definitely be a barrier to effective science communication, but there are two important points that the NO JARGON campaign misses: first, jargon is not always terrible, and second, eliminating jargon is not enough.

Jargon is not all bad

One way of accomplishing the first half of that reminder is to avoid the terms that require specialized knowledge to understand. Eliminating jargon sounds good.

But the rule to eliminate all jargon violates the second half  — to not underestimate an audience’s intelligence. Unless you’re speaking to an elementary school class or other specific population, your audience can probably handle some jargon. Of course you should define terms that could be new to your audience. But you don’t have to cut them out entirely. I didn’t attend ComSciCon-Triangle, but I was encouraged to see that speaker Abby Olena also suggested that we don’t need to treat jargon as Public Enemy #1 (thanks to this great blog post by Sarah Loftus for recapping the comments).

If you try to replace all jargon with explanations that use less specialized vocabulary, you’re likely to end up with cumbersome and roundabout explanations that lose your audience anyway, and potentially make them feel patronized in the process — we can all tell when someone’s avoiding using a “big word” because they think we can’t handle it.

I am all for cutting unnecessary jargon, or cutting words that actually mean something different in science than they do outside science. The American Geophysical Union’s Sharing Science blog has a great list of these, along with tips for reducing jargon. Let’s continue to cut excessive and unnecessary jargon, but let’s also push back against the advice that proclaims Absolutely No Jargon Anytime.

Jargon is just the tip of the iceberg

Another reason I get frustrated when people fixate on jargon is that reducing it is usually not that hard. It requires care and diligence to anticipate what will be jargon to an audience, but if you have an editor or listener from another field give input, you can usually catch jargon and replace it with more accessible descriptions. It takes time, but it’s rarely too intellectually challenging to remedy.

A sole focus on removing jargon is like “fixing” a hole in the wall by sliding your dresser in front of it. The hole is not fixed; it’s just hidden. Excising jargon eliminates the most glaring signs of inaccessibility, but it doesn’t fix the communication. In many cases, even when jargon is eliminated, the talk or article is still hard to understand. Cutting jargon is the tip of the iceberg.

We need to drastically broaden our acknowledgement of the factors that make information accessible to our audiences. It’s not just the words we use, but the layers of assumptions and experiences that underlie our ideas that can make them hard for people outside our field to understand in the way we want them to.

For example, in scientific research, we place great value on background research — knowing what was already known and what wasn’t before a study began. Background information is almost always described at the beginning of a scientific paper because the scientific community by and large assumes this information is crucial for understanding a given study.

But when communicating outside your discipline, if you assume your audience has the same appreciation for background info and understanding that it sets the stage for what’s to come, you might lose them. It’s not that they don’t understand the background information, but they don’t have the same scientific experience that tells them why they should care about this information. I’m not being revolutionary right now — the “inverted pyramid,” in which findings are communicated first and background last (the opposite of how science is communicated to other scientists) is common practice for journalists. Failing to follow this advice and instead using an academic frame, assuming that your audience expects background information and knows how to integrate it into findings you will present later often makes a message inaccessible — whether or not it contains jargon.
Inverted_pyramid

Another common problematic assumption in science communication is that the topic is worth studying. If you’ve been working on a topic for many years, hopefully you consider it an important one. There’s a good chance your audience doesn’t yet know why it’s important, and this is especially true of basic science. Instead of mentioning at the end (again, as we do in scientific papers) why this topic is worthy of study, your whole communication can be framed around the idea that this topic is important; and here’s why!

To sum up…

I’m not advocating for jargon-filled science communications. When there’s a simpler way to say something, we should always do that. When jargony terms aren’t necessary for understanding the ideas, we should remove them. And when a term means something different in a scientific and a lay context, we should avoid it. But sometimes jargon is necessary, and using it sparingly can result in a clearer communication than banishing it altogether.

I think the science communication community should stop fixating on jargon in part because it’s not always evil, but also because it’s not the only evil. The assumptions that underlie our messages can be just as confusing and ineffective as incomprehensible words used within. We need to address factors like underlying assumptions that threaten the accessibility of our science communications and are more challenging than replacing words and phrases with descriptions. Sure, let’s reduce jargon, but in doing so, let’s not lose sight of the forest for the trees.

annie-spratt-253806-unsplash.jpg
Photo by Annie Spratt on Unsplash
Advertisements

Scientists agree on climate change: How should we communicate that?

Scientists agree: humans are causing climate change, and if we don’t drastically change our behavior, there will be catastrophic consequences.

The Consensus Handbook, a recent publication by communication researchers John Cook, Sander van der Linden, Edward Maibach, and Stephen Lewandowsky provides a clear and concise compilation of research on communicating scientists’ consensus on climate change. Here are some of my highlights from the report*.

First, what percentage of scientists agree? There are a number of ways to measure consensus — examining published research, surveying scientists, or studying public statements made by scientists, for example. Different researchers have studied this question in a variety of ways, but each result has suggested that 91-100% of climate scientists agree that human-caused climate change is occurring. The majority of these studies actually converge on the estimate the 97% of scientists agree, which is why many of the studies that research the effects of consensus messaging use that number. Regardless, agreement is high. Climate scientist Katharine Hayhoe addresses this consensus in a great video on her channel Global Weirding.

Screen Shot 2018-04-02 at 6.57.42 AM

Does the public realize how high scientific consensus is? No.

Screen Shot 2018-04-02 at 7.09.03 AM.png

Why is there a gap between public perception of scientific agreement and actual scientific agreement? There are two primary culprits. The first, the authors refer to as a “cultural bias.” On average, people who are more conservative report lower consensus than those that are farther to the left. This report doesn’t delve into too much detail on the role of people’s ideological worldview in shaping how they think about climate change, but work by Dan Kahan (which I’ve written about here) is one resource for learning more about that.

The second — and larger — cause of the perception gap is a combination of a lack of information and misinformation. Misinformation campaigns have been relatively successful at confusing the public about scientific consensus on climate change. The most notable is probably the Global Warming Petition Project, in which “people” (some of whom are not real people and many of whom are not scientists) have signed a petition urging the US government to reject global warming agreements.

Adding fuel to the misinformation fire, the media often shows contrarian and climate scientist opinions in comparable ways, suggesting that there is a balance and that climate change is still an issue of debate among scientists.

Screen Shot 2018-04-02 at 7.16.38 AM.png

Why is it important for the public to know the true consensus on climate change? Research has shown that it’s a gateway belief:

what people think about expert agreement influences a range of other key climate attitudes, including whether global warming is real, caused by humans, resulting in serious impacts and importantly, whether we should act to solve it.

Since communicating consensus is also helpful for encouraging people to embrace other crucial beliefs held by climate scientists, the authors comment that “the 97% consensus offers a lot of bang for one’s communication buck.”

Given the importance of understanding scientific consensus, how should we communicate about it? The handbook offers a number of evidence-driven suggestions:

  • Use the number (97%). This is more effective than a description of the consensus as “an overwhelming majority” for convincing people of the reality of the consensus.
  • Consider a pie chart to show consensus. A study led by van der Linden (which I’ve written about previously) showed that the pie chart was more convincing than a simple description or analogy.
  • Encourage people to estimate consensus first. Revealing the consensus after people have estimated it has been shown to be more influential than simply revealing the same information.
  • Inoculate against misinformation (I’ve also written about this strategy). Research shows that people can encounter misinformation about the consensus and still come away with favorable climate attitudes if they’ve been warned about tactics that contrarians often use before they encounter them.

These are all promising tactics for communicating the climate change consensus, but amid these nuanced strategies, we should also not lose sight of the golden rule:

  • People need to encounter straightforward and clear messages that are repeated often and from a range of sources.

*All are figures from Cook, J., van der Linden, S., Maibach, E., & Lewandowsky, S. (2018). The Consensus Handbook. DOI:10.13021/G8MM6P. Available at http://www.climatechangecommunication.org/all/consensus-handbook/

Cover image from NASA: https://www.nasa.gov/topics/earth/images/index.html

Completing a PhD: What worked for me

I defended my PhD on a Wednesday in November. One week later, I boarded a flight to Washington, D.C. from San Diego. I had sold almost all my furniture, donated books and clothing, and packed the rest of my belongings in cardboard boxes that my husband helped me schlep over to USPS. The day before my flight, I filed for my PhD. I had forms signed by each member of my committee, and held my breath as the administrator flipped through my dissertation to ensure that every margin, header, and sub-section met regulations.

I arrived in DC 10 days before starting a new job. In those days I attended a conference, unpacked the few belongings I had shipped, made multiple trips to Target to supplement, and celebrated Thanksgiving with family.

When I started work, I was intellectually overwhelmed by the social and scientific issues I would be working on, the research methods I would hone, and my brilliant colleagues.

With this rapid and major transition, I didn’t have much mental energy to devote to reflecting on my PhD process immediately following my defense. But it’s been a few months, and I have some of that energy now.

I defended my PhD just over four years after I started grad school. In the US, and starting this process just months after earning my Bachelor’s degree, this was pretty quick. But I didn’t start grad school with the goal of finishing quickly; the process of earning a PhD is so much more important than the end point that racing to finish will, in many cases, seriously detract from the quality of research someone produces and their experience along the way. Finishing quickly is not the reason that I feel that my PhD experience was “successful.” However, I am proud that my research was high-quality, that I had a positive experience in grad school (overall I loved it!), and that I had enough of a sense of what I wanted for myself intellectually and professionally after four years that it made sense to finish.

Here’s what worked for me.

 

Granular Planning: For Academic Expedience

A PhD is a multi-year project, so there’s no way around planning. I think it’s a natural strategy to work on breaking the massive project down into smaller ones, and maybe breaking those smaller ones down further, to generate a timeline, and I certainly did this (again, recalibrating often). But my work plans were more granular than that. I often set goals for the week, and for each day, and then scheduled the time specific time that I would do each task (usually scheduling specifics about two days in advance). I was also conscientious about the time of day I scheduled different types of work for. For me, mornings are great for deep work, like challenging statistical analyses and writing, so those tasks were scheduled for mornings. Whenever possible, afternoons were reserved for meetings and reading.

Probably not surprisingly, my weeks almost never went exactly as they were initially scheduled. Some tasks took longer than I had anticipated, and sometimes things just came up and plans were derailed. Luckily, Google Calendar is forgiving. It lets you drag and extend or move entries, which can encourage user flexibility.

Screen Shot 2018-02-25 at 11.23.38 AM

But there was always a default plan for how I’d use time, and that was huge. I never sat down at my desk and wondered what I should work on. Even when I had short gaps between meetings or classes, I had deliberately decided what I’d spend that time doing in advance. Without that default plan, I’d inevitably start mindlessly checking email, Facebook, and Twitter and going down rabbit holes until the next commitment.

Screen Shot 2018-02-25 at 11.27.15 AM

Introspection: For Charting and Changing Course

I’m introspective by nature — constantly asking myself, What do I like about my situation? What do I not like? What are my personal and professional values and priorities, and how do they fit into my current situation? Sometimes I wrote my responses down. Other times I just talked about them with people I’m close to, or reflected while commuting or jogging.

It’s been invaluable to continually recalibrate my actions and goals when I realize my current situation doesn’t line up with my values and priorities. What I wanted last year might not be what I want today, and my actions tomorrow should reflect that acknowledgment.

For me, quality introspection requires down time. I can’t reflect on how well my daily life aligns with my broader ideals if I have no break from that daily life, if I’m constantly working. I’ve made space for hobbies like crafting (crochet, knit, and greeting cards) and exercising (training for my first half marathon in my first year of grad school did wonders for my mental health and introspection).

Non-Research Research: For Ideas and Opportunities

Reflection can only get you so far when it comes to figuring out what you want to do after earning a PhD. You also need to gather ideas to give you something to reflect on and seek out opportunities that will make it possible to achieve your goals. PhD students hone their critical thinking and information-finding skills, which can be applied to “non-research research” — idea- and opportunity-seeking outside your academic research.

There are many ways to do this non-research research, so individuals can find what’s best for them. For me, Twitter was a huge conduit for this research. I followed accounts related to my interests (psychology and cognitive science, language and linguistics, science communication), my location (university and city), and people I came across in real life or online who intrigued me. I follow the digital magazine Aeon, for example, and one day stumbled upon an article by Michael Erard on his work as a “metaphor designer,” which put FrameWorks, a communications think tank, on my radar. Today, I work there.

Twitter’s use of hashtags makes it easy to discover more accounts to follow and to find specific content. For example, I learned about ComSciCon, the communicating science workshop for graduate students, by browsing #scicomm. I’ve written about ComSciCon numerous times, so for now I’ll just note that my involvement in this group has been incredibly influential for the path I’ve taken and where I am today.

To sum up…

It’s important to take an active role in your PhD progress and your post-PhD prospects. My own PhD “success” is largely thanks to consistent planning, introspection, and curiosity.


Photo by João Silas on Unsplash

Framing your SciComm Message

When you’re communicating, whether about the frustration of finding facial hair stubble in your bathroom sink or the importance of addressing climate change, it’s useful to think not only about the idea you want to get across, but also how you want to get it across. Which words do you want to use, or which ones do you want to avoid, for fear that they’ll make your spouse or conversation partner feel defensive or closed-minded? How do you want to bring up this topic? What other situations do you want to compare it to?

We’re accustomed to framing our everyday conversations carefully in order to maximize the chances of a desired outcome, like a clean bathroom sink, and minimizing the chances of an undesired one, like offending. We need to use this same meta-cognitive strategy — framing — when we communicate all science, and especially when communicating science that some audiences may want to resist.

I’ve created this handout to give an overview of framing in the context of published research. What does research tell us about how we should communicate issues like the importance of vaccinations or addressing climate change? The handout includes takeaways from each of the topics to help science communicators apply research on the science of scicomm.

Click here to download the pdf.

What other strategies would you like to learn more about? I’m brainstorming my upcoming handouts and would love to hear from readers about topics from the science of science communication that would be most helpful.

I’m biased, and so are you. Considerations for SciComm

I really like being right. Chances are you do too, because we humans are psychologically inclined to seek out evidence that suggests we’re right. We tend to interpret neutral information in our favor and contrary information as flawed. These related tendencies are often referred to as confirmation bias.

Confirmation bias is inevitable, and it colors how every one of us sees the world. If I’ve just received an email from a student asking me to bump their grade up so they can get into med school, I might start grumbling to myself about my lazy students. Then as I’m ruminating on lazy students, I might interpret the next student’s well-intentioned question as a manipulative attempt to score higher on the exam. In other words, I may interpret this latter interaction as confirming my feeling that arose from the prior one — the students’ laziness — even if the second student wasn’t lazy or manipulative at all.

Confirmation bias can, in part, explain why there are still way too many parents who don’t have their children vaccinated. Once they believe that vaccines might be harmful for their children, they seek evidence to confirm that belief — for example, clinging to the very small percentage of people who do have adverse reactions to vaccines. Even an overwhelming amount of data demonstrating the benefits of vaccines and the fact that the vaccines-and-autism rumors started from completely fraudulent “science” will not persuade this person. They’ve chosen which evidence to believe and which to discard, even if they don’t necessarily see it as a conscious choice.

An audience’s confirmation bias can be extremely frustrating for science communicators. It can make it feel like communication attempts are futile, since some members will already have their mind made up, and will interpret new information through the lens of their current belief.

But successful science communication is not just a process of information transmission. The idea that the public just hasn’t received enough science info, and that they’ll hold more pro-science beliefs and make more pro-science decisions is incredibly misguided. Confirmation bias illustrates why heaping information on people will not change minds if they have contrary beliefs they seek to confirm. I’ve written about this before, and so have many other great writers.

We need to meet our audience where they’re at: LISTEN, recognize their concerns, find common ground, and empathize with them. Start there, and then share your message.

For science communicators: Your audience is going to have cognitive biases. It’s important not to let your awareness of their biases color how you think of the people you’re communicating with. In fact, if you start to characterize your audience as stubborn or irrational because their biases act as obstacles to accepting the science you want to share, you are falling prey to yet another cognitive bias —  a fundamental attribution error, or a correspondence bias. This bias plays out when we attribute someone else’s behaviors to their personality (for example, they’re not understanding my science because they’re irrational) more than we would attribute our own behaviors to our personalities.

Remember, you, too, have cognitive biases. Although at times those biases might drive you to make stubborn or irrational conclusions, you probably don’t think of yourself as a stubborn or irrational person. Instead, you might recognize that your unique background and your current circumstances have led you to make biased decisions. Acting stubborn in a certain context does not necessarily make you a stubborn person. We must remember this is true, even when we’re communicating with seemingly stubborn people.

So when you’re communicating, recognize that your audience has cognitive biases — this part, I think, is not too hard. What’s more difficult is to also recognize that you have cognitive biases. No real communication can happen until you do this — until you acknowledge that your audience is comprised of human beings, all of whom have wonderfully complex cognitive baggage, just like you.

Vaccinating, metaphorically and literally

There’s a lot of bad (either misleading or blatantly false) science information on the Internet. Science communicators often try to combat the bad content by dumping as much accurate information as they can into the world, but that strategy is not as effective as many would hope. One reason it’s not effective is that social circles on the Internet are echo chambers: people tend to be follow like-minded others. Scientists and science communicators follow each other, and skeptics follow each other, so we rarely even hear what others outside our circle are talking about. Plus, when we do encounter evidence that contradicts our beliefs, we tend to discount it and keep believing what we already did.

A recent study by Sander van der Linden, Anthony Leiserowitz, Seth Rosenthal, & Edward Maibach (that I recently wrote about) gives a glimmer of hope to this science communication trap: communicators may be able to “vaccinate” their audiences against misinformation. They found that if people are cued in to the kinds of tactics that opponents of global warming deploy, they’re less likely to believe them. This finding offers some hope in a time when the proliferation of fake and misleading science information seems inevitable. Scientific facts, along with a heads up about anti-scientific strategies, can help people better evaluate the information they receive to form evidence-based beliefs and decisions.

Does this apply to other scientific issues? Can we vaccinate against anti-vaccination rhetoric?

I don’t know. But I’d like to find out. In order to design a communication that alerts people about anti-vaccine messages they might encounter, it’s important to understand anti-vaccine tactics. I explored some very passionate corners of the Internet (videos, discussion threads, and blog posts by anti-vaccine proponents) for a better understanding. Here are the anti-vaccine tactics I found, a lot of which are described in this SciShow video:

Ethos: Appeal to Authority

First, note that this immunologist isn’t explicitly saying that children shouldn’t be vaccinated. But the quote implies so much. I don’t know if that’s actually her belief, but regardless, as a consumer of this image, I do get the sense that she looks pretty smart (#educated, in fact), and maybe she knows what she’s talking about…

Jargon

Screen Shot 2017-03-11 at 9.32.35 AM

There are four chemical names in the first five lines of the ad above. It sounds like whoever wrote it must really know their science. The message implies that the author has deep scientific knowledge about the chemicals mentioned and wants to warn you of their presence in vaccines. Paired with our society’s tendency to believe that all things “natural” are good, and all things “chemical” are enemies, this jargon-wielding author might appeal as someone worth listening to. Most of us (and I am definitely included here) don’t know much or anything about those chemicals — how do they work? Are they actually dangerous in the doses found in vaccines? This jargon paves the way for persuasion through the naturalistic fallacy — the idea that all natural things are better than non-natural things.

Logos: Appeal to “Logic”

Logical fallacies

A logical fallacy is faulty logic disguised as real logic, and it’s another common tactic used by anti-vaccine proponents. In the Tweet above, the author presents two facts, implying that they’re connected (that an increase in mandatory vaccines led to a change from 20th to 37th in the worldwide ranking of infant mortality rates. Just because America “lost ground” on this ranking, it doesn’t necessarily mean our mortality rate even went up — it’s likely that many other nations’ mortality went down. Plus, there are so many other factors beyond number of mandatory vaccines that influence infant mortality rate, and no evidence supplied by the Tweet that vaccines and mortality are related. They’re just two pieces of information, placed next to each other to give a sense of a causal relationship.

There are lots of ways logic can be distorted to suggest that vaccines are bad. One that really stands out to me is the suggestion that if vaccines work, why should we care if some children are not vaccinated? After all, they’ll be the ones who get sick… why does it concern the rest of us?

It does. For one, no child should end up with a paralyzing or fatal disease because their parent chose to disregard scientific consensus. But one person’s choice not to vaccinate directly affects others — for example, people who CAN’T be vaccinated for health reasons. If everyone else receives vaccines, that one person who cannot is safe thanks to “community immunity.” But if others stop receiving those vaccines, the person who had no choice but to remain unvaccinated is susceptible. This person is unjustly at danger as a result of others’ choices.

Pathos: Appeal to Emotion

Fear

Fear is a powerful motivator. Appeals to ethos and logos can work together to have an emotional effect. Parents just want to do their best for their kids, so messages that strike up fears about the harms of vaccines have a good chance of swaying them.

One way of drumming up fear is to promote vaccine proponents as bullies, as this article demonstrates:

Screen Shot 2017-04-30 at 1.08.20 PM.png

Yea, that description sounds pretty scary to me… Breakdown Radio. Link to article

Considerations for inoculation messages

Of course, I’ve just scratched the surface with these tactics that anti-vaccine proponents use (you can get an idea of some others in a post on how the anti-vax movement uses psychology to endanger us by Dr. Doom) Messages that vaccinate against misconceptions have to walk an extremely fine line. The goal of such a message is to foreshadow misleading messages a person may encounter, and to point out the reasons that message should be reconsidered.

Vaccine messages might be useful when they introduce new information, but they also need to be proactive, anticipating anti-vaccine rhetoric and alerting people to its flaws. There are a few dangers in doing so, though. For one, it often requires repeating the misconception, and research shows that doing so can backfire and reinforce the inaccuracy instead. In addition, pointing out flaws in an argument that someone might be prone to believing can alienate that person. If the warning message isn’t constructed conscientiously (for example, if it suggests that seeing through the misleading information is a no-brainer), it can imply that anyone who might believe the misconceptions is an idiot. A message like this will make some members of the audience feel defensive (wow, am I an idiot? No, I can’t be an idiot. Maybe this author of this message is the idiot…).

That doesn’t mean that inoculation messages can’t be effective. We have some evidence to suggest they can, and I think there’s a lot of room to continue honing this strategy. The first step in a successful inoculation message is to uncover the tactics used by those who misrepresent the science. Then it’s important to raise awareness of those tactics without alienating the audience and while being careful not to repeat the misinformation in a way that can be construed as reinforcing it.

Communicators can keep in mind that anti-vaccine messages often attempt to establish authority, tap into emotions, and apply misleading logic in order to convince people of their message. By anticipating these strategies, we can have greater success in counteracting them and promoting vaccines as the life-saving technologies they are.

More information

 

The Pope’s #scicomm: Effects of Laudato si’ on beliefs about climate change

Climate change is an extremely polarized issue: while many people firmly believe scientific evidence that human-caused climate change is ruining the planet and our health, many others adamantly maintain that it is not a problem. Figuring out how to communicate the gravity of climate change has been an urgent puzzle for climate change scientists and communicators (a topic I’ve written quite a bit about).

Collectively, we’re trying many different ways of communicating this issue. I especially love these videos by climate scientist Katharine Hayhoe and others by researcher M. Sanjayan with the University of California and Vox. Pope Francis also contributes to the scicomm effort — in 2015 he published an encyclical called Laudato si’: On Care for Our Common Home, which called for global action toward climate change (he also gave a copy of this encyclical to Donald Trump recently when the two met).

Was Laudato si’ effective?

Did the document influence beliefs about the seriousness of climate change and its effects on the poor? Recent research by Asheley Landrum and colleagues took up this question.

The work is based on survey results from Americans — the same people reported their beliefs about climate change before and after the encyclical came out.

They found that the encyclical did not directly affect people’s beliefs about the seriousness of climate change or its disproportionate effects on the poor.

But… the encyclical did affect people’s views of the pope’s credibility on climate change, encouraging them to see him as more of an authority after the document was published than before. This was especially true for liberals, though, reflecting a sort of echo chamber effect: people who already found climate change to be an issue gave the pope more credit for his stances on climate change after he published the encyclical.

Importantly, these altered views of the pope’s credibility did in turn affect how much people agreed with the pope’s message on climate change. In other words, there wasn’t a direct effect from the publication of the encyclical to agreement with its message; instead, there was first an effect of the document on beliefs about the pope’s credibility, and then an effect of those credibility assessments on agreement with the pope’s message.

Diem.png

This work reminds us that science communication efforts can’t be considered in isolation. Whether people agree with a message is influenced by factors like their political beliefs and the credibility of the source. This point calls for two directions for future scicomm: for one, communicators should do their best to consider their message and audience holistically — what factors are likely to shape an audience’s receptiveness to a message, and how can those be influenced? This work also reminds us that we need more research on the science of science communication. We need to continue working to understand how people perceive scientific issues and communicators, and how they respond to the scicomm they encounter.


Featured Image: Korea.net / Korean Culture and Information Service (Jeon Han)

Hurdles to Communicating Science & Strategies to Overcome them

Communicating science is hard in part because doing and understanding science is hard, but there are also some unique hurdles that science communicators face — especially when communicating information that’s relevant for policies. James Druckman recently described some of the challenges that particularly face people communicating policy-relevant science, and ways those challenges can be minimized.

Value-laden diversity

We all have values, relatively unchanging beliefs that reflect the way we see the world. For example, some people are more “individualist,” while others are more “communitarian.” If scientific information seems to contradict a value, we’ll be hesitant to accept that information. Information about climate change, especially if it contains or implies suggestions for reducing the problem by increasing regulations on businesses, might contradict an individualist’s values, making it hard for that person to even consider the scientific information. For more on this hurdle, see Dan Kahan’s work on The Science of Science communication, an earlier post on this blog about Kahan’s work, or a great post by Chris Mooney on Mother Jones.

What to do about it

First, communicators have to recognize that their audience will have a diverse set of values, some of which will conflict with the communicator’s values, and that these values will influence the way people receive scientific information.

Next, communicators should minimize the extent to which their message contains a value commentary. In other words, they should make sure the relevant science comes into play for certain policy decisions without defining “good” or “competent” decisions.

774926895_9fe0f5495a_b
Hurdles start. by robert voors. CC BY-NC-ND.

Motivated Reasoning

Motivated reasoning (or confirmation bias) is our drive to seek information that reinforces our prior beliefs and disregard information that does not. For example, in work by Duckman & Bolsen (2011), participants initially indicated their support for genetically modified (GM) foods. After 10 days, all participants received 3 types of info: positive information about how GM foods combat diseases, negative information about their possible longterm health consequences, and neutral information about their economic consequences.

People who initially supported GM foods dismissed the negative information and rated the positive information as valid, and perceived the neutral information as indicating benefits of GM foods. People who were initially opposed to GM foods did the exact opposite: dismissed the positive information, considered the negative information as valid, and interpreted the neutral information as indicating drawbacks of GM foods. Work on motivated reasoning shows we interpret information through a lens laden with our prior beliefs.

1427849853_1cb5cefd16_m
Last of the crop. By Mrs eNil. CC BY-ND-NC.

There have been lots of great articles highlighting motivated reasoning lately. These include Why Facts don’t Change our Minds, This Article won’t Change your Mind, and Why You Think You’re Right, Even When You’re Wrong.

What to do about it

When motivated reasoning occurs, people are motivated to understand information in a way that aligns with their previous beliefs. Instead, science communicators want to motivate their audience to understand new information in a way that will lead to maximum accuracy. There are a few things communicators can do to encourage people to seek accurate understandings:

  • Show that the issue and information matter for the individual’s life. Show relevance.
  • Present information that comes from a variety of sources, preferably ones with different goals (i.e., from Democrat and Republican sources)
  • Encourage people to explain their position to others (or at least prepare themselves to explain their position). Elaborating on your position requires people to think it through more carefully, and provide explicit evidence for their claims that goes beyond “because I want to believe this.”

Politicization

This term does not mean what we might expect given its name. Politicization is “the inevitable uncertainties about aspects of science to cast doubt on the science overall…thereby magnifying doubts in the public mind” (Stekette 2010, p. 2). It’s not exactly misinformation, since it doesn’t introduce false findings, but instead magnifies doubt. It’s especially common in issues about global warming and vaccination. People who politicize these issues send the message that scientific evidence on these issues is not as conclusive as it’s been made out to be.

What to do about it

Politicization comes directly from people perceiving scientists or informants as lacking credibility and being motivated to reason in ways consistent with their prior beliefs. Thus, politicization can be countered by addressing those hurdles – establishing credibility and encouraging an accuracy motivation. There are a couple other things we can do to overcome this hurdle:

  • Warn people of politicization they’re likely to encounter before they encounter it. This is sometimes referred to as an inoculation message, and it points out the strategies politicizers use and why their message is not to be trusted
  • Correct politicized messages after people have encountered them. Corrections are often not as effective as inoculation messages since people may have already had time to process and begin to believe the politicized message. However, corrections can be effective when people are motivated to reach an accurate understanding.

There’s more

Of course, these aren’t the only hurdles to communicating policy-relevant science. Other hurdles described by Druckman that I haven’t elaborated on include: communicating policy-relevant science requires effort on the part of scientists, getting and maintaining attention, establishing credibility, and changing government inaction.

More and more scientists are recognizing the value of communicating their science outside the Ivory Tower. At the same time, the science of science communication is advancing to help us all understand the hurdles we face and how to best overcome them.

Scientists Agree on Climate Change: A Gateway Belief

 

Screen Shot 2017-03-24 at 3.13.04 PM
https://climate.nasa.gov/scientific-consensus/

It doesn’t get much clearer. The Earth’s climate is warming. Humans are the reason. But how many people are actually aware of the scientific consensus on this issue?

Research by Sander van der Linden and colleagues shows that when people believe that scientists overwhelmingly agree about climate change, they increase their a) own beliefs in climate change and b) beliefs that humans are responsible. They feel c) more worried about climate change, and as a result of a, b, and c, they support public action to mitigate the effects of climate change.

journal.pone.0118489.g001

At the beginning of the study, participants indicated the percentage of scientists they thought agree on global warming and they answered some questions about their own climate change beliefs. People then received a message about scientific consensus, which took the form either of a) a simple description, b) a pie chart, or c) a metaphorical comparison related to trusting engineers’ consensus about bridges (i.e., if 97% of engineers agreed that a bridge was unsafe, would you use it?) or doctors’ consensus about illness. All the messages included the info that “97 % of climate scientists have concluded that human-caused climate change is happening.”

Then participants again indicated what percent of scientists they thought agree on global warming and answered questions about their own beliefs. All messages “worked,” in the sense that people perceived greater scientific agreement after the messages telling them that 97% of scientists agree than if they hadn’t read anything about the consensus at all (though the simple description and pie chart were more effective than the metaphor. People shifted their climate change beliefs more after encountering one of the more straightforward messages than the more complex metaphor. Great food for thought, as many science communicators insert metaphors wherever they can).

Of course, having people believe that there’s strong scientific consensus about climate is only one step toward the larger goal of having them endorse actions that mitigate the effects of climate change. But in follow-up analyses, the researchers identified that perceiving scientific agreement is a gateway belief: believing that scientists agree about global warming led to other beliefs, ones that get us closer to the goal of actions in favor of mitigating climate change. Specifically, it led to greater belief that climate change was real, human-caused, and worrisome. These beliefs, in turn, led to greater support for public action against climate change. It’s often hard to know what leads to what, especially when it comes to beliefs we keep hidden in our own heads, but with some semi-fancy math, these researchers quantified those relationships.

9154498428_d5d720f9ee_b
Climate 365 by NASA Goddard Space Space Flight Center. CC BY.

These studies have some clear takeaways for science communicators (especially when communicating about climate change — but maybe these ideas apply to other topics too — need more research!)

  • Emphasize scientific consensus, that an overwhelming percentage of scientists agree that climate change is a real problem caused by human activity.
  • Don’t worry so much about immediately pushing for public action against climate change. When people understand that scientists agree, they come to agree themselves that climate change is a problem that should be addressed, and THEN they come to support public action. Be careful about skipping steps.

At the same time, there’s not only one right way to communicate about climate change. There are truly effective ways, ineffective and potentially backfiring ways, and many in between. There aren’t cut-and-dry rules because every audience is unique, and taking the audience into account — their beliefs, values, and past experiences, for example — is crucial. But this work sheds light on communication strategies that are probably pretty far toward the “truly effective” end of the ways-to-communicate-climate-change continuum.