In a couple of weeks, crowds will march nationwide in the March for Science. It's part of a recent trend pushing back against "anti-science" attitudes that have become all the more prominent since last November's elections. However, many at this march may greatly misunderstand the "anti-science" behavior they will be marching against. This misunderstanding may be misguiding us in our attempts to make science actually matter.

People's relationship with science is much more complex and nuanced than "pro-science" or "anti-science." We need to correct some of the misconceptions we have and show that what is often labeled as "anti-science" or "science denial" is often better understood as isolated incidents of motivated bias. In general, trust in science is much higher than we often realize, in part because it includes a lot of people we might often consider "anti-science."

For instance, a survey of "science lovers" would include people like Sally, an electrical engineer who loves Mythbusters but is skeptical of climate change, and Sam, an environmental advocate who loves science nights at Griffith Observatory but is also part of an anti-vaccinations initiative. Both of these people hold incorrect views about specific scientific topics, but both also self-identify as science lovers. When we label people like Sally and Sam as "anti-science," we misunderstand much about their thought processes and risk pushing them away.

Today as the danger of so-called anti-science attitudes rise, we need to be more scientific about what we call science denial and ease our trigger finger off the damning label "anti-science." When do this, we can see that complete denial of the value of empirical science—true anti-science attitudes—are less common than we realize, but other elements that may hinder scientific progress are more common than we'd think.

Seeing at least some of the denial of scientific facts in this light helps us better understand the mindsets of people like Sally and Sam who oppose the scientific consensus on specific issues like climate change and vaccinations, but generally support science. When it comes to "anti-science," there are some true deceivers, and some truly anti-science people, but mostly it involves people with complicated views and biases. We suggest that those who truly love science will see why a better understanding of these complications is necessary to make science matter in this so-called "anti-science" era.

Misconception: People in general distrust scientists.
Reality: There is a deep respect for scientists and the scientific process.

Polling suggests that most Americans respect science and see it as a positive force in society. A 2015 Pew Research Center poll found that 79 percent of respondents say science has "made life easier." These findings held true for the specific areas such as health care (79 percent), food (61 percent) and the environment (62 percent). Even on vaccinations, one poll found that 79 percent of respondents believe scientists should play a role in creating policy and 88 percent of participants believe the benefits of vaccination outweigh the risks.

Now, this level of support is down slightly from 2009, when 83 percent of Americans said science makes life easier. Likewise, skepticism remains about scientific expertise on specific issues. Around a third of Americans are not convinced that scientists understands climate change and GMOs. This is discouraging, but the overall picture suggests that science is still held in relatively high esteem by Americans.

Misconception: People have no respect for the use of scientific findings in arguments.
Reality: People often use what they believe to be credible scientific findings to argue against actual, credible scientific findings.

Many people turn to science—or what they believe to be science—to support their viewpoints. Climate change deniers, for instance, frequently cite climate skeptics to argue that the science isn't settled or that it actually disproves human-caused climate change. People who oppose vaccinations look to the discredited work of Andrew Wakefield to make their case. Although these perspectives do not represent the scientific consensus on the issues, they can be persuasive. This is especially true for someone who is already predisposed to reject the established scientific view, since they have both the incentive and opportunity to latch on to discredited voices in the name of "supporting science."

This biased analysis of science is due to the now well-known psychological phenomenon of motivated reasoning. Research suggests that all people tend to seek out information that confirms (or at least does not challenge) the conclusions they want to draw on a given topic. In other words, we will work to discredit or avoid information that might require us to reconsider our pre-existing beliefs. Motivated reasoning is particularly likely when taking the other side might create conflict within our social circle—think making a break from your political party.

The way the media covers scientific issues deemed to be controversial can compound opportunities for people to engage in motivated reasoning. Scholars studying climate change communications have long documented that exposing people to a climate scientist and a climate denier side-by-side reduces their confidence in the scientific consensus that climate change is real and caused by humans. But this phenomenon may extend beyond climate change to our opinions of expertise on many issues. One study by psychologist Derek Koehler suggests that hearing conflicting reports from experts, even when experts are overall in agreement, can cause people to become less confident in the expert consensus.

Importantly, motivated reasoning means that people are still relying on science to bolster their beliefs.

However, they may be misinterpreting real findings, falling back on discredited information, or cherry-picking studies that agree with them. This means that people may be best described as not "anti-science," but science-biased in the sense that their engagement with science is biased by their other beliefs.

Misconception: It is always disagreement with scientific findings that motivates denial.
Reality: It is often the implied solutions of scientific findings that motivate denial.

In a study conducted by one of us, Troy Campbell, and his colleague Aaron Kay, participants were first shown an article talking about climate change in either regulatory (restrict carbon dioxide emissions) or market-based (invest in green technology) terms. After reading, they were asked how much they agreed with the scientific consensus.

The study found that Republicans were more likely to agree with climate science when presented with the market solution—one that's friendlier to their political ideology. This suggests it wasn't necessarily a general aversion to scientific findings that drove rejection, but an aversion to the implied solution. Similar effects were found with more liberal individuals on issues like gun control. Once again, people were motivated not by disbelief in the scientific method, but by the solutions. This leads to the denial of specific scientific facts, not all of science.

Scientific findings are the same whether we like them or not, so the phenomenon of solution aversion is problematic. However, the tactics used to get past this block are different than those which would be required if people rejected science as a whole.

Misconception: "Anti-science" people deny the correctness of facts
Reality: People often deny the relevance of facts, not just their correctness

People we label as taking an "anti-science" position may sometimes not be rejecting the scientific facts at all, but rejecting the relevance of these facts. It is a problem, but a very different one than we usually think.

In another study, author Campbell and colleagues Aaron Kay and Justin Friesen found that people engage in a manner of reasoning that makes the facts less relevant than their own personal beliefs. One of their experiments asked participants whether the legality of same-sex marriage should be decided based on facts or moral opinion. When presented with studies suggesting the facts were on their side (e.g., gay parents were or were not as good at parenting), participants felt facts should decide the issue. However, when presented with facts that countered their belief, participants changed their reasoning to put emphasis on the inherent morality of issues (e.g., something akin to "it is just wrong or right"). Facts that challenged their beliefs were seen as irrelevant. What this shows is our relationship with science is nuanced, and there are many different ways to resist facts or at least the implication of those facts.

It's important to call for evidence-based policies and a great utility in showing public support for politicians who respect scientific findings. Indeed, if we are to tackle major social and environmental problems such as these, empirically-backed practices are our best bet for success.

However, what we are seeing is not necessarily the rejection of empirical science, but a selective rejection of scientific facts and their relevance to avoid challenges to cherished beliefs. It can be useful, then, for activists working to promote science-based policies to unpack seemingly "anti-science" attitudes.

Figuring out why audiences are rejecting or ignoring specific scientific facts can aid in developing different messages for targeted audiences that will be truly persuasive. Further, the willingness of people (across the political spectrum) who generally trust science to resist science in numerous ways when it conflicts with their beliefs is cause for greater pause. The "anti-science" situation is complicated, but if we are more scientific in our understanding of it, we can better help good science do good things.

Selected Cited Works

Campbell, Troy, and Aaron Kay (2014). "Solution Aversion: On the Relation Between Ideology and Motivated Disbelief." Journal of Personality and Social Psychology, 107(5), 809-824.

Friesen, J. P., Campbell, T. H., & Kay, A. C. (2015). The psychological advantage of unfalsifiability: The appeal of untestable religious and political ideologies. Journal of personality and social psychology, 108(3), 515.

Koehler, D. J. (2016). Can journalistic "false balance" distort public perception of consensus in expert opinion? Journal of experimental psychology: applied, 22(1), 24.