The Science Of Why We Reject Science

The Science of Why We Reject Science
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
The Guardian

The Guardian

Josh Edelson/AFP/Getty Images

In his 1748 treatise An Enquiry Concerning Human Understanding, David Hume famously wrote, “A wise man ... proportions his belief to the evidence.” This sentiment has become a familiar fixture of our modern lexicon, a basic standard for intellectual honesty and a prescription for belief formation in an ideal world. Of course, it was also Hume who in an earlier volume wrote that reason is “the slave of the passions.” The original context of the passage concerns moral action and the precedence of our goals, motives and desires in relation to reason; specifically, Hume considered reason an ex post facto force in thrall to our moral impulses. But this concept can be duly extended to the psychology of belief formation more generally. Indeed, once we factor in culture, ideology, human psychology, and emotionally laden values, the image of the dispassionate creature responsive to the best evidence and argument gets cut down to size.

The ease with which the reasoning self can be subverted should clue us in that there is more going on behind the scenes than cold-blooded evaluation of facts. The prevalence of the triumph of emotion and ideology over reason and evidence is a feature of human psychology we should all be mindful of, but its underlying causes should be of special interest to proponents of science and other veterans of fact-based intervention.

In particular, what we know about cognition has great import for the information deficit model, the quixotic but woefully undersupported idea that the more facts and information one is exposed to, the more likely that person is to change their mind. As it turns out, our neural wiring tends to lead us in the opposite direction: our emotional attachment to our beliefs and values prods us to double down and organize contradictory information in a way that is consistent with whatever beliefs we already hold—the so-called “backfire effect.” As Chris Mooney put it in 2011: “In other words, when we think we’re reasoning, we may instead be rationalizing.”

I think that on some level anyone who has dealt with science denial for an extended period of time knows this intuitively. Adopting the role of serial debunker, whether online or off, can be a thankless and unrewarding undertaking. We figure that one more “hockey stick” graph, one more image of the vanishing Arctic, one more year of record-topping warmth, one more report on ocean acidification or extreme weather will settle the matter. But it rarely does; discredited claims persist, and frustration sets in.

That these efforts so often end in failure demonstrates the sheer poverty of “Just the facts, ma’am” approaches to persuasion. Summarizing a study on this very topic, Marty Kaplan wrote: “It turns out that in the public realm, a lack of information isn’t the real problem. The hurdle is how our minds work, no matter how smart we think we are.”

Slate

Slate

Photo by DANIEL LEAL-OLIVAS/AFP/Getty Images

Now this is not to say that fighting denial with facts never works, or that people are incapable of changing their minds on anything of consequence—it can and we are . For instance, someone who is already well trained in logic and critical thinking and evidence-based evaluation of claims in other contexts may yield to the facts in a different context even when their beliefs are misaligned. This does, however, appear to be the exception and not the rule. For the most part, the rationality we so aspire to all too often succumbs to inherent constraints and limitations in our cognitive makeup.

More and more it’s looking like the reason clashes with creationists and climate deniers have us banging our collective heads against the wall is not because we don’t have the facts on our side. It’s because we’re bumping up against deeply entrenched cultural norms and attitudes. Gaining traction in disputes over settled science, it seems, is less about rote communication of the facts than about translation of the science such that the recipient can freely process the information without feeling like their social identity and worldview are at risk. Science denial thus stems from the felt sense that scientific beliefs are incompatible with the received wisdom of one’s social sphere.

The research supporting this view is now extensive. One of the pioneers at the forefront of the connection between beliefs, evidence and culture—what we might call 'value-based belief formation'—is a psychology professor at Yale by the name of Dan Kahan. The theory he and his colleagues have proposed as a more robust alternative to the deficit model is identity-protective cognition, otherwise known as cultural cognition.

According to Kahan and others, we perceive and interpret scientific facts largely as symbols for cultural affinities. Especially certain scientific facts—think climate change, evolution, GMOs, stem cell research, vaccination—by dint of being politicized, carry cultural meaning that has nothing to do with the validity of the underlying science. Depending on the strength of one's political biases, facts may never enter the analysis at all, except to be argued against and swept aside in order to reinforce a fixed ideological position. The more invested we are in a certain cultural identity, and the more politicized the scientific issue, the higher the frequency at which these impulses operate.

Associating with beliefs that are out of step with one's social group, moreover, bears societal and interpersonal risks (e.g., loss of trust from peers) to members for whom ideology or 'party' has become a deep and meaningful part of their self-concept. This consequently prompts the identity-protective mechanism: we selectively credit and discredit evidence in response to those risks. In short, objective evidence is merely subjective fuel for the ideologically beholden. The psychology of group affiliation and competition for social status frequently overrides rational assessment of the credibility of scientific data in domains like climate change, evolution, vaccine efficacy, and so on.

tl; dr - It's about cultural identity and values, not facts. Echoing Hume, we are not ruled by reason.

I see Kahan & Co.'s research as a powerful commentary on the state of hyper-polarization of American politics and culture, and a cogent, if ineluctably depressing, counter-narrative to the information deficit model practiced by so many science communicators today. This leaves us on somewhat insecure footing. We cannot allow misinformation to spread unchallenged, but neither can we continue to labor under the faulty expectation that dousing our debates with more facts and figures will fan the flames of denial. To the extent this body of research provides an accurate picture of the referent under study, we would do well to absorb its insights and apply them in the arena of partisan politics and antiscience contrarianism.

As Chris Mooney wrote in 2012: "A more scientific understanding of persuasion, then, should not be seen as threatening. It’s actually an opportunity to do better—to be more effective and politically successful." Charging forward with false notions of human psychology dooms our efforts before they get off the ground. That our ideological commitments often prod us to double down and organize dissonant information in ways that cohere with the expectations of our cultural group—and, indeed, that the more informed and literate we are the more predisposed we are to capitulating to our values—is invaluable intel in countering the war against facts.

What does this look like in practice? The strategy urged by Kahan and his colleagues is to lead with values, appeal to common concepts and desires, emphasize shared goals. Avoid framing issues in terms of left vs. right, science vs. denialism, and don't get bogged down in scientific minutia. In essence, steer the conversation away from ideological pressure points that are likely to trigger ingroup-outgroup dynamics. By no means is this easy or a surefire path to success. For one, it requires much more preparation in terms of tailoring your arguments to your audience. But the evidence suggests this approach is more productive than throwing fact after fact at the wall and seeing what sticks.

Some have taken cultural cognition theory to imply that facts are culturally determined—to support a kind of postmodernist understanding of truth. This couldn't be more mistaken. Identity-protective cognition is an explanatory model for how beliefs—particularly those existing at the intersection of science and politics—are formed. It is both descriptive and prescriptive: it contends that our cultural experiences and personal identity influence the way we approach and interpret facts, and points us toward new modes of engagement. It should not be construed as a recommendation for how we are to form reliably accurate views about the world, abandoning fact-based decision making, or embracing a post-truth era. Scientific facts are still culturally independent descriptions of nature, and the physical laws of the universe don't change depending on who's measuring them.

Rather, this research—like all good psychology—brings to light imperfect manifestations of our innate cognitive circuitry. The more we learn about cognition, the more fleeting rationality and reasoned thinking appear to be, and the more vigilant we must be to avoid the common pitfalls so ingrained in our neurochemistry. After all, the penchant for tribalism and partisanism are more akin to features than bugs in the human operating system; such shortcomings have been with us from the beginning. Only by recognizing these features and adopting communication strategies which account for them can we hope to effectively engage lasting resistance to established science and help guide society out of the pre-Enlightenment era to which we seem to be regressing.

This article has been cross-posted from Waiving Entropy.

Further reading:

Popular in the Community

Close

What's Hot