Humans are hardwired to dismiss facts that don’t fit their worldview, no matter their political orientation

People watching TV

© AP Photo/John Raoux
The same facts will sound different to people depending on what they already believe.

Something is rotten in the state of American political life. The U.S. (among other nations) is increasingly characterized by highly polarized, informationally insulated ideological communities occupying their own factual universes.

glasses

© Vladyslav Starozhylov/Shutterstock.com
Everyone sees the world through one partisan lens or another, based on their identity and beliefs.

Within the conservative political blogosphere, global warming is either a hoax or so uncertain as to be unworthy of response. Within other geographic or online communities, vaccines, fluoridated water and genetically modified foods are known to be dangerous. Right-wing media outlets paint a detailed picture of how Donald Trump is the victim of a fabricated conspiracy.

None of that is correct, though. The reality of human-caused global warming is settled science. The alleged link between vaccines and autism has been debunked as conclusively as anything in the history of epidemiology. It’s easy to find authoritative refutations of Donald Trump’s self-exculpatory claims regarding Ukraine and many other issues.

Yet many well-educated people sincerely deny evidence-based conclusions on these matters.

In theory, resolving factual disputes should be relatively easy: Just present evidence of a strong expert consensus. This approach succeeds most of the time, when the issue is, say, the atomic weight of hydrogen.

But things don’t work that way when the scientific consensus presents a picture that threatens someone’s ideological worldview. In practice, it turns out that one’s political, religious or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.

“Motivated reasoning” is what social scientists call the process of deciding what evidence to accept based on the conclusion one prefers. As I explain in my book, The Truth About Denial, this very human tendency applies to all kinds of facts about the physical world, economic history and current events.

Denial doesn’t stem from ignorance

The interdisciplinary study of this phenomenon has exploded over just the last six or seven years. One thing has become clear: The failure of various groups to acknowledge the truth about, say, climate change, is not explained by a lack of information about the scientific consensus on the subject.

Instead, what strongly predicts denial of expertise on many controversial topics is simply one’s political persuasion.

A 2015 metastudy showed that ideological polarization over the reality of climate change actually increases with respondents’ knowledge of politics, science and/or energy policy. The chances that a conservative is a climate change denier is significantly higher if he or she is college-educated. Conservatives scoring highest on tests for cognitive sophistication or quantitative reasoning skills are most susceptible to motivated reasoning about climate science.

This is not just a problem for conservatives. As researcher Dan Kahan has demonstrated, liberals are less likely to accept expert consensus on the possibility of safe storage of nuclear waste, or on the effects of concealed-carry gun laws.

Denial is natural

Our ancestors evolved in small groups, where cooperation and persuasion had at least as much to do with reproductive success as holding accurate factual beliefs about the world. Assimilation into one’s tribe required assimilation into the group’s ideological belief system. An instinctive bias in favor of one’s “in-group” and its worldview is deeply ingrained in human psychology.

A human being’s very sense of self is intimately tied up with his or her identity group’s status and beliefs. Unsurprisingly, then, people respond automatically and defensively to information that threatens their ideological worldview. We respond with rationalization and selective assessment of evidence – that is, we engage in “confirmation bias,” giving credit to expert testimony we like and find reasons to reject the rest.

Political scientists Charles Taber and Milton Lodge experimentally confirmed the existence of this automatic response. They found that partisan subjects, when presented with photos of politicians, produce an affective “like/dislike” response that precedes any sort of conscious, factual assessment as to who is pictured.

In ideologically charged situations, one’s prejudices end up affecting one’s factual beliefs. Insofar as you define yourself in terms of your cultural affiliations, information that threatens your belief system – say, information about the negative effects of industrial production on the environment – can threaten your sense of identity itself. If it’s part of your ideological community’s worldview that unnatural things are unhealthful, factual information about a scientific consensus on vaccine or GM food safety feels like a personal attack.

Unwelcome information can also threaten in other ways. “System justification” theorists like psychologist John Jost have shown how situations that represent a threat to established systems trigger inflexible thinking and a desire for closure.

For example, as Jost and colleagues extensively review, populations experiencing economic distress or external threat have often turned to authoritarian, hierarchicalist leaders promising security and stability.

Denial is everywhere

This kind of affect-laden, motivated thinking explains a wide range of examples of an extreme, evidence-resistant rejection of historical fact and scientific consensus.

Have tax cuts been shown to pay for themselves in terms of economic growth? Do communities with high numbers of immigrants have higher rates of violent crime? Did Russia interfere in the 2016 U.S. presidential election? Predictably, expert opinion regarding such matters is treated by partisan media as though evidence is itself inherently partisan.

Denialist phenomena are many and varied, but the story behind them is, ultimately, quite simple. Human cognition is inseparable from the unconscious emotional responses that go with it. Under the right conditions, universal human traits like in-group favoritism, existential anxiety and a desire for stability and control combine into a toxic, system-justifying identity politics.

When group interests, creeds, or dogmas are threatened by unwelcome factual information, biased thinking becomes denial. And unfortunately these facts about human nature can be manipulated for political ends.

This picture is a bit grim, because it suggests that facts alone have limited power to resolve politicized issues like climate change or immigration policy. But properly understanding the phenomenon of denial is surely a crucial first step to addressing it.

Comment: Readers interested in the wider implications of the topics highlighted in this article may find Laura Knight-Jadczyk’s ‘Comets and the Horns of Moses‘ of interest and this passage from it:

The Social Contract Theory of Human Society?

One theory of human society is that of the ‘social contract’, which posits that a group of individuals get together and draw up an agreement to their mutual advantage by which they will all abide, and a ‘society’ is thus formed. The problem with this theory is that it relies on circular reasoning. It presupposes the very thing it purports to explain already exists: that human beings are already constrained by some values that allow them to get together to draw up this alleged contract. Such a group must already be able to conceptualize a situation in the future where they will benefit from being bound to these other people in a contract. Ernest Gellner outlines the basic theory of anthropology regarding how societies are formed. He writes:

The way in which you restrain people from doing a wide variety of things, not compatible with the social order of which they are members, is that you subject them to ritual. The process is simple: you make them dance round a totem pole until they are wild with excitement and become jellies in the hysteria of collective frenzy; you enhance their emotional state by any device, by all the locally available audio-visual aids, drugs, dance, music and so on; and once they are really high, you stamp upon their minds the type of concept or notion to which they subsequently become enslaved. Next morning, the savage wakes up with a bad hangover and a deeply internalized concept. The idea is that the central feature of religion is ritual, and the central role of ritual is the endowment of individuals with compulsive concepts which simultaneously define their social and natural world and restrain and control their perceptions and comportment, in mutually reinforcing ways. These deeply internalized notions henceforth oblige them to act within the range of prescribed limits. Each concept has a normative binding content, as well as a kind of organizational descriptive content. The conceptual system maps out social order and required conduct, and inhibits inclinations to thought or conduct which would transgress its limits.

I can see no other explanation concerning how social and conceptual order and homogeneity are maintained within societies which, at the same time, are so astonishingly diverse when compared with each other. One species has somehow escaped the authority of nature, and is no longer genetically programmed to remain within a relatively narrow range of conduct, so it needs new constraints. The fantastic range of genetically possible conduct is constrained in any one particular herd, and obliged to respect socially marked bounds. This can only be achieved by means of conceptual constraint, and that in turn must somehow be instilled. Somehow, semantic, culturally transmitted limits are imposed on men …

As Gellner must have known quite well, this theory of how to control human beings was understood in pretty much this exact way many thousands of years ago. In the course of my reading, I once came across a passage translated from a Hittite tablet found at an archaeological dig where the king wrote that the priesthood needed the king to establish their religious authority and the king needed the priests to establish his right to rule. This control comes sharply into view in the falsification of history. History, itself, becomes part of the control. After all, control of daily information is just history in the making. As to how this process works on the individual level, a passage in Barbara Oakley’s Evil Genes describes what ‘dancing around the totem pole with ones social group’ does to the human brain – including scientists and true believers, both of whom have very strong attachments to their belief systems:

‘Ratings of perceived contradictions in statements. Democrats readily identified the contradictions in Bush’s statements but not Kerry’s, whereas Republicans readily identified the contradictions in Kerry’s statements but not Bush’s.’

A recent imaging study by psychologist Drew Westen and his colleagues at Emory University provides firm support for the existence of emotional reasoning. Just prior to the 2004 Bush-Kerry presidential elections, two groups of subjects were recruited – fifteen ardent Democrats and fifteen ardent Republicans. Each was presented with conflicting and seemingly damaging statements about their candidate, as well as about more neutral targets such as actor Tom Hanks (who, it appears, is a likable guy for people of all political persuasions). Unsurprisingly, when the participants were asked to draw a logical conclusion about a candidate from the other – ‘wrong’ – political party, the participants found a way to arrive at a conclusion that made the candidate look bad, even though logic should have mitigated the particular circumstances and allowed them to reach a different conclusion. Here’s where it gets interesting.

When this ’emote control’ began to occur, parts of the brain normally involved in reasoning were not activated. Instead, a constellation of activations occurred in the same areas of the brain where punishment, pain, and negative emotions are experienced (that is, in the left insula, lateral frontal cortex, and ventromedial prefrontal cortex). Once a way was found to ignore information that could not be rationally discounted, the neural punishment areas turned off, and the participant received a blast of activation in the circuits involving rewards – akin to the high an addict receives when getting his fix.

In essence, the participants were not about to let facts get in the way of their hot-button decision making and quick buzz of reward. ‘None of the circuits involved in conscious reasoning were particularly engaged,’ says Westen. ‘Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones’ …

Ultimately, Westen and his colleagues believe that ’emotionally biased reasoning leads to the “stamping in” or reinforcement of a defensive belief, associating the participant’s “revisionist” account of the data with positive emotion or relief and elimination of distress. The result is that partisan beliefs are calcified, and the person can learn very little from new data,’ Westen says. Westen’s remarkable study showed that neural information processing related to what he terms ‘motivated reasoning’ … appears to be qualitatively different from reasoning when a person has no strong emotional stake in the conclusions to be reached.

The study is thus the first to describe the neural processes that underlie political judgment and decision making, as well as to describe processes involving emote control, psychological defense, confirmatory bias, and some forms of cognitive dissonance. The significance of these findings ranges beyond the study of politics: ‘Everyone from executives and judges to scientists and politicians may reason to emotionally biased judgments when they have a vested interest in how to interpret “the facts.”‘