Made-up minds
Since political beliefs are rooted in emotions, says Chris Mooney, the facts are often irrelevant
A MAN WITH a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point." So wrote the celebrated Stanford University psychologist Leon Festinger, in a passage that might have been referring to arguments over the president's birthplace or the causes of climate change and autism. But it was too early for all of that — this was the 1950s — and Festinger was actually describing what would become a famous case study in psychology: a group of Chicago UFO devotees who thought they were communicating with extraterrestrials.
On Dec. 21, 1954, the day the cult’s leader had said the world would end in cataclysm, Festinger and his team were with the Seekers, whom they had decided to study. This was the moment he was waiting for. How would people so emotionally invested in a belief system react, now that it had been soundly refuted?
When the prophecy failed, the group struggled for an explanation. But then rationalization set in. A new message arrived from the aliens, announcing that the Seekers had been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: "The little group, sitting all night long, had spread so much light that God had saved the world from destruction."
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. "Their sense of urgency was enormous," wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.
In the annals of denial, it doesn't get more extreme than that. The cultists lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But though the Seekers might lie at the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger's day, an array of new discoveries in psychology and neuroscience has further demonstrated how our pre-existing beliefs, far more than any new facts, can skew our thoughts and color what we consider our most dispassionate and logical conclusions. This tendency toward "motivated reasoning" helps explain why we find groups still polarized over matters where the evidence is so unequivocal. It seems that expecting people to be convinced by the facts flies in the face of, you know, the facts.
THE THEORY OF motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion — what researchers often call "affect." Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds — fast enough to detect with an EEG device, but long before we’re aware of them. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators but to data itself.
In other words, by the time we're consciously "reasoning," we may instead be rationalizing our prior emotional commitments. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers. Our "reasoning" is a means to a predetermined end — winning our "case" — and is shot through with biases. These include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to refute views and arguments that we find uncongenial. Plainly put, if I don't want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
Modern science originated from an attempt to weed out such subjective lapses — what Francis Bacon, that great 17th-century theorist of the scientific method, dubbed the "idols of the mind." Our individual responses to the conclusions that science reaches, however, are quite another matter. Because researchers employ so much nuance and disclose so much uncertainty, scientific evidence is highly susceptible to selective reading. Giving ideologues or partisans scientific data that's relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.
And it's not just that people twist or selectively read scientific evidence to support their pre-existing views. According to research by Yale Law School professor Dan Kahan and his colleagues, people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place — and thus where they consider "scientific consensus" to lie on contested issues.
Kahan classified individuals, based on their cultural values, as either "individualists" or "communitarians," and as either "hierarchical" or "egalitarian" in outlook. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) could lead to outcomes deleterious to society. Egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. And the groups split dramatically on global warning. But the study subjects weren't "anti-science" — not in their own minds, anyway. It's just that "science" was whatever they wanted it to be.
And that undercuts the standard notion that the way to persuade most people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts — they may hold their wrong views more tenaciously than ever.
Northwestern University sociologist Monica Prasad and her colleagues wanted to test whether they could dislodge the belief that Saddam Hussein and al Qaeda were secretly collaborating among those most likely to believe it — Republican partisans from highly GOP-friendly counties. So the researchers set up a study in which they discussed the topic with the subjects. Then they cited the findings of the 9/11 Commission, as well as a statement by George W. Bush himself denying that his administration had ever "said the 9/11 attacks were orchestrated between Saddam and al Qaeda."
As it turned out, not even Bush’s own words could change the minds of these Bush voters — just 1 of the 49 partisans who originally believed the Iraq–al Qaeda claim changed his or her mind. A far more common reaction was resisting the facts, either by coming up with counterarguments or by simply being unmovable.
BUT HOW "IRRATIONAL" is this in the end? On one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. "It is quite possible to say, 'I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'" explains Stanford social psychologist Jon Krosnick.
Indeed, there's another sense in which science denial could be considered keenly "rational." In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there’s something to climate change' — that’'s going to mark them out as a certain kind of person, and their life is going to go less well."
If you want to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it's an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you're a Republican or a Democrat. The two groups have grown more divided in their views even as the science becomes more unequivocal.
And it turns out that education has little to do with it. On the contrary: In a 2008 Pew survey, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college-educated Republicans. Among Democrats and independents, higher education correlated with greater acceptance of the science.
Other studies have shown a similar effect: Republicans who say they understand the global-warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to political scientists Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. "People who have a dislike of some policy — for example, abortion — if they're unsophisticated they can just reject it out of hand," says Lodge. "But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they’re right — and so their minds become harder to change.
IS THERE A case study of science denial that largely occupies the political left? Yes — the disproved claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr.) and Hollywood celebrities (Jenny McCarthy and Jim Carrey). The Huffington Post gives a large megaphone to denialists. And Seth Mnookin, author of the new book The Panic Virus, notes that if you want to find vaccine deniers, hang out at Whole Foods.
Vaccine denial has all the hallmarks of a belief system that's not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rates has been undermined by multiple epidemiological studies — as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.
Yet the true believers persist — critiquing each new study that challenges their views, and rallying to the defense of disgraced researcher Andrew Wakefield, even after his 1998 Lancet paper — which originated the current vaccine scare—was retracted and he subsequently lost his license to practice medicine in the U.K. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.
The upshot? Left or right, conservative or liberal, we all wear blinders in some situations. Then the question becomes: What can be done to counteract human nature itself? Given the power of our prior beliefs, one idea is becoming clear: If you want someone to accept new evidence, make sure to present it in a context that doesn't trigger a defensive, emotional reaction.
This theory is gaining traction in part because of Kahan’s work at Yale. In one study, he and his colleagues packaged the science of climate change into made-up newspaper articles bearing two very different headlines — "Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming" — and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into a narrative that appealed to those with a pro-industry worldview.
Followed to its logical conclusion, this means conservatives are more likely to embrace climate science if it comes from a religious or business leader, who can set the issue in a context of values that differ from those of an environmentalist. This effectively signals a détente in what Kahan calls a "culture war of fact." In other words, paradoxically, don’t lead with the facts in order to convince. Lead with the values — so as to give the facts a fighting chance.
From a longer article by Chris Mooney that originally appeared in Mother Jones and is available at MotherJones.com. ©2011 Foundation for National Progress.
-
Will California's EV mandate survive Trump, SCOTUS challenge?
Today's Big Question The Golden State's climate goal faces big obstacles
By Joel Mathis, The Week US Published
-
'Underneath the noise, however, there’s an existential crisis'
Instant Opinion Opinion, comment and editorials of the day
By Justin Klawans, The Week US Published
-
2024: the year of distrust in science
In the Spotlight Science and politics do not seem to mix
By Devika Rao, The Week US Published