Why do people believe conspiracies?

By R. Kelly GarrettThe Ohio State University

Following Justice Antonin Scalia’s death on February 13, a former criminal investigator for Washington, D.C.’s Metropolitan Police Department named William O. Ritchie took to Facebook.

“My gut tells me there is something fishy going on in Texas,” he wrote.

With those words, Ritchie helped draw national attention to an emerging conspiracy theory: that Scalia may have been murdered.

According to The Washington Post, Ritchie continued that he was “stunned that no autopsy was ordered for Justice Scalia,” before pointing out the many flaws he saw in published accounts on the subject.

Ritchie wasn’t the first to float this conspiracy. Conservative radio talk show host Alex Jones suggested as much in a video he posted to Facebook on the day of Scalia’s death (“The question is: Was Antonin Scalia murdered?”).

The next morning, the Drudge Report pointed out that the judge was found with a pillow over his head, presumably providing additional evidence of foul play. Conservative political commentator Michael Savage also weighed in, wondering “Is it a conspiracy theory to ask questions that are so obviously in need of answer, or is it just common sense?”

But as someone who has studied how and why misperceptions emerge and spread, Ritchie’s words struck me as noteworthy. Unlike Savage or Jones, Ritchie’s livelihood isn’t dependent on appealing to a conservative audience or making shocking allegations.

To the contrary, this is a man who made his living investigating crimes through the accumulation of evidence and the judicious use of reason. It is at least plausible that his primary motivation is to ensure that the truth is known.

The Scalia conspiracy theory is likely to strike many – especially liberal Democrats – as hard to fathom. Why should anyone by surprised at the death of a 79-year-old man when the average life expectancy of a man in the U.S. is 77? Why is the decision by Scalia’s family not to have an autopsy performed met with skepticism? And why second-guess the U.S. Marshals Service, an agency charged with protecting justices of the Supreme Court, when it concluded that there was no foul play?

Is it really so strange?

It may be tempting to assume that reasonable people are immune to conspiracy theories, but doing so would be a mistake. Research into misperception, rumour, and conspiracy theory suggests that even reasonable individuals can reach conclusions that don’t align with the best available evidence.

Researchers have identified numerous strategies that individuals use to assess what is true. Under the right circumstances, each can lead to misperceptions. For example, if a claim is consistent with other things you know, you will likely find it easier to understand, which in turn makes it seem more truthful – a phenomenon that has been called the illusory truth effect. Claims that contradict a prior belief, in contrast, tend to elicit counterargument.

The coherence and explanatory power of a causal story also influence whether it is believed. Individuals will often stick with a plausible explanation even if the available evidence doesn’t support it. In a classic study, researchers found that individuals would continue to attribute a warehouse fire to paint held in a storage room even after they were told – and could recall – that the storage room was empty at the time of the fire.

For those who see government-supported conspiracies as an everyday reality, the suggestion that Justice Scalia’s death is part of a coordinated plot to gain political advantage may well seem sensible, even if solid evidence is lacking. Without knowing anything about Ritchie’s political views, we can’t say whether this was a factor here, but it is likely to play a role in some individuals' assessments.

One final consideration that individuals use when assessing whether a claim is true is the belief of others. The more often we hear a claim, the easier it is to believe – especially claims that come from people we know and trust.

For example, one of the most important predictors of what Americans think about the risks posed by climate change is what members of Congress say and do about this issue. The more Democrats voice support for the issue, and the more Republicans are critical of it, the more polarized Americans have become.

In light of this, we should expect Donald Trump’s recent speculations about the Scalia conspiracy theory to contribute to growing public concern about this possibility.

The challenge of setting the record straight

Notably, research suggests that misperceptions – including belief in unsupported conspiracy theories – are not primarily due to a lack of information. Nor can such beliefs be attributed to so-called media echo chambers. My colleagues and I have found that most Americans consume news from a diverse range of outlets. (And if you’re skeptical of analyses based on Americans’ self-reported news exposure, note that behavioral studies yield similar results, both online and off.)

So is there anything we can do to defuse the Scalia conspiracy theory or others like it? Many studies, including this one by a colleague and me, find that attempts to correct misperceptions often fail. Sometimes they fail spectacularly. Indeed, some scholars conclude that humans are hopelessly irrational, that emotional biases will always win out.

Yet even if we are inherently emotional beings, there may still be hope for humans’ ability to reach reasoned conclusions. Studies suggest that fact-checking – whether by the media or by members of one’s own social network – can help promote more accurate beliefs. Fact-checking can also motivate political leaders to be more careful about the claims they make. Some of the best strategies – as journalist Craig Silverman points out in his article “Lies, Damn Lies, and Viral Content” – include focusing debunking efforts on the idea, not the person, and being mindful of the biases described here. (Other good reviews of correction strategies are here and here.)

There may also be things that individuals can do to reduce the biases to which all humans are prone. We can, in psychologist Daniel Kahneman’s words, “think slowly,” striving to reduce our reliance on intuition and gut feelings, instead focusing on more thorough examination of the evidence. Indeed, there is evidence that when individuals are made aware of their biases, they are often able to compensate.

But this will not always be enough, since there is ample evidence that reasoning skills and careful thought can actually increase bias.

Taken as a whole, research suggests that even thoughtful individuals with good intentions – including Ritchie – are prone to embrace claims for which there is little evidence, and to defend those claims in the face of contradictory evidence. This is particularly likely when stakes are high, when outcomes are hard to explain or accept, or when a claim is consistent with one’s political values. The evidence suggests that Richtie’s speculations are wrong, but there is nothing surprising about his suspicions.

The Conversation

R. Kelly Garrett, Associate Professor of Communication, The Ohio State University

This article was originally published on The Conversation. Read the original article.

Sign up to receive updates from the BC Humanist Association

Created with NationBuilder Creative Commons License