Conspiracy Theorists
I once had a colleague I'll refer to here as M.
M has a Ph.D. from a reputable university and works as a researcher. She's smart, funny, cooperative, and politically liberal.
One day, in the spring of 2021, M remarked that she was unvaccinated. We talked, and here's the gist of what she said: COVID-19 vaccines are unsafe and ineffective. The government has suppressed these facts because it answers to Big Pharma. The vaccines probably don't contain microchips that monitor citizens, but, well, you never know...
I was stunned. How could Dr. M be one of those people?
I had assumed (unwisely) that conspiracy theorists are the most paranoid, gullible, badly educated, hostile, and/or generally nutty people among us.
Certainly the most visible ones create that impression. Last Wednesday, for instance, InfoWars founder Alex Jones posted the following to the platform-formerly-known-as-Twitter:
"Disease X is coming. But it isn't some chaotic pandemic we must manage. It is a genocidal kill weapon agreed upon by the worst members of humanity."
Mr. Jones (arguably one of the worst members of humanity himself) was referring to a WHO panel being held that day at the World Economic Forum entitled "Preparing for Disease X".
However, "Disease X" isn't an actual disease. It's a hypothetical. Since 2018, the WHO has used the term in discussions of how health care systems can respond to the next pandemic. The assumption is that we can't know what might cause the next pandemic, but we can still be better prepared for it.
Nevertheless, social media buzzed for days with the rumor that governments are now plotting to create and spread Disease X to serve their nefarious interests. I saw some of these formerly-Twitter and Reddit posts. Paranoia and nuttiness were on full display.
Misunderstandings like this might be correctible if they're isolated. Unfortunately, conspiracy theorists tend to see conspiracies everywhere.
Sometimes their beliefs are innocuous. Decades ago a taxi driver in Providence told me that "umbrella man" had been hired by the CIA to shoot JFK. I got to my destination just fine, no harm done.
In other cases, conspiracy theories are dangerous. When Edgar Welch shot up the Comet Ping Pong pizza restaurant in 2016, he was hoping to liberate children that Hilary Clinton and other conspirators were keeping there as sex slaves. Here you can see a clear path from theory to action.
Conspiracy theories are also dangerous in a broader though hard to measure way. Inaccurate factoids about COVID-19 or the 2020 presidential election become more persuasive when folded into elaborate narratives about the deep state. In the case of Dr. M, her views increased the risk that she and those around her would get COVID. (She did in fact contract a mild case in 2022. As far as I know she's still unvaccinated.)
In the rare case, conspiracy theorists don't actually believe what they're saying. Last week a Quebec resident who'd been claiming online that the Canadian government had been setting wildfires was convicted for starting 14 such fires himself. (As for Alex Jones, it's hard to tell how much he believes what he says. He has offered some moderately nuanced remarks on his own state of mind, but I'm not sure anything he says is believable.)
Most of the time, conspiracy theorists appear to be passionately committed to their beliefs. And sure, this group includes deluded, deeply irrational people like Mr. Welch, but then there's Dr. M: Highly educated, cooperative, liberal, etc.
Who are these people? What could Edgar Welch and Dr. M have in common?
In this newsletter I'll be talking about the causes of conspiracy thinking. (Next week I'll focus on cures.)
What prompted this newsletter is a new study on how conspiracy mindsets impact the ability to recognize deepfakes. Although the study is relatively simple and narrowly focused, the conclusions align with recent scholarship on the causes of conspiracy thinking – specifically, a prominent 2023 meta-analysis that I'll touch on near the end.
This meta-analysis is unusual, because it reflects two of the most pervasive statistical errors in social science, yet both the findings and the errors tell us something important about the psychology of conspiracy theorizing.
A brief definition
Conspiracy theorists attribute events to the influence of powerful people who work together in secret. Often the conspirators are said to be benefitting themselves at the expense of the common good. (By definition, conspiracy theories are considered false, or extremely improbable.)
Although I'll be referring sometimes to "conspiracy theorists", it's more accurate to speak of a conspiracy mindset, as this way of thinking is a matter of degree.
At one end of the spectrum are the kinds of people who believe that the earth is ruled by shape-shifting, lizard-like creatures from outer space. (I'm not sure which is worse: Being ruled by lizard aliens, or sharing the right to vote with fellow citizens who believe in lizard aliens.)
The other extreme is illustrated by some of the folks who dispute the conventional view of the JFK assassination (one shooter, no conspiracy). Even if you think they're misreading forensic evidence or making unwarranted inferences about the CIA, their approach to evidence is more sensible than that of the lizard alien crowd.
The new study
The study I'll be describing here showed that people with conspiracy mindsets are especially good at identifying deepfake videos.
This finding surprised the researchers. Prior studies show that conspiracy theorists are more likely to believe fake news. They seem more gullible, or less critical, or whatever. You'd expect them to also be fooled more often by deepfakes, but that's not what the data showed.
The study, which appears in the March 2024 issue of Telematics and Informatics, was conducted by Drs. Ewout Nas and Roy de Kleijn at Leiden University.
Nas and de Kleijn asked 130 adults to view brief video clips of celebrities from a database called Celeb-DF. Each clip showed a celebrity talking. The clips lasted an average of 13 seconds apiece and were presented without sound.
Participants viewed a randomly-ordered mix of 59 authentic clips and 115 deepfakes. Below are two screen shots from the original database:
The face on the left is authentic. It's from an actual clip of Brad Pitt during a 1990 interview (see here.)
The face on the right is a deepfake. It was created by taking publicly available footage of a young Jake Gyllenhaal, digitally modifying it, and then, in effect, superimposing it on Brad Pitt's face.
The celebrities in each clip weren't named. Participants were simply asked to judge whether each video clip is authentic or a deepfake.
A key assumption is that participants based their judgments on perceptual details. If they correctly identified the Gyllenhaal clip as a deepfake, it's not because they remembered Pitt's 1990 interview. Rather, something about the way Gyllenhaal's lips moved, or shifts in the pattern of light on his face, or some other odd detail would give it away.
Conspiracy thinking was measured by the Conspiracy Mentality Questionnaire:
I think that…
1. many very important things happen in the world, which the public is never informed about.
2. politicians usually do not tell us the true motives for their decisions.
3. government agencies closely monitor all citizens.
4. events which superficially seem to lack a connection are often the result of secret activities.
5. there are secret organizations that greatly influence political decisions.
Answer options range from 0% (certainly not) to 100% (certain.)
Main findings
Overall accuracy was about 80%. Specifically, participants judged 79% of the authentic videos to be authentic, and 82% of the deepfakes to be deepfakes.
Nas and de Kleijn asked participants whether they recognized the celebrity in each clip. Familiarity boosted accuracy somewhat.
In addition, people who reported spending more time on social media showed slightly greater accuracy. (I take this as good news, because these are the people who would tend to be exposed to more deepfakes.)
The main finding was a significant association between conspiracy mentality and accuracy. Higher scores on the Conspiracy Mentality Questionnaire predicted slightly greater accuracy at distinguishing authentic videos from deepfakes.
In response to an email from me, Dr. Nas graciously ran an analysis, not reported in the article, which included familiarity and conspiracy mentality in the same model. The results were essentially unchanged: Familiarity and conspiracy mentality separately predicted accuracy.
Again, the latter finding was a surprise, given that conspiracy theorists are more susceptible to fake news. Here is Nas and de Kleijn's proposed explanation:
"[P]eople who believe in conspiracy theories tend to be more skeptical and suspicious of information they encounter. This suspiciousness may lead to more critical and alert viewing behavior when watching videos, including deepfakes, whereas news stories provide fewer cues for distinguishing authentic from fake."
This is a good start, but we need more. Skepticism and suspiciousness don't automatically make you a conspiracy theorist. I'm skeptical about a lot of data I encounter, and some of it makes me suspicious, but I don't see hidden conspiracies. Instead I write newsletters describing questionable methods or stats (or, as you'll see in a moment, questionable interpretations of the stats).
Why do people develop conspiracy mindsets?
Nas and de Kleijn's view of conspiracy theorists (they're skeptical and suspicious) is widely held and consistent with the results of a 2023 meta-analysis.
This meta-analysis, published in the leading journal Psychological Bulletin, presents the most complete, up-to-date, and, as far as I can tell, exhaustive analysis of conspiratorial thinking.
The researchers, led by Dr. Shauna Bowes at Vanderbilt, reviewed 170 studies with a total of 158,473 participants. The studies were admirably diverse in demographics, methods, and so on.
If you only read the abstract, you would think that Bowes and colleagues nailed it. What they wrote there is that conspiracy thinking is most strongly related to:
"(a) perceiving danger and threat, (b) relying on intuition and [reporting] odd beliefs and experiences, and (c) being antagonistic and acting superior."
There you go. Conspiracy theorists find the world dangerous, they rely on gut feelings, they have odd beliefs, and they tend to be arrogant about those beliefs. That fits the usual stereotypes (and seems to fold in Nas and de Kleijn's comment about skeptical, suspicious attitudes).
But what about Dr. M? As far as I could tell, little or nothing of this describes her.
Well, one might reply, never mind about M. She's just one person. Bowes and colleagues were describing a general pattern. There are always exceptions to the rule.
The problem though is that the "rule" is not as simple as the abstract suggests. This meta-analysis reflects two pervasive shortfalls in statistical reasoning. The first one comes up a lot in these newsletters; the second one I haven't discussed before.
1. Causal ambiguity.
Bowes and colleagues briefly acknowledge, at the end, that their results "do not shed light on causality or temporal precedence in these relations."
This means, for example, that we can't tell whether perceiving the world as a dangerous place gives rise to conspiracy thinking, or whether people develop conspiracy beliefs first and then increasingly view the world as dangerous.
This kind of ambiguity haunts correlational research and undermines the data. We want to know what causes conspiracy thinking, just like we want to know what causes lung cancer. If smoking is associated with lung cancer, but we can't say which causes which, then the information isn't very useful.
2. The crud factor.
Here's something that Bowes and colleagues' abstract doesn't tell you: In their actual analyses, they identified more than 100 variables associated with conspiracy thinking.
Conspiracy thinking was related to being less analytical, less open-minded, more cynical, more alienated, more narcissistic, more aggressive, less agreeable, etc. etc. None of the correlations, including the ones they alluded to in their abstract, were very strong.
This illustrates a problem known as the crud factor. In a large dataset, just about everything correlates with everything else, and it's difficult to judge what counts or how variables truly interrelate. We want to know why people become conspiracy theorists, but the data doesn't give us a starting point, because we just have a bunch of weak correlations (and some of them might be flukes anyway). Another way to put it is that the data are consistent with any number of interpretations, but we have little basis for favoring one over the other.
(Note for stats people: Bowes and colleagues briefly acknowledge a potential crud factor, but their solution – giving precedence to the largest effect sizes – doesn't work. The distribution of effects was quite smooth – 0.21, 0.22, 0.23, etc. – and so any decision about which ones count will be arbitrary. More importantly, you can't just compare effects across studies, even with standardized criteria (Cohen's ds, beta weights, etc.) because, in cases like this, methodological differences and variable choices from study to study greatly impacted potential effect sizes for any given variable.)
In sum, we can't say yet that conspiracy thinking is grounded in paranoia, or suspicion, or arrogance, or anything else. We don't know whether each of these variables is a cause or an effect of conspiracy thinking (causal ambiguity problem). And, there's no reason to privilege the weak correlation between any of these variables and conspiracy thinking over the many other such correlations that have been found (crud factor problem).
Conclusion
Now it may seem like I've taken a wrong turn. I've said that Bowes and colleagues' meta-analysis doesn't tell us much, because conspiracy thinking is weakly correlated with over 100 variables, and we don't know whether these variables are causes or effects or even related to conspiracy thinking in the first place.
I do think these observations suggest a conclusion though: Conspiracy thinking is a diverse phenomenon that arises from many sources.
Consider aggression for a moment. Since there are many types and many contributors, social science researchers don't typically study "aggression" in general. Rather, they explore specific types and contributors. Poor impulse control is responsible for some children's physical aggression and other children's verbal aggression. One child may be physically aggressive because of poor impulse control. Other children may be physically aggressive because they're modeling behaviors they've observed at home, or their parents neglect them and they crave attention, or they're struggling at school and unable to express their frustration. Same behavior, different causes.
Much the same may be true of conspiracy theorists. Alex Jones, Edgar Welch, any particular lizard alien believer, and Dr. M may represent four different types of conspiracy theorizers. The scope of their theories differs, as does whatever spurred them to the think this way in the first place. What Alex Jones and Dr. M have in common may only be the content of certain theories (e.g., their belief that governments are suppressing vaccine data) and some logical reasoning fallacies (which I'll discuss in some other newsletter). Analogously, physically aggressive children may resemble each other in specific behaviors more than in the causes of those behaviors.
It's hard to give up the idea that conspiracy theorists are paranoid, irrational, arrogant people. This is a widely held stereotype, and it's consistent with Bowe and colleagues' data. But the data are correlational, and the associations are weak. Dr. M doesn't seem like an especially paranoid person. When she talks, she sounds rational and humble. (And, for what it's worth, she describes herself as fairly sure – though not certain – that Lee Harvey Oswald acted alone.)
Perhaps our stereotypes about conspiracy theorists have some basis in fact, but much less than we think.
In order to address the problem of conspiracy thinking, we need to view it as multifaceted. We need to be open to the possibility that each person arrived at their particular theories via a more or less unique path. Maybe, as hinted at by Bowes and colleagues' 100+ different correlates, there are dozens of ways one could become a conspiracy theorist, and more than one type out there, disseminating vaccine misinformation or fretting about lizards.
Next week I'll discuss how to prevent conspiracy thinking.
Thanks for reading!