Did you know that Taylor Swift is a Pentagon asset, recruited to help rig the 2024 Super Bowl and presidential election?
We can laugh at (or pity) the folks who believe these things, but when it comes to topics like election integrity or vaccine safety, the impact of conspiracy thinking isn't very funny.
In this newsletter I'll be discussing what can be done to prevent conspiracy theories from emerging.
Quick recap
As I noted last week, conspiracy theorists attribute events to the influence of powerful, malicious people who work together in secret, often to benefit themselves at the expense of others.
Conspiracy thinking reflects a continuum. One extreme is illustrated by Marjorie Taylor Greene (R-GA), who has claimed that the 2018 wildfires in California were ignited by a space laser controlled by a secret corporate group. A wild claim, zero evidence.
At the other extreme are folks who don't ordinarily see conspiracies behind the scenes but in a few cases, like the JFK assassination, merely stray a bit from reasonable evidence.
Last week I also suggested that different people entertain conspiracy beliefs for different reasons.
That's not a very satisfying description of what causes conspiracy thinking, but I just don't believe the scientific literature justifies anything more specific than that.
For instance, a 2023 meta-analysis concluded that conspiracy theorists (a) find the world dangerous, (b) rely on gut feelings, (c) maintain odd beliefs, and (d) tend to be arrogant about their beliefs.
This seems like a clear profile, but the meta-analysis also found links to hundreds of other variables, and it's unclear which are causes as opposed to effects.
(Do people who find the world dangerous tend to become conspiracy theorists, or is it that conspiracy theorists find the world more dangerous with each new conspiracy? Or, is there a feedback loop: People who see the world as threatening develop a conspiracy mindset, and each new conspiracy they discover increases the seeming dangerousness of the world? The data aren't clear.)
Some good news
Sometimes a problem can be solved even if we don't fully understand what causes it or why the solutions are effective.
This is a common refrain in the history of medicine. Quinine, for instance, was used for centuries to treat malaria before we learned that a parasite causes the disease, and that quinine kills the parasite.
Similarly, even if the causes of conspiracy thinking are unclear, researchers have made progress on developing interventions that reduce the problem. Here I'll discuss three of the most widely studied approaches: Debunking, coursework, and inoculation.
1. Debunking
When we hear a conspiracy theory, our natural tendency is to correct the theorist. This seems like a rational approach, but it turns out to be among the least effective, along with emotional appeals and mockery. Studies show that challenging a conspiracy theorist by questioning their evidence or logic, or by proposing an alternative theory, rarely works.
Why is debunking ineffective? There are as many answers to that question as there are explanations for the problem. For instance, some researchers speculate that debunking fuels the suspicion and fear that drive conspiracy thinking in the first place. Others speculate that the irrational logic of conspiracy theories make them impervious to debunking.
2. Critical thinking courses
The earliest shtick in western philosophy is the Socratic method. In Plato's dialogues, Socrates asks questions that tease out logical flaws and, one by one, his fellow Athenians end up renouncing their misguided beliefs.
This is a rationalist's dream. The Socratic method doesn't work well with conspiracy theorists, because it's just one particularly sophisticated type of debunking.
A rationalist might reply that what Socrates supposedly did, and what recent debunking studies do, is merely conversational. You can't change someone's beliefs in a 10-minute conversation. So what about a full-semester course devoted to critical thinking?
At first glance, you'll see happy news: Courses do seem to help. A 2018 study by Drs. Kathleen Dyer and Raymond Hall, for example, showed that conspiracy thinking was reduced among Cal State undergraduates who took "Natural Science 4: Science and Nonsense."
This study is worth pondering for a moment, because it was well-designed, larger than most, and, like many of its kind, yielded results that sound great but turn out to be startlingly unpersuasive once you take a close look.
Pseudoscience beliefs among 203 students in six sections of the critical thinking course were tested at the beginning and end of the semester. Comparison groups (students taking research methods or science classes) were similarly tested.
The researchers' test is called the IEUB ("Inventory of Empirically Unverified Beliefs") and includes a mix of scientific facts and pseudoscientific statements with no factual basis. The higher the score, the more accepting the test-taker is of conspiracy theories and other unverified beliefs.
The main findings are shown in the figure below.
The green line at top shows pre- and post-test IEUB scores for students in science classes. The red line underneath it shows scores for students in the research methods classes. These lines are quite flat, indicating no change from pre-test to post-test.
The good news is that the blue line drops from pre-test to post-test. In a word, students in the critical thinking course became significantly less accepting of unverifiable beliefs by the end of the semester. The results for conspiracy theories in particular showed the same pattern.
Unfortunately, the closer you look at the data, the more the figure above looks like a house of cards in mid-collapse.
1. Small effects.
Students were asked to rate each IEUB statement on a scale ranging from 1 ("totally sure it's false") to 5 ("totally sure it's true"). The mid-point, 3, is labeled as "not sure". As you can see from the graph, the average scores among students in the critical thinking class shifted from about 2.6 to 2. Practically speaking, that's not much change.
2. Item bias.
The IEUB mostly consists of flaky items.
I can't say how many of them are "flaky", because that's just my informal take and not everyone would agree with my impressions. What I have in mind are items on alien abduction, crystal healing, the moon landing hoax, and psychic detectives, among others.
I suppose it's nice if you can dislodge beliefs in such things, but doing so may be easier than changing complex, sensitive beliefs such as climate change denialism. The researchers didn't tap into many beliefs like that, nor did they report item-level analyses.
So, after an entire semester in a critical thinking class, undergraduates become slightly more skeptical about alien abductions and crystals and the like. Not very impressive.
3. Demand characteristics.
Imagine taking a college class in which you learn how to debunk unwarranted statements. Then you take a test that includes many statements the general public considers unwarranted. Regardless of what you truly believe, it's clear what the "right" answers are on this test. The small effects in this study could be attributable to some students answering the way they think they're supposed to answer.
Don't get me wrong: I strongly advocate for critical thinking courses. Every high school and college student should be required to take one, in my opinion, and I do believe that these courses can prevent conspiracy theories from taking root. I just don't think studies like this demonstrate it. In a moment I'll describe more persuasive data.
Inoculation
Debunking takes place after the fact: A conspiracy theory is aired; someone then tries to prove it wrong.
But what happens if you can reach someone before their conspiracy beliefs develop?
In a newsletter last year, I described how Dr. Sander van der Linden and colleagues at the University of Cambridge Social Decision-Making Lab have developed techniques for increasing resistance to misinformation, including conspiracy theories. Their techniques rely on something called psychological inoculation. Here's how I described this approach:
"Just as vaccines strengthen your immune system by presenting it with a small, weakened dose of a pathogen, so we can increase a person's resistance to misinformation by exposing them to low levels and explaining how it works."
Inoculation is a form of "pre-bunking", because it's meant to work before a person is exposed to misinformation. (There are many other pre-exposure strategies, such as priming, that I'm skipping over here.) Two key principles of implementation are to get people actively involved in the inoculation process (as opposed to, say, having them read a list of instructions), and to ensure that inoculation increases the ability to recognize misinformation in any domain.
Van der Linden and colleagues have developed interventions such as the internet-based, internationally prominent game Bad News, in which players learn how to use fake news to attract followers on a Twitter-like platform. The game teaches people about misinformation techniques that include floating conspiracy theories.
Does inoculation work? Yes indeed. In 2023 alone, I count more than 10 studies showing that inoculation techniques boost resistance to conspiracy theories and other kinds of misinformation. Methodologically, the studies vary in quality, but some are quite strong, and in a systematic review published in PLOS ONE last year, in which 25 separate interventions for reducing conspiracy thinking were compared, inoculation approaches were shown to have the strongest effects.
The success of inoculation strategies tells us that critical thinking coursework can be successful, even if the studies focusing on actual classes are methodologically weak. Inoculation is eminently teachable and may not even require much time (see here).
In or outside the classroom, inoculation is particularly useful because we can't know what the next conspiracy-du-jour will be. All we can be sure of is that it's already in the works. (For instance, did you know that Beyoncé is being recruited by the woke deep state to rig the World Series and ensure Joe Biden's re-election? You read it here first.)
An integrated approach
Inoculation techniques are clearly effective, but they tend to inhibit new conspiracy theories rather than dismantling existing ones. Stand-alone academic courses could focus on both types, and so this is where I pin my hopes. Critical thinking courses can be (and, in a small number of places, already are) mandatory in secondary and/or post-secondary curricula.
To gain broader insight into what can be done, I reached out this week to Dr. Mikey Biddlestone at the University of Cambridge, an expert who studies conspiracy belief origins and interventions, among other things.
My first question to Dr. Biddlestone was, in essence, how conspiracy thinking can be addressed in settings such as academic courses where sustained attention to the problem is possible.
Dr. Biddlestone recommended a "careful mix" of interventions, starting with ones that focus on logical fallacies and then including those that focus on the manipulation techniques used by misinformation spreaders. Here's how he put it:
"The best way I can describe what I recommend is through the broad use of the inoculation framework (motivating resistance to persuasion then pre-emptively refuting how these persuasion attempts might occur)...
"Step one is...pre-bunking the logical fallacies that make us vulnerable to misinformation.... Next, I would say pre-bunk the manipulation tactics like polarization, false dichotomies, and deflection/trolling. Also important to note is the need to repeatedly administer these pre-bunks so they really become internalised into the students’ psychological vocabulary."
An essential goal of these activities, according to Dr. Biddlestone, is Actively Open-Minded Thinking. AOT, as he describes it, "is the tendency to engage in reflective thinking, and to practice intellectual humility in the face of evidence that refutes one’s beliefs." Reflection here means not just reflecting on evidence but also on one's beliefs, and how well they stand up to the evidence.
My takeaway from the discussion with Dr. Biddlestone, and from recent data, is that the best approach to dealing with conspiracy theorizing would be an integrated one: Work on the cognitive biases that make people susceptible to conspiracy theories. Help them understand how and why conspiracy theories are spread. Promote active open-mindedness that's both reflective and humble. And, in the process, help them reflect more critically on their own conspiracy thinking.
A political caveat
Earlier I advocated for critical thinking courses that rely on inoculation and other techniques to prevent conspiracy theories from taking root. My hope that such courses become mandatory is, admittedly, not very realistic at the moment.
At the K-12 level, critical thinking is woven into state curricular standards, but a common refrain among educators and educational scholars is that there's not enough of it, or that it's not taught to the extent that it should be, owing to many other instructional and assessment needs. It's hard to imagine making time for another mandatory course.
At the college level, the current climate is unfavorable for different reasons. Broadly, the push for more career-focused curricula doesn't typically include a call for more critical thinking courses (though perhaps it should). And there's a specific concern: Since last year, a subcommittee probe, led by House Judiciary Committee chair Jim Jordan (R-Ohio), has been exploring the federal government's alleged collaboration with social media companies for the purpose of silencing conservative voices.
This subcommitee's work includes the investigation of researchers who study how to prevent misinformation about politics and health. According to a Washington Post report, subpoenas, requests for information, and other activities by Jordan's subcommittee have already chilled both funding and research on these topics in the U.S. This is not a time for creating new critical thinking courses, or for including existing ones with general education requirements. Rather, it's a time for pushing back against an ideologically-driven agenda that amounts to an attack on free speech. (For guidance about what you might do, I'd suggest reaching out to the Center for an Informed Public (CIP) at University of Washington.)
Final thoughts
It's easier to lie than it is to disprove a lie. Conspiracy theories are easily invented, and, once aired, they tend to linger. (More than seven years later, Pizzagate is still with us.) In a future newsletter I'll discuss the reasons for their persistence.
Earlier I suggested that interventions can be effective in spite of uncertainty around the causes of conspiracy thinking. I floated this idea to Dr. Biddlestone and he agreed, while alluding again to the importance of encouraging reflection on conspiracy beliefs:
"I would say, in line with your intuition, it doesn’t really matter [which variables are causes or effects of conspiracy thinking]. While other scholars may disagree with me, I would argue that the cyclical nature of unmet psychological needs motivating belief in conspiracy theories, which then in turn exacerbate the same unmet psychological needs, suggests that simply identifying a strong correlate and drawing people’s attention to how it makes one vulnerable to conspiracy narratives is a fruitful endeavour to break the cycle regardless of whether it’s seen as a consequence or predictor."
In other words, conspiracy thinking might be diminished by helping someone realize, for instance, that paranoia underlies their beliefs, and that, in turn, the conspiracies they discover fuel their paranoia. Reflecting on this cycle may help break it, regardless of whether the paranoia or the conspiracy beliefs came first. With a little guidance, people can inoculate themselves.
In the meantime, we have got to get Taylor Swift off the Pentagon's secret payroll!
Thanks for reading!
Thank you, Tia Marie. I appreciate your enthusiasm!
….and 15 minutes later a pervasive shift toward hope has uplifted me. Due to time and work obligations, and the inability to speak a response, my comments will be brief. I think conspiracy theories can be national (Taylor Swift) and local (our local politicians don’t care) and immediate (my spouse or employer is working against me to take me down). Mitigating the effects through critical thinking training, seems similar to CBT therapy. I am led to rethink some beliefs, personally and professionally, and as a leader work better to encourage the people I do life with to become curious about their harmful beliefs, and balancing them out between ones that have some form of substantial evidence those that don’t…I’ve lost my train of thought, which is why speaking is so much better for me… At any rate, conspiracy theories heavily impacts policing. Though some theories are based on facts, we have to work hard to mediate or prevent new ones from spreading, and with information readily available at our fingertips, and AI, changing the narrative could be done a lot faster, figuring out how is the challenge. My synapses are truly firing, this brief writing does no justice for the myriad of thoughts I had while reading this article. I’ll end by expressing gratitude for showing up in this space and expressing your thoughts!