Boosting Resistance to Misinformation
Yesterday, roughly four hours before the event itself, the New York Times ran an article entitled "Preparing to Fact-Check the First G.O.P. debate..."
In this article, false or misleading statements previously made by each Republican candidate were analyzed, in hopes that readers would be better prepared to recognize misinformation bubbling up during the debate itself.
Spotting misinformation during a presidential debate is like spotting leaves in a forest: You may not see all of them, but you will see many. Still, the Times article was useful, because misinformation can be quite subtle (and the pace of these debates is relentless).
I appreciated the article too because it relied on an important, increasingly influential technique for boosting resistance to misinformation. That technique, psychological inoculation, is the focus of this week's newsletter.
Specifically, this newsletter was prompted by a new study that questions whether inoculation is effective. It turns out that owing in part to misused statistical methods, the results of the new study are misinformative. (Ironic, right?) The study illustrates that statistics can't be useful when they don't fit the data being analyzed.
Curing misinformation
Researchers nowadays often treat misinformation like a virus that can "infect" people. As you'll see, this is more than just a clever metaphor. It's a theoretical perspective that allows for better understanding of how misinformation spreads through social networks, and how it can be contained or prevented.
One way to manage disease is via treatment. When we encounter a person who's "infected" by misinformation, we treat the infection by correcting them. We help them understand that one of their beliefs is misguided or flat wrong.
In Plato's dialogues, this kind of treatment works splendidly well. Socrates not only corrects the false beliefs of his fellow Athenians, he also frequently gets them to acknowledge their mistakes.
If only things were so easy... In the real world, correction is not so consistently effective. There are lots of reasons for this, including the fact that people don't like to be corrected (even by famous philosophers). This June, for instance, in a meta-analysis of 74 studies on science-related misinformation, researchers reported that correction was not successful, on average.
Under certain conditions (e.g, well-reasoned and tactful feedback, politically neutral themes, topics other than science, etc.), correction works, but the effects aren't consistently strong. Misinformation may have residual effects even after people are corrected. Meanwhile, the attempt to correct someone introduces a risk of back-fire effects (i.e., a strengthening of their original belief).
In sum, whether a fact-checking organization debunks a fake news item, or whether, in a one-on-one conversation, you tactfully correct someone, these treatments are not going to come close to fully "curing" the disease.
Preventing misinformation
If treatments for misinformation are limited, what about prevention?
Metaphorically speaking, preventing misinformation is much like preventing any other kind of disease:
You can try to kill the pathogen (as when a social media platform bans obviously fake news).
You can warn people where the pathogen has been found (as when a social media platform flags potentially misleading or false content).
You can also try to increase peoples' resistance. For instance, critical thinking skills woven onto the K-12 curriculum promote resistance to misinformation, analogous to the way a healthy lifestyle promotes resistance to disease.
This newsletter focuses on a more targeted approach.
Psychological inoculation
Building on prior theoretical work, Dr. Sander van der Linden and his colleagues at the University of Cambridge Social Decision-Making Lab have developed techniques for boosting resistance to misinformation that rely on psychological inoculation.
(For a quick introduction to Dr. van der Linden and his work, check out this excellent profile in Wired magazine. Suffice to say he's one of the world's leading social psychologists.)
The gist of the inoculation approach is this: Just as vaccines strengthen your immune system by presenting it with a small, weakened dose of a pathogen, so we can increase a person's resistance to misinformation by exposing them to low levels and explaining how it works.
Two principles are especially important to the implementation of this approach:
First, people need to be actively involved in the inoculation process, as opposed to, say, passively reading an essay on misinformation.
Second, the goal of inoculation needs to be broad-spectrum immunity, or the ability to recognize misinformation techniques when they're being applied to any topic.
Experts in cognitive psychology and education would approve of the pedagogy implied here. The first principle calls for active learning strategies. The second principle takes the goal of learning to be the broadest possible transfer. Both principles are embodied in the online interventions that Dr. van der Linden and colleagues have developed and tested.
Bad News
The first intervention that van der Linden and colleagues created is the internet-based, internationally prominent game Bad News.
Bad News is an interactive social media game, patterned on Twitter, in which users become a "fake news tycoon" and post fake news items. During the game, which lasts about 15 minutes, users choose fake news content with the goal of obtaining as many followers as possible while maintaining credibility.
Via prompts, the game teaches users about six misinformation techniques: Impersonating experts, using polarizing descriptions, discrediting fact-checkers, provoking negative emotions, floating conspiracy theories, and trolling people online.
Besides allowing users to choose their fake posts and observe the impact on follower counts, Bad News also provides feedback on choices, offers suggestions, and explains what does and doesn't work. All of this plays out in a very casual way. For instance, when I attempted to post the item "The 25 Most Popular TV Shows!", my follower count barely changed, and one follower responded: "YAAAWN. Boring! Who cares?" I also received a prompt noting that I hadn't used any manipulation techniques. In less than a minute, after following a few clicks, I found myself posting the following meme to attack scientists who fret about climate change:
Posting this highly manipulative message increased my follower count from 130 to 219. (See how easy it is to get famous on social media! Please note though that the post and followers aren't real, and I strongly support climate change scientists.)
Good news about Bad News
Numerous studies have shown that Bad News increases resistance to misinformation. Here's just one of many examples:
In a 2020 study, Dr. van der Linden and a team lead by Dr. Melisa Basol (at the time a doctoral student) randomly assigned 197 adults to either play Bad News or Tetris for 15 minutes. At pretest (before playing their assigned game) and at posttest (after playing the game), each participant was presented with 18 fabricated Twitter posts such as the following:
BREAKING: Insurance companies are using your phone to track your fast food consumption.
Medical students only receive 5 hours of tutoring in nutrition. Don't trust doctors' dietary advice.
Three posts apiece were used to represent each of the six misinformation techniques covered in Bad News. (The first post above represents Conspiracy Theories; the second one represents Discrediting Opponents.) There were also three true news items not included in the main analyses.
Participants rated the reliability of each post on a scale from 1 (not reliable at all) to 7 (very reliable). They also rated their confidence in each of their reliability judgments on a 7-point scale. Finally, they described their own political ideology on a scale ranging from 1 (very conservative) to 7 (very liberal).
The main findings were straightforward:
—Among the Tetris group, mean reliability scores barely changed from pretest to posttest. Among the Bad News group, mean reliability scores significantly declined. In short, after playing Bad News, participants tended to view the fake news posts as less reliable.
—Among the Tetris group, mean confidence scores barely changed from pretest to posttest. Among the Bad News group, confidence increased significantly. The increase mainly occurred among participants who tended to view the fake news items as less reliable. (This tells us that Bad News wasn't instilling false confidence about one's own powers of discernment.)
—The impact of Bad News was not influenced by political ideology. In other words, the game benefitted people comparably regardless of how liberal or conservative they are.
Overall, there's lots of good news about Bad News here. Although the impact on reliability scores wasn't large (about half a point on the 7-point scale, on average), it was significant, and such effects, as well as larger ones, have been demonstrated in many other studies. These studies have examined Bad News as well as Go Viral!, a version of the game that focuses on misinformation related to COVID-19.
Inoculation effects or response bias?
Scientific findings rarely go unchallenged. Drs. Ariana Modirrousta-Galian and Philip Higham, both at University of Southampton, just published a paper in the current issue of Journal of Experimental Psychology: General purporting to show that the van der Linden team's findings reflect response bias.
Specifically, Modirrousta-Galian and Higham reanalyzed data from five of the published studies on Bad News or Go Viral!. They found that the games make people more skeptical about all sorts of news items, whether false or true.
If Modirrousta-Galian and Higham are correct, Bad News would be like an inoculation that shields you against harmful substances but blocks out the ones you need as well.
Imagine a cholera vaccine that prevents the bacterium from taking hold in the intestines, but also prevents essential nutrients from being absorbed. If people took this vaccine, nobody would ever die of cholera. Rather, we'd all starve to death. This would be a vaccine whose benefits fail to offset the costs, and we wouldn't want anyone to use it.
In short, Modirrousta-Galian and Higham report some very bad news about Bad News. Can we trust their conclusions?
The problem with colanders
In these newsletters, when I complain about a study, I often focus on what took place right before, or right after, the statistical analyses are run.
What happens right before statistical analysis is the gathering of data – in a word, measurement. As I've mentioned before, good statistics can't save weak measures.
What happens right after the analysis is a process of figuring out what the results tell us – in a word, interpretation. I've also noted that good statistics don't guarantee sensible interpretations.
Here, my complaint focuses on the statistics themselves.
The statistical approach that Modirrousta-Galian and Higham used is actually perfectly fine in other contexts. It just isn't suitable for reanalyzing van der Linden and colleagues' data.
Here's how you might think about it: A colander is a useful tool, but not for holding soufflé. In essence, Modirrousta-Galian and Higham poured soufflés of published data into a colander and complained that they fell apart.
(If you're a stats person and find what I just wrote too impressionistic, the method was a fairly standard application of ROC curve analysis. You can imagine the details based on what I've written in the next section.)
The specific problem
As Modirrousta-Galian and Higham view it, Bad News is meant to get people to identify fake news as fake, while recognizing that true news reports are true. What the researchers found when reanalyzing the data is that from pretest to posttest, both the fake news items and the true ones received lower scores.
However, Bad News doesn't teach people to say: this news item is true, that one is false. Rather, by showing how misinformation works, it helps people judge how much less they might trust a news item when misinformation techniques appear to have been used.
Think back to my description of that 2020 Bad News study. Participants were not asked to judge whether each Twitter post is true or false. Rather, they judged how reliable it is on a 7-point scale. All of the studies are like that. Dr. van der Linden and colleagues are very careful about how they describe their data.
Concretely speaking, here's the problem with Modirrousta-Galian and Higham's approach: When a participant's score for some news item drops slightly from pretest to posttest, this doesn't mean that they're now viewing the item as false. It just means they trust it a little less.
Moreover, a close look shows that response bias (as Modirrousta-Galian and Higham viewed it) didn't always surface, or it only surfaced temporarily, or it surfaced among the control group, or it was specific to certain items.
In short, Modirrousta-Galian and Higham provided no clear evidence that playing Bad News makes people reject all sorts of news as fake. Rather, the game helps people recognize when misinformation techniques are being used and figure out how much less reliable they find the information.
The value of colanders
Modirrousta-Galian and Higham's statistical approach wasn't suitable for reanalyzing the Bad News and Go Viral! data. It was a colander attempting to hold up soufflés.
Colanders can be useful, of course. At times, we do need to distinguish between obviously fake versus clearly credible news. We sometimes do need to say: this news item is true, that one is false. But distinctions in truthfulness or accuracy are often a matter of degree. Even a single headline might contain multiple statements that vary in degrees of acccuracy (see below). It's easy to find news we consider true but presented in a slightly misleading way. And, what looks more or less like fake news may often contain an element of truth. For example, consider this Fox News headline from last week:
"Appeals court stops FDA from jeopardizing pregnant women’s lives with abortion pill."
It's true that last week, a U.S. appeals court ruled that access to the abortion pill mifepristone should be restricted. However, this headline is constructed from a mix of false or misleading assumptions:
(a) The ruling doesn't take effect immediately; thus it doesn't "stop" anything.
(b) The ruling concludes that mifepristone should continue to maintain FDA approval, but access must be tightly restricted. Thus, if you assume that mifepristone jeopardizes pregnant women, you wouldn't say that the ruling will stop this from happening.
(c) Data reviewed by the FDA suggest that mifepristone is actually safe for pregnant women.
In short, the Fox headline contains a grain of truth buried in deeply misleading or false assumptions.
Bad News doesn't teach people to judge that the headlines like this are false. Rather, it teaches them to notice the misinformation techniques being used. In this headline, you can see polarization of a sensitive topic, as well as an attempt to stir up (or tap into) negative emotions such anger against the FDA and fear for pregnant women. (There's also the plain dumbness of it. Good journalism distinguishes between fact and interpretation. The Fox headline clearly launches straight into the latter. It doesn't convey the actual substance of the ruling.)
Further clarification
On Monday I reached out to Dr. Modirrousta-Galian via email but, as of today, I have not heard back. I also emailed Dr. van der Linden. I asked one broad question, framed as innocuously as possible: "I am wondering how you view their [Modirrousta-Galian and Higham's] analytic approach and whether you agree with their interpretations of your data."
Dr. van der Linden aired the general objection to Modirrousta-Galian and Higham's statistical approach that I've described here. He also discussed item-specific issues. For instance, Modirrousta-Galian and Higham found it troubling that response bias emerged for ambiguous true items. Dr. van der Linden found this unproblematic:
"It could be true that people err on the side of caution when presented with “ambiguous” real news items...However, why is this an issue? Our interventions don’t teach people facts about the world, they only alert people to potential manipulation so when people come across ambiguous stimuli they probably err on the side of caution. It could be that requiring a higher threshold for accepting a stimulus as true leads people to search out more information (presumably a good thing)...."
I would agree that caution and information-seeking are desirable responses to news items that are true but not perfectly clear.
Dr. van der Linden went on to note that in the earlier studies conducted by his team, most of the news items were fake news rather than real. His group has data now showing that a more balanced mix continues to yield evidence of discernment rather than response bias.
Not to pile it on, but Dr. van der Linden also pointed out that Modirrousta-Galian and Higham's reanalysis was far from exhaustive. A new meta-analysis, currently in press, looked at the literature more comprehensively and found that psychological inoculation improves the assessment of misinformation as well as real information.
Finally, Dr. van der Linden commented that
"...the beauty of our interventions is that they are “living”, so for the people who are concerned we have implemented a simple feedback tool which gives people feedback on several trials during gameplay which helps consolidate learning and consistently boosts discernment. This paper is under review, but the feedback modules are already live because we want our interventions to be the best they can be and [we can thereby] respond to reasonable criticism."
I especially like that. As van der Linden and his colleagues continue to expand their research, they're continuing to tweak the online games that are currently being played by people all around the world. Not just the virus but improved versions of the vaccine are spreading too.
Thanks for reading!