Fact Checking
What do you do when you encounter a claim that seems misleading or false?
Since a new election year is underway, I'll ask more narrowly: What do you do when you read or hear dubious political news?
Fact-checking organizations seem like a useful resource. Prominent ones like FactCheck.org, Politifact.com, and Snopes.com are free, impartial, and widely praised for their accuracy. But we might ask: Who fact checks the fact checkers?
In the first half of this newsletter. I'll tell you about a recent study that does just that. You'll find some good news there.
Fact-checking organizations can't evaluate every claim, and they may need time to do their work. We're often left to evaluate dubious statements ourselves.
Doing so is easy – we "just google it" – but the internet bristles with misinformation, so we might also ask: How successful is DIY fact-checking?
In the second half of this newsletter, I'll tell you about a new study that addresses this question.
At first glance, the results of the new study are disturbing. The researchers find that online searches make misinformation more believable. You'll see though that owing to some statistical missteps, the news from this study isn't so bad.
At the end, I'll describe three useful strategies for DIY fact-checking. In the Appendix, I'll share an update on the Statisfied project.
Fact-checking the fact checkers
In late October, Sian Lee and colleagues at Penn State published a study in Misinformation Review that compared the verdicts of prominent fact-checking organizations.
The researchers' main focus was on Snopes and Politifact, owing to the prominence of these organizations as well as similarities in their rating systems.
Key ratings by Snopes include True, Mostly True, Mixture, Mostly False, and False, among others. Key ratings by Politifact include True, Mostly True, Half True, Mostly False, False, and Pants on Fire.
(I love that distinction between False and Pants on Fire. Saying that Elvis was born in Seattle would be False. Saying that you spotted Elvis at a sushi bar last night would be Pants on Fire, because, you know, he didn't like fish.)
Lee and colleagues compared all verdicts from Snopes and Politifact posted between January 1, 2016 and August 31, 2022 that were directly comparable. Agreement was noted on 748 of these 749 verdicts.
(The lone exception? A claim that Ben Carson said this: "illegal immigrants who get caught voting should be stripped of their citizenship". Snopes judged this Mostly True, Politifact judged it Mostly False. Why? Because what Carson actually said was that "Anyone caught involved in voter fraud should...have his citizenship revoked.” Snopes took "Anyone" to mean "illegal immigrants", hence their Mostly True verdict. Politifact took "Anyone" to mean "any American". There's no way to determine which organization is right without asking Mr. Carson directly what he meant. However, if you looked at each organization's rationale for their verdict, you'd see what Carson actually said and you could decide for yourself. Since both organizations provided the actual verbiage, I'd call the extent of their agreement 748.5 out of 749.)
The researchers drew a pretty straightforward conclusion:
"[T]he two fact checkers have established consistent and reliable fact-checking practices, [which] enhances the credibility of fact checkers in the eyes of the public...Furthermore, the consistency of fact-checking among major organizations is crucial for mitigating misinformation online..."
Bait and switch?
Wait, you may be thinking. You told me this was a study on fact checking the fact checkers. The study only shows that the two fact-checking organizations agree with each other. That's not the same as fact-checking their work. Maybe they're both consistently wrong.
That's possible in theory, but then, in theory, you could doubt almost anything, a la Descartes and other philosophers. But you can't live that way. A couple of paragraphs ago, I got up to use the bathroom, but I didn't doubt the existence of my toilet. Of course it's possible that someone had broken into my house and stolen it, but we needn't entertain doubts unless there's a reason to.
In order to survive, we have to trust someone for the information we deem factual. I would say: Those you should trust (while remaining critical) should include fact checkers like Snopes, Politifact, FactCheck, and APNews. Trust them because they're transparent about their evaluation process, because they have good reputations for neutrality and accuracy, and because this recent study shows that two of them agree with each other. They're not perfect, but they're the best we have.
When fact checkers can't help
Fact-checking organizations can't evaluate every questionable claim, and they may take time to reach their verdicts. (Donald Trump's political success has been attributed in part to the volume and speed of his misstatements. The Washington Post tallied 30,573 false or misleading claims during his presidency alone, and fact-checkers often struggled to keep up.)
Snopes and Politifact verdicts yield simple conclusions (True, Mostly True, etc.), but some claims can't be evaluated that way. Consider, for instance, assertions that judicial decisions involving Donald Trump are politically motivated. You can't fact check claims about a judge's unspoken personal biases. You can only amass evidence and speculate.
If a fact-checking organization can't help you evaluate a claim, you may need to do it yourself. How good are we at this?
A new study
This study, published two weeks ago in Nature, was conducted by Dr. Kevin Aslett at University of Central Florida along with colleagues at NYU and Stanford.
Study participants were asked to evaluate the truthfulness of recent news articles deemed either "true" or "false" by a team of six professional fact-checkers.
Participants were divided into two groups: One group was instructed to search online for evidence that might support or undermine the central claim of each article. The other group received no instructions and apparently ran no searches. Both groups then judged each article as "true", "false or misleading", or "could not determine". They also rated each article on a 7-point scale ranging from 1 (definitely false) to 7 (definitely true).
Across five experiments, each incorporating thousands of participants, Aslett and colleagues found that online searches actually increased belief in misinformation.
Specifically, compared to participants who didn't search, those who searched online were significantly more likely to judge a false article as true. And, on average, they rated false articles more highly on the 7-point scale.
This seems like bad news. Online searching is widely recommended as a strategy for helping suss out misinformation. This study says that doing so is counterproductive.
Does the data really show that online searching is harmful?
No.
The study is rigorously designed, but there are two wrong moves that limit or invalidate its gloomy conclusion.
1. Tiny effects.
Online search barely influenced participants' judgments. For instance, across the five experiments, group differences never exceeded three-tenths of a point on the 7-point scale.
Three-tenths of a point is, for example, the difference between a mean rating of 6 versus 6.3 on a scale where 7 is labeled "definitely true". That difference may be statistically significant, but it lacks practical importance.
I mean, if I told someone I shared a donut with Elvis this morning, I wouldn't consider them more gullible if they rated my comment a 6.3 as opposed to a 6. Either way, I'd be selling them tickets to the next Fyre Festival.
(By the way, if you're a stats person, all relevant Cohen's d values were less than 0.2, the conventional lower bound for a "small" effect.)
2. Ecological invalidity.
Participants were recruited through Qualtrics and Mechanical Turk; we don't know much about them other than the fact that their demographics, including age and educational levels, varied widely.
Across the five experiments, thousands of participants were instructed to search online. Doubtless some of them don't ordinarily fact check their news. Perhaps they don't want to; perhaps they lack the necessary skills.
I would expect then that during the study, some of these people found information they mistakenly viewed as corroborating false news items.
I say this because when you google a false claim, your first hits often support a thematically-related idea. For instance, if you google "are there viruses on Mars?", you'll get a bunch of reputable sources discussing the scientific possibility of viruses on the planet.
Someone who's not motivated to fact check their news, or not skilled at doing so, may see those related hits and mistakenly assume these sources support whatever they're evaluating.
In short, the study may only show that people who don't fact check their news won't be good at it the first time they try.
(Aslett and colleagues' own data are consistent with this possibility, as one of their experiments links poor search strategies to higher proportions of misleading hits and greater acceptance of false items.)
The researchers could've prevented the problem I've described here by asking people about their fact-checking behavior and either excluding inexperienced people or distributing them evenly among those who vs. were not instructed to run searches.
Bottom line: Any negative impacts of fact checking online in this study are tiny and probably reflect unfamiliarity with the process. More broadly, I think we can still assume that DIY fact checking per se isn't harmful. It's only harmful if done badly.
DIY-Fact checking: A brief guide
So far I've argued that fact-checking organizations such as Snopes and Politifact are credible, and that DIY fact checking is not inherently problematic.
I'd go further and say that DIY fact checking is highly desirable, given how many dubious claims we encounter. Facebook, TikTok, Twitter (ok, X), Reddit…social media alone is a misinformation universe, despite each platform's more or less sincere efforts to cap the problem.
So, what should we keep in mind when doing our own fact-checking? Here are three suggestions:
1. Identify credible sources.
We depend on others – scholars, practitioners, journalists, etc. – for so much of what we know. Finding credible sources is essential.
What make a source credible? Being recognized as an expert (e.g., having an advanced degree). Being affiliated with a respected institution (e.g., a university). Sharing expertise in respected outlets (e.g., peer-reviewed journals and reputable news media). Having no apparent conflicts of interest.
Obviously, these criteria are fallible. For instance, yesterday, Dr. Joseph Ladapo, the Surgeon General of Florida, called for a halt to the use of COVID-19 vaccines in his state on the grounds that they contaminate our DNA.
Dr. Ladapo might seem like a credible source. He has a medical degree. He's a state surgeon general. And, he has published in peer-reviewed journals.
However, there's literally no evidence for his claims about DNA contamination from COVID vaccines. The claims are purely speculative, and experts don't even consider them plausible as speculations go.
This brings me to my next suggestion:
2. Look for corroboration across credible sources.
There's widespread, if not total agreement among experts on the safety of COVID-19 vaccines. At the same time, organizations such as the FDA, editors of leading scientific journals, and a range of scientists concur that Dr. Ladapo engages in baseless speculation. (Given that he answers to Governor Ron DeSantis, who maintains a strong anti-vaccination stance, you might suspect some conflict of interest here.) Corroboration is especially important in cases like this, because, as the prominent epidemiologist Katelyn Jetelina points out, knowledge of molecular biology is needed to distinguish the kernels of truth in Ladapo's pronouncements from the wholly unfounded conclusions he draws. Yes, COVID-19 vaccines contain DNA fragments, but no, they won't contaminate the DNA of recipients (see here for details.)
3. Read and listen critically.
Sometimes unreliable sources can be recognized simply by the way they present their topic. But regardless of source, you should be skeptical of any content that seems overly simplistic, exaggerated, incoherent, inaccurate, poorly evidenced, out-of-date, incomplete, or biased. Be especially skeptical of people who claim to have all the answers to complex issues (something you'll hear a lot during the upcoming election). The sources I trust the most take time to acknowledge and reflect on uncertainties.
Final thoughts
Misinformation has no political allegiance. I wouldn't say that Republicans are much more prone to it than Democrats or Independents. But as 2024 gets underway, I can't help feeling that Republicans have the edge when it comes to sharing and accepting some of the most dangerous types. For instance, in a Washington Post-University of Maryland poll published today, 34% of Republicans agreed that the January 6, 2021 attack on the U.S. Capitol was "probably" or "definitely" instigated by the FBI. (Only about 13% of Democrats endorsed this claim.)
Fact-checking organizations, courts of law, and DIY searches have found no credible evidence that the FBI instigated the attacks, and so, at least for the moment, this constitutes misinformation.
DIY fact checking is crucial, even when it requires listening to what folks in the other political party have to say. This will help you keep the facts straight, and to recognize when others are getting them wrong. As the saying goes, keep your friends close and your enemies closer.
Thanks for reading!
Appendix: Update on Statisfied
2023 was an exciting year for this project.
1. The newsletter acquired 913 new subscribers. (Welcome to each of you, and thanks to my continuing readers!) Although by social media standards this is a paltry number, I'm proud of it, because I haven't promoted myself on social media, and my approach to the newsletter is fairly scholarly (i.e., nerdy). In 2024 the newsletter format will stay much the same.
2. Thanks to a grant from the university where I'm a professor emeritus (i.e., retired), the Statisfied website has been built and is tapping its foot impatiently while I sort out which content is most suitable. I've been indecisive, as the website could still go in so many directions, but I do plan to get it online by late spring or summer.
3. I've made progress on my book and hope to have preliminary discussions with a few agents by the end of the year.