Public Opinion on the War
Like most people, when I see a public opinion poll, I assume the results tell me how Americans feel, right now, about some topic.
Then I look more closely at the poll and realize I've overestimated how much I know.
There's a scientific term for overestimating how much you know (or how well you can perform) when your knowledge (or ability) is minimal: The Dunning-Kruger effect. You can see this effect sometimes when people talk about public opinion: The least informed people tend to be most sure about what others think. (A fuller explanation of Dunning-Kruger is provided in Appendix A.)
In this newsletter I'll be showing how a Fox News journalist misrepresented American attitudes toward the Russia-Ukraine war, owing to reliance on minimal data (i.e., a superficial reading of a recent poll). But my ultimate goal isn't to pick on the journalist. It's easy for anyone to overestimate how much they know about public opinion. My goal here is to present a case study of why this happens, and how it can be prevented.
The Fox News article
On March 7, Liz Peek published an opinion piece in Fox News entitled "End Ukraine slaughter and stand up to Russia, Americans tell Biden in new poll". (Ms. Peek was referring to a poll from Reuters/Ipsos first reported on March 4.)
Ms. Peek's takeaway from the poll mostly consisted of sweeping generalizations: "Americans want Biden to get more aggressive." "Americans want Joe Biden to stop the slaughter in Ukraine." "Americans want to see that the United States...cannot be cowed by a thug in the Kremlin." However, the results of the poll don't support any of these statements. For one thing, the poll questions of relevance don't ask about President Biden but rather the United States. We don't know whether respondents were thinking about the President, and/or Congress, and/or something else when they answered questions about what the "United States" should do. (Another issue is that the poll didn’t ask about topics like aggression, stopping the slaughter, etc.)
Readers of Ms. Peek's opinion piece who dig deeper would find only one reference to the poll's specific findings:
"A new Reuters/Ipsos poll shows an overwhelming majority of Americans (74%) support establishing a no-fly zone over Ukraine, banning U.S. imports of Russian oil (80%), and imposing further penalties on the Russian economy (81%)."
People who read this passage may feel like they learned something about public sentiment, but a closer look at the poll raises serious doubts. Here are three reasons why we learn virtually nothing from these stats:
1. Respondents may not have understood some of the questions.
Consider the finding that 74% of respondents support the creation of a no-fly zone over Ukraine. The polling organization itself acknowledged that respondents may not have known what a no-fly zone is, or realized that enforcement would require us to shoot down Russian aircraft.
I suspect that if the concept of "no-fly zone" had been explained, fewer than 74% of respondents would have supported it, because the extent of support for U.S. military engagement with Russian has never exceeded 39% in any nationally representative survey thus far.
2. Some questions may not have been sensitive enough to capture respondents' attitudes.
The conciseness of public opinion polls is a mixed blessing. You can obtain data quickly and easily from large samples, but the price is a relatively superficial understanding of how people think. For example, respondents for this particular Reuters/Ipsos poll were only given two options: Support or not support. "Uncertain" wasn't an option.
In a separate poll conducted by The Economist/YouGov, which did include an "uncertain" option, only 45% of respondents indicated that a no-fly zone would be a "good idea". 20% considered it a "bad idea", and 35% were "uncertain".
The Economist/YouGov data suggest that in the Reuters/Ipsos poll, the 74% who supported a no-fly zone consists of at least some people who were actually uncertain. People who hadn't decided how they felt (or weren't completely sure what a no-fly zone is) may have seen the no-fly zone as a compromise between doing nothing versus intensive military involvement.
As for the 81% of respondents who favored imposing further penalties on the Russian economy, the poll doesn't explain what these penalties are, or to what extent they might adversely affect the U.S. The nature of the penalty matters. For example, oil-related sanctions would cause Americans to pay more for fuel and gas, but in response to a separate question, only 58% of respondents indicated that doing so would be worthwhile "in the interest of defending another democratic country."
Here again, if something had been stated about what those "further penalties" might be, or if respondents had been allowed to choose an "uncertain" option, support for such penalties might have differed considerably from 81%.
3. Discrepancies across poll results undermine the validity of all findings.
In the Reuters/Ipsos poll, 37% of respondents supported our sending American troops to help defend Ukraine, and 37% supported air strikes on Russia. However, in the Economist/YouGov poll, only 19% of respondents supported sending American troops to Ukraine to fight Russian soldiers, and only 20% supported air strikes on Russia.
In short, the percentages of Americans who support military involvement is almost twice as high in the Reuters/Ipsos poll, even though both polls consisted of large national samples and were conducted during the same week. What gives?
Earlier I mentioned that the Economist/YouGov poll includes an "uncertain" option, while the Reuters/Ipsos poll does not. Another important difference is that with respect to military involvement, the Reuters/Ipsos poll only asks about troops and air strikes. The Economist/YouGov poll includes questions about other kinds of military options, such as drone strikes, as well as cyber attacks and other responses not mentioned in the Reuters/Ipsos poll. In addition, the Economist/YouGov poll distinguishes between sending U.S. troops to help but not fight Russian soldiers, versus sending troops who would fight with the Russians.
Why are these differences important? Well, in every poll I've looked at, including these two, the majority of Americans want our government to do more (even if they don't agree on what the government should do). Thus, one might expect more Reuters/Ipsos respondents to favor military involvement, because the poll doesn't provide many options, and it doesn't allow respondents to express uncertainty. (For more on the discrepancy between the polls, see Appendix B.)
Conclusion
Liz Peek, the Fox News journalist who described the Reuters/Ipsos poll, either (a) disregarded what the poll actually showed, because her primary interest was to attack President Biden, or (b) read the poll carelessly, and didn't consult other polls, because she succumbed to a Dunning-Kruger effect. Americans may want our government to do more in response to the Russian invasion, but it's not clear that most of us want our president to be more aggressive, nor is it clear that "overwhelming majorities" of Americans want to see any particular type of response. The polls don't warrant very specific conclusions.
Preventive strategies
How can we avoid Dunning-Kruger effects? When we try to understand public opinion, for example, how can we ensure that we learn as much as possible, while avoiding the tendency to overestimate how much we know?
The strategies that seem most useful are essentially the same regardless of whether we're researchers, journalists, lay persons, etc:
1. Consider multiple sources of information.
The Fox News writer who cited the Reuters/Ipsos poll would've had to tell a different story if she had considered the Economist/YouGov poll as well. (Beyond polling, other kinds of data might help too, including what can be gleaned from the endless multiverse of social media.)
2. Examine each source of information closely.
In the case of public opinion polls, questions as well as findings should be examined. The Fox News writer who referenced the Reuters/Ipsos poll would've had to tell a different story if she'd merely looked more closely at that poll.
3. Reflect on what you don't know.
How can you reflect on what you don't know? That sounds like something people do when they drop acid. Well, learning more, talking to people, and/or reflection can sometimes help us realize where our knowledge is lacking. For example, as I reflected on the two polls, it struck me that because stats are only reported for individual questions, we can only speculate about broader patterns in the way people are thinking. We might find, for example, that support for American military involvement is highly correlated with how threatened people feel by the Russian invasion. We might find that a small percentage of people are highly supportive of all types of military response, regardless of how extreme, and we might find that those people primarily affiliate with one political party. These are simple examples of how statistical analyses could've revealed more about public opinion.
Appendix A: The Dunning-Kruger effect
David Dunning, Jason Kruger, and others have shown that for a variety of tasks, people with the least competence tend to be most overconfident about their abilities. Owing to their lack of competence, they make mistakes but are least able to recognize that they've done so.
The Dunning-Kruger effect is illustrated in the figure below for a test of logical reasoning ability. The y-axis is the percentile ranking for each test-taker compared to the entire group. (Participants in the study were asked to rate their own logical reasoning ability, relative to others, on a percentile scale; they also took a logical reasoning test and their scores were converted to percentiles.)
In this figure, the biggest discrepancy between how good people think they are at logical reasoning (perceived ability) and their score on an actual logical reasoning test can be seen among those in the bottom quartile (i.e., the lowest 25% of scores on the test). As you can see, folks in the lowest quartile consider themselves slightly above average. You can also see, moving from left to right, that as actual test performance increases, perceived ability only increases a little, so that the top quartile of test-takers actually underestimate their ability by a small margin.
One of the important roles that journalists play is to convey the results of recent studies, polls, etc. to the general public. As a result, one can probably trace some misinformation to Dunning-Kruger effects that play out simultaneously among both journalists and their readers.
For example, journalists who gather the least information on public opinion (e.g., by skimming a single public opinion poll) may be the ones who most overestimate their understanding of public opinion, and thus they'll write about it with great certainty. An example would be the opinion piece discussed in this newsletter. At the same time, readers of this Fox News piece will know less than readers of journalists who provide more thorough coverage. As a result, the Fox News reader will overestimate how much they know, based on the writing of a journalist who has done the same!
Appendix B: Are the polls consistent on deployment of American troops?
In the Economist/YouGov poll, 33% of respondents indicated it would be a "good idea" to send U.S. soldiers to Ukraine "to provide help, but not to fight Russian soldiers." This is close to the Reuters/Ipsos poll finding that 37% of respondents favored sending U.S. soldiers "to help defend Ukraine". It's only when the Economist/YouGov poll asks about sending U.S. soldiers to "fight Russian soldiers" that support drops to 19%.
Does this difference in wording account for the discrepancy between the polls? Probably not. If virtually everyone in the Reuters/Ipsos poll thought that the question about sending U.S. soldiers "to help defend Ukraine" meant that the soldiers would go there to help, but not fight Russians, then the results of the two polls would be consistent. But I doubt most people think that way. Most people probably assume that sending soldiers "to help defend" a country means that the soldiers could be involved in combat. In any case, we're left to speculate, because the wording of the question in the Reuters/Ipsos poll is ambiguous.
In sum, the inconsistency between these two polls can't be resolved. We don't know to what extent Americans support direct military involvement. Based on these and other polls, perhaps all we can say is: Less than 50% of Americans now prefer any direct strategy that sounds even remotely "military". (In the Economist/YouGov poll, for example, 26% support drone attacks and 35% support cyber attacks.)