Free Speech on Campus
Describing university campuses as "intellectually repressive environments", Florida Governor Ron DeSantis signed a new law that took effect on July 1. The law requires public colleges and universities in Florida to conduct annual surveys documenting the extent of "intellectual freedom and viewpoint diversity" on campus. (The law also prevents any shielding of the campus community from offensive or unwelcome speech, and it allows students to surreptitiously record classroom lectures, without instructor consent, for use in civil or criminal cases against the institution.)
Faculty groups, as well as democrats outside academia, strongly oppose the new law. My own concerns include, among other things, the annual surveys the law requires. In this newsletter I'll do three things:
1. describe problems with the Florida law's survey requirements;
2. discuss what can go wrong with these surveys (using the 2020 College Free Speech Rankings as an illustration); and
3. offer suggestions on how to make the best of this challenging legal mandate.
The Florida law offers no guidance on survey content or methodology, other than requiring that the surveys be "objective, nonpartisan, and statistically valid."
The first and last requirements are empty, because they don't call for anything survey researchers wouldn't be trying to do anyway (or at least claim they're trying to do). As for "nonpartisan", this could mean that survey creators have no political beliefs of relevance to the topic (which seems unlikely) or that survey questions and results are not aligned with some political perspective (which seems quite difficult to achieve).
In practice, anyone can claim that a survey (or anything else) is "nonpartisan", but you have to look and see. Case in point: A report that created a fair amount of buzz in the media as well as academia recently called the 2020 College Free Speech Rankings (see here). The report describes itself as "nonpartisan," but it was funded by the Charles Koch Institute, which is openly partisan, and it was was commissioned by The Foundation for Individual Rights in Education (FIRE) and RealClearEducation, organizations that describe themselves in multiple places as "nonpartisan" but clearly aren't. FIRE is supported by conservative foundations (the Bradley Foundation, the Sarah Scaife Foundation, and the Charles Koch Institute), and their website clearly reflects a conservative agenda. RealClear Media Group, which includes RealClearEducation, can be described as a moderately conservative source of news and polling data that has become increasingly conservative in recent years (details here).
If the College Free Speech Rankings was commissioned and funded by conservative organizations, what about the organization that actually conducted the surveys? This organization, CollegePulse, is a respected, student-created opinion platform that doesn't seem obviously partisan in its mission, methods, or sources of funding. And yet... There are many indications that CollegePulse didn't take a nonpartisan approach to creating its rankings. Here are some stats that illustrate the point:
Throughout the report, in green font, are 22 quotes from students. All 22 students articulate discomfort about expressing themselves on campus. 22 out of 22. No exceptions. In a report that's supposedly nonpartisan and empirical, no student is quoted as saying, in effect, that they felt comfortable – or at least not uncomfortable – in expressing themselves, even when their opinions seemed unpopular. It's hard to imagine no students ever felt that way. In short, the quotes seem to have been cherry-picked. And there's more:
Seven of the 22 quotes don't indicate anything specific about politics or ideology, and one of the quotes seems difficult to classify. 10 of the remaining 14 quotes were anti-liberal (either in generic way, or drawing from an explicitly Republican or Christian perspective). In short, these 10 students were complaining about liberal sentiments on campus. Only four of the 14 quotes presented the opposite scenario: a liberal complaining about an allegedly non-liberal atmosphere. (Three of these four quotes were drawn from a discussion of one university, Brigham Young.)
In short, there’s bias in the "qualitative" data provided in this report. Specifically, an anti-liberal bias. As for the quantitative survey data, which was the main focus, I'm going to argue that the stats behind the rankings are flawed, owing to the wording of the survey questions.
Here's a quick sketch of how College Free Speech rankings are determined: Each institution receives an overall score ranging from 0 to 100, with higher scores indicating greater freedom of speech. This overall score is based on five sets of questions pertaining to openness, tolerance, administrative support for free speech, self-expression, and a speech code rating created by FIRE. However, the dimensions aren't equally weighted. 92% of the overall score is determined by openness (40%), tolerance (40%), and self-expression (12%). I'll focus on these three dimensions to illustrate my concern with the rankings. In each case, you'll see that the survey question is poorly worded, resulting in uninterpretable data.
Openness was measured by eight questions. The prompt was: "Some students say it can be difficult to have conversations about certain issues on campus. Which of the following issues, if any, would you say are difficult to have an open and honest conversation about on your campus?" The eight options were Abortion, Affirmative action, Feminism, Gun control, Immigration, The Israeli/Palestinian conflict, Race, and Transgender issues.
My primary concern here is the vagueness of the term "difficult". There are many reasons why you might consider it "difficult" to have an open and honest conversation about these six topics, but not all reasons implicate a campus that discourages free speech. Consider the following scenarios:
(i) You're unsure of your exact views on a topic.
(ii) You sense that your views are complex and perhaps unclear or self-contradictory.
(iii) You doubt whether you can express your views as eloquently as others do.
In each scenario, the difficulty isn't caused by the campus climate (which might be quite open) but rather by the prospect of not presenting oneself well in public. Students with these kinds of concerns might therefore consider their campuses more open than their survey responses would suggest.
Tolerance was measured by six questions. The prompt was: "Would you support or oppose your school allowing a speaker on campus who promotes the following idea:” The ideas were: “Abortion should be completely illegal.” “Black Lives Matter is a hate group.” “Censoring the news media is necessary.” “Some racial groups are less intelligent than others.” “The U.S. should support Israeli military policy.” “Transgender people have a mental disorder."
Here my concern is with the term "promotes". No doubt the creators of the survey intended it to mean something like "expresses support for the idea during an invited speech." However, it could also be taken to mean "engages in activities on campus that support the idea." A student might favor inviting speakers who verbally support offensive ideas, yet not want the speakers to do anything else on campus to promote those ideas (e.g., distribute leaflets, organize protests, make impromptu speeches around campus, etc.). Students like this might therefore be more tolerant than indicated by their survey responses.
Finally, self-expression was measured with a single question: "Personally, [have you] ever felt you could not express your opinion on a subject because of how students, a professor, or the administration would respond?” The "could not" in this question is vague. As with openness, some students who consider their campus quite amenable to self-expression might not express their opinion on a subject when they feel uninformed, uncertain about their views, or concerned that they can't make a well-articulated statement. These students aren't worried about ideology; they just don't want to be perceived as inarticulate or muddled. Also, consider that term "ever". "Ever" doesn't mean "generally". It means "one or more times." Even among students who feel that their campus is extremely supportive of self-expression, if a student chose not to share their opinion during one conversation with one person on one occasion, they would respond "yes" to the question, and campus.
The self-expression question is arguably the most problematic element of the survey. The "ever" almost guarantees that the respondent isn't presenting a general assessment of campus support for self-expression. And, the report summarizes the self-expression results using the term "self-censorship", a term that was picked up by various news organizations. Here's a scary example ("scary" because the source is so influential): On 9/29/20, The Chronicle of Higher Education ran a story on the report entitled, in large font, "More than half of college students self-censor when race and other tough topics come up, survey finds." (story here).
The Chronicle's title is inaccurate, or at least deeply misleading. What the survey found is that 60% of students reported at least one instance in which they could not express their opinion "because of how students, a professor, or the administration would respond." That's it. We should ask about that 60% figure: How many of these students didn't express their opinion because they weren't sure of how they felt or whether they could justify themselves? How many of these students didn't express their opinion because of purely interpersonal rather than ideological reasons ("I don't like your haircut, but I'm not sure I know you well enough yet to say so.")? How many of these students suppressed their opinions on just one or two occasions (e.g., in class, because they felt unsure of themselves) but found their campus to be extremely receptive to self-expression overall?
(I'm actually surprised that after spending a few years with thousands of other people on campus, 40% of students could not recall ever choosing not to express an opinion owing to concerns about how others would respond. Are 40% of students (or anyone else) really that consistently expressive?)
The kinds of concerns I've raised here apply to the other questions in the Free Speech Rankings Report survey. Long story short: I don't trust the data. Along with bias in who commissioned and funded the report, stats presented in the report likely overestimate the extent to which students experience a lack of free speech on campus.
To be clear, I'm not saying that students don't experience free speech limitations. Lots of data, scientific as well as anecdotal, tell us that they do. I'm only saying that I don't trust the numbers from the Free Speech Rankings. Owing to methodological flaws, the report tells us nothing credible about the magnitude of the problem.)
Back to the Florida law. I don't support the survey mandate, but since it's the law (and may become law in other states), here's what I would recommend to those who create the surveys, as well as those who examine the results later:
1. If outside organizations conduct the surveys, make sure they're nonpartisan. (Don't assume they are just because they claim to be. Look and see. Look at the scope of their activities, their sources of funding, their leadership, etc.)
Since the law doesn't provide funding for colleges and universities to conduct the surveys, many of these institutions may leverage their own resources (e.g., offices of institutional research) rather than hire outside organizations, in which case they should still be aware of the risk of bias and seek a diversity of input on survey design. (A more realistic goal might be "multipartisan" rather than "nonpartisan" surveys.)
2. Construct the surveys carefully. As in any survey study, the specific phrasing of the questions (as illustrated in this newsletter) influences the credibility of the data.
Getting the wording of a survey right can be tricky. Hence the need for caution – and for doing a better job than the 2020 College Free Speech Rankings. I've included some thoughts in the Appendix.
3. Be clear and careful about the treatment of survey data.
In their formal statement opposing Florida's law, the AAUP points out that the law "offers no assurances that the survey’s answers will be anonymous, and there is no clarity on who will use the data and for what purpose." One hopes each institution will guarantee anonymity to survey respondents, only report aggregated or anonymized data, and not use survey responses as grounds for intervening if a student calls out a particular program, class, instructor, etc. This illustrates an obvious point about the use of statistical factoids: Like any other information, they can be weaponized. If institutions of higher ed are forced by law to survey free speech climate on campus, we need to not only ensure methodologically sound surveys, but also ethically appropriate use of data.
Thanks for reading!
Appendix: Surveying free speech climate
In this newsletter I use the term "free speech" loosely, to cover a number of interrelated phenomena. The Florida law specifically requires campus support for "intellectual freedom and viewpoint diversity", which it defines as "the exposure of students, faculty, and staff to, and the encouragement of their exploration of, a variety of ideological and political perspectives." The law also prohibits "shielding", which means "to limit students', faculty members', staff members' access to, or observation of, ideas and opinions that they may find uncomfortable, unwelcome, disagreeable, or offensive." That's a lot to unpack – and to operationally define in the form of survey questions. What could you do?
Well, you could take the Florida law at its word and ask students to rate (on a scale of 1 to 7, say) questions such as "To what extent have you been exposed to a variety of ideological and political perspectives on campus?" (And, somewhere in the instructions, you would define "on campus" – presumably not just classrooms but also study groups, dorms, virtual learning environments, etc.)
Unfortunately, terms like "variety" (and others in the Florida law) are quite squishy. What would a "variety" of perspectives be? (Can you count perspectives? At what point do subtle distinctions between your perspective and mine mark different perspectives?) One could ask more directly "How diverse are the ideological and political perspectives you've been exposed to on campus?" but that's just another flavor of the same problem. (In particular, how do you quantify "diverse" in this context?)
At bottom, the challenge in translating the Florida law's mandate into survey questions is that the law’s vague language ("variety", "encouragement", "limit", etc.) conceals its political motivation. What does Governor DeSantis want for Florida campuses? Surely not every possible point of view. For example, you can always imagine scenarios in which a speaker is so outrageously offensive, violent, and ridiculous that nobody would want that person to speak on campus. ("16 recipes for cooking your grandchildren when they're naughty." You'll never hear that list unless it appears in a stand-up comedy routine or post-modern play.) On the other hand, there's reason to believe that Governor DeSantis would not want Florida campuses to ban, for example, presentations from white supremacists with militant attitudes (see here).
In short, the law seems motivated by the desire to protect specific kinds of speech and inquiry on campus (those most amenable to conservatives, for example), but since the law can't be written that way, the survey questions can't reflect such ideologically-specific motivation. Hence we're left with broad concepts like support for "intellectual freedom and viewpoint diversity", and it's difficult to operationally define these concepts in a survey.
So, how about this: "How often did you choose not to express an opinion on campus because you were among people with a different political viewpoint who would not welcome dissent from that viewpoint?" (Different versions of this question would replace "political viewpoint" with phrases like "religious viewpoint", "perspective on race", "attitude toward the LGBTQ+ community", etc.)
Although it's too wordy, a question like this seems close to what the Florida law has in mind. Even so, it's not an ideal question, because we'd still be unsure why students chose not to express certain opinions. Some students may consider their campuses exceptionally supportive of viewpoint diversity, but they don't feel the need to air a dissenting opinion every time they have one. Or, for non-ideological reasons they just wouldn't consider it acceptable to do so in some settings (commencements, funerals, etc). And what about students who kept their opinions to themselves for ideological reasons on some occasions and non-ideological reasons on others?
In the end, surveys and stats might not be the right approach. Perhaps we need what anthropologists call ethnographies, or what other social scientists call qualitative case studies, in order to capture the climate of free speech on individual campuses. In other words, we may need to use words rather than numbers to tell this complicated tale. This illustrates an important point about statistics: Not everything can be "statisfied".