If you're cooking a lot this holiday season, you may have been spooked by reports of toxic chemicals in black plastic spatulas and other kitchen utensils.
Some of headlines were terrifying. "Your favorite spatula could kill you" warned Salon. A report in The Atlantic featured this lovely image:
Guess what? The October study that sparked the commotion contained an error. (The lead author refers to it now as a "typo".)
The study claimed that the wrong utensils could expose you to BDE-209, a toxic flame retardant, at levels approaching the EPA's maximum safe limit of 42,000 nanograms per day.
The EPA's daily limit for BDE-209 is actually 420,000 nanograms. The October study shows that we might be ingesting only about one-tenth that amount.
Oops.
The online version of the study was corrected on December 15, but even without the correction, the carcinogenicity and other dangers of black plastic utensils had been exaggerated. (Here is a good summary of what we know about the topic.)
Your favorite spatula won't kill you. It won't even curse at you under its breath.
My Christmas present to you, dear reader, is a description of some other cases where the risk of cancer has been exaggerated.
My point is not that exaggeration is the norm. Rather, I want to describe what motivates experts to magnify risk in some cases.
Last week I described financial incentives. My focus was on shoddy, semi-fraudulent research that exaggerates the risks of benzene exposure. The research, conducted by a chemical testing company called Valisure, is being used drum up business for the company.
Cancer screening centers sometimes exaggerate risk in order to sell testing that people don't need. Expert witnesses may exaggerate risk on behalf of whoever hired them. And there are folks like Ian Paterson, currently serving a 20-year sentence for conducting unnecessary procedures on more than 1,000 breast cancer patients after overstating or inventing risks.
Clearly, exaggerating the risk of cancer is profitable.
This week I'll be describing how exaggerated risks can stem from the desire for publication, and, in some cases, from a genuine concern for public health.
2. Exaggerating cancer risk for publication
Cancer risk is exaggerated sometimes in order to make a study more publishable.
(Peer-reviewed publication may benefit scientists financially – higher salaries down the road, consulting gigs, book contracts, etc. – so getting published and making money aren't necessarily unrelated motives.)
The example I want to share is an October study showing a higher risk of cancer among people with lower levels of omega-3 and omega-6 polyunsaturated fatty acids, or PUFAs.
PUFAs are essential to the normal functioning of body and brain. Most people get sufficient quantities from their diet, but omega supplements have nonetheless become a multibillion dollar global industry.
The new study, published in the International Journal of Cancer, was led by Yuchen Zhang, a postdoctoral scholar at University of Georgia.
Zhang and colleagues obtained PUFA data from each of 253,138 cancer-free adults at some point between 2007 and 2010. 29,838 of these individuals developed cancer by 2023.
The point of the study was to figure out what was distinctive about the cancer group. What increased a participant's chances of ending up in that group?
In brief, Zhang and colleagues found an inverse relationship between omega-3 and -6 plasma levels and the risk of various cancers.
In plain English, this means that the higher a person's PUFA levels, the lower their risk. Or, to put it the other way around, the lower the PUFA levels, the higher the risk. (There was one exception: Higher omega-3 levels were associated with a lower risk of prostate cancer.) All effects were small.
Should you start taking fish oil supplements? I wouldn't recommend it, at least not on the basis of this study. Here are a few of the many reasons we can't trust the findings:
1. PUFA levels were only recorded once.
PUFA levels were measured once, at the beginning of the study, but participants' health was tracked for more than a decade. (About 13 years, on average.)
Could each participant's PUFA levels have stayed the same over a period of years? This seems unlikely. Our bodies don't create PUFAs; rather, we obtain them from foods (and, in some cases, supplements). As diet and/or supplement intake changes over time, PUFA levels will also tend to change.
This is what I call the snapshot problem: Taking a "picture" at the beginning of a study, then assuming that what's observed remains constant over time. Any links between cancer and PUFA levels that were measured once, over a decade earlier, may be coincidental.
2. Known contributors to cancer risk were not adequately managed.
With respect to smoking and drinking, each participant was classified as "never", "previous", or "current".
This is far too crude. A two-pack-a-day smoker has a much greater risk of developing cancer than someone who lights an occasional cigar after dinner, but they'd both be classified as "current" smokers in this study. Likewise, someone with alcohol use disorder and someone who drinks a glass of wine on occasion would both be labeled "current."
In the end, differences in how much people smoked and drank (and in other variables) rather than PUFA levels could've been the main predictors of cancer.
3. The approach to data analysis was not adequately justified.
Researchers need to make many choices about how to record and analyze their data. These "researcher degrees of freedom" allow for choices that bias the results. I see lot of that in this study.
For instance, with respect to diet, Zhang and colleagues "excluded data from individuals whose total energy intake fell within the highest or lowest percentiles to avoid extreme values." No further clarification is provided.
This is a great example of why the "fine print" matters. Data shouldn't be thrown out just because it's extreme. Rather, only true outliers should be excluded.
To illustrate, suppose you want to know whether a high school basketball player's height predicts how many points they score per minute.
If two players in your sample are 7 feet 4 inches tall, then sure, you should exclude their data if the next tallest players are 6' 5". There's just too much of a gap. Any generalizations you make about the association between height and points scored will be distorted.
However, if your sample also includes some players who are 7' 3", 7' 2", 7' 1", and all the way down, you shouldn't throw out any data, because you have a nice continuous distribution that supports generalizations about height.
Zhang and colleagues gave no details on who they excluded or what the distribution of total energy intake looked like. This is one of many red flags suggesting that they threw out data, while massaging the rest, in a way that yielded publishable results.
4. No hypotheses or explanations were presented.
Zhang and colleagues analyzed many variables, including 19 site-specific cancers. Inevitably, some associations turned up significant. But the researchers didn't present hypotheses for how omega-3 and omega-6 might affect the risk of cancer. And, after finding effects for some those 19 cancers, they didn't offer explanations for how PUFAs play a role.
You might say this is a study that couldn't fail.
The reason we need hypotheses and explanations is that without them, we're just reporting associations between columns in a spreadsheet. Run enough analyses and some will inevitably turn up significant. Thus, for instance, we find that the number of bachelor's degrees awarded in engineering between 2012 and 2021 is highly correlated with Google searches for "dollar store near me".
Is this because engineering degrees don't pay well and engineers tend to shop at dollar stores?
Of course not. We see an association because Tyler Vigen, the source of this figure, runs zillions of correlations on publicly available data in order to demonstrate that a small number of them turn up significant just by chance.
Zhang and colleagues' study couldn't fail, because it was inevitable that some of the many associations they explored would turn out to be significant. Sometimes, when you toss a coin repeatedly, you'll get three or four heads in a row. That doesn't mean the coin is weighted.
Exaggerating cancer risk for altruistic reasons
In this newsletter, I make a lot of claims about motives.
For instance, I claim that some experts who exaggerate the risks of cancer are just out to make a buck. That seems clear enough.
As for the PUFA study, I believe that in the interest of getting a paper published, the researchers used a variety of tricks to exaggerate the risks of low omega-3 and -6 levels (or, as they would put it, the benefits of higher levels). But I have to admit that I'm speculating. The data exaggerates the risks, and I do see reasons to blame the researchers, but I can't claim any personal knowledge of their motives.
My discussion of altruism will draw me a bit further into the realm of the speculative. I believe that experts sometimes magnify risks out of concern for public health. They may profit financially and/or professionally by doing so, but they also genuinely care about people and want to protect us from harm.
In other words, instead of Plato's noble lie, we have the noble exaggeration.
My example is a December 3 report from ProPublica, a non-profit organization with a strong reputation for investigative journalism (e.g., they're the first fully online news source to win a Pulitzer Prize).
ProPublica's report concerns formaldehyde, a naturally-occurring chemical that's found pretty much everywhere (including our own bodies, which produce and quickly metabolize about 1.5 ounces per day). Formaldehyde is used in a variety of manufacturing processes and also released into the atmosphere via automobile exhaust, pressed-wood products, and smoke from cigarettes and gas- or wood-burning stoves. (You're probably familiar with this chemical from whatever you dissected in high school biology class.)
I think ProPublica would be the first to acknowledge that their intent is to terrify. The first two paragraphs of the report give you a sense of the overall tone:
"In a world flush with hazardous air pollutants, there is one that causes far more cancer than any other, one that is so widespread that nobody in the United States is safe from it.
It is a chemical so pervasive that a new analysis by ProPublica found it exposes everyone to elevated risks of developing cancer no matter where they live. And perhaps most worrisome, it often poses the greatest risk in the one place people feel safest: inside their homes"
I have no quarrel with the claim that formaldehyde exposure can cause cancer, or that EPA regulations are insufficient. ProPublica tells the story well, and I believe the organization is genuinely concerned about public health. However, their report wildly exaggerates risk.
1. Limited evidence.
ProPublica's report relies heavily on National Cancer Institute studies that tracked more than 25,000 factory workers over a period of decades. The studies showed that workers exposed to the highest levels of formaldehyde were at greater risk of developing certain kinds of cancers (particularly lymphohematopoietic malignancies and myeloid leukemia).
The effects were small, and, in the case of myeloid leukemia, inconsistent, but the main limitation of the studies is what's occasionally called rare events bias.
In these studies, cancer mortality rates are calculated for workers exposed to different levels of formaldehyde. But the actual numbers of those 25,000+ workers who died from cancer are very small (ranging from less than 10 to less than 300, depending on type of cancer and level of exposure).
Because those numbers are small, the impact of formaldehyde may have been exaggerated.
Here's a simple illustration:
Suppose that group A and group B each consist of 100 people.
Scenario 1: 1 person in group A and 2 people in group B die of cancer.
Scenario 2: 10 people in group A and 11 people in group B die of cancer.
In each scenario, only one additional member of group B succumbs to cancer. However, the elevated risk of mortality among group B will be much greater in Scenario 1, according to the standard approach to calculating risk ratios that NCI used.
Specifically, in Scenario 1, being in group B increases mortality risk by 100 percent. (2 is 100% greater than 1.) In Scenario 2, group B's mortality risk is only 10 percent higher (11 is 10% higher than 10.)
In short, when the number of events (i.e., cancer mortalities) is very small, the chances that group differences are significant will be inflated. In Scenario 1, just one additional cancer diagnosis doubles the risk.
This is a simple illustration of the limitations with the NCI data.
2. Reliance on anecdote.
In 1978, and then again in 2009, I lost good friends to cancer.
If either of them had blamed the CIA, vaccines, or martians for their disease, I would've bit my tongue and nodded. I would not argue with a dying person.
All the same, personal anecdote is the weakest form of evidence. What we know about the causes of cancer derives in part from studies on many people. When someone says their uncle drank lemonade every day and got colon cancer in his seventies, it just doesn't tell us much.
Unfortunately, ProPublica relies in part on the anecdotal approach, describing the lives of two people apparently sickened by formaldehyde exposure.
I have no particular reason to doubt that formaldehyde was the culprit in these cases. But this just doesn't rise to the level of scientific evidence. Rather, it strengthens the impression that ProPublica is relying on scare tactics at the expense of the data.
4. A DYOR problem
Drawing from EPA resources, ProPublica provides a free lookup tool. Enter your city, your zip code, or your street address, and you'll see data on the incremental lifetime cancer risk from formaldehyde in your neighborhood.
I don't doubt ProPublica's good intentions, but the tool is ultimately a limited, potentially scary source of information.
Here's an illustration. For the 20850 zip code (Rockville, Maryland), formaldehyde exposure is expected to cause 1 additional death per 54,000 residents over the course of their lifetimes. That's quite a small number obviously.
The EPA doesn't support the way ProPublica uses this tool. Why not? Because, as the EPA acknowledges, the estimates are extremely imprecise:
Formaldehyde levels aren't directly measured but rather modeled on the basis of various data about each locality. One of the model assumptions is that a person would breathe the same air every day for 70 years.
Estimates of risk are modeled on the basis of formaldehyde estimates for one year. This is the snapshot problem again.
Estimates of risk pertain to outdoor exposure. As ProPublica itself acknowledges, indoor levels tend to create the greatest risk. (In addition, the tool fails to reflect time spent working and playing away from one's residential neighborhood.)
In light of these and other sources of imprecision, I wouldn't conclude anything about formaldehyde risks in Rockville (or anywhere else). There's way too much room for error.
If you're concerned anyway, ProPublica does offer recommendations for how to minimize formaldehyde exposure. I would say at least that cigarettes and automobile exhaust are two sources we should be aiming to reduce anyway.
Final thoughts
There are people who name their tumors.
Tina Tumor. Donald Lump. And a clever fellow who called his tumor "Irving", shortened the name to "Ir" after most of it was removed, then started calling it "Irv" when it grew back a bit.
I admire these people. They're alive to the comedic potential of an otherwise grim situation.
The rest of us tend to view cancer as something to be feared. I don't think we need overstatements of the risks. As I wrote last week, exaggerating risk causes needless anxiety around an already frightening topic. It's particularly troubling when the motive for exaggeration is personal gain.
The best we can do to prevent cancer is to eat and sleep well, manage stress, stay active, and be cautious about activities known to be carcinogenic (e.g., smoking, drinking to excess, overexposure to sunlight and tanning beds, use of toxic chemicals without proper ventilation, and so on – see here for some simple tips).
Happy holidays to you, and thanks for reading!
Thank you for your brilliant, in-depth exploration of faulty research and media scare tactics.
Reassuring.