Here's a quiz for you. Each of the following headlines from this week refers to a health-related study. Which headline is inaccurate?
(a) "Early risers more likely to develop anorexia"
(b) "Super Mario may smash depression symptoms"
(c) "Wearing hearing aids could reduce your risk of dying earlier"
These headlines come from widely-read sources (NBC, MSN, and the Washington Post, respectively), and you can find similar ones elsewhere. The topics – anorexia, depression, and early mortality – are important. Accuracy matters.
However, my "quiz" was a bit of a trick question. All of the headlines are inaccurate. And yet, none of them is completely wrong. Each contains a kernel of truth obscured by at least one error in statistical reasoning.
I was excited (in an embarrassingly nerdy way) to have stumbled across these headlines all in the same week. They illustrate the need for caution with new data on health and other topics of personal relevance. Misrepresentations of the data are common – and sometimes dangerous.
Early risers and anorexia
"Early risers more likely to develop anorexia".
You can find dozens of headlines like this. What prompted them?
You may be a morning person, a night owl, or neither. This preference for when you sleep and when you're most active is called your chronotype.
Researchers have discovered connections between chronotype and mental health. For instance, a study published this January 4 in JAMA Network Open showed a link between the morning chronotype and anorexia. This is the source of the headlines I mentioned.
But the headlines got it wrong, and the stories that followed didn't necessarily clarify. The study, led by Hannah Wilcox at Harvard Medical School, actually showed a bidirectional relationship between the two variables.
Concretely speaking, morning people are indeed more likely to become anorexic. But, independent of this, a person who develops anorexia is more likely to then shift from an evening chronotype (or no preference) to becoming a morning person. These are separate processes.
Wilcox and colleagues identified this bidirectional link in a datset consisting of 16,992 individuals who had anorexia and 55,525 people who did not, so no complaints here about sample size.
Why pick on the news reports? Even the most inaccurate of them were still half right – i.e., they noted that morning people are more likely to develop anorexia.
The fact that they were half wrong too is part of what makes them dangerous. People with anorexia often deny that they have a problem. This is one of the most common symptoms of the disease. If they happen to be night owls, the news reports give them further grounds for believing there's no cause for alarm.
In other words, we shouldn't say that early risers are prone to anorexia without adding that night owls experience the disease too.
Most importantly, the difference between morning and evening people in rates of anorexia is extremely small. The value of the new study is in setting the stage for more research on genetic links between anorexia and circadian rhythms. In practice, chronotype is irrelevant to a person's risk for the disease.
The researchers acknowledge this rather discreetly, noting that "The clinical implications of our novel bidirectional chronotype findings are unclear." This is a fancy way of saying that the data, though scientifically important, yields no practical takeaways.
If you see signs of anorexia in yourself or others, seek help (for example here). Don't be comforted by the fact that anyone is a night owl.
At the same time, if you or someone you know is a morning person, don't stress about the prospect of anorexia unless you observe symptoms (see here for a list).
So, how should the headlines have been phrased? That's a tough question.
"Chronotype associated with anorexia" is accurate but a bit bland and jargony.
"Early risers prone to anorexia, and vice versa" would be acceptable, but only if the article stressed that "prone" refers to the slightest of statistical trends, and emphasized that being a night owl is no buffer. It's potentially harmful not to add these qualifications.
Super Mario Odyssey and depression
"Super Mario may smash depression symptoms."
Headlines like this have popped up over the past two weeks, though the study they refer to was published roughly four months ago.
This study, which appeared in Frontiers in Psychiatry, does show that playing Super Mario Odyssey benefits people with depression. But the headlines, and some of the stories that followed, badly misrepresented the findings, a trend that's continuing now on Reddit and other social media.
Moritz Bergmann and colleagues at University of Bonn wanted to know whether playing Super Mario Odyssey would help adults being treated for major depressive disorder.
Inviting depressed people to play a commercial video game may seem like an odd or somewhat callous way of offering support. Bergman and colleagues argued that because depression has affective symptoms (e.g., sadness) as well as cognitive ones (e.g., impaired memory), and because action video games have been shown to improve cognitive functioning in the general population, these games might improve cognition among depressed people and perhaps then reduce their affective symptoms.
Action video games are the ones that require focused attention, as well as rapid, carefully-chosen responses under distracting conditions. (Sounds to me like a kindergarten classroom.)
Each study participant was randomly assigned to one condition:
–Some patients spent six weeks playing Super Mario Odyssey three times per week for 45 minutes.
–Some patients spent the same amount of time using CogPack, a computer-based training program that promotes attention, memory, and other cognitive skills.
–Some patients simply continued to receive their usual treatment (psychotherapy and/or medicine).
At the beginning and end of the six-week period, severity of depression was recorded via the Beck Depression Inventory (BDI), the mostly widely-used measure of its kind. Bergmann and colleagues used a BDI cut-off score to identify people as either clinically depressed or not. They also administered short-term working memory tests.
If you didn't actually read the study, the findings would sound impressive: By the end of the study, there were significantly fewer depressed people in the Super Mario group. Super Mario players also showed significant gains on one of the cognitive tests.
So, what's wrong with the headline "Super Mario may smash depression symptoms"?
"Smash" is sensationalistic, of course. There's no treatment yet that "smashes" depression for everyone, even though some people fully recover. But what about a more sober headline, such as "Super Mario Odyssey helps treat depression"? And what about the stories themselves? Are they accurate?
Here's where things get ugly.
This is a wretched study. Wretched and irresponsible. I know that sounds rude, but when the goal of a study is to evaluate methods for alleviating suffering, methodological competence is essential.
I'll focus here on two fatal flaws, one conceptual, the other statistical. In each case, I would blame the researchers rather than journalists for misrepresented data in the news and social media.
1. Bad conceptualization.
Cognition and depression are related in many ways.
For instance, most approaches to psychotherapy, following the lead of cognitive behavioral therapy, assume that cognitive biases contribute to depression. Pessimistic attitudes, a tendency to remember personal failures rather than successes, and a tendency to overgeneralize ("I did such-and-such badly this morning; therefore I'm a bad person") are among the many biases that have been identified.
Put simply, cognitive biases are among the causes of depression, at least for some people.
At the same time, impairments in attention, memory, and other aspects of cognitive functioning are among the symptoms of depression. These are not causes but effects. Feeling sad and lethargic makes it hard to concentrate, for instance.
If you want to help a depressed person by improving their cognition, you work on dismantling those cognitive biases. This is a way of treating a cause of depression, and it has some degree of success.
In contrast, treating the cognitive effects of depression isn't helpful, even if you do it successfully. A depressed person whose concentration improves is still a depressed person.
What the researchers did in this study was a sort of bait-and-switch. They said their goal was to improve the cognitive functioning of depressed people, but they didn't address any cognitive biases. Instead, their intervention consisted of a video game, on the grounds that it might improve the cognitive effects of depression.
This is like teaching a person with a broken leg how to move that leg a little more rapidly as they walk. It can be done. The person can hobble a little faster. But you haven't treated the problem.
But wait... Even if the study was poorly conceptualized, Super Mario Odyssey still reduced the incidence of depression, right?
No.
2. Statistical malfeasance.
Among each of the three groups, the percentage of clinically depressed people decreased over the six-week period. That decrease was only significant among the Super Mario Odyssey group. (Just barely significant, by conventional standards, at p < .03).
When you run a lot of statistical tests, you introduce the risk of finding at least one significant effect just by chance, even though it's not genuine. The researchers used something called a Bonferroni correction to address this problem.
Bergmann and colleagues reported that the Super Mario effect disappeared after using the Bonferroni correction. And yet, they went on to say that the game was effective.
If you're not a stats person, here's how I'd describe the situation: Scientists and clinicians use significance testing to determine which results to trust. Significance testing relies on a bunch of rules. According to one of the rules, Super Mario Odyssey showed no benefits in this study. The researchers openly ignored the rules.
This is the Donald Trump approach to statistical analysis.
In any case, the sample size was so small (14 to 16 per group), the findings wouldn't have had much credibility anyway. There are other statistical problems too, but I'll stop here.
Here are my concerns about the news reports and their accompanying headlines:
1. The researchers didn't really find benefits of Super Mario Odyssey. Improvement was seen in all of the groups. That improvement was mislabeled as significant for the Super Mario group.
2. Participants in all three groups continued to receive psychotherapy and/or medicine during the study. So, even if Mr. Mario had been helpful, he wouldn't have been acting alone.
Bottom line: Don't claim that a treatment is effective when it isn't really, or when it's only part of a broader intervention. The danger here is that news and social media coverage might lead some people to believe that Super Mario can help them or others cope with depression. (It might, but there's no evidence for that here.)
As I said, I don't blame the journalists, because the researchers argued that the game was helpful, and it was actually quite hard to tell from their write-up that all three groups continued to receive treatment. (I reached out to the lead researcher with questions but have not heard back.)
I did find one accurate headline. Interestingly, it wasn't in a news or social media report, but on a gaming website called Gamerant.com. The headline went like this: "Scientists used Mario game to treat depression.
Yep, that's exactly what they did. And they did it very, very badly.
Hearing aids and early mortality
"Wearing hearing aids could reduce your risk of dying earlier"
Headlines like this got my attention, because I've been hard of hearing most of my life but still (childishly) refuse to put one of those things anywhere near my ear.
Perhaps I should reconsider, because a new study, published in the January issue of Lancet Healthy Longevity, showed that people with hearing loss who wear a hearing aid tend to live longer.
Dr. Janet Choi at USC and colleagues obtained recent mortality data on 9,885 adults whose hearing had been tested between 1999 and 2012.
The data came from the National Health and Nutrition Examination Survey (NHANES), a program of annual studies overseen by the CDC.
(Whether you realize it or not, you're familiar with NHANES data. It's the foundation for national dietary guidelines, pediatric growth charts, and many other health standards. When you hear that phrase "According to the CDC..." followed by some alarming news about obesity or sodium intake or whatever, the statistics probably came from NHANES.)
Choi and colleagues' sample included 1,863 participants found to have hearing loss, 12.7% of whom wore hearing aids regularly. Here are three key findings:
1. Hearing loss increased risk of mortality. The more severe the loss, the greater the risk.
2. Among people with hearing loss, those who wore hearing aids regularly had a lower risk of mortality.
3. Occasional use of hearing aids did not reduce the risk of mortality.
(Maybe I should reconsider using one of those things…)
One of the advantages of NHANES data is that Choi and colleagues could statistically adjust for differences among participants on a variety of characteristics such as age, socioeconomic status, and health.
Why is this important?
Well, simply linking hearing loss to mortality doesn't mean much, because hearing naturally declines with age. Folks with poorer hearing tend to pass away earlier because they tend to be older. In research lingo, hearing loss is confounded with age.
Likewise, showing that people who wear hearing aids tend to live longer doesn't mean much per se, because people who wear them may be more affluent, more attentive to their health, etc. The use of a hearing aid is confounded with these variables.
Thus, after statistically adjusting for age, health (BMI, blood pressure, smoking, etc.), and other variables, Choi and colleagues were able to state with confidence that hearing loss increases the risk of mortality, and that wearing a hearing aid reduces the risk.
However, they also noted the possibility of residual confounders – in other words, confounding variables that weren't measured. I reached out to Dr. Choi for more details, and here's what she had to say:
"Yes, there are residual confounders that we were not able to account for in this dataset, such as health consciousness and access to care. Hearing aid users are likely wealthier, which is associated with lower mortality risks. We accounted for education, income, and insurance status in this study. However, it's also likely that hearing aid users are more health-conscious, adhere more to medical recommendations, and generally have a higher level of access to care - these factors are among the major residual confounders."
I appreciate Dr. Choi's candor, here and in the published article. Methodologically, it's superb research, but we need to be cautious about the data, as no single study could rule out all possible confounds.
So, let's revisit that headline: "Wearing hearing aids could reduce your risk of dying earlier".
This isn't too bad. Perhaps I overstated the case by saying the headline is inaccurate, since "could" implies that hearing aids don't always increase life expectancy but simply that they tend to. That's what Choi and colleagues show. For any particular individual (e.g., me), hearing aids might translate into a few extra years, but there's no guarantee.
Unfortunately, most of the articles accompanying headlines like this focused on the impact of hearing aids. The takeaway is that if you begin to notice diminished hearing, you should get your ears checked and start using one of these devices.
What the science tells us is that if your hearing begins to decline, you should also get a physical exam, because the cause may be something other than normal changes to the middle or inner ear. Dr. Choi and colleagues touch on this in their article. Hearing loss can result from a variety of medical conditions, such as high blood pressure or diabetes.
Advising people with hearing loss to consider a hearing aid, but saying nothing more, may encourage people to treat a symptom but overlook some progressive underlying cause. This could be like recommending Super Mario to someone with depression.
Also, hearing aids don't directly impact life expectancy. Choi and colleagues acknowledge that we don't know yet exactly why there's a connection, but they suggest some plausible explanations. For instance:
"Previous studies have shown associations between hearing loss and loneliness, as well as between loneliness and mortality. Hearing aid use has been found to have a positive effect on reducing feelings of loneliness, social isolation, and depression, which could have contributed to the lowered mortality risk."
The implication is that the use of a hearing aid won't be helpful unless the person is also taking steps to be more sociable.
I don't know if the headlines concerning this study could've been better. It's the stories themselves that often failed to note the many causes of hearing loss, and to remind readers that hearing aids alone may not be helpful apart from making the world more audible.
Dangerous health news?
Media misrepresentations of data on health and other topics aren't necessarily dangerous. Not everyone reads the stories or notices the headlines. Those who do may not be affected, or the effects may be trivial.
At the same time, as we see in the news this week, there are potential harms. Budding anorexia dismissed because the person is a night owl. Early risers worrying needlessly. Depressed people being encouraged to play video games rather than seeking more helpful support. People with hearing loss overlooking health problems, or using hearing aids without pursuing other lifestyle changes.
In the end, caring for your health relies in part on taking care with data on health. We're blessed – if not overwhelmed – with information, but with much data comes much responsibility for those who try to make sense of it. We need careful descriptions of new health studies, but that's not always what we get. So, please read (or listen) cautiously.
(Have I become one of those older people who's a little too eager to share medical advice? Seems so...)
Thanks for reading!
I love this article! More people need to read it and apply the lesson to every health-related headline they read.