Find an observational study touching on a lifestyle choice or widespread health concern… take the associations found in the study and erroneously present them with cause-and-effect language… finally, make matters worse by de-emphasizing caveats and limitations but emphasizing clickbait headlines, thereby leading readers down the path to misinformation...
Every week we come across a misguided and misleading recipe for “news” that goes something like this:
Find an observational study touching on a lifestyle choice or widespread health concern … take the associations found in the study and erroneously present them with cause-and-effect language … finally, make matters worse by de-emphasizing caveats and limitations but emphasizing clickbait headlines, thereby leading readers down the path to misinformation.
Here’s what you need to know about this latest bumper crop of substandard stories:
This is based on an “anti-inflammatory diet index,” or AIDI, that includes 16 food types, 11 of which are designated as “anti-inflammatory” (including fruits, vegetables, nuts, whole grains, wine/beer, coffee, and chocolate) and five as “pro-inflammatory” (such as red meat, soft drinks, chips, and offal).
Newsweek explains inflammation this way:
The immune system triggers inflammation when the body is confronted with a potential threat, like a harmful chemical or microbes. That process can become problematic when inflammation becomes the body’s default state. And evidence suggests that conditions from Alzheimer’s to depression, cancer and heart disease could be caused by chronic inflammation.”
The italicized bit is our emphasis to underscore pure speculation. Why include that and not the following?
Other than remove the causal language found throughout the story, the next best thing the story could have done is asked sources: What are the limitations of this study and why is that important to readers?
Next to get featured was this study from the Canadian Healthy Infant Longitudinal Development cohort. It suggests “emerging links” between maternal use of household cleaning products and children having a higher body mass index by age three. The research authors go on to speculate whether this could be due changes in the gut microbiome measured at age 3 to 4 months, depending on what type of cleaner a mom reported she used.
The study has plenty–plenty–of caveats, and Newsweek’s take did offer these cautions:
But why — after acknowledging three such critical limitations — would Newsweek then print a headline suggesting causality between household cleaning products and childhood obesity? And why would the story go on to say:
Nevertheless [the lead author] believes the evidence her team collected is enough to suggest the overuse of disinfectants should be avoided, as it may harm our human microbiota.
This observational study didn’t even address that. Again, that’s pure speculation.
It’s sobering to point out these two articles were published on the same day by a single news organization. Imagine the misleading reporting on observational studies that can reach the public in a week? A month? One year?
Suffice it to say observational studies are a target-rich environment for news organizations looking for eye-catching headlines on popular health topics. And although observational studies can make valuable contributions to science (see below), their limitations are significant enough to render many of the headlines they generate little more than hype.
Unfortunately, the limitations of these studies are under the radar of too many journalists.
To address that shortcoming, reporters would serve their readers best by at least including the following:
These may seem like small steps, but they could have a huge impact on readers trying to make well-informed health care decisions.
Please note: Observational studies (as with the research that linked smoking to cancer and other problems) can indeed pile up such overwhelming evidence that it would be prudent to make public health recommendations on that basis.
However, it’s rare that observational studies reported in the news media rise to this level of evidence. Statistical association is not proof of cause-and-effect. It is not unimportant. But no one should make it more than what it is.