Increasingly it appears to be the case that there are two types of public-health publications pertaining to food. There is bad research. And then there is bad reporting on bad research.
Instances of the former are too numerous to list here. Suffice to say that last year I blogged at Hit & Run about a study that claims the very notion that we choose what we eat is a mere illusion. More recently, I noted in a column for Reason that I couldn't find much to a study by Kansas researchers that appears to claim the mere existence of food logos is somehow contributing to childhood obesity.
Instances of the latter—the seemingly rote regurgitation by journalists of sketchy research, often coupled with brainless quotes from a small sample of go-to researchers—are no less common.
A widely circulated recent Associated Press article on the purported dangers of Monster Energy drink, for example, claims a 24-ounce can of Monster "contains 240 milligrams of caffeine, or seven times the amount of the caffeine in a 12-ounce cola." That's true, according to data provided by the Center for Science in the Public Interest and others.
The AP article also notes the FDA limits the amount of added caffeine in cola drinks to 0.02 percent. That's also true. But the limit only applies to caffeine added directly to "cola-type beverages"—a definition that excludes both non-cola-type drinks like Mountain Dew and substances containing caffeine (like guarana or coffee) that are added to any drinks (whether cola-type or not).
These important distinctions are lost on Forbes columnist Melanie Haiken, who writes that
FDA does not allow soda to have more than 0.02 percent caffeine….
A 24-ounce can of Monster Energy Drink supposedly has 240 mg of caffeine, approximately equivalent to seven cups of coffee.
No (cola, not soda) and no (soda, not coffee).
Nevertheless, Haiken concludes her one-sided piece, "Can Energy Drinks Kill? The FDA Investigates, Consumers Worry, A Business Under Fire," with a call to restrict the marketing of all energy drinks.
Another recent example of underwhelming public-health journalism comes from USA Today's Nanci Hellmich in a piece largely focused on a Robert Wood Johnson Foundation publication that claims changes in school nutrition over the last few years may have brought us to "the turning point" in combating childhood obesity.
Hellmich then notes "[t]he gains are pretty small in some communities"—as if perhaps they're large in others.
While Hellmich cites a few examples, I wanted to see the research for myself. Yet I found one immediate difficulty with Hellmich's article: She doesn't name the RWJF publication or note when or where it was published.
Fifteen minutes of searching later I found what I believe is the publication—an "issue brief"—after scrolling through several pages of RWJF's website.
Its title? "Declining Childhood Obesity Rates—Where Are We Seeing the Most Progress?"
One of the four key pieces of data supporting RWJF's claim of states and cities demonstrating "the most progress" comes from California. In that case, one set of students in three different grades in 2005 was 38.44 percent obese and/or overweight. In 2010, a different set of students in the same three grades was 38 percent obese and/or overweight. For those keeping score, that's a decrease of 1.1 percent in the obese/overweight levels of a completely different set of students over five years.
On top of that fact, the California Center for Public Health Advocacy, the group that carried out the research with UCLA thanks to a grant from RWJF, also notes in a post announcing the 1.1 percent figure that "the majority of counties in the state [are] still registering increases in obesity rates among school-age children."
And remember, this is one of the states and cities RWJF is touting as having made "the most progress"! (Emphasis mine.)
Even some who one might expect to latch onto the data in the USA Today and Forbes pieces are openly skeptical.
"Sadly, it's not uncommon to see sloppy journalism on important scientific matters," says Michele Simon, a public health lawyer who advocates in favor of regulating the food and alcohol industries, in an email to me.
"Such misinformation can do a lot of damage to advocates pushing for policy change. We all need reliable science to accurately inform the public and policymakers. Journalists can and should do a better job. And advocates should also do their own homework and be careful not to amplify bad science."
In spite of the current crop of mediocre research and reporting, there are a few bright counterweights.
Take Professor Jayson Lusk of Oklahoma State University, who I interviewed for Reason earlier this year.
Lusk had just published a study in the journal Food Policy, "The Political Ideology of Food," in which he concluded that the great majority of Americans support increasing the extent to which food is regulated.
Lusk, whose forthcoming book is The Food Police: A Well Fed Manifesto about the Politics of Your Plate, admitted he found his results "a bit disheartening" but published them anyways.
Me? I hate the results (and challenged him on his data in my interview).
But I love the fact Lusk displayed the intellectual honesty and courage to publish research that doesn't simply reflect his own values and wishes.
For this alone, Lusk deserves a medal. But he's on a very short list.
One who might agree with that assessment and that of Michele Simon is noted NBC News chief science and medical correspondent Robert Bazell, who recently used a Harvard University teaching hospital's blatant overreach in pushing "weak" research on the sweetener aspartame to remind everyone—from scientists to the media and consumers—of the perils of pushing bad science.
"Not all science deserves publicity," writes Bazell. "Some is not done well."
Bravo. And the same can be said of science journalism—especially when it comes to so much public-health research in the area of food.