A recent article by David Freedman, Survival of the Wrongest,  discusses some of the perils of health journalism. David Gorski has already pointed out the irony of the article, given that Freedman's 2011 article on so-called alternative medicine was highly criticized by proponents of science-based medicine as terrible health journalism.  

I won't bother repeating David's excellent deconstruction, but I would like to explore some of the same issues raised.  

Irony aside, Freedman does bring up a valid point - medical research is error prone, those stories that journalists tend to report are the ones that sound the most counterintuitive and dramatic, and are also the ones most likely to be wrong.  

This is essentially correct, but there are other challenges to responsible health journalism that don't necessarily deal with error.  

At least as important as the fact that many published studies reach a conclusion that will turn out to be incorrect, is the fact that medical research is horrifically complex. It really does take a high degree of expertise to sort it out. As an academic physician and educator I would not even try to make sense of a research area sufficiently outside of my area of expertise.  

Rather, I rely on experts in the field to make sense of it for me. But even then, it is challenging to dissect and synthesize those expert opinions (proportional to the degree of controversy over the topic of interest). Sometimes I conclude that the task of wrapping my head around an issue is simply too daunting, and I defer giving any opinion about it. Other times, when possible, I simply ask an expert to synthesize the current findings for me (after getting as far as I can on my own). At other times nothing short of several expert opinions will get me there.  

In other words, it takes a fair degree of expertise and background research just to know how to access and summarize the current consensus of scientific opinion on a complex topic, or to fairly represent an ongoing controversy. That is essentially the task of a science journalist.  

So I sympathize with non-expert journalists who try to do this, and I understand why they generally (in my opinion) do such a poor job. Rather than just criticize this, however, I think it would be more fruitful to consider ways of improving the situation.  

I have to say also that there are many excellent science journalists out there. But even the good journalists that I respect don't always do a science topic justice, and make mistakes in their reporting that an expert would simply not make.  

There is no one answer to this problem, but here are some rules of thumb that I think would help.  

Recognize and Respect the Complexity of Science  

First, journalists need a greater understanding of the challenges of good science journalism. Unless you have a decent formal science education and voraciously consume science information, you probably should just avoid science stories altogether. This problem, however, has become worse in recent years as news outlets struggle with the changes the internet has wrought on that profession, and if anything it is more common for non-science journalists to report on science stories.  

There is also a bit of a Dunning-Kruger effect going on - those journalists who are not scientifically literate are too scientifically illiterate to recognize and understand the implications of their own science illiteracy. So they gleefully wade into stories they have no business reporting on.  

Use Multiple Experts  

Second, science journalists need to recognize that no single expert can ever give them a complete view of a science story. There will almost always be a range of opinions, and any one scientist is going to give a biased perspective.  

Further, don't assume all experts are equal. Some "experts" may in fact be cranks. This is related to the "false balance" issue when non-controversial science is presented as if it is controversial.  

Put Science Stories Into Perspective  

Perhaps the most common mistake that I see science journalists make, however, is the failure to put a science news story into any kind of perspective. Research is not done in a vacuum - often a study is performed to address an ongoing question or controversy. The results are just one piece of data perhaps in a complex back and forth.  

For example - are low carb diets helpful? Every time a study comes out addressing this issue it is reported as if it's the final word on that question. It turns out low carb diets are finally proven, or low carb diets are no good. Whatever the conclusion of the most recent study, it is reported as definitive and final.  

Rather, journalists should report science like it actually works. Lay out the controversy, try to fairly represent the current state of the evidence, then show how this new study fits into the overall picture. This, of course, takes a lot more work and a deeper understanding of science in general and the topic at hand.  

Understand That There Are Different Kinds of Evidence  

Often I see a study presented as just that - a "study," without further clarification. Sometimes the study in question is just a review, and not new data at all, but it's presented in the media as if it is new evidence.  

Different kinds of evidence also serve different purposes. There is basic science, animal research, clinical research, and observational studies of various kinds. Observational studies generally cannot establish cause and effect, and basic science studies cannot dictate clinical claims.  

Unless a journalist has a working knowledge of the different kinds of published research, their strengths, weaknesses, and proper uses, they really have no business reporting on scientific research.  

You Don't Have to Report Everything  

Much of what goes on in science is preliminary and exploratory research. These studies serve the purpose of pointing the way toward further research. It is these studies that are more likely to be wrong than ultimately verified.  

Honestly, this type of research should probably not be reported on at all to the general public. It is the noise in the background of scientific advance. Throwing in caveats at the end pointing out that the research is preliminary has little effect, because most people will remember the outcome (if they remember it at all) but not the source or the status (preliminary).  

I can see popular journals for science nerds reporting on such research, in sections clearly labeled as preliminary findings only of interest to other researchers. Or in articles about possible future science - again clearly pointing out the context that this is speculative and not ready for prime time. Otherwise they can be accessed in online versions of the technical journals themselves, and honestly if you cannot follow the technical article you probably have no legitimate interest in the research itself.  

Stories that are appropriate for general reporting to the public should be confirmatory research findings that actually affect what we think about things, how we practice medicine, or findings that should affect people's health behaviors. Otherwise reporting serves more to confuse people than to inform them.  

This is especially true when multiple failings of responsible health journalism are combined - reporting preliminary findings without properly putting them into context, properly using experts, or fairly representing the type and reliability of the new data.  

The Responsibility of Scientists  

Finally, scientist themselves (and their institutions) need to take responsibility for good science reporting. Scientist should engage more with journalists, and understand something about journalism themselves. Effective science journalism is about cultivating a working relationship between journalists and scientists.  

Blogs and other social media have lowered the bar for being a journalist, and some scientists have taken advantage of the to successfully become their own journalists - reporting on their field and even branching out into science reporting in general. I think this is a good thing. In essence you need to be a scientist and a journalist to be a good science journalist. You can either teach science to journalists, or teach journalism to scientists, or have a collaboration between the two. (These options are not mutually exclusive - they all can work.)  

Imagine how nice it would be if the media generally was a force for increasing scientific literacy in our society and promoting good science, instead of being a net force for dragging society down further into pseudoscience and confusion.

 

Steven Novella, M.D. is the JREF's Senior Fellow and Director of the JREF’s Science-Based Medicine project.