Conclusion: toxic fish or toxic reporting?
In the coverage of the Science study, there were many well-written stories that sought to balance discussion of controversial material with caveats and quotes from critics. Among the best were those reported by the New York Times, the Philadelphia Inquirer, the Chicago Tribune, the San Francisco Chronicle, the Vancouver Sun and the Toronto Star.
And even though television scored poorly on our index of key data, a close look at the transcripts shows far more caution about endorsing the conclusions reached by Hites et al. than was found in many newspaper stories. On one key measure, the EPA’s estimation of the risk posed by the PCB levels found in fish, television news did much better than its print counterparts.
But overall, the media failed to give readers the facts they needed to make sense of the risk from PCBs in salmon. Even when stories provided numeric data, it was either incomplete or misleading. For example, many U.S. news stories mentioned only the average PCB level for the entire sample of farmed fish analyzed by Hites et al. (36.6 parts per billion), and not the average levels for farmed fish in the U.S — or even those for farmed fish sold in the U.S. — both of which were significantly lower.
And it is simply astonishing that, in covering a study claiming to have found an increased risk for cancer, so few news organizations told readers what the likelihood of that risk was, or that the evidence that PCBs are a probable carcinogen in humans is so controversial.
Why this matters
But as researchers claim to find risks in ever smaller increments (parts per billion, parts per trillion), we need to ask whether the assumptions behind their reasoning can be justified by evidence and logic. Or rather, journalists need to ask these kinds of questions.
One important consequence of the way the media covered Hites et al. is that the public remains largely clueless about the assumptions that go into assessing the health risks from contaminants in food and the environment, or that science is so divided on the methods for calculating and interpreting risk.
Yet within two weeks of the salmon study’s publication, an editorial in the Los Angeles Times accused the FDA of having “outmoded” standards for contaminants in fish, and of being more concerned about protecting “the food industry’s profit margin” than protecting public health (“The FDA’s Fishy Standards,”24/01/04). The evidence for these claims? Hites et al. “The report published in Science magazine this month demonstrated the need for the Food and Drug Administration to update its standards for such toxins in fish and all food,” said the Times (emphasis added).
Now, unless you happen to know the backstory, this sounds impressively persuasive; after all, who would argue against updating “outmoded” limits and lowering health risks? Yet the LA Times promotes this policy recommendation on the basis of a partial interpretation of the facts. And, unfortunately, it is through this kind of alchemical certitude that the media, all too frequently, turn science into public policy. A new health risk study should be treated no differently than a rumor of political scandal: it needs to be checked out, thoroughly, before it appears in print.