Google

Wednesday, February 22, 2006

 

Big Fat Science, Over-Simplified

A few recent news stories got me thinking about bad science reporting and bad science promotion. The big story, with potentially very dangerous consequences, was the results of the Women’s Health Initiative long term study. You probably heard about it as “study shows low-fat diets have no health benefits”. In fact, as far as I can tell, the study was simply a failure because the women in the study didn’t actually follow the diet. The researchers had a goal of reducing the fat intake of the some 40,000 women in their 50s and 60s to a level of 20% of their calories from fat. Most small scale studies suggest dramatic benefits from diets with 7-10% of their calories from fat, but that was considered too ambitious by these researchers. So, through a series of seminars, they taught the women how to moderately limit their fat intake, then asked them every few months to report on their eating habits, and also had them come in for a medical exam every year. They had a “control” group of some 60,000 women who were asked to eat “normally” but were given the same questionnaires and exams. I saw one site that said something like $400 million dollars was spent on this study over a period of 10-15 years, although I haven’t verified the accuracy of that number. However, a study involving 100,000+ women over about a decade is an enormous investment in researchers’ time and taxpayers’ money.

The result? Well, the self-reported fat intake during the first year was 24%, and the “low-fat” group lost an average of 5 pounds. Not what they wanted, but perhaps enough to say something since the control group reported about 35% fat intake. But, by year 5, the women in the “low-fat” group were reporting 29% fat intake and were within a pound of the control group. Now, remember, they were given extensive lessons in what the researchers wanted from them. They knew very well what they had agreed to do, and their diet was self-reporting not by daily logs of what they ate but by questionnaires about their eating habits over the previous several months. This is almost a classic exercise in how to get biased data. Undoubtedly, their reported intake was significantly less than their actual. The evidence of this bias is the essentially identical weight of the two groups. So, in other words, what really happened was, for the most part, the women in the “low-fat” group moderately lowered their fat intake for a year or two, then pretty much went back to eating the way they had before. I.e., the study was a massive failure, with the data largely useless because the study subjects didn’t do what they were supposed to. And, surprise, surprise, there was no significant difference between the two groups for diseases which take decades to develop.

But, since many millions of taxpayers’ dollars had been spent, a result had to be announced. And so “there was no significant difference in cancer or heart disease rates in the low-fat group” becomes “Study shows no health benefits from a low-fat diet”. And commentators start saying “break out the Ben and Jerry’s, since it doesn’t matter anyway”.

At the same time this report was coming out, I heard a report on the BBC of a neutrino detector being built in Antarctica. Now, when you hear “news” about telescopes in the process of being built, you can be sure that they are up for a budget review and they are trying to get some favorable press to impress the politicians. Anyway, this report contained some obviously false statements by the scientists and the commentator (although they could have been a result of editing of context). The telescope is being built to detect high-energy neutrinos coming from sources of ultra-high energy cosmic rays. But, in the report, the words “high-energy” were avoided, presumably because that would be too confusing. Instead they made statements like “these elusive particles, called neutrinos, have yet to be detected” and “these things we call cosmic rays, and only 12 have ever been detected” (these quotes aren’t exact, since they are from memory, but they are accurate paraphrases). Now many thousands of neutrinos have been detected by instruments throughout the world. And millions of cosmic-rays have been detected, at energies from MeV to TeV. Every time an astronomer takes an image with a CCD camera, there will be a few saturated pixels from cosmic rays hitting the chip. But at energies approaching a ZeV (that’s 10e21 eV), only a handful have been detected. However, to “simplify”, the good scientist decided to say only 12 cosmic rays had been detected, rather than 12 of these ultra high-energy cosmic rays.

It also makes them sound more exotic and interesting. But I guess that is just coincidence.

This over-simplification is problematic in a lot of science reporting. Using the simplification excuse, scientists and science reporters bias their reports to be more “exciting” or “controversial”. How many times did Hubble “prove” the existence of black holes? What about the discovery of “quark stars” or “hypernovae”?? In the former, often it was just another measurement similar in principle to many that had been done before without any real new evidence for the existence of astrophysical black holes. The latter two by taking the simplest interpretation of data but ignoring well known effects that are probably relevant (neutron star atmospheres and beaming, respectively, at least in initial press releases).

Astronomers can tell themselves that such things don’t really matter. Unlike dietary studies, lives don’t depend on the information. But they do undermine the public’s impression of the legitimacy of science and the statements of scientists. And each example of sloppy reporting or over-interpretation of data makes it that much more difficult to get out important information to the public that DOES affect lives.

So go ahead, eat three cheeseburgers and a pint of ice cream with a double cappucino chaser. Its just as good for your heart as brown rice, spinach, and orange juice. Tomorrow’s headlines may say the opposite, but that just proves scientists don’t know anything anyway, doesn’t it?

This page is powered by Blogger. Isn't yours?