Fund the answer you want…

Aleks sends in this article by David Michaels:

More than 90 percent of the 100-plus government-funded studies performed by independent scientists found health effects from low doses of BPA, while none of the fewer than two dozen chemical-industry-funded studies did. . .

Having a financial stake in the outcome changes the way even the most respected scientists approach their research. Scientists make many decisions about the doses, exposure methods and disease definitions they use in their experiments, and each decision affects the result.

For instance, when assessing the risk of exposure to perchlorate, a rocket-fuel ingredient that can affect the thyroid and contaminates many water supplies, scientists on a National Academy of Sciences panel chose perchlorate’s effect on thyroid iodine uptake as the most important indicator of its effect on health. On the other hand, scientists working for companies that might have to bear the costs of perchlorate cleanup selected the chemical’s effect on one thyroid hormone as the basis of their risk estimation. These scientists estimated a safe level for perchlorate exposure nearly three times higher than that of the NAS scientists. . .

Industry researchers design studies in ways that make the products of their sponsor appear to be superior to those of their competitors. . . . “tricks of the trade” include testing your drug against a treatment that either does not work or does not work very well; testing your drug against too low or too high a dose of the comparison drug because this will make your drug appear more effective or less toxic; publishing the results of a single trial many times in different forms to make it appear that multiple studies reached the same conclusions; and publishing only those studies, or even parts of studies, that are favorable to your drug, and burying the rest.

The problem is equally apparent in review articles and meta-analyses, in which an author selects a group of papers and synthesizes an overall message or pattern. Decisions about which articles to include in a meta-analysis and how heavily to weight them have an enormous impact on the conclusions. This was apparent in two different conclusions that came out of National Toxicology Program-sponsored reviews of BPA literature. Two independent expert groups made different decisions about including and weighting studies with particular exposure routes, and the groups expressed different levels of concern about the effects on prostate and mammary glands of fetuses and children exposed to low doses of BPA.

No comment, except that I don’t think it would be difficult to bias results in social science as well. In my own work, I typically work my way through an analysis by feeling my way through, double-checking when things don’t make sense, adjusting the questions I’m asking to be compatible with the data I have, and so forth. And I’m not in the position to be looking for specific outcomes. I can easily imagine how difficult it would be to have an open inquiry when there’s pressure from the funding source (or an implicit incentive to get more funding if you find the right thing).

3 thoughts on “Fund the answer you want…

  1. Having done both the academic and industrial bits, I'd say these biases are present in both areas. As an academic researcher, I and my co-workers knew what would get published and/or funded, and that influenced what we worked on. As a scarce resource in an industrial environment without the grant funding constraints, I have much more freedom to say no…

  2. There is bias in both directions.

    Industry bias is well known; but in medicine at least, govt funded bias is just as obvious if you look for it.

    For example, agencies with regulatory roles fund studies to show the need for more regulation; health promotion agency studies show that health promotion works etc; where the state funds health care the govt funded studies usually report zero advantage for new expensive treatments (and it is quite easy to design studies that fail to detect a difference between interventions).

    The difference is that those who work for the government or are government funded get to fill in the 'declaration of interest' section with "none".

Comments are closed.