Meta-analysis, game theory, and incentives to do replicable research

One of the key insights of game theory is to solve problems in reverse time order. You first figure out what you would do in the endgame, then decide a middle-game strategy to get you where you want to be at the end, then you choose an opening that will take you on your desired path. All conditional on what the other players do in their turn.

In an article from 1989, “Meta-analysis in medical research: Strong encouragement for higher quality in individual research efforts,” Keith O’Rourke and Allan Detsky apply this principle to the process of publication of scientific research:

From the statistical point of view, there really is no escape from performing a de facto meta-analysis. One can either judge the effectiveness of a therapy based solely on the most recent study and ignore all previous studies, a method which is equivalent to giving the most recent study weight 1.Oand all previous studies weight 0, or try to choose the weights on some scientific basis . . . If important differences in study findings exist they must be identified and explained.

That most researchers realize the need for stating their results in the context of previous trials is evidenced by the literature review section in almost all scientific articles. Meta-analysis is a further development and refinement of this approach offering a more rigorous and coherent treatment of past research work. It is tempting to propose that no experimental results should be published without inclusion of an appropriate meta-analysis. In effect, one might suggest that a literature review section ought to be based on an explicitly described methodology in place of the usual ad hoc approach.

So far, nothing exceptional. But then O’Rourke and Detsky continue:

What is it about meta-analysis that will actually help bring about improvement in individual research efforts? . . . the comprehensive, rigorous, and public peer review that a meta-analysis entails will encourage high quality participation by members of the research community in the resolution of the inadequacies. . . .

With a better understanding of meta-analysis in the context of the full scientific research process, meta-analysis is seen as a key element for improving individual research efforts and their reporting in the literature. This in turn will further enhance the role of meta-analysis in helping clinicians and policy makers answer clinical questions.

The idea (if I’m reading O’Rourke and Detsky correctly) is that, not only is meta-analysis appropriate for summarizing existing dat, also the threat or promise of meta-analysis provides an incentives for researchers to follow better practices in their new projects. If you know (or think there’s a high probability) that your work will be processed through a rigorous meta-analysis, this motivates you to be careful, to supply replication materials (otherwise your study will get a low weight), etc. That’s where the game theory comes in.

8 thoughts on “Meta-analysis, game theory, and incentives to do replicable research

  1. Umm.. you appear to have copy-pasted two columns together into the paragraph starting “That most researchers realize the need..”.

  2. I would have thought that having your work read by peers would have the same effect. Although I suppose getting a quality score is a further incentive to improve the quality of your work (even if the use of quality scores is discouraged).

  3. This can be turned around; faced with published results from scientists whose utilities you believe only reflect “can we get this published?”, the rational thing to do is to discount them – by differing degrees, depending on how strong you think the regression to the mean effect is. (Getting the discounting right is not easy, however.)

  4. One of the key insights of game theory is to solve problems in reverse time order. You first figure out what you would do in the endgame, then decide a middle-game strategy to get you where you want to be at the end, then you choose an opening that will take you on your desired path.

    Gee, this is how you write murder mysteries, an insight which has been around before there was game theory.

  5. Yes, it was very much about “incentives for researchers to follow better practices” as well as research funders to reconsider the costs/benefits of funding research that was carried out and reported this way (Detsky was a health economist trained at Harvard).

    “Strong encouragement for better” was more polite than “Penalties for crummy” but the paper was worked through, in the management science perspective (I had learned in MBS school) of “if folks are not doing what you want, the perceived rewards and penalties need to be changed”, rather than formal game theory (which surely informs that).

    Most statisticians (and all those at my institution) seemed dead set against helping anyone do any kind of a meta-analysis of published results fearing (unrecognized) garbage in would sure pollute every one’s understanding of current research results. Sanitizing the garbage (bias modelling) was perceived as hopeless and even impossible (not identified) by some. But refusing to help just left the status as is and it needed (still needs to be) change(d).

    What we missed pointing out until years later was that bias modelling was still required even when putting weight 1 on the current study one was involved in – unless one could credibly argue it was non-exchangeable with other similar studies conducting (which is hard to do or at least prone to been seen as arrogant.) The bias terms should be attached to the individual study if it comes from a class of studies subject to the biasing possibilities.

    So, most statisticians just stuck with the “I only have to be concerned with study I was involved in [and not the class it came from.]”.

    If the paper had much of an impact it was more likely with the clinical research funding agencies which started to demand meta-analysis prior to funding proposed RCTs (unless none at all had been attempted before), which started I believe, in Australia around 2001.

  6. Pingback: Science is broken. | The Thinker

Comments are closed.