When the evidence is unclear

A few months ago I posted on a paper by Bernard Tanguy et al. on a field experiment in Ethiopia where I couldn’t figure out, from the article, where was the empirical support for the claims being made. This was not the first time I’d had this feeling about a claim made in social science research; sometimes there seem to be various gap between the data, the statistical analyses, and the substantive claims, and the resulting disagreements about the strength of evidence can be contentious. In the case of the Tanguy et al. paper, as with an earlier discussion of a paper by Gerlter et al., there were no contentious disagreements, but possibly only because there were no direct exchanges between myself and the authors of these papers.

Anyway, I sent the link to the Tanguy et al. paper to political scientist John Bullock, and he replied:

* I [Bullock] would be interested in reading, on your blog, or elsewhere, an appraisal of “classic” empirical political science books like Voting or The American Voter. Those books are still highly respected and widely assigned. Their analyses are simple. But when I read them, it’s sometimes tough for me to figure out exactly what the authors are doing. I wonder whether you would have a similar problem.

* You don’t mention authors’ incentives to be unclear. I fear that they are real. If you make things simple, some people will appreciate it. But some will think that you are simple-minded, and they will penalize you.
Perhaps this reasoning doesn’t explain the lack of clarity that you noted in the Tanguy et al. paper. But I do think that it helps to explain why social scientists are rarely as clear as they should be, especially at the level of the sentence and the paragraph.

7 thoughts on “When the evidence is unclear

  1. Early in my career, I remember one referee report that said (I’m paraphrasing here), “the result is either obvious or wrong.” I fear that clear results invite that reaction. Then, there is the famous Lemons paper by Akerlof that got rejected 3 or 4 times before being published – by his own account, it was a better paper in the early versions – he had to add considerable mathematics (less clarity, in my view) to get it published.

    • Yeah – but now it has been cited 20,000 times! And it is still short and sweet and clear.

      I think in general there is some incentive to be, if not downright unclear, at least a little bit difficult. For instance, why doesn’t every single RCT paper have a simple 2X2 before/after treatment/control table – just means, easy to read? Because a fancy table with lots of columns and multiple classes of controls and F-stats and whatever else makes it look like it would be really hard to do that analysis. It is a complicated shibboleth.

      But that said, I don’t think that rule applies to the very best work, including Akerlof’s. He got punished for having a new idea – everyone does, in every field, but his “punishment” wasn’t much, people caught on pretty fast that his idea was genius. Similarly with yesterday’s winner: “Aujourd’hui, maman est morte” is about as simple as it gets. You just gotta have the quality and chutzpah to be simple and clear.

  2. Bullock’s comment:

    You don’t mention authors’ incentives to be unclear. I fear that they are real. If you make things simple, some people will appreciate it. But some will think that you are simple-minded, and they will penalize you.
    Perhaps this reasoning doesn’t explain the lack of clarity that you noted in the Tanguy et al. paper. But I do think that it helps to explain why social scientists are rarely as clear as they should be, especially at the level of the sentence and the paragraph.

    The obvious comment is that most “results” in the social sciences aren’t very useful and typically have to be conditioned so much on the dataset being analyzed (forks, anyone) that the incentive is to sell as if it were a profound result when it is not, not to be unclear. I think the lack of clarity comes more from the researchers in social science lacking any rigorous training in logic, mathematics, or scientific inquiry, so they often can’t specify exactly what their hypotheses are. To quote Wasserman, most useful ideas are simple.

  3. I don’t have anything substantial to add here. Just that the first author’s first name is Tanguy and is last name is Bernard, not the contrary. So it should be Bernard et al, not Tanguy et al. (you can delete this comment).

Leave a Reply

Your email address will not be published. Required fields are marked *