Skip to content

Bank Shot

Tom Clark writes:

I came across this paper and thought of you. You might be aware of some papers that have been published about the effect of military surplus equipment aid that is given to police departments. Some economists have claimed to find that it reduces crime.

My coauthors and I thought the papers were likely flawed because of an ecological fallacy problem. They conducted their analyses at the county level, whereas aid goes to local jurisdictions. And, within counties, there is often a lot of missing crime data. So, we did a re-analysis (by Anna Gunderson, Elisha Cohen, Kaylyn Jackson, Tom Clark, Adam Glynn, and Michael Owens), disaggregating to the jurisdiction level. In the course of doing so, we uncovered massive data problems. Those, plus the ecological issue make their results go away.

In any event, the paper I came across today is mind-boggling. It claims to find that military aid reduces suicides. They aggregate not to the county level but to the state level. They then speculate that the mechanism is that people feel safer with militarized police and so have fewer guns in their homes. As with all of these papers, they instrument military aid. The instruments are all weird. (We do not deal with that in our paper in an effort to focus on the data and aggregation problems.)

Here is the suicide paper (by Alexander McQuoid and David Vitt).

Given how significant the policy implications of these claims are, I thought you might want to think about blogging this stuff. Or, maybe you have something to say. I suspect real-world policy-makers are paying attention to these results. It’s maddening.

I’ve not read either of these papers in detail, so I’ll just offer the general comment that ecological regression can make sense—but the treatment effects and outcome have to be on the right scale. Remember that “Reduction in Firearm Injuries during NRA Annual Conventions” story from last year? The numbers just didn’t add up.

Again speaking generally, without reference to the specific work discussed above, I see a naivety in some social-science research, especially but not limited to economics, where some natural experiment or identification strategy is taken to allow causal inference, with all the usual concerns about observational studies forgotten. (Here’s another example, which we discussed a few years ago.)

Causal identification is important, no doubt. But if you have a weak and indirect state-level treatment, and a noisy state-level outcome, you’re drawing dead, as they say in poker, and all the instruments and regression discontinuities and differences in differences aren’t gonna save you.

One Comment

  1. Adrian says:

    …And often the identification strategy itself is just as shaky as the other statistical assumptions made anyway, or the identification strategy (even if correct) ends up making things worse from other points of view (e.g. IV estimators and their high variance).

    Having said that, I don’t think this is naivety – it’s lack of rigor when there are several diagnostics that can be run to check as many assumptions as possible.

Leave a Reply