Skip to content

Zero Dark Thirty and Bayes’ theorem

A moviegoing colleague writes:

I just watched the movie Zero Dark Thirty about the hunt for Osama Bin Laden. What struck me about it was: (1) Bayes theorem underlies the whole movie; (2) CIA top brass do not know Bayes theorem (at least as portrayed in the movie).

Obviously one does not need to know physics to play billiards, but it helps with the reasoning.

Essentially, at some point the key CIA agent locates what she strongly believes is OBL’s hidding place in Pakistan. Then it takes the White House some 150 days to make the decision to attack the compound. Why so long? And why, even on the eve of the operation, were senior brass only some 60% OBL was there?

Fear of false positives is the answer. After all, the compound could belong to a drug lord, or some other terrorist. Here is the math:

There are two possibilities, according to movie: OBL is in a compound (C) in a city or he is in the mountains in tribal regions. Say P(OBL in C) = 0.5.

A diagnosis is made on the compound on the basis of X criteria (e.g. high walls, nobody enters or leaves except courier, etc.. ).

Say the “compound test” is positive (+) when all criteria are met and negative otherwise. The CIA agent is very sure this is OBL’s compound. In particular, say P(+|OBL in C)=0.9.

This would seem like a “slam dunk” but what if say P(+|OBL not in C)=0.6. Then, by Bayes Theorem, P(OB in C|+) is only 60%, which is what the senior brass thinks on the eve of the operation.

One can play around with the parameters, but I draw three observations:

First, I don’t know if this was the right assessment or not but what I gather from the discussion — as portrayed in the movie — is that top brass did not focus explicitly on debating the arguments to the Bayes formula. The evidence was not structured, but casually spoken in hunches and guesses.

Second, it seems the debacle on WMD in Iraq has made the CIA more worried of false positives!

Third, given what the film portrays about the compound, I think it belonging to a drug lord was pretty minimal, so I would say P(+|OBL not in C)=0.1. If so P(OB in C|+) is 90%. This suggests the White House was extremely risk averse: it seemed very likely they had OBL and not somebody else yet they sat on it for half a year (at substantial risk to homeland)!!!! (Alternatively, the fear of false positives was getting in the way of the top brass judgement (e.g. maybe they thought it was 90% but did not want to say it)).


  1. pritesh says:

    The risk adversity arises from the method the President chose, there is no risk if you use drones to blow up the compound.

    • MGC says:

      There is risk to the extent that clearly someone important was hiding in compound, the location is near a military school, Pakistan is unstable country, has nuclear weapons, and is a key “ally” against Taliban.

      Shooting from afar does not imply you are out of harms way, or are not hurting your other objectives and goals.

  2. Entsophy says:

    From my past and current adventures in the Intelligence community, I would say you’re about 50+ years to late if you want to make Bayesian/Decision Theoretic methods commonplace there.

    Individual analysts often do this kind of Bayesian analysis as a sanity check even though they may not show it to anyone. The analysts may not have a good sense of each probability, but they can often work backwards from their supposed conclusions and show they would imply implausible values for the input probabilities.

    More importantly, there are many situations where a mass of data has to be analyzed in real time. China building up for an invasion of Taiwan is a classic example. The buildup likely would appear as “signals” in a mass of very diverse data sets/indicators. If these could be analyzed in real time, they would probably give a very accurate warning of an invasion. It may not be practical to do this by hand in real time, so a computer will have to be programed to assess and draw conclusions from this mass of data. If you’re the one programming the computer you don’t have too many choices but to use a Bayesian/Decision theoretic model.

    I’ve heard tell though, that Frequentists believe the Bayesian/Decision Theoretic stuff is metaphysical non-sense and were pushing the US government into bribing China to invade Taiwan 1000 times so that they could calibrate their Frequentist models. I haven’t heard anything about that proposal in a while so maybe it’s been dropped.

  3. Zach says:

    I was hoping a discussion of the treatment of uncertainty in Zero Dark Thirty would appear here.

    There were two scenes where CIA agents attempted to quantify their uncertainty. In scene 1, some analysts rattle off some overly precise sounding estimates of the probabilities that various types of people live in the compound. I can’t imagine what model they used or what their methods were. It seemed stupid.

    In scene 2, agents go around the table assigning a rough number value to their certainty level that UBL is in the compound given all the evidence. I thought they went about this in a reasonable way. They seemed to basically make a composite predictor called “strength of evidence” and perform a univariate mental regression with past cases they’ve experienced as observations and whether-the-evidence-pointed-to-the-correct-conclusion as the response. Iraq rightly weighed heavily because the strength of evidence had seemed very strong but pointed to the wrong conclusion. In situations like this where it would be very hard to construct appropriate quantitative models, I think the approach they took was quite reasonable.

    Now, the movie-going colleague (MGC) from the post above writes “given what the film portrays about the compound, I think it belonging to a drug lord was pretty minimal, so I would say P(+|OBL not in C)=0.1. If so P(OB in C|+) is 90%. This suggests the White House was extremely risk averse.” MGC is ignoring the possibility that someone other than a drug lord or UBL is living in the compound. It could have been another high ranking terrorist. Indeed, one of the agents says something like “I’m virtually certain there’s a high value target in there. I’m just not sure it’s Bin Laden.” The only reason they thought it was Bin Laden and not another terrorist was that someone who had delivered messages for Bin Laden in the past was also living there. Certainly good evidence that it’s Bin Laden (justifying the 60%-80% certainty levels people were expressing) but hardly iron clad.

    Finally, I liked the way the movie didn’t completely dismiss the people who were uncertain as hapless wafflers lacking the courage of their convictions. Yes, that reading was made available. (The protagonist states her certainty level as, “100%. Fine, 95% since certainty freaks you guys out, but it’s 100.”) But I thought the movie also made available the reading that Jessica Chastain was not being reasonable about her uncertainty even if her point estimate (the same as everyone else’s at the table) was correct. And that’s a major step forward! Movies almost always unreservedly lionize the person who follows their gut in the face of pathetic vacillating bureaucrats. At least this movie acknowledged the possibility that the process of thinking through uncertainty might have a place.

    • revo11 says:

      In some fields, overly precise estimates are used as a signaling mechanism to show that your estimate is derived from some thorough data-based calculation and you aren’t pulling a number out of thin air. “73.24%” sounds like it must have been derived from a calculation, whereas, “70%” sounds like you might have just made a number up. This is frustrating to deal with when it’s obvious apriori that such precision isn’t warranted.

      I haven’t seen the movie, but that “100%” comment reminds me of a part in Nate’s book. He talks about how a poker player in the world series of poker could be completely wrong in their probability calculation, but because the TV audience can see all the cards and the player happens to get lucky, they come off looking like psychic poker jesus.

    • Nameless says:

      There is a hole in the argument.

      Are we sure at any point that there is at most one compound in Pakistan that fits these criteria?

      We assume that, if OBL were in Pakistan, he would most likely modify his own compound to fit them: P(+|OBL in C, this is OBL’s compound)=0.9.

      We can also theorize that there is an expected number ‘x’ of compounds satisfying these criteria (housing drug lords, other high ranking terrorists, etc) in Pakistan even if OBL is not there. Suppose that x=0.5 and the distribution is Poisson.

      If I did the math correctly, that makes P(OBL in C|+) = 0.67. Furthermore, the number that we really want: the probability that this particular compound, which fits the criteria, happens to house Bin Laden, is only 0.53.

      Even if we lower x to 0.105 (which would mean that there’s 90% probability of having no compounds in the absence of OBL), P(this is OBL’s compound|+)=0.846.

      • Nameless says:

        It also means that we can’t believe simultaneously that P(this is OBL’s compound|this compound is ‘+’,OBL in C)=0.9 and that P(there exists a ‘+’|OBL not in C)=0.6.

        If x is relatively high, say, 0.912, then there is a significant risk of a false positive, namely that OBL is not even in a compound: P(there exists a ‘+’|OBL not in C)=0.6. But then there’s also a significant risk that OBL is in a compound, just not in that particular compound:

        P(OBL is in one of other ‘+’s in the country|this compound is ‘+’,OBL in C) = 0.32
        P(OBL is not in a ‘+’|this compound is ‘+’,OBL in C) = 0.06
        P(this is OBL’s compound|this compound is ‘+’,OBL in C) = 0.62

        On the other hand, if x is low, then the first probability is high and the second must be low. To get P(this is OBL’s compound|this compound is ‘+’,OBL in C)=0.9, we need to set x=0.177, which makes P(there exists a ‘+’|OBL not in C)=0.16.

      • MGC says:


        Like any argument if you change the assumptions you change the conclusions. My argument is logical, not very realistic, but useful.

        Like you, I also struggled with the possibility of there being other compounds that could pass the test (e.g. get a +) but then decided to ignore it (simplify). The idea is that there is only one suspect compound and then they apply further tests so everything is conditional on it being suspect to begin with.

        The above seems like mixing the likelihood and the prior. More realistically one could think of the inferential sequence as follows: (i) A suspect compound is found (this sets agent’s prior); (ii) a sequence of tests is carried out from Washington, like satellite evidence, and for each step in the sequence a likelihood and posterior are calculated, generating an updated prior for next step; (iii) At some point the evidence is so strong an attack is launched.

        My analysis above is a gross over simplification of what in reality must be a complicated and haphazard process. Even so, I think the main conclusions hold.

        • Nameless says:

          Yes, simplifications are great, as long as they are justified. Here one of the key factors we’re dealing with is the likelihood that someone other than OBL might have a compound that fits our criteria. It seems to me that it should be one of the input variables.

          At the very least we need to spell out our assumptions and our definitions carefully. For example, as the post is written, the definition of P(+|OBL in C) is ambiguous. If we mean “probability that criteria are met, conditional on having OBL in this compound”, that’s a useful input variable, but saying that “P(+|OBL in C)=0.9” is not at all the same as “the CIA agent is very sure this is OBL’s compound”. What the CIA agent is really “very sure” of is that, if OBL is in _a_ compound somewhere, then he is in this particular compound. You don’t need to go full Poisson to see that “the likelihood that someone other than OBL might have a ‘+’ compound” and “the likelihood that OBL is in this specific ‘+’ compound, assuming that he is in a compound” are not independent.

          Ignoring the possibility of having multiple compounds from the beginning blurs things and makes it harder to think straight. You can model third-party compounds with a Poisson distribution, or you can simplify by allowing the existence of at most one third-party compound, or you can do that and then apply Bayes’ theorem one more time by introducing an observation that “there’s exactly one compound in Pakistan that fits the criteria”. Those are all good options. But trying to squeeze everything at once into a single application of Bayes’ theorem would be oversimplifying.

          • MGC says:

            I agree the language in my post is not clear. I wrote it off the cuff after the movie, and kept it simple so high school kids or undergrads might relate to it. It’s a fun example.

            In light of your comments I would make it extra clear example starts off conditional on suspect compound. Then the question is whether OBL is in that specific compound as we gather more evidence.

      • Steve Sailer says:

        “Are we sure at any point that there is at most one compound in Pakistan that fits these criteria?”

        Good question.

        The night of the SEAL Team Six raid on Bin Laden’s compound, I got on Google Maps and, going off of verbal descriptions, located the compound a mile north of the Pakistani military academy. No problem finding it.

        The next day, however, I found out I had gotten the wrong compound. Bin Laden’s compound was a mile south of the military academy. Sorry Mr. Non-Bin Laden Compound-Dweller!

        They were the same size, same look.

    • MGC says:


      You state: “MGC is ignoring the possibility that someone other than a drug lord or UBL is living in the compound. It could have been another high ranking terrorist.”

      I stated: “Third, given what the film portrays about the compound, I think it belonging to a drug lord was pretty minimal, so I would say P(+|OBL not in C)=0.1. ” The reference to the drug lord is as an example, the conditional probability includes all other possible inhabitants except OBL. E.g. P(+|OBL not in C) means probability positive conditional on *anyone* other than OBL being in C.

      • Zach says:

        Yes, I agree, the conditional probability P(+|OBL not in C) includes everyone other than OBL (or UBL as they call him in the movie, which it took me a while to get). That was really my only point. You seemed to be saying that it’s unlikely a drug lord would have a compound like that, so P(+|OBL not in C) is low. I’m saying that it’s not unlikely that another high ranking terrorist would have a compound like that, so P(+|OBL not in C) is not that low.

      • Paul says:

        I went through a similar mental calculation while watching the movie, eliminating the possibility that the compound’s inhabitant was a drug lord. Then yesterday, I came across this article which mentions that there were huge duffels of raw opium under each bed in each computer room in the compound.

  4. Jonathan (a different one) says:

    Actually, what I liked most about the probabilistic reasoning was when one guy says “It’s 60 percent, and that factors in Jessica Chastain’s 95 percent.” What is unstated is whether that’s because she didn;t get much weight or because the other non-JC posteriors were so low. But the implication is that her certainty isn’t new evidence, or certainly not independent evidence in the calculation.

  5. numeric says:

    This suggests the White House was extremely risk averse: it seemed very likely they had OBL and not somebody else yet they sat on it for half a year (at substantial risk to homeland)!!!!

    Agreed Obama is risk-averse (this is why he uses a teleprompter so much–not because he can’t but he knows that if he even phrases something somewhat ambiguously (as in “didn’t build that”–what he meant that referred to is different than what the Republicans claimed), there was no “substantial” risk to the homeland. Bin Laden, through the slow process of relying on messengers rather than electronic means, had long ago lost any operational control over Al Queda. He was more like a senior theorist, more like the non-productive faculty members around most academic departments who nonetheless can muck things up. What his death does allow is for the US to leave Afghanistan with the claim that the US has accomplished its goals (which was presumably killing Bin Laden). So that is a face-saving result for America. And it helped Obama get re-elected, but the real story of the election (and all American elections since 1960–sorry, Andrew, Red State Blue State should be called Racist State, not so Racist state) was the continuing racially polarized voting in the electorate. Just as a thought experiment, think of all those academic models where voters choose a president on the basis of how “close” they are to a party’s positions. Then consider how far to the right Romney moved (under pressure in the primaries, of course), and how it didn’t matter to white voters! (61% for Romney). The problem for national Republicans is now that there is an offsetting block of minority voters who are just as obdurate in their voting propensities as whites. As another thought experiment, plot the minority composition of the electorate versus the Democratic share of the vote since 1964. Then do the same for the economic models. Which would have greater explanatory power?

  6. Dan says:

    a lot of these posts (and the govt analysts in the film) focus on the problem as if it’s just a random guarded house in Pakistan. Except, OBL’s courier led them to the house, which (rightly) impacts both your probability of whether or not hes in a compound, and whether or not hes in THIS compound.

  7. joe says:

    What I believe many people are forgetting is that we need to factor in (as the movie mentions) that there had been a year before a false positive resulting in X number of seals deaths. What was the evidence there and if it was similar, perhaps an update changing probability from “maya” to “brass” is warranted.

  8. Mike says:

    Ironically, this is a great example of the pitfalls of Bayesian analysis. Let me explain. As Nameless pointed out, the fundamentals of this calculation are way off. We’re interested in the probability that OBL is in that particular compound, not just a compound. The prior for that is certainly not 0.5, an overestimate by a factor in the thousands or more. Given that low prior, you may as well set P(info | OBL there) to be 1 — the update will be almost entirely driven by P(info | OBL not there). That probability needs to be exceedingly low, which is exactly why details of the compound were not enough. They needed the link to OBL’s courier.

    This demonstrates a sneaky problem with Bayesian analysis: the ability of the analyst to tweak the prior to get the conclusion he or she wants. This is especially dangerous when using ex post knowledge. It may seem reasonable now to say that of course OBL was hiding there, what was Obama waiting for, but at some point it was just some house in the middle of nowhere.

    • MGF says:

      I hear your concerns. More than tweaking priors the difficulty as I see it is the continuous learning process. You seem to want to start from a position where we know nothing. After all, OBL could have been in any house anywhere in the world: Why limit it to PAK? If so probability he is in any given compound is.infinitesimal.

      I choose to start from the suspect compound so everything learned up to that point is the prior. Now the question is whether OBL is in that specific compound.

  9. Steve Sailer says:

    Here’s a question for you that might put this in perspective: Was this the least bad statistical analysis scene in the history of action movies?


    Let’s not get spoiled by “Moneyball” into not appreciating statistically pretty good scriptwriting.

  10. MGC says:

    One aspect not discussed in the post is what is the best hiding strategy for OBL. The problem with OBL’s compound is that it was so obviously trying to hide someone it actually was quite transparent. Had OBL known the sort of criteria used to make the decision to attack, he might have modified his hiding strategy. Obviously “hiding in plain sight” is an option but with attendant risks. What is the nature of the tradeoff? How does one formalize it? Anyone wants to run some simulations for the optimal concealment strategy?

    • Steve Sailer says:

      “The problem with OBL’s compound is that it was so obviously trying to hide someone it actually was quite transparent.”

      It worked for a half-dozen years despite a $25 million bounty on his head.

      • MGC says:

        ah yes. Maybe has something yo do with ineptitude of pursuers:

        1. See point in post about top brass not disciplining their thoughts using Bayes

        2. See moment in film when intern reports much of Intel received after 9/11 was not filed adequately, went overlooked. This in the age of Big Data, scanners, OCR, etc.

  11. APC says:

    How would you apply a Bayesian analysis to this question:

    Is keeping a gun in your home more likely to save the life of someone in your home (by shooting or scaring off an intruder with murderous intent) or to cause the death of someone in your home (by accident, murder, or suicide)? How would you also take into account the risk that your gun will be incorrectly used to save the life of someone in your home, i.e., result in the killing of an innocent person (like the kid shot because he pulled into the wrong driveway)?

    • Steve Sailer says:

      What about the social impact? I live in a liberal-voting part of Los Angeles where Obama had what might be his single most lucrative fundraiser in 2012. Homeowners here tend to be heavily armed (especially since the Rodney King riots). The sporting goods store nearest to CBS Studios, for example, sells a large assortment of guns, and not many locals are hunters. My general impression is that people in the entertainment industry pack a lot of heat.

      The result: burglary, home invasion, car jacking, even graffiti have dropped sharply over the years.

      In general, armed neighbors make for a lower risk of property crime. It’s a lot like vaccinations. The ideal individual outcome is for everybody else’s child to be vaccinated, but not your own child.

  12. Steve_NM says:

    A fundamental assumption held to be true in all this debate is that statistical modelling is an appropriate tool to use in this situation (or in any situation similar to this one).

    What evidence can be presented to show that this is the case?

  13. Glen says:

    For the White House decision, it seems the main focus was on how to do it, not if he was there. The fact that they decided to strike in a high risk operation implies they thought there was more than a 60% probability he was there. I’d guess it was 75%+, considering the risks of extraction after the operation were high, as interception by scrambled pakistani jets would be disastrous.

    Once they decided not to go with the drone strike (as identifying Bin Laden’s body may have been impossible), then the planning and training needs for the operation were considerable. Getting that done in under a couple of months is high speed.

  14. […] Zero Dark Thirty and Bayes’ theorem ( […]

  15. […] Bayes theorem and sitting on the can vacillating for half a year. […]

  16. David Karger says:

    Using Bayes Theorem to get a probability of OBL presence isn’t enough; the important calculation is the expected utility of sending in the Seal Team. If the possible loss (should OBL be missing) is large compared to the possible gain (if OBL is present) then one needs an extremely high probability of OBL presence to justify taking the risk.

    • MGC says:

      Yes, that is a good point, but there are many possible courses of action other than the SEALs. According to the movie the fear of the White House was not failing to get a target, so much as making sure the target was OBL. That was the major risk.

      So the status quo (SQ) is do nothing, and doing something is a lottery between the best outcome (OBL in C) and the worst outcome (OBL not in C). If probability of success is .9 I would say its reasonable to assume EU(do something)> U(SQ) for a risk neutral president.

  17. […] the hunt for Osama bin Laden (or UBL as he is often referred to in the film)? If so, this piece on Zero Dark Thirty and Bayes’ theorem may pique your interest. (Statistical Modeling, Causal Inference, and Social […]

  18. Bob the Builder says:

    Why is everyone focusing on the compound in isolation? If the movie is true to what actually happened, we know that someone who was suspected to be OBL’s TOP courier resided at this compound. That alone disqualifies nearly every other compound in Pakistan, no?