Skip to content

UnConMax – uncertainty consideration maxims 7 +/- 2

Warning – this blog post is meant to encourage some loose, fuzzy and possibly distracting thoughts about the practice of statistics in research endeavours. There maybe spelling and grammatical errors as well as a lack of proper sentence structure. It may not be understandable to many or even possibly any readers.

But somewhat more seriously, its better that “ConUnMax”

So far I have five maxims

1. Explicit models of uncertanty are useful but – always wrong and can always be made less wrong
2. If the model is formally a probability model – always use probability calculus (Bayes)
3. Always useful to make the model a formal probability model – no matter what (Bayesianisn)
4. Never use a model that is not empirically motivated and strongly empirically testable (Frequentist – of the anti-Bayesian flavour)
5. Quantitative tools are always just a means to grasp and manipulate models – never an end in itself (i.e. don’t obsess over “baby” mathematics)
6. If one really understood statistics, they could always successfully explain it to any zoombie



  1. Are there really any maxims for Uncertainty Consideration?

    I think one might add…a picture's worth a thousand words; two wrongs do make a right (in formal logic); and the flaw of averages.

    All are not necessarily "uncertainty considerations" but are fundamental considerations in mathematics that certainly frame and color our consideration of uncertainty.

  2. ahuri says:

    How about this one (found on John Cook's blog) :

    “I beseech you in the bowels of Christ think it possible you may be mistaken.”

    [Oliver Cromwell, in a letter to the general assembly of the Church of Scotland. 1650.]

    In my opinion, it somehow links to APJ Taylor's "Extreme views weakly held" that Prof. Gelman wrote about…

  3. george says:

    If standard use of e.g. linear regression is to be allowed by your maxims, Number 4 <a>seems untenable. (Is that your intention?)

    I disagree about Number 5; while quantitative tools do help us understand uncertainty, there seems no reason for them to "always" go through models.

    Number 3 seems a very blinkered view. And as for 'zoombies', is that what you really meant?

  4. K? O'Rourke says:

    "MaxUnCon" would have been worse.

    Mark – thanks, 7. Always plot how the model ingredients "come together or not".

    George – the linearity in the data model is an empirically testable assumption, sometimes the prior assumptions for say the variance parameter may not be.

    We can only think through representations (signs, models, semiosis) …

    And sorry about the zoombie spelling mistake – it was not on purpose but there was a disclaimer at the top.

    And the maxims were not meant to be consistent with each other, but as loose ways to think about serious statistical issues.

    Hope to post on maxim 6 this weekend – with a course outline for zombies with a suggested exercise and R coded solution.

    Well, hopefully this weekend – but zombies are eternally patient – or so I hope.