What are the most important statistical ideas of the past 50 years?

Many of you have heard of this article (with Aki Vehtari) already—we wrote the first version in 2020, then did some revision for its publication in the Journal of the American Statistical Association.

But the journal is not open-access so maybe there are people who are interested in reading the article who aren’t aware of it or don’t know how to access it.

Here’s the article [ungated]. It begins:

We review the most important statistical ideas of the past half century, which we categorize as: counterfactual causal inference, bootstrapping and simulation-based inference, overparameterized models and regularization, Bayesian multilevel models, generic computation algorithms, adaptive decision analysis, robust inference, and exploratory data analysis. We discuss key contributions in these subfields, how they relate to modern computing and big data, and how they might be developed and extended in future decades. The goal of this article is to provoke thought and discussion regarding the larger themes of research in statistics and data science.

I really love this paper. Aki and I present our own perspective—that’s unavoidable, indeed if we didn’t have an interesting point of view, there’d be no reason to write or read article in the first place—but we also worked hard to give a balanced view, including ideas that we think are important but which we have not worked on or used ourselves.

Also, here’s a talk I gave a couple years ago on this stuff.

4 thoughts on “What are the most important statistical ideas of the past 50 years?

Leave a Reply

Your email address will not be published. Required fields are marked *