First Wikipedia, then the Times (featuring Yair Ghitza), now Slashdot (featuring Allen “PyStan” Riddell). Just get us on Gawker and we’ll have achieved total media saturation.
Next step, backlash. Has Stan jumped the shark? Etc. (We’d love to have a “jump the shark” MCMC algorithm but I don’t know if or when we’ll get there. I’m still struggling trying to get wedge sampling to work.)
I think Metropolis jumped the shark when it teamed up with Gibbs to try to get wider distribution.
Interesting paper referenced on Wiki Stan site http://onlinelibrary.wiley.com/doi/10.1002/pst.1595/pdf
An actual survey of statisticians and their perceived difficulties in adopting Bayesian approaches in their work.
A number of glib comments about what novices should know and be able to discuss:
“[if] non-informative priors are to be used, interpretation of the results would need to be understood by the team,
along with assumptions and potential drawbacks”
“explain what the return will be on this investment [adopting Bayes]”
“good understanding of the pros and cons of Bayesian as well as frequentist statistics”
As if these are things novices/early adopters could easily pick up clearly in most of the available literature!
Then they point to some arguably useful (expert, not well known?) materials on those questions.
“Examples for integrating the best of Bayesian and frequentist ideas can be found in [47–52]. For example, Little advocated ‘calibrated Bayes’, a concept that builds upon work by Box, Rubin, and Gelman.”
Paper was obviously written by a committee – well worth the read.
I can see the Buzzfeed headline now. “QUIZ: Which Bayesian computational tool are you?” with selections available such as numerical integration, grid computation, MCMC, HMC, RHMCMC..
Someone make this a reality!
“jump-the-shark” seems like the perfect name for an algorithm for multi-modal posteriors.
I had a paper reviewed at Science using STAN which we’ve revised and are submitting to Nature. Hopefully it goes and gets STAN some more publicity.