(This post is by Yuling, not Andrew, except many ideas are originated from Andrew.) This post is intended to advertise our new preprint Adaptive Path Sampling in Metastable Posterior Distributions by Collin, Aki, Andrew and me, where we developed an automated implementation of path sampling and adaptive continuous tempering. But I have been recently reading a writing book […]

Archive of entries posted by

## How good is the Bayes posterior for prediction really?

It might not be common courtesy of this blog to make comments on a very-recently-arxiv-ed paper. But I have seen two copies of this paper entitled “how good is the Bayes posterior in deep neural networks really” left on the tray of the department printer during the past weekend, so I cannot underestimate the popularity of […]

## “Machine Learning Under a Modern Optimization Lens” Under a Bayesian Lens

I (Yuling) read this new book Machine Learning Under a Modern Optimization Lens (by Dimitris Bertsimas and Jack Dunn) after I grabbed it from Andrew’s desk. Apparently machine learning is now such a wide-ranging area that we have to access it through some sub-manifold so as to evade dimension curse, and it is the same […]