Distortion estimates for approximate Bayesian inference

Published in Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence, 2020

Current literature on posterior approximation for Bayesian inference offers many alternative methods. Does our chosen approximation scheme work well on the observed data? The best existing generic diagnostic tools treating this kind of question by looking at performance averaged over data space, or otherwise lack diagnostic detail. However, if the approximation is bad for most data, but good at the observed data, then we may discard a useful approximation. We give graphical diagnostics for posterior approximation at the observed data. We estimate a “distortion map” that acts on univariate marginals of the approximate posterior to move them closer to the exact posterior, without recourse to the exact posterior. Download paper here