Efficient Multiscale Sampling from Products of Gaussian Mixtures

Part of Advances in Neural Information Processing Systems 16 (NIPS 2003)

Bibtex Metadata Paper

Authors

Alexander Ihler, Erik Sudderth, William Freeman, Alan Willsky

Abstract

The problem of approximating the product of several Gaussian mixture distributions arises in a number of contexts, including the nonparametric belief propagation (NBP) inference algorithm and the training of prod- uct of experts models. This paper develops two multiscale algorithms for sampling from a product of Gaussian mixtures, and compares their performance to existing methods. The first is a multiscale variant of pre- viously proposed Monte Carlo techniques, with comparable theoretical guarantees but improved empirical convergence rates. The second makes use of approximate kernel density evaluation methods to construct a fast approximate sampler, which is guaranteed to sample points to within a tunable parameter (cid:15) of their true probability. We compare both multi- scale samplers on a set of computational examples motivated by NBP, demonstrating significant improvements over existing methods.