Processing math: 100%

Posterior Matching for Arbitrary Conditioning

Part of Advances in Neural Information Processing Systems 35 (NeurIPS 2022) Main Conference Track

Bibtex Paper Supplemental

Authors

Ryan Strauss, Junier B Oliva

Abstract

Arbitrary conditioning is an important problem in unsupervised learning, where we seek to model the conditional densities p(xuxo) that underly some data, for all possible non-intersecting subsets o,u{1,,d}. However, the vast majority of density estimation only focuses on modeling the joint distribution p(x), in which important conditional dependencies between features are opaque. We propose a simple and general framework, coined Posterior Matching, that enables Variational Autoencoders (VAEs) to perform arbitrary conditioning, without modification to the VAE itself. Posterior Matching applies to the numerous existing VAE-based approaches to joint density estimation, thereby circumventing the specialized models required by previous approaches to arbitrary conditioning. We find that Posterior Matching is comparable or superior to current state-of-the-art methods for a variety of tasks with an assortment of VAEs (e.g.~discrete, hierarchical, VaDE).