NIPS Proceedingsβ

Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices

Part of: Advances in Neural Information Processing Systems 32 (NIPS 2019) pre-proceedings

[PDF] [BibTeX] [Supplemental]

Authors

Conference Event Type: Poster

Abstract

We study the Unadjusted Langevin Algorithm (ULA) for sampling from a probability distribution $\nu = e^{-f}$ on $\R^n$. We prove a convergence guarantee in Kullback-Leibler (KL) divergence assuming $\nu$ satisfies log-Sobolev inequality and $f$ has bounded Hessian. Notably, we do not assume convexity or bounds on higher derivatives. We also prove convergence guarantees in R\'enyi divergence of order $q > 1$ assuming the limit of ULA satisfies either log-Sobolev or Poincar\'e inequality.