NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:6392
Title:Generative Modeling by Estimating Gradients of the Data Distribution


		
The paper proposes to perform Langevin dynamics in data space (as opposed to the latent space) of a deep generative model as a means to explore the data distribution. This reduces the difficult problem of estimating the data distribution to the slightly less difficult problem of estimating its gradients. The latter ones are estimated by different versions of score matching. This paper mainly builds on recent work on score matching by random projections. As a result, a new generative model is achieved whose sample quality is similar to GANs, while avoiding an adversarial training paradigm. This is a strong contribution. As a minor point of criticism, the reviewers wished a more thorough analysis of the effect of extra noise added on the data, which could be provided for the camera-ready version.