Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
This paper investigates the convergence of unadjusted Langevin algorithm (ULA) under the log-Sobolev inequality condition and shows convergence with respect to Reny divergence, which generalizes the existing results for KL-divergence. The analysis for the convergence on the Reny divergence is novel and interesting. However, Theorems 1 and 2 require some more discussions about comparison with existing studies. Actually, reviewer 3 pointed out the following paper:  Ma, Yi-An, et al. "Is There an Analog of Nesterov Acceleration for MCMC?." arXiv preprint arXiv:1902.00996 (2019). This paper shows the exponential convergence under the log-Sobolev inequality in its Sec. 3.1. In addition to that, the following paper also uses the log-Sobolev inequality for showing the exponential convergence of the relative entropy and the convergence of ULA: Raginsky, Rakhlin, Telgarsky: Non-convex learning via Stochastic Gradient Langevin Dynamics: a nonasymptotic analysis. COLT2017. Hence, we recommend the authors to clarify the novelty of Theorem 1 and Theorem 2 by putting the contribution appropriately in the literature. Despite this issue, this paper contains a novelty result for the convergence of ULA with respect to KL and Reny divergences.