Riemannian SVRG: Fast Stochastic Optimization on Riemannian Manifolds

Part of Advances in Neural Information Processing Systems 29 (NIPS 2016)

Bibtex Metadata Paper Reviews Supplemental

Authors

Hongyi Zhang, Sashank J. Reddi, Suvrit Sra

Abstract

We study optimization of finite sums of \emph{geodesically} smooth functions on Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums have witnessed tremendous attention in the recent years, existing work is limited to vector space problems. We introduce \emph{Riemannian SVRG} (\rsvrg), a new variance reduced Riemannian optimization method. We analyze \rsvrg for both geodesically \emph{convex} and \emph{nonconvex} (smooth) functions. Our analysis reveals that \rsvrg inherits advantages of the usual SVRG method, but with factors depending on curvature of the manifold that influence its convergence. To our knowledge, \rsvrg is the first \emph{provably fast} stochastic Riemannian method. Moreover, our paper presents the first non-asymptotic complexity analysis (novel even for the batch setting) for nonconvex Riemannian optimization. Our results have several implications; for instance, they offer a Riemannian perspective on variance reduced PCA, which promises a short, transparent convergence analysis.