Part of Advances in Neural Information Processing Systems 29 (NIPS 2016)
Srinadh Bhojanapalli, Behnam Neyshabur, Nati Srebro
We show that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrix recovery from incoherent linear measurements. With noisy measurements we show all local minima are very close to a global optimum. Together with a curvature bound at saddle points, this yields a polynomial time global convergence guarantee for stochastic gradient descent {\em from random initialization}.