NeurIPS 2020

Nonasymptotic Guarantees for Spiked Matrix Recovery with Generative Priors


Meta Review

The reviewers appreciate the analysis showing that the landscape of the optimization problem has tractable structure despite the complexity of the model. This is similar to the Hand-Voroninski result (extended to SPCA) but still remains one of the most impressive theoretical phenomena in optimization for deep networks. Reviewers and meta-reviewer were concerned that this does not lead (yet) to a proof that gradient-based optimization will converge to the global opt in poly-time but we expect this result to also be obtained. The empirical evaluation is limited and the problem is a bit contrived but perhaps the authors or someone else in the community can find a good application for this type of structure. There was a debate if the authors oversell their lack of "statistical to computational gap" but the meta-reviewer thinks that this is established in an average-case sense. For this reason this is a great fit for Neurips.