NeurIPS 2020

Least Squares Regression with Markovian Data: Fundamental Limits and Algorithms


Meta Review

The paper makes strong theoretical contributions towards the understanding of SGD under Markovian data for the simple least square problems. After discussion, reviews were consistent about the novelty of the lower bound and optimality of SGD-DD. The analysis of SGD with experience-replay, albeit for a very narrow setting, is considered interesting. In the final version, please take the reviewer comments into account. In particular, the following should be addressed: - Detailed discussion and comparison to prior results of information-theoretic lower bounds under Markovian noise for general convex problems established in [Duchi et al., 2012] - Clear explanation of the noise model - Additional numerical experiments on a more complicated example