NeurIPS 2020

Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model

Meta Review

The submission considers noiseless (/low noise) linear regression with non-linear transformation of the input data, and show that under this setting, SGD achieves faster convergence rates. This is a very nice contribution with applicability to important problems as mentioned by the authors in their feedback. We urge the authors to incorporate the points they made in response to the reviews.