NeurIPS 2020
### A Dynamical Central Limit Theorem for Shallow Neural Networks

### Meta Review

The paper provides CLT-like results for the dynamics of single-hidden layer, wide neural networks in the mean-field limit. The authors also show that under certain conditions the long-time fluctuations can be controlled with an MC type resampling error. The reviewers had a positive assessment of the finite width analysis and the strength of some of the technical contributions. They did however raise a variety of concerns regarding the asymptotic nature of results (both in n and t), assumptions on Dhat, and lack of results with discretization. While some of these concerns were alleviated based on the authors’ response, the more critical reviewers maintained their score and one positive reviewer slightly decreased theirs from 8 to 7. I agree with the reviewers that CLT type results for finite width is indeed interesting. In particular, compared with prior work such as [16] the dependence in time in their results is not exponential. I also agree with R2 that “the asymptotic analysis leaves open a number of questions that cannot be answered unless the analysis is refined”. Given the other papers I am handling I believe this paper is slightly above the acceptance threshold. I strongly urge the authors to address the excellent points raised by the reviewers in their final manuscript.