NeurIPS 2020

Deep learning versus kernel learning: an empirical study of loss landscape geometry and the time evolution of the Neural Tangent Kernel

Meta Review

The reviews for this paper were overall positive. The paper presents a empirical inquiry of the landscape of the loss of the data-dependent neural tangent kernel. The authors examine dynamics of the kernel, the loss landscape, comparing the learned kernel to corresponding neural networks. The reviewers appreciated that the evaluation of the so-called 'parent-child spawning' phenomena and the approximation accuracy of data-dependent neural tangent kernels. The authors made a laudable effort to put to question common preconceptions about neural tangent kernels through an extensive set of numerical experiments leading to interesting empirical observations. We recommend to carefully read the reviewers' comments and suggestions and take them into account while preparing the camera ready final version. The authors may pay particular attention to correct several misleading terms and to nuance several claims. If the final version is sufficient nuanced and polished, the paper will be a important and timely contribution to the field. Accept.