NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:4222
Title:How degenerate is the parametrization of neural networks with the ReLU activation function?


		
The paper provides interesting new theoretical results on inverse stability. As with all current theory papers about neural networks, it is unclear whether this research direction will prove fruitful for practical insights, and this uncertainty is reflected in the reviews. However, the paper has sufficient redeeming qualities to merit acceptance: the results are clean, the paper is very well written, and the rebuttal provides sufficiently substantial indications of how the results can have an impact. Further remarks: * I agree with reviewer #2 that the claim about extensions to deeper networks seems unnecessary and detracts from the paper. You might replace this by a discussion of the potential difficulties of such an extension. * For future work, I would suggest to the authors that establishing the importance of the research direction might have higher priority than generalizing the results to deeper networks.