NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:4561
Title:Online Normalization for Training Neural Networks


		
The authors propose a new normalization technique for training deep networks called online normalization, as an alternative to batch normalization, providing both theoretical analyses and experimental results for the proposed approach. The topic is likely to be of broad interest to the NeurIPS audience given the prevalence of batch normalization in deep learning. All four reviewers found significant merit in the ideas in the paper - but they also had a number of specific technical questions (e.g., by R1 and R4) that should be addressed in the final version of the paper.