Part of Advances in Neural Information Processing Systems 31 (NeurIPS 2018)
Yixi Xu, Xiao Wang
This paper presents a general framework for norm-based capacity control for Lp,q weight normalized deep neural networks. We establish the upper bound on the Rademacher complexities of this family. With an Lp,q normalization where q≤p∗ and 1/p+1/p∗=1, we discuss properties of a width-independent capacity control, which only depends on the depth by a square root term. We further analyze the approximation properties of Lp,q weight normalized deep neural networks. In particular, for an L1,∞ weight normalized network, the approximation error can be controlled by the L1 norm of the output layer, and the corresponding generalization error only depends on the architecture by the square root of the depth.