Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Kuan-Yun Lee, Thomas Courtade
We establish a new class of minimax prediction error bounds for generalized linear models. Our bounds significantly improve previous results when the design matrix is poorly structured, including natural cases where the matrix is wide or does not have full column rank. Apart from the typical $L_2$ risks, we study a class of entropic risks which recovers the usual $L_2$ prediction and estimation risks, and demonstrate that a tight analysis of Fisher information can uncover underlying structural dependency in terms of the spectrum of the design matrix. The minimax approach we take differs from the traditional metric entropy approach, and can be applied to many other settings.