Effective Dimension and Generalization of Kernel Learning

Part of Advances in Neural Information Processing Systems 15 (NIPS 2002)

Bibtex Metadata Paper

Authors

Tong Zhang

Abstract

We investigate the generalization performance of some learning prob- lems in Hilbert function Spaces. We introduce a concept of scale- sensitive effective data dimension, and show that it characterizes the con- vergence rate of the underlying learning problem. Using this concept, we can naturally extend results for parametric estimation problems in finite dimensional spaces to non-parametric kernel learning methods. We de- rive upper bounds on the generalization performance and show that the resulting convergent rates are optimal under various circumstances.