Part of Advances in Neural Information Processing Systems 16 (NIPS 2003)
Jean-yves Audibert, Olivier Bousquet
There exist many different generalization error bounds for classification. Each of these bounds contains an improvement over the others for cer- tain situations. Our goal is to combine these different improvements into a single bound. In particular we combine the PAC-Bayes approach intro- duced by McAllester [1], which is interesting for averaging classifiers, with the optimal union bound provided by the generic chaining technique developed by Fernique and Talagrand [2]. This combination is quite nat- ural since the generic chaining is based on the notion of majorizing mea- sures, which can be considered as priors on the set of classifiers, and such priors also arise in the PAC-bayesian setting.