Boosted Dyadic Kernel Discriminants

Part of Advances in Neural Information Processing Systems 15 (NIPS 2002)

Bibtex Metadata Paper

Authors

Baback Moghaddam, Gregory Shakhnarovich

Abstract

We introduce a novel learning algorithm for binary classi(cid:12)cation with hyperplane discriminants based on pairs of training points from opposite classes (dyadic hypercuts). This algorithm is further extended to nonlinear discriminants using kernel functions satisfy- ing Mercer’s conditions. An ensemble of simple dyadic hypercuts is learned incrementally by means of a con(cid:12)dence-rated version of Ad- aBoost, which provides a sound strategy for searching through the (cid:12)nite set of hypercut hypotheses. In experiments with real-world datasets from the UCI repository, the generalization performance of the hypercut classi(cid:12)ers was found to be comparable to that of SVMs and k-NN classi(cid:12)ers. Furthermore, the computational cost of classi(cid:12)cation (at run time) was found to be similar to, or bet- ter than, that of SVM. Similarly to SVMs, boosted dyadic kernel discriminants tend to maximize the margin (via AdaBoost). In contrast to SVMs, however, we o(cid:11)er an on-line and incremental learning machine for building kernel discriminants whose complex- ity (number of kernel evaluations) can be directly controlled (traded o(cid:11) for accuracy).