Incremental and Decremental Support Vector Machine Learning

Part of Advances in Neural Information Processing Systems 13 (NIPS 2000)

Bibtex Metadata Paper

Authors

Gert Cauwenberghs, Tomaso Poggio

Abstract

An on-line recursive algorithm for training support vector machines, one vector at a time, is presented. Adiabatic increments retain the Kuhn(cid:173) Tucker conditions on all previously seen training data, in a number of steps each computed analytically. The incremental procedure is re(cid:173) versible, and decremental "unlearning" offers an efficient method to ex(cid:173) actly evaluate leave-one-out generalization performance. Interpretation of decremental unlearning in feature space sheds light on the relationship between generalization and geometry of the data.