Improving the Accuracy and Speed of Support Vector Machines

Part of Advances in Neural Information Processing Systems 9 (NIPS 1996)

Bibtex Metadata Paper


Christopher J. C. Burges, Bernhard Schölkopf


Support Vector Learning Machines (SVM) are finding application in pattern recognition, regression estimation , and operator inver(cid:173) sion for ill-posed problems. Against this very general backdrop , any methods for improving the generalization performance, or for improving the speed in test phase, of SVMs are of increasing in(cid:173) terest. In this paper we combine two such techniques on a pattern recognition problem. The method for improving generalization per(cid:173) formance (the "virtual support vector" method) does so by incor(cid:173) porating known invariances of the problem. This method achieves a drop in the error rate on 10,000 NIST test digit images of 1.4% to 1.0%. The method for improving the speed (the "reduced set" method) does so by approximating the support vector decision sur(cid:173) face. We apply this method to achieve a factor of fifty speedup in test phase over the virtual support vector machine. The combined approach yields a machine which is both 22 times faster than the original machine, and which has better generalization performance, achieving 1.1 % error. The virtual support vector method is appli(cid:173) cable to any SVM problem with known invariances. The reduced set method is applicable to any support vector machine.