Part of Advances in Neural Information Processing Systems 16 (NIPS 2003)
Thore Graepel, Ralf Herbrich
invariances with respect to given pattern Knowledge about local transformations can greatly improve the accuracy of classification. Previous approaches are either based on regularisation or on the gen- eration of virtual (transformed) examples. We develop a new frame- work for learning linear classifiers under known transformations based on semidefinite programming. We present a new learning algorithm— the Semidefinite Programming Machine (SDPM)—which is able to find a maximum margin hyperplane when the training examples are polynomial trajectories instead of single points. The solution is found to be sparse in dual variables and allows to identify those points on the trajectory with minimal real-valued output as virtual support vec- tors. Extensions to segments of trajectories, to more than one trans- formation parameter, and to learning with kernels are discussed. In experiments we use a Taylor expansion to locally approximate rota- tional invariance in pixel images from USPS and find improvements over known methods.