A General and Efficient Multiple Kernel Learning Algorithm

Sören Sonnenburg, Gunnar Rätsch, Christin Schäfer

Advances in Neural Information Processing Systems 18 (NIPS 2005)

While classical kernel-based learning algorithms are based on a single kernel, in practice it is often desirable to use multiple kernels. Lankriet et al. (2004) considered conic combinations of kernel matrices for classi- fication, leading to a convex quadratically constraint quadratic program. We show that it can be rewritten as a semi-infinite linear program that can be efficiently solved by recycling the standard SVM implementa- tions. Moreover, we generalize the formulation and our method to a larger class of problems, including regression and one-class classifica- tion. Experimental results show that the proposed algorithm helps for automatic model selection, improving the interpretability of the learn- ing result and works for hundred thousands of examples or hundreds of kernels to be combined.

f (x) = sign N Xi=1

αiyik(xi, x) + b! ,