Unifying Framework for Fast Learning Rate of Non-Sparse Multiple Kernel Learning

Part of Advances in Neural Information Processing Systems 24 (NIPS 2011)

Bibtex Metadata Paper Supplemental

Authors

Taiji Suzuki

Abstract

In this paper, we give a new generalization error bound of Multiple Kernel Learning (MKL) for a general class of regularizations. Our main target in this paper is dense type regularizations including ℓp-MKL that imposes ℓp-mixed-norm regularization instead of ℓ1-mixed-norm regularization. According to the recent numerical experiments, the sparse regularization does not necessarily show a good performance compared with dense type regularizations. Motivated by this fact, this paper gives a general theoretical tool to derive fast learning rates that is applicable to arbitrary monotone norm-type regularizations in a unifying manner. As a by-product of our general result, we show a fast learning rate of ℓp-MKL that is tightest among existing bounds. We also show that our general learning rate achieves the minimax lower bound. Finally, we show that, when the complexities of candidate reproducing kernel Hilbert spaces are inhomogeneous, dense type regularization shows better learning rate compared with sparse ℓ1 regularization.