Efficiency versus Convergence of Boolean Kernels for On-Line Learning Algorithms

Part of Advances in Neural Information Processing Systems 14 (NIPS 2001)

Bibtex Metadata Paper

Authors

Roni Khardon, Dan Roth, Rocco A. Servedio

Abstract

We study online learning in Boolean domains using kernels which cap- ture feature expansions equivalent to using conjunctions over basic fea- tures. We demonstrate a tradeoff between the computational efficiency with which these kernels can be computed and the generalization abil- ity of the resulting classifier. We first describe several kernel functions which capture either limited forms of conjunctions or all conjunctions. We show that these kernels can be used to efficiently run the Percep- tron algorithm over an exponential number of conjunctions; however we also prove that using such kernels the Perceptron algorithm can make an exponential number of mistakes even when learning simple func- tions. We also consider an analogous use of kernel functions to run the multiplicative-update Winnow algorithm over an expanded feature space of exponentially many conjunctions. While known upper bounds imply that Winnow can learn DNF formulae with a polynomial mistake bound in this setting, we prove that it is computationally hard to simulate Win- now’s behavior for learning DNF over such a feature set, and thus that such kernel functions for Winnow are not efficiently computable.