Part of Advances in Neural Information Processing Systems 14 (NIPS 2001)
Adam Kowalczyk, Alex Smola, Robert C. Williamson
We give results about the learnability and required complexity of logical formulae to solve classification problems. These results are obtained by linking propositional logic with kernel machines. In particular we show that decision trees and disjunctive normal forms (DNF) can be repre- sented by the help of a special kernel, linking regularized risk to separa- tion margin. Subsequently we derive a number of lower bounds on the required complexity of logic formulae using properties of algorithms for generation of linear estimators, such as perceptron and maximal percep- tron learning.