Part of Advances in Neural Information Processing Systems 17 (NIPS 2004)
Tong Zhang
We consider the problem of deriving class-size independent generaliza- tion bounds for some regularized discriminative multi-category classi- fication methods. In particular, we obtain an expected generalization bound for a standard formulation of multi-category support vector ma- chines. Based on the theoretical result, we argue that the formula- tion over-penalizes misclassification error, which in theory may lead to poor generalization performance. A remedy, based on a generalization of multi-category logistic regression (conditional maximum entropy), is then proposed, and its theoretical properties are examined.