Part of Advances in Neural Information Processing Systems 16 (NIPS 2003)
Saharon Rosset, Ji Zhu, Trevor Hastie
Margin maximizing properties play an important role in the analysis of classi£- cation models, such as boosting and support vector machines. Margin maximiza- tion is theoretically interesting because it facilitates generalization error analysis, and practically interesting because it presents a clear geometric interpretation of the models being built. We formulate and prove a suf£cient condition for the solutions of regularized loss functions to converge to margin maximizing separa- tors, as the regularization vanishes. This condition covers the hinge loss of SVM, the exponential loss of AdaBoost and logistic regression loss. We also generalize it to multi-class classi£cation problems, and present margin maximizing multi- class versions of logistic regression and support vector machines.