Part of Advances in Neural Information Processing Systems 17 (NIPS 2004)
Dori Peleg, Ron Meir
A novel linear feature selection algorithm is presented based on the global minimization of a data-dependent generalization error bound. Feature selection and scaling algorithms often lead to non-convex opti- mization problems, which in many previous approaches were addressed through gradient descent procedures that can only guarantee convergence to a local minimum. We propose an alternative approach, whereby the global solution of the non-convex optimization problem is derived via an equivalent optimization problem. Moreover, the convex optimization task is reduced to a conic quadratic programming problem for which effi- cient solvers are available. Highly competitive numerical results on both artificial and real-world data sets are reported.