Part of Advances in Neural Information Processing Systems 37 (NeurIPS 2024) Main Conference Track
Ilias Diakonikolas, Nikos Zarifis
We study the problem of PAC learning γ-margin halfspaces in the presence of Massart noise. Without computational considerations, the sample complexity of this learning problem is known to be ˜Θ(1/(γ2ϵ)). Prior computationally efficient algorithms for the problem incur sample complexity ˜O(1/(γ4ϵ3)) and achieve 0-1 error of η+ϵ, where η<1/2 is the upper bound on the noise rate.Recent work gave evidence of an information-computation tradeoff, suggesting that a quadratic dependence on 1/ϵ is required for computationally efficient algorithms. Our main result is a computationally efficient learner with sample complexity ˜Θ(1/(γ2ϵ2)), nearly matching this lower bound. In addition, our algorithm is simple and practical, relying on online SGD on a carefully selected sequence of convex losses.