Part of Advances in Neural Information Processing Systems 35 (NeurIPS 2022) Main Conference Track
Xuan Zhang, Necdet Serhat Aybat, Mert Gurbuzbalaban
We propose a new stochastic method SAPD+ for solving nonconvex-concave minimax problems of the form min, where f,g are closed convex and \Phi(x,y) is a smooth function that is weakly convex in x, (strongly) concave in y. For both strongly concave and merely concave settings, SAPD+ achieves the best known oracle complexities of \mathcal{O}(L\kappa_y\epsilon^{-4}) and \mathcal{O}(L^3\epsilon^{-6}), respectively, without assuming compactness of the problem domain, where \kappa_y is the condition number, and L is the Lipschitz constant. We also propose SAPD+ with variance reduction, which enjoys the best known oracle complexity of \mathcal{O}(L\kappa_y^2\epsilon^{-3}) for weakly convex-strongly concave setting. We demonstrate the efficiency of SAPD+ on a distributionally robust learning problem with a nonconvex regularizer and also on a multi-class classification problem in deep learning.