Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Cheng Chen, Luo Luo, Weinan Zhang, Yong Yu
The Frank-Wolfe algorithm is a classic method for constrained optimization problems. It has recently been popular in many machine learning applications because its projection-free property leads to more efficient iterations. In this paper, we study projection-free algorithms for convex-strongly-concave saddle point problems with complicated constraints. Our method combines Conditional Gradient Sliding with Mirror-Prox and show that it only requires $\tilde{\cO}(1/\sqrt{\epsilon})$ gradient evaluations and $\tilde{\cO}(1/\epsilon^2)$ linear optimizations in the batch setting. We also extend our method to the stochastic setting and propose first stochastic projection-free algorithms for saddle point problems. Experimental results demonstrate the effectiveness of our algorithms and verify our theoretical guarantees.