An adaptive Mirror-Prox method for variational inequalities with singular operators

Part of Advances in Neural Information Processing Systems 32 (NeurIPS 2019)

AuthorFeedback Bibtex MetaReview Metadata Paper Reviews Supplemental

Authors

Kimon Antonakopoulos, Veronica Belmega, Panayotis Mertikopoulos

Abstract

Lipschitz continuity is a central requirement for achieving the optimal O(1/T) rate of convergence in monotone, deterministic variational inequalities (a setting that includes convex minimization, convex-concave optimization, nonatomic games, and many other problems). However, in many cases of practical interest, the operator defining the variational inequality may become singular at the boundary of the feasible region, precluding in this way the use of fast gradient methods that attain this rate (such as Nemirovski's mirror-prox algorithm and its variants). To address this issue, we propose a novel smoothness condition which we call Bregman smoothness, and which relates the variation of the operator to that of a suitably chosen Bregman function. Leveraging this condition, we derive an adaptive mirror prox algorithm which attains an O(1/T) rate of convergence in problems with possibly singular operators, without any prior knowledge of the problem's Bregman constant (the Bregman analogue of the Lipschitz constant). We also present an extension of our algorithm to stochastic variational inequalities where the algorithm achieves a $O(1/\sqrt{T})$ convergence rate.