Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Nicholas Harvey, Christopher Liaw, Tasuku Soma
We consider the problem of nonnegative submodular maximization in the online setting. At time step t, an algorithm selects a set St ∈ C ⊆ 2^V where C is a feasible family of sets. An adversary then reveals a submodular function ft. The goal is to design an efficient algorithm for minimizing the expected approximate regret. In this work, we give a general approach for improving regret bounds in online submodular maximization by exploiting “first-order” regret bounds for online linear optimization. - For monotone submodular maximization subject to a matroid, we give an efficient algorithm which achieves a (1 − c/e − ε)-regret of O(√kT ln(n/k)) where n is the size of the ground set, k is the rank of the matroid, ε > 0 is a constant, and c is the average curvature. Even without assuming any curvature (i.e., taking c = 1), this regret bound improves on previous results of Streeter et al. (2009) and Golovin et al. (2014). - For nonmonotone, unconstrained submodular functions, we give an algorithm with 1/2-regret O(√ nT), improving on the results of Roughgarden and Wang (2018). Our approach is based on Blackwell approachability; in particular, we give a novel first-order regret bound for the Blackwell instances that arise in this setting