Part of Advances in Neural Information Processing Systems 32 (NeurIPS 2019)
Robert Kleinberg, Kevin Leyton-Brown, Brendan Lucier, Devon Graham
Algorithm configuration methods optimize the performance of a parameterized heuristic algorithm on a given distribution of problem instances. Recent work introduced an algorithm configuration procedure (Structured Procrastination'') that provably achieves near optimal performance with high probability and with nearly minimal runtime in the worst case. It also offers an anytime property: it keeps tightening its optimality guarantees the longer it is run. Unfortunately, Structured Procrastination is not adaptive to characteristics of the parameterized algorithm: it treats every input like the worst case. Follow-up work (
LeapsAndBounds'') achieves adaptivity but trades away the anytime property. This paper introduces a new algorithm, ``Structured Procrastination with Confidence'', that preserves the near-optimality and anytime properties of Structured Procrastination while adding adaptivity. In particular, the new algorithm will perform dramatically faster in settings where many algorithm configurations perform poorly. We show empirically both that such settings arise frequently in practice and that the anytime property is useful for finding good configurations quickly.