A Necessary and Sufficient Stability Notion for Adaptive Generalization
Part of: Advances in Neural Information Processing Systems 32 (NIPS 2019) pre-proceedings
[PDF] [BibTeX] [Supplemental]Authors
Conference Event Type: Poster
Abstract
We introduce a new notion of the stability of computations, which holds under post-processing and adaptive composition. We show that the notion is both necessary and sufficient to ensure generalization in the face of adaptivity, for any computations that respond to bounded-sensitivity linear queries while providing accuracy with respect to the data sample set. The stability notion is based on quantifying the effect of observing a computation's outputs on the posterior over the data sample elements. We show a separation between this stability notion and previously studied notion and observe that all differentially private algorithms also satisfy this notion.