NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:6133
Title:A Necessary and Sufficient Stability Notion for Adaptive Generalization


		
The paper introduces a new notion of stability (LSS) and shows formally that it's sufficient and necessary for generalization in adaptive settings. The necessity result is particularly interesting. The paper also gives a rigorous comparison with the notions of approximate max-information and differential privacy. The paper makes a good, clear theoretical contribution to the area of adaptive data analysis. The paper would have been much stronger if it gives (i) new non-trivial mechanisms achieving LSS, and (ii) extended comparisons with other notions such as typical stability and maximal leakage. As recommended in the post-rebuttal discussion, the authors are encouraged to provide at least some discussion on the possible directions for developing non-trivial LSS mechanisms and what general properties they would expect these mechanisms to have. It is also recommended that the authors make a more comprehensive comparison with other existing notions of stability that guarantee generalization in adaptive data analysis.