NeurIPS 2020

Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms

Meta Review

This paper makes progress on a recent sequence of papers on information-theoretic quantities in generalization bounds for statistical learning, in particular focusing on the mutual information and conditional mutual information, and providing a new (sometimes tighter) variant. The reviewers all agree that the progress made in this paper is valuable and worthy of publication, and note that the application to studying Langevin dynamics is a particularly nice component of the paper. However, they also note that the paper could have been written more accessibly.