NeurIPS 2020

Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Distributions


Meta Review

This paper proposes a concept of "profile entropy" for measuring complexity of discrete distributions. They show this concept can be useful in understanding properties of estimation of discrete distributions and compression of sample profiles. The reviewers are in agreement that the work is novel, useful, and mostly well-executed. The reviewers have suggested a number of changes to the presentation format, which the authors seem to have already promised to address in their author feedback. With these changes, the paper should be ready for publication.