NeurIPS 2020

Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification

Meta Review

High-quality paper, that demonstrates how the information-bottleneck principle can be adapted to train invertible neural networks, explores its theoretical and practical implications, and connects it to generative classification. All reviewers agree that the theory is interesting and novel. Well done. The reviews raised some interesting points (e.g. CI(X, Z) as a lower bound, the effect of sigma), which the rebuttal went on to address. I would encourage the authors to use the extra 9th page to add these details to the camera-ready too.