NeurIPS 2020

Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts


Meta Review

The work has clever ideas with considerable advances on Prior Networks, and all reviewers found the paper interesting and well-written. There are notable limitations that the work should be honest about, however, and I strongly recommend that the authors make revisions following the reviews and rebuttal. For example: + There are a number of limitations in using encoders and normalizing flows (and aiming to solve class-conditional density estimation) as an alternative to typical classification networks. This would benefit quite a bit from ablations as well as discussion of these limitations, e.g., how well normalizing flows work here, choices that matter most when others aim to reproduce the work, and the scaling with respect to the number of classes. + Section 4's loss function is uninteresting from a novelty perspective. As R1 states, it's exactly the ELBO with a uniform prior, and I'm surprised the authors did not make that connection. I'd strongly recommend they tone down the significance of the contribution here.