NeurIPS 2020

Depth Uncertainty in Neural Networks


Meta Review

This paper proposes to treat depth of the network as a random variable and marginalize over that to achieve better uncertainty. The authors show that this can be performed efficiently in a single forward pass, and demonstrate improved uncertainty estimates on both regression and classification benchmarks (including corrupted versions and out-of-distribution evaluation). The reviewers initially raised several questions particularly on experimental setup, flexibility of the depth posterior and comparisons to stochastic depth and dropout). During the discussion, the reviewers agreed that the author rebuttal satisfactorily addresses the major concerns and some of them raised their scores correspondingly as well. Overall, this is a good paper and I recommend accept.