NeurIPS 2020

Predictive inference is free with the jackknife+-after-bootstrap

Meta Review

This submission is a hard one. Given the reviewer scores and their relatively low confidence in their write-ups I reviewed this paper myself. (The reviewers all discussed their comments and updated their reviews as well.) I am torn. On one hand, the reviewers raise valid concerns about the experimental study; the results in the appendix highlight important aspects of their method, as applied to non-tree based methods. These are not addressed in the main text. And the authors' response to this point is very unsatisfying. ("Not enough space" and "our theory proves everything" are not valid responses to what can come across as cherry-picked results.) A trival response would have been "yes, we can shorten the unnecessarily verbose section 2.2 and bring these results into the main text -- and here's how to interpret these graphs". Instead, the authors choose to defend a 1 page Section 2.2, which constitutes basic background for anyone interested in reading this paper in the first place. On the other hand, the method is clever. It can save significant computation and the theory appears correct. In cases where ensemble learning is used, this approach seems like a practical method. In spite of my frustration with the authors' response, I am recommending that this paper is accepted. However, I will be very disappointed if there isn't a significant effort made into improving the experimental study and shifting of key experimental results from the appendix into the main text, especially with the extra page that accepted will enjoy this year.