NeurIPS 2020

Distribution Aligning Refinery of Pseudo-label for Imbalanced Semi-supervised Learning

Meta Review

This paper proposes an approach to semi-supervised learning for imbalanced classes. It is indeed non-trivial to combine local/global/perturbation consistency-based semi-supervised methods and fully supervised methods for imbalanced classes---this paper may be the first work along this direction. The paper is quite general and can be applied on top of any pseudo-labeling-based semi-supervised methods. It first estimates the true class-prior probability and then updates/modifies the pseudo labels by pushing their class-prior probability with a constrained convex optimization. While in the beginning the reviewers had some concerns (mainly the clarity and too few datasets), the authors did a particularly good job in their rebuttal (showing that the class-prior probability can be estimated rather than must be given). Thus in the end, all of us have agreed to accept this paper for publication! PS, reviewers are all voluntary and serve for free, and they are busy and not asked to go through all the things in the supp file. The authors shouldn't use the supp as same as the main and then rely on that all reviewers will also review the appendices, because this will implicitly break the page limit and be extremely unfair to other papers. So if the authors think certain messages are important (appendix F and the experiments on more datasets), these things should definitely be moved to the main file at the submission time. A strange organization of the material hurts the clarity very very much; without clarity, it is meaningless to talk about the novelty or the significance, where I think the authors should really learn a lesson this time. Moreover, please be more discreet in using the confidential comments to ACs, since we are all voluntary and serve for free!