NeurIPS 2020

Comprehensive Attention Self-Distillation for Weakly-Supervised Object Detection

Meta Review

This paper proposes a Comprehensive Attention Self-Distillation (CASD) training method for tackling weakly-supervised object detection. The method is empirically assessed on COCO and PASCAL VOC benchmarks where it shows impressive performance. A common concern across all reviews was the issue with number of hyper-parameters of the method. Reviewers found detailed authors' response very helpful and some increased their initial scores. Another common concern, but less critical, was about writing, which can be improved for the final version. Overall, the paper is seen as an important contribution and thus I recommend accept.