This paper presents an approach for distillation of self-supervised models. All the reviewers acknowledge that the paper present a simple approach which outperforms several baselines. There are some concerns with respect to: (a) speed with which SSL field changes and applicability to new approaches; (b) clarity of tables; (c) claim of better than alexNet supervised. There was a rebuttal which answered some of the concerns. The AC agrees with authors that we should not wait for better models before working on model compression. Based on the discussions, the AC recommends acceptance.