NeurIPS 2020

Robust Correction of Sampling Bias using Cumulative Distribution Functions


Meta Review

The paper is extending the statistical invariants idea of Vapnik and Izmailov to covariate shift setting. The resulting method basically use the V-matrix of the target distribution while using the source labels. The application is not straightforward; moreover, authors perform an analysis and discuss the theoretical properties of the method. Resulting method is also empirically evaluated on some classical datasets. The papers received mix scores from the reviewers. In summary, - All reviewers agreed on the novelty, correctness, clarity, and the theoretical value of the paper. - R#1 and R#4 raised issues on the empirical study stated that the margins are narrow and baselines can be improved with more hyper-parameter search. - R#3 raised issues stating SVMs are pretty old, the method should used NNs. Moreover, it should compare with NN based methods. First of all, I disagree with the R#3 in extending the method to NNs. Extending the proposed method to NN is very non-trivial due to the curse-of-dimensionality and heavy use of RKHS/Kernels in the V-matrix which do not exist in NNs. The remaining issues are clearly not valid reasons for rejecting a paper. Because 1) As long as there is non-trivial theoretical/algorithmic advancements and the empirical study is solid, we do not need state-of-the-art results with large margins. 2) Proposed method does not use hyperparameter search and expecting it from baselines is not fair. 3) Not all methods need to use neural networks, we still need diverse ideas and models in NeurIPS for an healthy field.