NeurIPS 2020

Meta-Learning with Adaptive Hyperparameters


Meta Review

The reviewers generally agreed that this paper brings an important contribution to the NeurIPS community. The experiments are thorough. The results are quite strong, and were surprising to some reviewers. The approach also is scalable. There are multiple changes that we urge the authors to make for the camera ready version of the paper: - The title is far too broad, and not at all informative of the key ideas in the paper. The title should be revised such that it gets across that it is a meta-learning paper, and such that it is specific enough that the new title could not be used to also describe other existing papers. [For example, the current title could be used to describe really any gradient-based meta-learning paper] - The scalability of the approach was unclear from the text. In the camera ready, the authors should include some of the clarifications mentioned in the author response. - Of course, the new experiments in the author response should also be included in the revised paper. Also, note that the first-order MAML results in the author response are from an out-of-date version of the meta-dataset paper. The camera-ready paper should report the updated results from v4 of the meta-dataset arxiv paper (linked here: https://arxiv.org/abs/1903.03096). It should also include results from other top-performing methods, including this paper: https://arxiv.org/abs/2003.11539 - For the regression experiments in the rebuttal, the authors should also run a comparison that controls for the total number of parameters. Vanilla MAML often performs significantly better on this problem when using a larger architecture.