NeurIPS 2020

Counterexample-Guided Learning of Monotonic Neural Networks


Meta Review

The reviewers agreed that this was an interesting and novel approach to imposing monotonicity, and that the paper was mostly well-written (although R4’s review contains some suggested improvements). The main criticisms were that (i) the datasets in the experiments were small, and (ii) using an SMT solver at evaluation time might be too expensive for many applications. R3 also mentioned that the limitation to ReLU networks could be somewhat constraining. These issues, however, were agreed to be outweighed by the strengths of the paper, and all reviewers recommended acceptance. Please carefully read the reviews, and take them seriously when making edits: the paper is very good already, and while of course experiments should not be overhauled between submission and a final version, implementing some of the reviewers’ suggestions (especially adding a more in-depth discussion of evaluation-time costs, and their impact on real-world systems) could improve it even further.