NeurIPS 2020

Learning Restricted Boltzmann Machines with Sparse Latent Variables

Meta Review

This paper presents an algorithm for provably learning RBMs when each visible node is connected to a small number of hiddens, presenting bounds that improve over previous results in a specific regime. While reviewers agree the results appear sound, the paper has done little to convince the reviewers of the significance of the regime, and reviewer requests for additional intuition were not satisfied effectively in the author response. In total, though, the work appears novel and sound, and consensus is in favor of acceptance. I would strongly encourage the authors to try to address R2's questions. (R2's response after rebuttal: "I didn't find the response very helpful, unfortunately. I will have to look at the paper again, but my intuition was that $s$ should bound the size of the Markov blanket, which should lead to better than $O(n^d)$ scaling, and they don't seem to have addressed this except to say it doesn't.")