NeurIPS 2020
### Learning Some Popular Gaussian Graphical Models without Condition Number Bounds

### Meta Review

This paper studies the well-known problem of structure learning Gaussian Graphical Models. This is simply the problem of learning the zero-non-zero structure of the "precision" matrix (inverse Covariance matrix) of an unknown gaussian distribution from samples. All known efficient algorithms for the problem suffer from a running time and sample complexity dependence on the condition number of the unknown covariance matrix. This paper gives an algorithm, which, for some covariance matrices that satisfy some structural assumptions (walk summabililty, attractive) gives an efficient algorithm that does not depend on the condition number of the unknown covariance (which can be arbitrarily ill-conditioned under the assumptions)
. The reviewers (with clarifications in the author feedback) were convinced of both the motivation and non-triviality of the assumptions made and found the algorithmic contributions of this paper important.
This paper makes progress on a fundamental problem in graphical model learning. I am pleased to recommend accepting it to the NeurIPS program.