Model Selection in Gaussian Graphical Models: High-Dimensional Consistency of \boldmath$\ell_1$-regularized MLE

Part of Advances in Neural Information Processing Systems 21 (NIPS 2008)

Bibtex Metadata Paper

Authors

Garvesh Raskutti, Bin Yu, Martin J. Wainwright, Pradeep Ravikumar

Abstract

We consider the problem of estimating the graph structure associated with a Gaussian Markov random field (GMRF) from i.i.d. samples. We study the performance of study the performance of the 1 -regularized maximum likelihood estimator in the high-dimensional setting, where the number of nodes in the graph p, the number of edges in the graph s and the maximum node degree d, are allowed to grow as a function of the number of samples n. Our main result provides sufficient conditions on (n, p, d) for the 1 -regularized MLE estimator to recover all the edges of the graph with high probability. Under some conditions on the model covariance, we show that model selection can be achieved for sample sizes n = (d2 log(p)), with the error decaying as O(exp(-c log(p))) for some constant c. We illustrate our theoretical results via simulations and show good correspondences between the theoretical predictions and behavior in simulations.