Bayesian Estimation of Gaussian Graphical Models with Projection Predictive Selection

17 Jan 2018  ·  Donald R. Williams, Juho Piironen, Aki Vehtari, Philippe Rast ·

Gaussian graphical models are used for determining conditional relationships between variables. This is accomplished by identifying off-diagonal elements in the inverse-covariance matrix that are non-zero. When the ratio of variables (p) to observations (n) approaches one, the maximum likelihood estimator of the covariance matrix becomes unstable and requires shrinkage estimation. Whereas several classical (frequentist) methods have been introduced to address this issue, Bayesian methods remain relatively uncommon in practice and methodological literatures. Here we introduce a Bayesian method for estimating sparse matrices, in which conditional relationships are determined with projection predictive selection. This method uses Kullback-Leibler divergence and cross-validation for variable selection, in addition to the horseshoe prior for regularization. Through simulation and an applied example, we demonstrate that the proposed method often outperforms classical methods, such as the graphical lasso, as well as an alternative Bayesian method with respect to edge identification and frequentist risk. Further, projection predictive selection consistently had the lowest false positive rate, both with simulated and real data. We end by discussing future directions and contributions to the Bayesian literature on the topic of sparsity.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper