glasso
The graphical lasso: new insights and alternatives. The graphical lasso [5] is an algorithm for learning the structure in an undirected Gaussian graphical model, using ℓ 1 regularization to control the number of zeros in the precision matrix Θ=Σ -1 [2, 11]. The R package glasso [5] is popular, fast, and allows one to efficiently build a path of models for different values of the tuning parameter. Convergence of glasso can be tricky; the converged precision matrix might not be the inverse of the estimated covariance, and occasionally it fails to converge with warm starts. In this paper we explain this behavior, and propose new algorithms that appear to outperform glasso. By studying the “normal equations” we see that, glasso is solving the dual of the graphical lasso penalized likelihood, by block coordinate ascent; a result which can also be found in [2]. In this dual, the target of estimation is Σ, the covariance matrix, rather than the precision matrix Θ. We propose similar primal algorithms p-glasso and dp-glasso, that also operate by block-coordinate descent, where Θ is the optimization target. We study all of these algorithms, and in particular different approaches to solving their coordinate sub-problems. We conclude that dp-glasso is superior from several points of view.
Keywords for this software
References in zbMATH (referenced in 315 articles , 1 standard article )
Showing results 1 to 20 of 315.
Sorted by year (- Chen, Jingnan; Dai, Gengling; Zhang, Ning: An application of sparse-group Lasso regularization to equity portfolio optimization and sector selection (2020)
- Fang, Qian; Yu, Chen; Weiping, Zhang: Regularized estimation of precision matrix for high-dimensional multivariate longitudinal data (2020)
- Farnè, Matteo; Montanari, Angela: A large covariance matrix estimator under intermediate spikiness regimes (2020)
- Nystrup, Peter; Lindström, Erik; Pinson, Pierre; Madsen, Henrik: Temporal hierarchies with autocorrelation for load forecasting (2020)
- Pan, Yuqing; Mai, Qing: Efficient computation for differential network analysis with applications to quadratic discriminant analysis (2020)
- Talukdar, Saurav; Deka, Deepjyoti; Doddi, Harish; Materassi, Donatello; Chertkov, Michael; Salapaka, Murti V.: Physics informed topology learning in networks of linear dynamical systems (2020)
- Abbruzzo, Antonino; Vujačić, Ivan; Mineo, Angelo M.; Wit, Ernst C.: Selecting the tuning parameter in penalized Gaussian graphical models (2019)
- Audouze, Christophe; Nair, Prasanth B.: Sparse low-rank separated representation models for learning from data (2019)
- Banerjee, Sayantan; Akbani, Rehan; Baladandayuthapani, Veerabhadran: Spectral clustering via sparse graph structure learning with application to proteomic signaling networks in cancer (2019)
- Bashir, Amir; Carvalho, Carlos M.; Hahn, P. Richard; Jones, M. Beatrix: Post-processing posteriors over precision matrices to produce sparse graph estimates (2019)
- Belomestny, Denis; Trabs, Mathias; Tsybakov, Alexandre B.: Sparse covariance matrix estimation in high-dimensional deconvolution (2019)
- Bhadra, Anindya; Datta, Jyotishka; Polson, Nicholas G.; Willard, Brandon: Lasso meets horseshoe: a survey (2019)
- Bien, Jacob: Graph-guided banding of the covariance matrix (2019)
- Bollhöfer, Matthias; Eftekhari, Aryan; Scheidegger, Simon; Schenk, Olaf: Large-scale sparse inverse covariance matrix estimation (2019)
- Brzyski, Damian; Gossmann, Alexej; Su, Weijie; Bogdan, Małgorzata: Group SLOPE -- adaptive selection of groups of predictors (2019)
- Castelletti, Federico; Consonni, Guido: Objective Bayes model selection of Gaussian interventional essential graphs for the identification of signaling pathways (2019)
- Celeux, Gilles; Maugis-Rabusseau, Cathy; Sedki, Mohammed: Variable selection in model-based clustering and discriminant analysis with a regularization approach (2019)
- Chakrabarti, Arnab; Sen, Rituparna: Some statistical problems with high dimensional financial data (2019)
- Chen, Li-Pang; Yi, Grace Y.; Zhang, Qihuang; He, Wenqing: Multiclass analysis and prediction with network structured covariates (2019)
- Choi, Young-Geun; Lim, Johan; Roy, Anindya; Park, Junyong: Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage (2019)