glasso
The graphical lasso: new insights and alternatives. The graphical lasso [5] is an algorithm for learning the structure in an undirected Gaussian graphical model, using ℓ 1 regularization to control the number of zeros in the precision matrix Θ=Σ -1 [2, 11]. The R package glasso [5] is popular, fast, and allows one to efficiently build a path of models for different values of the tuning parameter. Convergence of glasso can be tricky; the converged precision matrix might not be the inverse of the estimated covariance, and occasionally it fails to converge with warm starts. In this paper we explain this behavior, and propose new algorithms that appear to outperform glasso. By studying the “normal equations” we see that, glasso is solving the dual of the graphical lasso penalized likelihood, by block coordinate ascent; a result which can also be found in [2]. In this dual, the target of estimation is Σ, the covariance matrix, rather than the precision matrix Θ. We propose similar primal algorithms p-glasso and dp-glasso, that also operate by block-coordinate descent, where Θ is the optimization target. We study all of these algorithms, and in particular different approaches to solving their coordinate sub-problems. We conclude that dp-glasso is superior from several points of view.
Keywords for this software
References in zbMATH (referenced in 456 articles , 1 standard article )
Showing results 1 to 20 of 456.
Sorted by year (- Dong, Yan; Li, Daoji; Zheng, Zemin; Zhou, Jia: Reproducible feature selection in high-dimensional accelerated failure time models (2022)
- Gillard, Jonathan; O’Riordan, Emily; Zhigljavsky, Anatoly: Simplicial and minimal-variance distances in multivariate data analysis (2022)
- Hamada, Naoki; Ichiki, Shunsuke: Free disposal hull condition to verify when efficiency coincides with weak efficiency (2022)
- Nguyen, Viet Anh; Kuhn, Daniel; Esfahani, Peyman Mohajerin: Distributionally robust inverse covariance estimation: the Wasserstein shrinkage estimator (2022)
- Werner, Tino: Asymptotic linear expansion of regularized M-estimators (2022)
- Barceló, Pablo; Baumgartner, Alexander; Dalmau, Victor; Kimelfeld, Benny: Regularizing conjunctive features for classification (2021)
- Bartlett, Thomas E.; Kosmidis, Ioannis; Silva, Ricardo: Two-way sparsity for time-varying networks with applications in genomics (2021)
- Bian, Fengmiao; Liang, Jingwei; Zhang, Xiaoqun: A stochastic alternating direction method of multipliers for non-smooth and non-convex optimization (2021)
- Burkina, M.; Nazarov, I.; Panov, M.; Fedonin, G.; Shirokikh, B.: Inductive matrix completion with feature selection (2021)
- Byrd, Michael; Nghiem, Linh H.; McGee, Monnie: Bayesian regularization of Gaussian graphical models with measurement error (2021)
- Fan, Xinyan; Zhang, Qingzhao; Ma, Shuangge; Fang, Kuangnan: Conditional score matching for high-dimensional partial graphical models (2021)
- Ha, Min Jin; Stingo, Francesco Claudio; Baladandayuthapani, Veerabhadran: Bayesian structure learning in multilayered genomic networks (2021)
- Huling, Jared D.; Smith, Maureen A.; Chen, Guanhua: A two-part framework for estimating individualized treatment rules from semicontinuous outcomes (2021)
- Kashlak, Adam B.: Non-asymptotic error controlled sparse high dimensional precision matrix estimation (2021)
- Kereta, Željko; Klock, Timo: Estimating covariance and precision matrices along subspaces (2021)
- Khalili, Atefeh; Eskandari, Farzad; Nematollahi, Nader: Estimation of undirected graph with finite mixture of nonparanormal distribution (2021)
- Lapucci, Matteo; Levato, Tommaso; Sciandrone, Marco: Convergent inexact penalty decomposition methods for cardinality-constrained problems (2021)
- Laverny, Oskar; Masiello, Esterina; Maume-Deschamps, Véronique; Rullière, Didier: Dependence structure estimation using copula recursive trees (2021)
- Lee, Kyoungjae; Cao, Xuan: Bayesian inference for high-dimensional decomposable graphs (2021)
- Li, Yaguang; Xu, Wei; Gao, Xin: Graphical-model based high dimensional generalized linear models (2021)