glmnet

R package glmnet: Lasso and elastic-net regularized generalized linear models. Extremely efficient procedures for fitting the entire lasso or elastic-net regularization path for linear regression, logistic and multinomial regression models, poisson regression and the Cox model. Two recent additions are the multiresponse gaussian, and the grouped multinomial. The algorithm uses cyclical coordinate descent in a pathwise fashion, as described in the paper listed below.


References in zbMATH (referenced in 136 articles )

Showing results 1 to 20 of 136.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Chow, Yat Tin; Wu, Tianyu; Yin, Wotao: Cyclic coordinate-update algorithms for fixed-point problems: analysis and applications (2017)
  2. Simon Mak, C. F. Jeff Wu: cmenet: a new method for bi-level variable selection of conditional main effects (2017) arXiv
  3. Angelopoulos, Nicos; Abdallah, Samer; Giamas, Georgios: Advances in integrative statistics for logic programming (2016)
  4. Beinrucker, Andre; Dogan, Ürün; Blanchard, Gilles: Extensions of stability selection using subsamples of observations and covariates (2016)
  5. Blum, Yuna; Houée-Bigot, Magalie; Causeur, David: Sparse factor model for co-expression networks with an application using prior biological knowledge (2016)
  6. Fitzpatrick, Trevor; Mues, Christophe: An empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage market (2016)
  7. Fountoulakis, Kimon; Gondzio, Jacek: Performance of first- and second-order methods for $\ell_1$-regularized least squares problems (2016)
  8. Frandi, Emanuele; Ñanculef, Ricardo; Lodi, Stefano; Sartori, Claudio; Suykens, Johan A.K.: Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee (2016)
  9. Furmańczyk, Konrad: Variable selection using stepdown procedures in high-dimensional linear models (2016)
  10. Gillberg, Jussi; Marttinen, Pekka; Pirinen, Matti; Kangas, Antti J.; Soininen, Pasi; Ali, Mehreen; Havulinna, Aki S.; Järvelin, Marjo-Riitta; Ala-Korpela, Mika; Kaski, Samuel: Multiple output regression with latent noise (2016)
  11. Guhaniyogi, Rajarshi; Dunson, David B.: Compressed Gaussian process for manifold regression (2016)
  12. Kwemou, Marius: Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model (2016)
  13. Laurin, Charles; Boomsma, Dorret; Lubke, Gitta: The use of vector bootstrapping to improve variable selection precision in Lasso models (2016)
  14. Neykov, Matey; Liu, Jun S.; Cai, Tianxi: On the characterization of a class of Fisher-consistent loss functions and its application to boosting (2016)
  15. Oneto, Luca; Ridella, Sandro; Anguita, Davide: Tikhonov, Ivanov and Morozov regularization for support vector machine learning (2016)
  16. Perthame, Émeline; Friguet, Chloé; Causeur, David: Stability of feature selection in classification issues for high-dimensional correlated data (2016)
  17. Pillonetto, Gianluigi; Chen, Tianshi; Chiuso, Alessandro; De Nicolao, Giuseppe; Ljung, Lennart: Regularized linear system identification using atomic, nuclear and kernel-based norms: the role of the stability constraint (2016)
  18. Scheinberg, Katya; Tang, Xiaocheng: Practical inexact proximal quasi-Newton method with global complexity analysis (2016)
  19. Shah, Rajen D.: Modelling interactions in high-dimensional data with backtracking (2016)
  20. Teisseyre, Paweł; Kłopotek, Robert A.; Mielniczuk, Jan: Random subspace method for high-dimensional regression with the R package regRSM (2016)

1 2 3 ... 5 6 7 next