glmnet: Lasso and elastic-net regularized generalized linear models. Extremely efficient procedures for fitting the entire lasso or elastic-net regularization path for linear regression, logistic and multinomial regression models, poisson regression and the Cox model. Two recent additions are the multiresponse gaussian, and the grouped multinomial. The algorithm uses cyclical coordinate descent in a pathwise fashion, as described in the paper listed below.

References in zbMATH (referenced in 133 articles )

Showing results 1 to 20 of 133.
Sorted by year (citations)

1 2 3 ... 5 6 7 next

  1. Angelopoulos, Nicos; Abdallah, Samer; Giamas, Georgios: Advances in integrative statistics for logic programming (2016)
  2. Beinrucker, Andre; Dogan, Ürün; Blanchard, Gilles: Extensions of stability selection using subsamples of observations and covariates (2016)
  3. Blum, Yuna; Houée-Bigot, Magalie; Causeur, David: Sparse factor model for co-expression networks with an application using prior biological knowledge (2016)
  4. Fitzpatrick, Trevor; Mues, Christophe: An empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage market (2016)
  5. Fountoulakis, Kimon; Gondzio, Jacek: Performance of first- and second-order methods for $\ell_1$-regularized least squares problems (2016)
  6. Frandi, Emanuele; Ñanculef, Ricardo; Lodi, Stefano; Sartori, Claudio; Suykens, Johan A.K.: Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee (2016)
  7. Furmańczyk, Konrad: Variable selection using stepdown procedures in high-dimensional linear models (2016)
  8. Gillberg, Jussi; Marttinen, Pekka; Pirinen, Matti; Kangas, Antti J.; Soininen, Pasi; Ali, Mehreen; Havulinna, Aki S.; Järvelin, Marjo-Riitta; Ala-Korpela, Mika; Kaski, Samuel: Multiple output regression with latent noise (2016)
  9. Guhaniyogi, Rajarshi; Dunson, David B.: Compressed Gaussian process for manifold regression (2016)
  10. Kwemou, Marius: Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model (2016)
  11. Laurin, Charles; Boomsma, Dorret; Lubke, Gitta: The use of vector bootstrapping to improve variable selection precision in Lasso models (2016)
  12. Neykov, Matey; Liu, Jun S.; Cai, Tianxi: On the characterization of a class of Fisher-consistent loss functions and its application to boosting (2016)
  13. Oneto, Luca; Ridella, Sandro; Anguita, Davide: Tikhonov, Ivanov and Morozov regularization for support vector machine learning (2016)
  14. Perthame, Émeline; Friguet, Chloé; Causeur, David: Stability of feature selection in classification issues for high-dimensional correlated data (2016)
  15. Pillonetto, Gianluigi; Chen, Tianshi; Chiuso, Alessandro; De Nicolao, Giuseppe; Ljung, Lennart: Regularized linear system identification using atomic, nuclear and kernel-based norms: the role of the stability constraint (2016)
  16. Scheinberg, Katya; Tang, Xiaocheng: Practical inexact proximal quasi-Newton method with global complexity analysis (2016)
  17. Shah, Rajen D.: Modelling interactions in high-dimensional data with backtracking (2016)
  18. Teisseyre, Paweł; Kłopotek, Robert A.; Mielniczuk, Jan: Random subspace method for high-dimensional regression with the R package regRSM (2016)
  19. Treister, Eran; Turek, Javier S.; Yavneh, Irad: A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression (2016)
  20. Tutz, Gerhard; Koch, Dominik: Improved nearest neighbor classifiers by weighting and selection of predictors (2016)

1 2 3 ... 5 6 7 next