glmnet

R package glmnet: Lasso and elastic-net regularized generalized linear models. Extremely efficient procedures for fitting the entire lasso or elastic-net regularization path for linear regression, logistic and multinomial regression models, poisson regression and the Cox model. Two recent additions are the multiresponse gaussian, and the grouped multinomial. The algorithm uses cyclical coordinate descent in a pathwise fashion, as described in the paper listed below.


References in zbMATH (referenced in 432 articles , 1 standard article )

Showing results 1 to 20 of 432.
Sorted by year (citations)

1 2 3 ... 20 21 22 next

  1. Bertsimas, Dimitris; van Parys, Bart: Sparse high-dimensional regression: exact scalable algorithms and phase transitions (2020)
  2. Boehmke, Brad; Greenwell, Brandon M.: Hands-on machine learning with R (2020)
  3. Canhong Wen, Aijun Zhang, Shijie Quan, Xueqin Wang: BeSS: An R Package for Best Subset Selection in Linear, Logistic and Cox Proportional Hazards Models (2020) not zbMATH
  4. Cao, Xuan; Khare, Kshitij; Ghosh, Malay: High-dimensional posterior consistency for hierarchical non-local priors in regression (2020)
  5. Chavez, Gordon V.: Dynamic tail inference with log-Laplace volatility (2020)
  6. Chen, Kedong; Li, William; Wang, Sijian: An easy-to-implement hierarchical standardization for variable selection under strong heredity constraint (2020)
  7. Choiruddin, Achmad; Cuevas-Pacheco, Francisco; Coeurjolly, Jean-François; Waagepetersen, Rasmus: Regularized estimation for highly multivariate log Gaussian Cox processes (2020)
  8. Fan, Jianqing; Ke, Yuan; Wang, Kaizheng: Factor-adjusted regularized model selection (2020)
  9. Feng, Yang; Liu, Qingfeng; Okui, Ryo: On the sparsity of Mallows model averaging estimator (2020)
  10. Furmańczyk, Konrad; Rejchel, Wojciech: High-dimensional linear model selection motivated by multiple testing (2020)
  11. García-Portugués, Eduardo; Álvarez-Liébana, Javier; Álvarez-Pérez, Gonzalo; González-Manteiga, Wenceslao: Goodness-of-fit tests for functional linear models based on integrated projections (2020)
  12. Gold, David; Lederer, Johannes; Tao, Jing: Inference for high-dimensional instrumental variables regression (2020)
  13. Huang, Yimin; Kong, Xiangshun; Ai, Mingyao: Optimal designs in sparse linear models (2020)
  14. James, Gareth M.; Paulson, Courtney; Rusmevichientong, Paat: Penalized and constrained optimization: an application to high-dimensional website advertising (2020)
  15. Jeon, Jong-June; Kim, Yongdai; Won, Sungho; Choi, Hosik: Primal path algorithm for compositional data analysis (2020)
  16. Lai, Yuanhao; McLeod, Ian: Ensemble quantile classifier (2020)
  17. Liu, Wenchen; Tang, Yincai; Wu, Xianyi: Separating variables to accelerate non-convex regularized optimization (2020)
  18. Mainak Jas; Titipat Achakulvisut; Aid Idrizović; Daniel E. Acuna; Matthew Antalek; Vinicius Marques; Tommy Odland; Ravi Prakash Garg; Mayank Agrawal; Yu Umegaki; Peter Foley; Hugo L Fernandes; Drew Harris; Beibin Li; Olivier Pieters; Scott Otterson; Giovanni De Toni; Chris Rodgers; Eva Dyer; Matti Hamalainen; Konrad Kording; Pavan Ramkumar: Pyglmnet: Python implementation of elastic-net regularized generalized linear models (2020) not zbMATH
  19. Nikooienejad, Amir; Wang, Wenyi; Johnson, Valen E.: Bayesian variable selection for survival data using inverse moment priors (2020)
  20. Oda, Ryoya; Yanagihara, Hirokazu: A fast and consistent variable selection method for high-dimensional multivariate linear regression with a large number of explanatory variables (2020)

1 2 3 ... 20 21 22 next