SLEP: Sparse Learning with Efficient Projections. Main Features: 1) First-Order Method. At each iteration, we only need to evaluate the function value and the gradient; and thus the algorithms can handle large-scale sparse data. 2) Optimal Convergence Rate. The convergence rate O(1/k2) is optimal for smooth convex optimization via the first-order black-box methods. 3) Efficient Projection. The projection problem (proximal operator) can be solved efficiently. 4) Pathwise Solutions. The SLEP package provides functions that efficiently compute the pathwise solutions corresponding to a series of regularization parameters by the “warm-start” technique.

References in zbMATH (referenced in 42 articles )

Showing results 1 to 20 of 42.
Sorted by year (citations)

1 2 3 next

  1. Curtis, Frank E.; Dai, Yutong; Robinson, Daniel P.: A subspace acceleration method for minimization involving a group sparsity-inducing regularizer (2022)
  2. Feng, Xiaodong; Wu, Sen: Robust sparse coding via self-paced learning for data representation (2021)
  3. Li, Mei; Kong, Lingchen; Su, Zhihua: Double fused Lasso regularized regression with both matrix and vector valued predictors (2021)
  4. Wang, Rui; Xiu, Naihua; Zhou, Shenglong: An extended Newton-type algorithm for (\ell_2)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets (2021)
  5. Jeong, Jun-Yong; Kang, Ju-Seok; Jun, Chi-Hyuck: Regularization-based model tree for multi-output regression (2020)
  6. Won, Daehan; Manzour, Hasan; Chaovalitwongse, Wanpracha: Convex optimization for group feature selection in networked data (2020)
  7. Zhang, Yangjing; Zhang, Ning; Sun, Defeng; Toh, Kim-Chuan: An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems (2020)
  8. Zhu, Li; Huo, Zhiguang; Ma, Tianzhou; Oesterreich, Steffi; Tseng, George C.: Bayesian indicator variable selection to incorporate hierarchical overlapping group structure in multi-omics applications (2019)
  9. Barbero, Álvaro; Sra, Suvrit: Modular proximal optimization for multidimensional total-variation regularization (2018)
  10. Jin, Fei; Lee, Lung-fei: Irregular N2SLS and Lasso estimation of the matrix exponential spatial specification model (2018)
  11. Karl Sjöstrand; Line Clemmensen; Rasmus Larsen; Gudmundur Einarsson; Bjarne Ersbøll: SpaSM: A MATLAB Toolbox for Sparse Statistical Modeling (2018) not zbMATH
  12. Liu, Yanqing; Tao, Jiyuan; Zhang, Huan; Xiu, Xianchao; Kong, Lingchen: Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression (2018)
  13. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: On efficiently solving the subproblems of a level-set method for fused lasso problems (2018)
  14. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
  15. Xue, Wei; Zhang, Wensheng; Yu, Gaohang: Least absolute deviations learning of multiple tasks (2018)
  16. Li, Ying-Yi; Zhang, Hai-Bin; Li, Fei: A modified proximal gradient method for a family of nonsmooth convex optimization problems (2017)
  17. Yau, Chun Yip; Hui, Tsz Shing: LARS-type algorithm for group Lasso (2017)
  18. Zeng, Bilin; Wen, Xuerong Meggie; Zhu, Lixing: A link-free sparse group variable selection method for single-index model (2017)
  19. Frandi, Emanuele; Ñanculef, Ricardo; Lodi, Stefano; Sartori, Claudio; Suykens, Johan A. K.: Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee (2016)
  20. Zhang, Liangliang; Yang, Longqi; Hu, Guyu; Pan, Zhisong; Li, Zhen: Link prediction via sparse Gaussian graphical model (2016)

1 2 3 next