FPC_AS (fixed-point continuation and active set) is a MATLAB solver for the l1-regularized least squares problem: A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation. We propose a fast algorithm for solving the ℓ 1 -regularized minimization problem min x∈ℝ n μ∥x∥ 1 +∥Ax-b∥ 2 2 for recovering sparse solutions to an undetermined system of linear equations Ax=b. The algorithm is divided into two stages that are performed repeatedly. In the first stage a first-order iterative “shrinkage” method yields an estimate of the subset of components of x likely to be nonzero in an optimal solution. Restricting the decision variables x to this subset and fixing their signs at their current values reduces the ℓ 1 -norm ∥x∥ 1 to a linear function of x. The resulting subspace problem, which involves the minimization of a smaller and smooth quadratic function, is solved in the second phase. Our code FPC_AS embeds this basic two-stage algorithm in a continuation (homotopy) approach by assigning a decreasing sequence of values to μ. This code exhibits state-of-the-art performance in terms of both its speed and its ability to recover sparse signals

References in zbMATH (referenced in 43 articles , 1 standard article )

Showing results 1 to 20 of 43.
Sorted by year (citations)

1 2 3 next

  1. Cheng, Wanyou; Dai, Yu-Hong: Gradient-based method with active set strategy for $\ell _1$ optimization (2018)
  2. Li, Chong-Jun; Zhong, Yi-Jun: A pseudo-heuristic parameter selection rule for $l^1$-regularized minimization problems (2018)
  3. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
  4. Eghbali, Reza; Fazel, Maryam: Decomposable norm minimization with proximal-gradient homotopy algorithm (2017)
  5. Karimi, Sahar; Vavasis, Stephen: IMRO: A proximal quasi-Newton method for solving $\ell_1$-regularized least squares problems (2017)
  6. Stella, Lorenzo; Themelis, Andreas; Patrinos, Panagiotis: Forward-backward quasi-Newton methods for nonsmooth optimization problems (2017)
  7. Sun, Tao; Jiang, Hao; Cheng, Lizhi: Global convergence of proximal iteratively reweighted algorithm (2017)
  8. Byrd, Richard H.; Chin, Gillian M.; Nocedal, Jorge; Oztoprak, Figen: A family of second-order methods for convex $\ell _1$-regularized optimization (2016)
  9. De Santis, Marianna; Lucidi, Stefano; Rinaldi, Francesco: A fast active set block coordinate descent algorithm for $\ell_1$-regularized least squares (2016)
  10. Hager, William W.; Zhang, Hongchao: An active set algorithm for nonlinear optimization with polyhedral constraints (2016)
  11. Shen, Yuan; Wang, Hongyong: New augmented Lagrangian-based proximal point algorithm for convex optimization with equality constraints (2016)
  12. Treister, Eran; Turek, Javier S.; Yavneh, Irad: A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression (2016)
  13. Cheng, Wanyou; Chen, Zixin; Li, Donghui: Nomonotone spectral gradient method for sparse recovery (2015)
  14. Huang, Yakui; Liu, Hongwei: A Barzilai-Borwein type method for minimizing composite functions (2015)
  15. Lin, Qihang; Xiao, Lin: An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization (2015)
  16. Lorenz, Dirk A.; Pfetsch, Marc E.; Tillmann, Andreas M.: Solving basis pursuit: heuristic optimality check and solver comparison (2015)
  17. Ulbrich, Michael; Wen, Zaiwen; Yang, Chao; Klöckner, Dennis; Lu, Zhaosong: A proximal gradient method for ensemble density functional theory (2015)
  18. Yin, Penghang; Lou, Yifei; He, Qi; Xin, Jack: Minimization of $\ell_1-2$ for compressed sensing (2015)
  19. Zhang, Li; Zhou, Wei-Da: Time series prediction using sparse regression ensemble based on $\ell _2$-$\ell _1$ problem (2015) ioport
  20. Zhao, ZhiHua; Xu, FengMin; Li, XiangYang: Adaptive projected gradient thresholding methods for constrained $l_0$ problems (2015)

1 2 3 next