SPGL1

SPGL1: A solver for large-scale sparse reconstruction: Probing the Pareto frontier for basis pursuit solutions. The basis pursuit problem seeks a minimum one-norm solution of an underdetermined least-squares problem. Basis Pursuit DeNoise (BPDN) fits the least-squares problem only approximately, and a single parameter determines a curve that traces the optimal trade-off between the least-squares fit and the one-norm of the solution. We prove that this curve is convex and continuously differentiable over all points of interest, and show that it gives an explicit relationship to two other optimization problems closely related to BPDN. We describe a root-finding algorithm for finding arbitrary points on this curve; the algorithm is suitable for problems that are large scale and for those that are in the complex domain. At each iteration, a spectral gradient-projection method approximately minimizes a least-squares problem with an explicit one-norm constraint. Only matrix-vector operations are required. The primal-dual solution of this problem gives function and derivative information needed for the root-finding method. Numerical experiments on a comprehensive set of test problems demonstrate that the method scales well to large problems.


References in zbMATH (referenced in 107 articles )

Showing results 1 to 20 of 107.
Sorted by year (citations)

1 2 3 4 5 6 next

  1. Chkifa, Abdellah; Dexter, Nick; Tran, Hoang; Webster, Clayton G.: Polynomial approximation via compressed sensing of high-dimensional functions on lower sets (2018)
  2. Gibali, Aviv; Liu, Li-Wei; Tang, Yu-Chao: Note on the modified relaxation CQ algorithm for the split feasibility problem (2018)
  3. Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.: Compressive sensing with cross-validation and stop-sampling for sparse polynomial chaos expansions (2018)
  4. Keshavarzzadeh, Vahid; Kirby, Robert M.; Narayan, Akil: Numerical integration in multiple dimensions with designed quadrature (2018)
  5. Li, Chong-Jun; Zhong, Yi-Jun: A pseudo-heuristic parameter selection rule for $l^1$-regularized minimization problems (2018)
  6. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: On efficiently solving the subproblems of a level-set method for fused lasso problems (2018)
  7. Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
  8. Wang, Fenghui: Polyak’s gradient method for split feasibility problem constrained by level sets (2018)
  9. Xiao, Xiantao; Li, Yongfeng; Wen, Zaiwen; Zhang, Liwei: A regularized semi-smooth Newton method with projection steps for composite convex programs (2018)
  10. Yu, Yongchao; Peng, Jigen: The matrix splitting based proximal fixed-point algorithms for quadratically constrained $\ell_1$ minimization and Dantzig selector (2018)
  11. Aravkin, Aleksandr; Burke, James V.; Ljung, Lennart; Lozano, Aurelie; Pillonetto, Gianluigi: Generalized Kalman smoothing: modeling and algorithms (2017)
  12. Bastounis, Alexander; Hansen, Anders C.: On the absence of uniform recovery in many real-world applications of compressed sensing and the restricted isometry property and nullspace property in levels (2017)
  13. Bouwmans, Thierry; Sobral, Andrews; Javed, Sajid; Jung, Soon Ki; Zahzah, El-Hadi: Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset (2017)
  14. Boyd, Nicholas; Schiebinger, Geoffrey; Recht, Benjamin: The alternating descent conditional gradient method for sparse inverse problems (2017)
  15. Drusvyatskiy, D.; Krislock, N.; Voronin, Yuen-Lam; Wolkowicz, H.: Noisy Euclidean distance realization: robust facial reduction and the Pareto frontier (2017)
  16. Guo, Ling; Narayan, Akil; Zhou, Tao; Chen, Yuhang: Stochastic collocation methods via $\ell_1$ minimization using randomized quadratures (2017)
  17. Hart, J. L.; Alexanderian, A.; Gremaud, P. A.: Efficient computation of Sobol’ indices for stochastic models (2017)
  18. Hu, Jun; Zhang, Shudao: Global sensitivity analysis based on high-dimensional sparse surrogate construction (2017)
  19. Karimi, Sahar; Vavasis, Stephen: IMRO: A proximal quasi-Newton method for solving $\ell_1$-regularized least squares problems (2017)
  20. Lu, Zhaosong: Randomized block proximal damped Newton method for composite self-concordant minimization (2017)

1 2 3 4 5 6 next