SPGL1
SPGL1: A solver for large-scale sparse reconstruction: Probing the Pareto frontier for basis pursuit solutions. The basis pursuit problem seeks a minimum one-norm solution of an underdetermined least-squares problem. Basis Pursuit DeNoise (BPDN) fits the least-squares problem only approximately, and a single parameter determines a curve that traces the optimal trade-off between the least-squares fit and the one-norm of the solution. We prove that this curve is convex and continuously differentiable over all points of interest, and show that it gives an explicit relationship to two other optimization problems closely related to BPDN. We describe a root-finding algorithm for finding arbitrary points on this curve; the algorithm is suitable for problems that are large scale and for those that are in the complex domain. At each iteration, a spectral gradient-projection method approximately minimizes a least-squares problem with an explicit one-norm constraint. Only matrix-vector operations are required. The primal-dual solution of this problem gives function and derivative information needed for the root-finding method. Numerical experiments on a comprehensive set of test problems demonstrate that the method scales well to large problems.
Keywords for this software
References in zbMATH (referenced in 123 articles , 2 standard articles )
Showing results 1 to 20 of 123.
Sorted by year (- Adcock, Ben; Gelb, Anne; Song, Guohui; Sui, Yi: Joint sparse recovery based on variances (2019)
- Kim, Kyung-Su; Chung, Sae-Young: Greedy subspace pursuit for joint sparse recovery (2019)
- Liu, Michelle; Kumar, Rajiv; Haber, Eldad; Aravkin, Aleksandr: Simultaneous-shot inversion for PDE-constrained optimization problems with missing data (2019)
- Manohar, Krithika; Kaiser, Eurika; Brunton, Steven L.; Kutz, J. Nathan: Optimized sampling for multiscale dynamics (2019)
- Adcock, Ben: Infinite-dimensional compressed sensing and function interpolation (2018)
- Adcock, Ben; Bao, Anyi; Jakeman, John D.; Narayan, Akil: Compressed sensing with sparse corruptions: fault-tolerant sparse collocation approximations (2018)
- Boyd, Nicholas; Hastie, Trevor; Boyd, Stephen; Recht, Benjamin; Jordan, Michael I.: Saturating splines and feature selection (2018)
- Chen, Xiaojun; Womersley, Robert S.: Spherical designs and nonconvex minimization for recovery of sparse signals on the sphere (2018)
- Chkifa, Abdellah; Dexter, Nick; Tran, Hoang; Webster, Clayton G.: Polynomial approximation via compressed sensing of high-dimensional functions on lower sets (2018)
- Gibali, Aviv; Liu, Li-Wei; Tang, Yu-Chao: Note on the modified relaxation CQ algorithm for the split feasibility problem (2018)
- Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.: Compressive sensing with cross-validation and stop-sampling for sparse polynomial chaos expansions (2018)
- Keshavarzzadeh, Vahid; Kirby, Robert M.; Narayan, Akil: Numerical integration in multiple dimensions with designed quadrature (2018)
- Li, Chong-Jun; Zhong, Yi-Jun: A pseudo-heuristic parameter selection rule for (l^1)-regularized minimization problems (2018)
- Lin, Qihang; Nadarajah, Selvaprabu; Soheili, Negar: A level-set method for convex optimization with a feasible solution path (2018)
- Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: On efficiently solving the subproblems of a level-set method for fused lasso problems (2018)
- Li, Xudong; Sun, Defeng; Toh, Kim-Chuan: A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems (2018)
- Schaeffer, Hayden; Tran, Giang; Ward, Rachel: Extracting sparse high-dimensional dynamics from limited data (2018)
- Shen, Jinglai; Mousavi, Seyedahmad: Least sparsity of (p)-norm based optimization problems with (p>1) (2018)
- Wang, Fenghui: Polyak’s gradient method for split feasibility problem constrained by level sets (2018)
- Wang, Yuepeng; Cheng, Yue; Zhang, Zongyuan; Lin, Guang: Calibration of reduced-order model for a coupled Burgers equations based on PC-EnKF (2018)