LSMR: an iterative algorithm for sparse least-squares problems. An iterative method LSMR is presented for solving linear systems Ax=b and least-squares problems min∥Ax-b∥ 2 , with A being sparse or a fast linear operator. LSMR is based on the Golub-Kahan bidiagonalization process. It is analytically equivalent to the MINRES method applied to the normal equation A T Ax=A T b, so that the quantities ∥A T r k ∥ are monotonically decreasing (where r k =b-Ax k is the residual for the current iterate x k ). We observe in practice that ∥r k ∥ also decreases monotonically, so that compared to LSQR (for which only ∥r k ∥ is monotonic) it is safer to terminate LSMR early. We also report some experiments with reorthogonalization.

References in zbMATH (referenced in 35 articles , 1 standard article )

Showing results 1 to 20 of 35.
Sorted by year (citations)

1 2 next

  1. Fong, Justin; Tan, Ying; Crocher, Vincent; Oetomo, Denny; Mareels, Iven: Dual-loop iterative optimal control for the finite horizon LQR problem with unknown dynamics (2018)
  2. Ahmadi-Asl, Salman; Beik, Fatemeh Panjeh Ali: Iterative algorithms for least-squares solutions of a quaternion matrix equation (2017)
  3. Chung, Julianne; Saibaba, Arvind K.: Generalized hybrid iterative methods for large-scale Bayesian inverse problems (2017)
  4. Gould, Nicholas; Scott, Jennifer: The state-of-the-art of preconditioners for sparse linear least-squares problems (2017)
  5. Hnětynková, Iveta; Kubínová, Marie; Plešinger, Martin: Noise representation in residuals of LSQR, LSMR, and CRAIG regularization (2017)
  6. Ji, Hao; Li, Yaohang: Block conjugate gradient algorithms for least squares problems (2017)
  7. Mojarrab, M.; Toutounian, F.: Global LSMR(Gl-LSMR) method for solving general linear systems with several right-hand sides (2017)
  8. Renaut, Rosemary A.; Vatankhah, Saeed; Ardestani, Vahid E.: Hybrid and iteratively reweighted regularization by unbiased predictive risk and weighted GCV for projected systems (2017)
  9. Scott, Jennifer: On using Cholesky-based factorizations and regularization for solving rank-deficient sparse linear least-squares problems (2017)
  10. Scott, Jennifer; Tuma, Miroslav: Solving mixed sparse-dense linear least-squares problems by preconditioned iterative methods (2017)
  11. Spantini, Alessio; Cui, Tiangang; Willcox, Karen; Tenorio, Luis; Marzouk, Youssef: Goal-oriented optimal approximations of Bayesian linear inverse problems (2017)
  12. Zwaan, Ian N.; Hochstenbach, Michiel E.: Multidirectional subspace expansion for one-parameter and multiparameter Tikhonov regularization (2017)
  13. Deadman, Edvin; Higham, Nicholas J.: Testing matrix function algorithms using identities (2016)
  14. Diamond, Steven; Boyd, Stephen: Matrix-free convex optimization modeling (2016)
  15. Greif, C.; Paige, C.C.; Titley-Peloquin, D.; Varah, J.M.: Numerical equivalences among Krylov subspace algorithms for skew-symmetric matrices (2016)
  16. van Leeuwen, T.; Herrmann, F.J.: A penalty method for PDE-constrained optimization in inverse problems (2016)
  17. Zhang, Xiaowei; Cheng, Li; Chu, Delin; Liao, Li-Zhi; Ng, Michael K.; Tan, Roger C.E.: Incremental regularized least squares for dimensionality reduction of large-scale data (2016)
  18. Chung, Julianne M.; Kilmer, Misha E.; O’Leary, Dianne P.: A framework for regularization via operator approximation (2015)
  19. Chung, Julianne; Palmer, Katrina: A hybrid LSMR algorithm for large-scale Tikhonov regularization (2015)
  20. Conder, James A.: Fitting multiple Bell curves stably and accurately to a time series as applied to Hubbert cycles or other phenomena (2015)

1 2 next