Blendenpik: supercharging Lapack’s least-squares solver. Several innovative random-sampling and random-mixing techniques for solving problems in linear algebra have been proposed in the last decade, but they have not yet made a significant impact on numerical linear algebra. We show that by using a high-quality implementation of one of these techniques, we obtain a solver that performs extremely well in the traditional yardsticks of numerical linear algebra: it is significantly faster than high-performance implementations of existing state-of-the-art algorithms, and it is numerically backward stable. More specifically, we describe a least-squares solver for dense highly overdetermined systems that achieves residuals similar to those of direct QR factorization-based solvers (LAPACK), outperforms LAPACK by large factors, and scales significantly better than any QR-based solver.

References in zbMATH (referenced in 33 articles )

Showing results 1 to 20 of 33.
Sorted by year (citations)

1 2 next

  1. Zhang, Liping; Wei, Yimin: Randomized core reduction for discrete ill-posed problem (2020)
  2. Bjarkason, Elvar K.: Pass-efficient randomized algorithms for low-rank matrix approximation using any number of views (2019)
  3. Mor-Yosef, Liron; Avron, Haim: Sketching for principal component regression (2019)
  4. Trogdon, Thomsa: On spectral and numerical properties of random butterfly matrices (2019)
  5. Wang, Haiying: More efficient estimation for logistic regression with optimal subsamples (2019)
  6. Wu, Tao; Gleich, David F.: Multiway Monte Carlo method for linear systems (2019)
  7. Zhou, Quan; Guan, Yongtao: Fast model-fitting of Bayesian variable selection regression using the iterative complex factorization algorithm (2019)
  8. Battaglino, Casey; Ballard, Grey; Kolda, Tamara G.: A practical randomized CP tensor decomposition (2018)
  9. Shabat, Gil; Shmueli, Yaniv; Aizenbud, Yariv; Averbuch, Amir: Randomized LU decomposition (2018)
  10. Yang, Jiyan; Chow, Yin-Lam; Ré, Christopher; Mahoney, Michael W.: Weighted SGD for (\ell_p) regression with randomized preconditioning (2018)
  11. Avron, Haim; Clarkson, Kenneth L.; Woodruff, David P.: Faster kernel ridge regression using sketching and preconditioning (2017)
  12. Zhang, Xiaojia Shelly; de Sturler, Eric; Paulino, Glaucio H.: Stochastic sampling for deterministic structural topology optimization with many load cases: density-based and ground structure approaches (2017)
  13. Clarkson, Kenneth L.; Drineas, Petros; Magdon-Ismail, Malik; Mahoney, Michael W.; Meng, Xiangrui; Woodruff, David P.: The fast Cauchy transform and faster robust linear regression (2016)
  14. Scott, Jennifer; Tuma, Miroslav: Preconditioning of linear least squares by robust incomplete factorization for implicitly held normal equations (2016)
  15. Wei, Yimin; Xie, Pengpeng; Zhang, Liping: Tikhonov regularization and randomized GSVD (2016)
  16. Zhang, Xiaowei; Cheng, Li; Chu, Delin; Liao, Li-Zhi; Ng, Michael K.; Tan, Roger C. E.: Incremental regularized least squares for dimensionality reduction of large-scale data (2016)
  17. Bourgain, Jean; Dirksen, Sjoerd; Nelson, Jelani: Toward a unified theory of sparse dimensionality reduction in Euclidean space (2015)
  18. Holodnak, John T.; Ipsen, Ilse C. F.: Randomized approximation of the Gram matrix: exact computation and probabilistic bounds (2015)
  19. Ma, Ping; Mahoney, Michael W.; Yu, Bin: A statistical perspective on algorithmic leveraging (2015)
  20. Zhang, Hongyang; Lin, Zhouchen; Zhang, Chao; Gao, Junbin: Relations among some low-rank subspace recovery models (2015)

1 2 next