Blendenpik
Blendenpik: supercharging Lapack’s least-squares solver. Several innovative random-sampling and random-mixing techniques for solving problems in linear algebra have been proposed in the last decade, but they have not yet made a significant impact on numerical linear algebra. We show that by using a high-quality implementation of one of these techniques, we obtain a solver that performs extremely well in the traditional yardsticks of numerical linear algebra: it is significantly faster than high-performance implementations of existing state-of-the-art algorithms, and it is numerically backward stable. More specifically, we describe a least-squares solver for dense highly overdetermined systems that achieves residuals similar to those of direct QR factorization-based solvers (LAPACK), outperforms LAPACK by large factors, and scales significantly better than any QR-based solver.
Keywords for this software
References in zbMATH (referenced in 45 articles )
Showing results 1 to 20 of 45.
Sorted by year (- Huang, Guangxin; Liu, Yuanyuan; Yin, Feng: Tikhonov regularization with MTRSVD method for solving large-scale discrete ill-posed problems (2022)
- Shustin, Paz Fink; Avron, Haim: Semi-infinite linear regression and its applications (2022)
- Ailon, Nir; Yehuda, Gal: The complexity of computing (almost) orthogonal matrices with (\varepsilon)-copies of the Fourier transform (2021)
- Chi, Jocelyn T.; Ipsen, Ilse C. F.: Multiplicative perturbation bounds for multivariate multiple linear regression in Schatten (p)-norms (2021)
- Du, Yi-Shu; Hayami, Ken; Zheng, Ning; Morikuni, Keiichi; Yin, Jun-Feng: Kaczmarz-type inner-iteration preconditioned flexible GMRES methods for consistent linear systems (2021)
- Du, Yi-Shu; Hayami, Ken; Zheng, Ning; Morikuni, Keiichi; Yin, Jun-Feng: Kaczmarz-type inner-iteration preconditioned flexible GMRES methods for consistent linear systems (2021)
- Sobczyk, Aleksandros; Gallopoulos, Efstratios: Estimating leverage scores via rank revealing methods and randomization (2021)
- Chung, Julianne; Chung, Matthias; Tanner Slagel, J.; Tenorio, Luis: Sampled limited memory methods for massive linear inverse problems (2020)
- Homrighausen, Darren; McDonald, Daniel J.: Compressed and penalized linear regression (2020)
- Malik, Osman Asif; Becker, Stephen: Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument (2020)
- Malik, Osman Asif; Becker, Stephen: Fast randomized matrix and tensor interpolative decomposition using countsketch (2020)
- Richtárik, Peter; Takáč, Martin: Stochastic reformulations of linear systems: algorithms and convergence theory (2020)
- Zhang, Liping; Wei, Yimin: Randomized core reduction for discrete ill-posed problem (2020)
- Bjarkason, Elvar K.: Pass-efficient randomized algorithms for low-rank matrix approximation using any number of views (2019)
- Mor-Yosef, Liron; Avron, Haim: Sketching for principal component regression (2019)
- Trogdon, Thomsa: On spectral and numerical properties of random butterfly matrices (2019)
- Wang, Haiying: More efficient estimation for logistic regression with optimal subsamples (2019)
- Wu, Tao; Gleich, David F.: Multiway Monte Carlo method for linear systems (2019)
- Zhou, Quan; Guan, Yongtao: Fast model-fitting of Bayesian variable selection regression using the iterative complex factorization algorithm (2019)
- Battaglino, Casey; Ballard, Grey; Kolda, Tamara G.: A practical randomized CP tensor decomposition (2018)