LOBPCG

Preconditioned low-rank methods for high-dimensional elliptic PDE eigenvalue problems. We consider elliptic PDE eigenvalue problems on a tensorized domain, discretized such that the resulting matrix eigenvalue problem Ax=λx exhibits Kronecker product structure. In particular, we are concerned with the case of high dimensions, where standard approaches to the solution of matrix eigenvalue problems fail due to the exponentially growing degrees of freedom. Recent work shows that this curse of dimensionality can in many cases be addressed by approximating the desired solution vector x in a low-rank tensor format. In this paper, we use the hierarchical Tucker decomposition to develop a low-rank variant of LOBPCG, a classical preconditioned eigenvalue solver. We also show how the ALS and MALS (DMRG) methods known from computational quantum physics can be adapted to the hierarchical Tucker decomposition. Finally, a combination of ALS and MALS with LOBPCG and with our low-rank variant is proposed. A number of numerical experiments indicate that such combinations represent the methods of choice.


References in zbMATH (referenced in 32 articles )

Showing results 1 to 20 of 32.
Sorted by year (citations)

1 2 next

  1. Shi, Tianyi; Townsend, Alex: On the compressibility of tensors (2021)
  2. Elman, Howard C.; Su, Tengfei: Low-rank solution methods for stochastic eigenvalue problems (2019)
  3. Huang, Ruihao; Mu, Lin: A new fast method of solving the high dimensional elliptic eigenvalue problem (2019)
  4. Rakhuba, Maxim; Novikov, Alexander; Oseledets, Ivan: Low-rank Riemannian eigensolver for high-dimensional Hamiltonians (2019)
  5. Kazeev, Vladimir; Schwab, Christoph: Quantized tensor-structured finite elements for second-order elliptic PDEs in two dimensions (2018)
  6. Rakhuba, M. V.; Oseledets, I. V.: Jacobi-Davidson method on low-rank matrix manifolds (2018)
  7. Zhang, Junyu; Wen, Zaiwen; Zhang, Yin: Subspace methods with local refinements for eigenvalue computation using low-rank tensor-train format (2017)
  8. Bachmayr, Markus; Dahmen, Wolfgang: Adaptive low-rank methods: problems on Sobolev spaces (2016)
  9. Bachmayr, Markus; Schneider, Reinhold; Uschmajew, André: Tensor networks and hierarchical tensors for the solution of high-dimensional partial differential equations (2016)
  10. Bachmayr, M.; Dahmen, W.: Adaptive low-rank methods for problems on Sobolev spaces with error control in (\mathrmL_2) (2016)
  11. Etter, Simon: Parallel ALS algorithm for solving linear systems in the hierarchical Tucker representation (2016)
  12. Kressner, Daniel; Steinlechner, Michael; Vandereycken, Bart: Preconditioned low-rank Riemannian optimization for linear systems with tensor product structure (2016)
  13. Kressner, Daniel; Uschmajew, André: On low-rank approximability of solutions to high-dimensional operator equations and eigenvalue problems (2016)
  14. Lee, Namgil; Cichocki, Andrzej: Regularized computation of approximate pseudoinverse of large matrices using low-rank tensor train decompositions (2016)
  15. Bachmayr, Markus; Dahmen, Wolfgang: Adaptive near-optimal rank tensor approximation for high-dimensional operator equations (2015)
  16. Dolgov, Sergey; Khoromskij, Boris N.; Litvinenko, Alexander; Matthies, Hermann G.: Polynomial chaos expansion of random coefficients and the solution of stochastic partial differential equations in the tensor train format (2015)
  17. Hackbusch, Wolfgang: Solution of linear systems in high spatial dimensions (2015)
  18. Kazeev, Vladimir; Schwab, Christoph: Tensor approximation of stationary distributions of chemical reaction networks (2015)
  19. Hackbusch, Wolfgang: Numerical tensor calculus (2014)
  20. Savostyanov, Dmitry V.: Quasioptimality of maximum-volume cross interpolation of tensors (2014)

1 2 next