OptShrink

OptShrink: an algorithm for improved low-rank signal matrix denoising by optimal, data-driven singular value shrinkage. OptShrink is a simple, completely data-driven algorithm for denoising a low-rank signal matrix buried in noise. It takes as its input the signal-plus-noise matrix, an estimate of the signal matrix rank and returns as an output the improved signal matrix estimate. It computes this estimate by shrinking the singular values corresponding to the Truncated SVD (TSVD) in the correct manner as given by random matrix theory. It can be used in the missing data setting and for a large class of noise models for which the i.i.d. Gaussian setting is a special case. There are no tuning parameters involved so it can be used in a black-box manner wherever improving low-rank matrix estimation is desirable. The algorithm outperforms the truncated SVD (TSVD) significantly in the low to moderate SNR regime and will never do worse than the TSVD. The theory also explains why it will always do better than singular value thresholding.


References in zbMATH (referenced in 20 articles , 1 standard article )

Showing results 1 to 20 of 20.
Sorted by year (citations)

  1. Bigot, Jérémie; Deledalle, Charles: Low-rank matrix denoising for count data using unbiased Kullback-Leibler risk estimation (2022)
  2. Zhang, Anru R.; Cai, T. Tony; Wu, Yihong: Heteroskedastic PCA: algorithm, optimality, and applications (2022)
  3. Ding, Xiucai; Yang, Fan: Spiked separable covariance matrices and principal components (2021)
  4. Leeb, William: Rapid evaluation of the spectral signal detection threshold and Stieltjes transform (2021)
  5. Leeb, William E.: Matrix denoising for weighted loss functions and heterogeneous signals (2021)
  6. Cordero-Grande, Lucilio: MIXANDMIX: numerical techniques for the computation of empirical spectral distributions of population mixtures (2020)
  7. Ding, Xiucai: High dimensional deformed rectangular matrices with applications in matrix denoising (2020)
  8. Dobriban, Edgar: Permutation methods for factor analysis and PCA (2020)
  9. Dobriban, Edgar; Leeb, William; Singer, Amit: Optimal prediction in the linearly transformed spiked model (2020)
  10. Johnstone, Iain M.; Onatski, Alexei: Testing in high-dimensional spiked models (2020)
  11. Lazzaro, Damiana; Morigi, Serena: Matrix completion for matrices with low-rank displacement (2020)
  12. Lettau, Martin; Pelger, Markus: Estimating latent asset-pricing factors (2020)
  13. Prasadan, Arvind; Nadakuditi, Raj Rao: Time series source separation using dynamic mode decomposition (2020)
  14. Dobriban, Edgar; Owen, Art B.: Deterministic parallel analysis: an improved method for selecting factors and principal components (2019)
  15. Hong, David; Balzano, Laura; Fessler, Jeffrey A.: Asymptotic performance of PCA for high-dimensional heteroscedastic data (2018)
  16. Liu, Lydia T.; Dobriban, Edgar; Singer, Amit: (e)PCA: high dimensional exponential family PCA (2018)
  17. Bigot, Jérémie; Deledalle, Charles; Féral, Delphine: Generalized SURE for optimal shrinkage of singular values in low-rank matrix denoising (2017)
  18. Chatterjee, Sourav: Matrix estimation by universal singular value thresholding (2015)
  19. Dobriban, Edgar: Efficient computation of limit spectra of sample covariance matrices (2014)
  20. Nadakuditi, Raj Rao: OptShrink: an algorithm for improved low-rank signal matrix denoising by optimal, data-driven singular value shrinkage (2014)