softImpute: Matrix Completion via Iterative Soft-Thresholded SVD. Iterative methods for matrix completion that use nuclear-norm regularization. There are two main approaches.The one approach uses iterative soft-thresholded svds to impute the missing values. The second approach uses alternating least squares. Both have an ”EM” flavor, in that at each iteration the matrix is completed with the current estimate. For large matrices there is a special sparse-matrix class named ”Incomplete” that efficiently handles all computations. The package includes procedures for centering and scaling rows, columns or both, and for computing low-rank SVDs on large sparse centered matrices (i.e. principal components)

References in zbMATH (referenced in 74 articles , 2 standard articles )

Showing results 1 to 20 of 74.
Sorted by year (citations)

1 2 3 4 next

  1. Yuxuan Zhao, Madeleine Udell: gcimpute: A Package for Missing Data Imputation (2022) arXiv
  2. Bauch, Jonathan; Nadler, Boaz; Zilber, Pini: Rank (2r) iterative least squares: efficient recovery of ill-conditioned low rank matrices from few entries (2021)
  3. Chen, Yuxin; Fan, Jianqing; Ma, Cong; Yan, Yuling: Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data (2021)
  4. Dong, Shuyu; Absil, P.-A.; Gallivan, K. A.: Riemannian gradient descent methods for graph-regularized matrix completion (2021)
  5. Huang, Jianwen; Wang, Jianjun; Zhang, Feng; Wang, Hailin; Wang, Wendong: Perturbation analysis of low-rank matrix stable recovery (2021)
  6. Lin, Kevin Z.; Lei, Jing; Roeder, Kathryn: Exponential-family embedding with application to cell developmental trajectories for single-cell RNA-seq data (2021)
  7. Tanaka, Masahiro: Bayesian matrix completion approach to causal inference with panel data (2021)
  8. Wang, Wendong; Zhang, Feng; Wang, Jianjun: Low-rank matrix recovery via regularized nuclear norm minimization (2021)
  9. Chen, Yuxin; Chi, Yuejie; Fan, Jianqing; Ma, Cong; Yan, Yuling: Noisy matrix completion: understanding statistical guarantees for convex relaxation via nonconvex optimization (2020)
  10. Kuang, Shenfen; Chao, Hongyang; Li, Qia: Majorized proximal alternating imputation for regularized rank constrained matrix completion (2020)
  11. Mayer, Imke; Sverdrup, Erik; Gauss, Tobias; Moyer, Jean-Denis; Wager, Stefan; Josse, Julie: Doubly robust treatment effect estimation with missing attributes (2020)
  12. Mazumder, Rahul; Saldana, Diego; Weng, Haolei: Matrix completion with nonconvex regularization: spectral operators and scalable algorithms (2020)
  13. Mazumder, Rahul; Weng, Haolei: Computing the degrees of freedom of rank-regularized estimators and cousins (2020)
  14. Robin, Geneviève; Klopp, Olga; Josse, Julie; Moulines, Éric; Tibshirani, Robert: Main effects and interactions in mixed and incomplete data frames (2020)
  15. Sportisse, Aude; Boyer, Claire; Josse, Julie: Imputation and low-rank estimation with missing not at random data (2020)
  16. Sun, Dengdi; Bao, Yuanyuan; Ge, Meiling; Ding, Zhuanlian; Luo, Bin: Dual-graph regularized sparse low-rank matrix recovery for tag refinement (2020)
  17. Yu, Guan; Li, Quefeng; Shen, Dinggang; Liu, Yufeng: Optimal sparse linear prediction for block-missing multi-modality data without imputation (2020)
  18. Zhang, Chelsea; Taylor, Sean J.; Cobb, Curtiss; Sekhon, Jasjeet: Active matrix factorization for surveys (2020)
  19. Bhadra, Anindya; Datta, Jyotishka; Polson, Nicholas G.; Willard, Brandon: Lasso meets horseshoe: a survey (2019)
  20. Kumar, Anil; Liang, Che-Yuan: Credit constraints and GDP growth: evidence from a natural experiment (2019)

1 2 3 4 next