Parallel stochastic gradient algorithms for large-scale matrix completion. This paper develops Jellyfish, an algorithm for solving data-processing problems with matrix-valued decision variables regularized to have low rank. Particular examples of problems solvable by Jellyfish include matrix completion problems and least-squares problems regularized by the nuclear norm or $gamma _2$-norm. Jellyfish implements a projected incremental gradient method with a biased, random ordering of the increments. This biased ordering allows for a parallel implementation that admits a speed-up nearly proportional to the number of processors. On large-scale matrix completion tasks, Jellyfish is orders of magnitude more efficient than existing codes. For example, on the Netflix Prize data set, prior art computes rating predictions in approximately 4 h, while Jellyfish solves the same problem in under 3 min on a 12 core workstation.

References in zbMATH (referenced in 15 articles )

Showing results 1 to 15 of 15.
Sorted by year (citations)

  1. Ashraphijuo, Morteza; Wang, Xiaodong; Aggarwal, Vaneet: Rank determination for low-rank data completion (2017)
  2. Boyd, Nicholas; Schiebinger, Geoffrey; Recht, Benjamin: The alternating descent conditional gradient method for sparse inverse problems (2017)
  3. Bhaskar, Sonia A.: Probabilistic low-rank matrix completion from quantized measurements (2016)
  4. Tappenden, Rachael; Richtárik, Peter; Gondzio, Jacek: Inexact coordinate descent: complexity and preconditioning (2016)
  5. Boumal, Nicolas; Absil, P.-A.: Low-rank matrix completion via preconditioned optimization on the Grassmann manifold (2015)
  6. Harchaoui, Zaid; Juditsky, Anatoli; Nemirovski, Arkadi: Conditional gradient algorithms for norm-regularized smooth convex optimization (2015)
  7. Lin, An-Ya; Ling, Qing: Decentralized and privacy-preserving low-rank matrix completion (2015)
  8. Mackey, Lester; Talwalkar, Ameet; Jordan, Michael I.: Distributed matrix completion and robust factorization (2015)
  9. Mai, The Tien; Alquier, Pierre: A Bayesian approach for noisy matrix completion: optimal rate under general sampling distribution (2015)
  10. Mankad, Shawn; Michailidis, George: Analysis of multiview legislative networks with structured matrix factorization: does twitter influence translate to the real world? (2015)
  11. Wang, Zheng; Lai, Ming-Jun; Lu, Zhaosong; Fan, Wei; Davulcu, Hasan; Ye, Jieping: Orthogonal rank-one matrix pursuit for low rank matrix completion (2015)
  12. Xu, Yangyang; Yin, Wotao: Block stochastic gradient iteration for convex and nonconvex optimization (2015)
  13. Aravkin, Aleksandr; Kumar, Rajiv; Mansour, Hassan; Recht, Ben; Herrmann, Felix J.: Fast methods for denoising matrix completion formulations, with applications to robust seismic data interpolation (2014)
  14. Kyrillidis, Anastasios; Cevher, Volkan: Matrix recipes for hard thresholding methods (2014)
  15. Recht, Benjamin; Ré, Christopher: Parallel stochastic gradient algorithms for large-scale matrix completion (2013)