LowRankModels

LowRankModels.jl is a julia package for modeling and fitting generalized low rank models (GLRMs). GLRMs model a data array by a low rank matrix, and include many well known models in data analysis, such as principal components analysis (PCA), matrix completion, robust PCA, nonnegative matrix factorization, k-means, and many more. For more information on GLRMs, see our paper. There is a python interface to this package, and a GLRM implementation in the H2O machine learning platform with interfaces in a variety of languages. LowRankModels.jl makes it easy to mix and match loss functions and regularizers to construct a model suitable for a particular data set. In particular, it supports: using different loss functions for different columns of the data array, which is useful when data types are heterogeneous (eg, real, boolean, and ordinal columns); fitting the model to only some of the entries in the table, which is useful for data tables with many missing (unobserved) entries; and adding offsets and scalings to the model without destroying sparsity, which is useful when the data is poorly scaled.


References in zbMATH (referenced in 27 articles , 1 standard article )

Showing results 1 to 20 of 27.
Sorted by year (citations)

1 2 next

  1. Aggarwal, Charu C.: Linear algebra and optimization for machine learning. A textbook (2020)
  2. Bossmann, Florian; Ma, Jianwei: Enhanced image approximation using shifted rank-1 reconstruction (2020)
  3. Chen, Yunxiao; Li, Xiaoou; Zhang, Siliang: Structured latent factor analysis for large-scale data: identifiability, estimability, and their implications (2020)
  4. Hong, David; Kolda, Tamara G.; Duersch, Jed A.: Generalized canonical polyadic tensor decomposition (2020)
  5. Kallus, Nathan; Udell, Madeleine: Dynamic assortment personalization in high dimensions (2020)
  6. Landgraf, Andrew J.; Lee, Yoonkyung: Dimensionality reduction for binary data through the projection of natural parameters (2020)
  7. Li, Xinrong; Xiu, Naihua; Zhou, Shenglong: Matrix optimization over low-rank spectral sets: stationary points and local and global minimizers (2020)
  8. Lumbreras, Alberto; Filstroff, Louis; Févotte, Cédric: Bayesian mean-parameterized nonnegative binary matrix factorization (2020)
  9. Robin, Geneviève; Klopp, Olga; Josse, Julie; Moulines, Éric; Tibshirani, Robert: Main effects and interactions in mixed and incomplete data frames (2020)
  10. Shen, Rui; Meng, Zhiqing; Jiang, Min: Smoothing partially exact penalty function of biconvex programming (2020)
  11. Sportisse, Aude; Boyer, Claire; Josse, Julie: Imputation and low-rank estimation with missing not at random data (2020)
  12. Alaya, Mokhtar Z.; Klopp, Olga: Collective matrix completion (2019)
  13. Bai, Jushan; Ng, Serena: Rank regularized estimation of approximate factor models (2019)
  14. Balcan, Maria-Florina; Liang, Yingyu; Song, Zhao; Woodruff, David P.; Zhang, Hongyang: Non-convex matrix completion and related problems via strong duality (2019)
  15. Daneshmand, Amir; Sun, Ying; Scutari, Gesualdo; Facchinei, Francisco; Sadler, Brian M.: Decentralized dictionary learning over time-varying digraphs (2019)
  16. Driggs, Derek; Becker, Stephen; Aravkin, Aleksandr: Adapting regularized low-rank models for parallel architectures (2019)
  17. Gillis, Nicolas; Shitov, Yaroslav: Low-rank matrix approximation in the infinity norm (2019)
  18. Ungun, Baris; Xing, Lei; Boyd, Stephen: Real-time radiation treatment planning with optimality guarantees via cluster and bound methods (2019)
  19. Fithian, William; Mazumder, Rahul: Flexible low-rank statistical modeling with missing data and side information (2018)
  20. Liu, Lydia T.; Dobriban, Edgar; Singer, Amit: (e)PCA: high dimensional exponential family PCA (2018)

1 2 next