LowRankModels
LowRankModels.jl is a julia package for modeling and fitting generalized low rank models (GLRMs). GLRMs model a data array by a low rank matrix, and include many well known models in data analysis, such as principal components analysis (PCA), matrix completion, robust PCA, nonnegative matrix factorization, k-means, and many more. For more information on GLRMs, see our paper. There is a python interface to this package, and a GLRM implementation in the H2O machine learning platform with interfaces in a variety of languages. LowRankModels.jl makes it easy to mix and match loss functions and regularizers to construct a model suitable for a particular data set. In particular, it supports: using different loss functions for different columns of the data array, which is useful when data types are heterogeneous (eg, real, boolean, and ordinal columns); fitting the model to only some of the entries in the table, which is useful for data tables with many missing (unobserved) entries; and adding offsets and scalings to the model without destroying sparsity, which is useful when the data is poorly scaled.
Keywords for this software
References in zbMATH (referenced in 31 articles , 1 standard article )
Showing results 1 to 20 of 31.
Sorted by year (- Abdolali, Maryam; Gillis, Nicolas: Simplex-structured matrix factorization: sparsity-based identifiability and provably correct algorithms (2021)
- Lin, Kevin Z.; Lei, Jing; Roeder, Kathryn: Exponential-family embedding with application to cell developmental trajectories for single-cell RNA-seq data (2021)
- Aggarwal, Charu C.: Linear algebra and optimization for machine learning. A textbook (2020)
- Bossmann, Florian; Ma, Jianwei: Enhanced image approximation using shifted rank-1 reconstruction (2020)
- Chen, Yunxiao; Li, Xiaoou; Zhang, Siliang: Structured latent factor analysis for large-scale data: identifiability, estimability, and their implications (2020)
- Galuzzi, B. G.; Giordani, I.; Candelieri, A.; Perego, R.; Archetti, F.: Hyperparameter optimization for recommender systems through Bayesian optimization (2020)
- Hong, David; Kolda, Tamara G.; Duersch, Jed A.: Generalized canonical polyadic tensor decomposition (2020)
- Kallus, Nathan; Udell, Madeleine: Dynamic assortment personalization in high dimensions (2020)
- Landgraf, Andrew J.; Lee, Yoonkyung: Dimensionality reduction for binary data through the projection of natural parameters (2020)
- Li, Xinrong; Xiu, Naihua; Zhou, Shenglong: Matrix optimization over low-rank spectral sets: stationary points and local and global minimizers (2020)
- Lumbreras, Alberto; Filstroff, Louis; Févotte, Cédric: Bayesian mean-parameterized nonnegative binary matrix factorization (2020)
- Robin, Geneviève; Klopp, Olga; Josse, Julie; Moulines, Éric; Tibshirani, Robert: Main effects and interactions in mixed and incomplete data frames (2020)
- Shen, Rui; Meng, Zhiqing; Jiang, Min: Smoothing partially exact penalty function of biconvex programming (2020)
- Sportisse, Aude; Boyer, Claire; Josse, Julie: Imputation and low-rank estimation with missing not at random data (2020)
- Alaya, Mokhtar Z.; Klopp, Olga: Collective matrix completion (2019)
- Bai, Jushan; Ng, Serena: Rank regularized estimation of approximate factor models (2019)
- Balcan, Maria-Florina; Liang, Yingyu; Song, Zhao; Woodruff, David P.; Zhang, Hongyang: Non-convex matrix completion and related problems via strong duality (2019)
- Daneshmand, Amir; Sun, Ying; Scutari, Gesualdo; Facchinei, Francisco; Sadler, Brian M.: Decentralized dictionary learning over time-varying digraphs (2019)
- Driggs, Derek; Becker, Stephen; Aravkin, Aleksandr: Adapting regularized low-rank models for parallel architectures (2019)
- Gillis, Nicolas; Shitov, Yaroslav: Low-rank matrix approximation in the infinity norm (2019)