Python library lsd: Learning with Synthetic Data. The present lsd (Learning on Synthetic Data) package provides an easy framework to run learning experiments on synthetic datasets. In particular, it implements USV-layers to allow for entropy computation during learning, using the heuristics replica formula from statistical physics as proposed in . The computation of entropies requires the installation of the dedicated package dnner (DNNs Entropy from Replicas).
Keywords for this software
References in zbMATH (referenced in 7 articles , 1 standard article )
Showing results 1 to 7 of 7.
- Benigni, Lucas; Péché, Sandrine: Eigenvalue distribution of some nonlinear models of random matrices (2021)
- Pandit, Parthe; Sahraee-Ardakan, Mojtaba; Rangan, Sundeep; Schniter, Philip; Fletcher, Alyson K.: Matrix inference and estimation in multi-layer models (2021)
- Barbier, Jean; Chan, Chun Lam; Macris, Nicolas: Concentration of multi-overlaps for random dilute ferromagnetic spin models (2020)
- Gribonval, Rémi; Blanchard, Gilles; Keriven, Nicolas; Traonmilin, Yann: Compressive statistical learning with random feature moments (2020)
- Merkh, Thomas; Montúfar, Guido: Factorized mutual information maximization. (2020)
- Barbier, Jean; Macris, Nicolas: The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference (2019)
- Gabrié, Marylou; Manoel, Andre; Luneau, Clément; Barbier, Jean; Macris, Nicolas; Krzakala, Florent; Zdeborová, Lenka: Entropy and mutual information in models of deep neural networks (2019)