TheCommitteeMachine
The committee machine: computational to statistical gaps in learning a two-layers neural network. Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks. In this paper, we provide a rigorous justification of these approaches for a two-layers neural network model called the committee machine, under a technical assumption. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows optimal learning in polynomial time for a large set of parameters. We find that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it; strongly suggesting that no efficient algorithm exists for those cases, unveiling a large computational gap.
Keywords for this software
References in zbMATH (referenced in 5 articles , 1 standard article )
Showing results 1 to 5 of 5.
Sorted by year (- Oostwal, Elisa; Straat, Michiel; Biehl, Michael: Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (2021)
- Barbier, Jean; Chan, Chun Lam; Macris, Nicolas: Concentration of multi-overlaps for random dilute ferromagnetic spin models (2020)
- Huang, Hanwen; Yang, Qinglong: Large scale analysis of generalization error in learning using margin based classification methods (2020)
- Aubin, Benjamin; Maillard, Antoine; Barbier, Jean; Krzakala, Florent; Macris, Nicolas; Zdeborová, Lenka: The committee machine: computational to statistical gaps in learning a two-layers neural network (2019)
- Barbier, Jean; Macris, Nicolas: The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference (2019)