THE MNIST DATABASE of handwritten digits. The MNIST database of handwritten digits, available from this page, has a training set of 60,000 examples, and a test set of 10,000 examples. It is a subset of a larger set available from NIST. The digits have been size-normalized and centered in a fixed-size image. It is a good database for people who want to try learning techniques and pattern recognition methods on real-world data while spending minimal efforts on preprocessing and formatting.

References in zbMATH (referenced in 264 articles )

Showing results 1 to 20 of 264.
Sorted by year (citations)

1 2 3 ... 12 13 14 next

  1. Chen, Qipin; Hao, Wenrui; He, Juncai: A weight initialization based on the linear product structure for neural networks (2022)
  2. Pfannschmidt, Karlson; Gupta, Pritha; Haddenhorst, Björn; Hüllermeier, Eyke: Learning context-dependent choice functions (2022)
  3. Rudin, Cynthia; Chen, Chaofan; Chen, Zhi; Huang, Haiyang; Semenova, Lesia; Zhong, Chudi: Interpretable machine learning: fundamental principles and 10 grand challenges (2022)
  4. Stehr, Mark-Oliver; Kim, Minyoung; Talcott, Carolyn L.: A probabilistic approximate logic for neuro-symbolic learning and reasoning (2022)
  5. Sukumar, N.; Srivastava, Ankit: Exact imposition of boundary conditions with distance functions in physics-informed deep neural networks (2022)
  6. Watanabe, Satoru; Yamana, Hayato: Topological measurement of deep neural networks using persistent homology (2022)
  7. Aquilanti, Laura; Cacace, Simone; Camilli, Fabio; De Maio, Raul: A mean field games model for finite mixtures of Bernoulli and categorical distributions (2021)
  8. Ayoub Benaissa, Bilal Retiat, Bogdan Cebere, Alaa Eddine Belfedhal: TenSEAL: A Library for Encrypted Tensor Operations Using Homomorphic Encryption (2021) arXiv
  9. Basani, Jasvith Raj; Bhattacherjee, Aranya: Continuous-variable deep quantum neural networks for flexible learning of structured classical information (2021)
  10. Baskerville, Nicholas P.; Keating, Jonathan P.; Mezzadri, Francesco; Najnudel, Joseph: The loss surfaces of neural networks with general activation functions (2021)
  11. Benning, Martin; Betcke, Marta M.; Ehrhardt, Matthias J.; Schönlieb, Carola-Bibiane: Choose your path wisely: gradient descent in a Bregman distance framework (2021)
  12. Benny, Yaniv; Galanti, Tomer; Benaim, Sagie; Wolf, Lior: Evaluation metrics for conditional image generation (2021)
  13. Chang, Woonyoung; Ahn, Jeongyoun; Jung, Sungkyu: Double data piling leads to perfect classification (2021)
  14. Chzhen, Evgenii; Denis, Christophe; Hebiri, Mohamed: Minimax semi-supervised set-valued approach to multi-class classification (2021)
  15. Cloninger, A.; Mhaskar, H. N.: Cautious active clustering (2021)
  16. De Loera, Jesús A.; Haddock, Jamie; Ma, Anna; Needell, Deanna: Data-driven algorithm selection and tuning in optimization and signal processing (2021)
  17. Frye, Charles G.; Simon, James; Wadia, Neha S.; Ligeralde, Andrew; Deweese, Michael R.; Bouchard, Kristofer E.: Critical point-finding methods reveal gradient-flat regions of deep network losses (2021)
  18. Geiger, Mario; Petrini, Leonardo; Wyart, Matthieu: Landscape and training regimes in deep learning (2021)
  19. Ghods, Alireza; Cook, Diane J.: A survey of deep network techniques all classifiers can adopt (2021)
  20. Giffon, Luc; Emiya, Valentin; Kadri, Hachem; Ralaivola, Liva: Quick-means: accelerating inference for K-means by learning fast transforms (2021)

1 2 3 ... 12 13 14 next