MNIST

THE MNIST DATABASE of handwritten digits. The MNIST database of handwritten digits, available from this page, has a training set of 60,000 examples, and a test set of 10,000 examples. It is a subset of a larger set available from NIST. The digits have been size-normalized and centered in a fixed-size image. It is a good database for people who want to try learning techniques and pattern recognition methods on real-world data while spending minimal efforts on preprocessing and formatting.


References in zbMATH (referenced in 185 articles )

Showing results 1 to 20 of 185.
Sorted by year (citations)

1 2 3 ... 8 9 10 next

  1. Abin, Ahmad Ali; Bashiri, Mohammad Ali; Beigy, Hamid: Learning a metric when clustering data points in the presence of constraints (2020)
  2. Bauvin, Baptiste; Capponi, Cécile; Roy, Jean-Francis; Laviolette, François: Fast greedy (\mathcalC)-bound minimization with guarantees (2020)
  3. Bellavia, Stefania; Krejić, Nataša; Morini, Benedetta: Inexact restoration with subsampled trust-region methods for finite-sum minimization (2020)
  4. Boutin, Victor; Franciosini, Angelo; Ruffier, Franck; Perrinet, Laurent: Effect of top-down connections in hierarchical sparse coding (2020)
  5. Carlsson, Gunnar; Gabrielsson, Rickard Brüel: Topological approaches to deep learning (2020)
  6. Challa, Aditya; Danda, Sravan; Sagar, B. S. Daya; Najman, Laurent: Power spectral clustering (2020)
  7. Chaoyang He, Songze Li, Jinhyun So, Mi Zhang, Hongyi Wang, Xiaoyang Wang, Praneeth Vepakomma, Abhishek Singh, Hang Qiu, Li Shen, Peilin Zhao, Yan Kang, Yang Liu, Ramesh Raskar, Qiang Yang, Murali Annavaram, Salman Avestimehr: FedML: A Research Library and Benchmark for Federated Machine Learning (2020) arXiv
  8. Duan, Shiyu; Yu, Shujian; Chen, Yunmei; Principe, Jose C.: On kernel method-based connectionist models and supervised deep learning without backpropagation (2020)
  9. Erway, Jennifer B.; Griffin, Joshua; Marcia, Roummel F.; Omheni, Riadh: Trust-region algorithms for training responses: machine learning methods using indefinite Hessian approximations (2020)
  10. Frady, E. Paxon; Kent, Spencer J.; Olshausen, Bruno A.; Sommer, Friedrich T.: Resonator networks. I: An efficient solution for factoring high-dimensional, distributed representations of data structures (2020)
  11. Fung, Samy Wu; Tyrväinen, Sanna; Ruthotto, Lars; Haber, Eldad: ADMM-softmax: an ADMM approach for multinomial logistic regression (2020)
  12. Kumar, Sandeep; Ying, Jiaxi; Cardoso, José Vinícius de M.; Palomar, Daniel P.: A unified framework for structured graph learning via spectral constraints (2020)
  13. Leimkuhler, Benedict; Sachs, Matthias; Stoltz, Gabriel: Hypocoercivity properties of adaptive Langevin dynamics (2020)
  14. Liang, Tengyuan; Rakhlin, Alexander: Just interpolate: Kernel “Ridgeless” Regression can generalize (2020)
  15. Marschall, Owen; Cho, Kyunghyun; Savin, Cristina: A unified framework of online learning algorithms for training recurrent neural networks (2020)
  16. Nguyen, Hien D.; Forbes, Florence; McLachlan, Geoffrey J.: Mini-batch learning of exponential family finite mixture models (2020)
  17. Romano, Yaniv; Aberdam, Aviad; Sulam, Jeremias; Elad, Michael: Adversarial noise attacks of deep learning architectures: stability analysis via sparse-modeled signals (2020)
  18. Sa-Couto, Luis; Wichert, Andreas: Storing object-dependent sparse codes in a Willshaw associative network (2020)
  19. Song, Hwanjun; Kim, Sundong; Kim, Minseok; Lee, Jae-Gil: Ada-boundary: accelerating DNN training via adaptive boundary batch selection (2020)
  20. Tomita, Tyler M.; Browne, James; Shen, Cencheng; Chung, Jaewon; Patsolic, Jesse L.; Falk, Benjamin; Priebe, Carey E.; Yim, Jason; Burns, Randal; Maggioni, Mauro; Vogelstein, Joshua T.: Sparse projection oblique randomer forests (2020)

1 2 3 ... 8 9 10 next