THE MNIST DATABASE of handwritten digits. The MNIST database of handwritten digits, available from this page, has a training set of 60,000 examples, and a test set of 10,000 examples. It is a subset of a larger set available from NIST. The digits have been size-normalized and centered in a fixed-size image. It is a good database for people who want to try learning techniques and pattern recognition methods on real-world data while spending minimal efforts on preprocessing and formatting.

References in zbMATH (referenced in 176 articles )

Showing results 1 to 20 of 176.
Sorted by year (citations)

1 2 3 ... 7 8 9 next

  1. Abin, Ahmad Ali; Bashiri, Mohammad Ali; Beigy, Hamid: Learning a metric when clustering data points in the presence of constraints (2020)
  2. Bellavia, Stefania; Krejić, Nataša; Morini, Benedetta: Inexact restoration with subsampled trust-region methods for finite-sum minimization (2020)
  3. Carlsson, Gunnar; Gabrielsson, Rickard Brüel: Topological approaches to deep learning (2020)
  4. Chaoyang He, Songze Li, Jinhyun So, Mi Zhang, Hongyi Wang, Xiaoyang Wang, Praneeth Vepakomma, Abhishek Singh, Hang Qiu, Li Shen, Peilin Zhao, Yan Kang, Yang Liu, Ramesh Raskar, Qiang Yang, Murali Annavaram, Salman Avestimehr: FedML: A Research Library and Benchmark for Federated Machine Learning (2020) arXiv
  5. Duan, Shiyu; Yu, Shujian; Chen, Yunmei; Principe, Jose C.: On kernel method-based connectionist models and supervised deep learning without backpropagation (2020)
  6. Erway, Jennifer B.; Griffin, Joshua; Marcia, Roummel F.; Omheni, Riadh: Trust-region algorithms for training responses: machine learning methods using indefinite Hessian approximations (2020)
  7. Fung, Samy Wu; Tyrväinen, Sanna; Ruthotto, Lars; Haber, Eldad: ADMM-softmax: an ADMM approach for multinomial logistic regression (2020)
  8. Kumar, Sandeep; Ying, Jiaxi; Cardoso, José Vinícius de M.; Palomar, Daniel P.: A unified framework for structured graph learning via spectral constraints (2020)
  9. Leimkuhler, Benedict; Sachs, Matthias; Stoltz, Gabriel: Hypocoercivity properties of adaptive Langevin dynamics (2020)
  10. Liang, Tengyuan; Rakhlin, Alexander: Just interpolate: Kernel “Ridgeless” Regression can generalize (2020)
  11. Marschall, Owen; Cho, Kyunghyun; Savin, Cristina: A unified framework of online learning algorithms for training recurrent neural networks (2020)
  12. Nguyen, Hien D.; Forbes, Florence; McLachlan, Geoffrey J.: Mini-batch learning of exponential family finite mixture models (2020)
  13. Romano, Yaniv; Aberdam, Aviad; Sulam, Jeremias; Elad, Michael: Adversarial noise attacks of deep learning architectures: stability analysis via sparse-modeled signals (2020)
  14. Sa-Couto, Luis; Wichert, Andreas: Storing object-dependent sparse codes in a Willshaw associative network (2020)
  15. Tomita, Tyler M.; Browne, James; Shen, Cencheng; Chung, Jaewon; Patsolic, Jesse L.; Falk, Benjamin; Priebe, Carey E.; Yim, Jason; Burns, Randal; Maggioni, Mauro; Vogelstein, Joshua T.: Sparse projection oblique randomer forests (2020)
  16. Tran, K. H.; Ngolè Mboula, F. M.; Starck, J. L.; Prost, V.: Semisupervised dictionary learning with graph regularized and active points (2020)
  17. Yaohua Liu, Risheng Liu: BOML: A Modularized Bilevel Optimization Library in Python for Meta Learning (2020) arXiv
  18. Ariafar, Setareh; Coll-Font, Jaume; Brooks, Dana; Dy, Jennifer: ADMMBO: Bayesian optimization with unknown constraints using ADMM (2019)
  19. Auricchio, Gennaro; Bassetti, Federico; Gualandi, Stefano; Veneroni, Marco: Computing Wasserstein barycenters via linear programming (2019)
  20. Balakrishnan, Harikrishnan Nellippallil; Kathpalia, Aditi; Saha, Snehanshu; Nagaraj, Nithin: Chaosnet: a chaos based artificial neural network architecture for classification (2019)

1 2 3 ... 7 8 9 next