Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms. We present Fashion-MNIST, a new dataset comprising of 28x28 grayscale images of 70,000 fashion products from 10 categories, with 7,000 images per category. The training set has 60,000 images and the test set has 10,000 images. Fashion-MNIST is intended to serve as a direct drop-in replacement for the original MNIST dataset for benchmarking machine learning algorithms, as it shares the same image size, data format and the structure of training and testing splits. The dataset is freely available at this https URL

References in zbMATH (referenced in 52 articles )

Showing results 1 to 20 of 52.
Sorted by year (citations)

1 2 3 next

  1. Calder, Jeff; Park, Sangmin; Slepčev, Dejan: Boundary estimation from point clouds: algorithms, guarantees and applications (2022)
  2. Flores, Mauricio; Calder, Jeff; Lerman, Gilad: Analysis and algorithms for (\ell_p)-based semi-supervised learning on graphs (2022)
  3. Grabovoy, A. V.; Strijov, V. V.: Probabilistic interpretation of the distillation problem (2022)
  4. Guo, Qian; Qian, Yuhua; Liang, Xinyan: GLRM: logical pattern mining in the case of inconsistent data distribution based on multigranulation strategy (2022)
  5. Han, Zhixian; Sereno, Anne: Modeling the ventral and dorsal cortical visual pathways using artificial neural networks (2022)
  6. Lakhmiri, Dounia; Le Digabel, Sébastien: Use of static surrogates in hyperparameter optimization (2022)
  7. Park, Seonho; Adosoglou, George; Pardalos, Panos M.: Interpreting rate-distortion of variational autoencoder and using model uncertainty for anomaly detection (2022)
  8. Akuzawa, Kei; Iwasawa, Yusuke; Matsuo, Yutaka: Information-theoretic regularization for learning global features by sequential VAE (2021)
  9. Amir, Guy; Wu, Haoze; Barrett, Clark; Katz, Guy: An SMT-based approach for verifying binarized neural networks (2021)
  10. Arachie, Chidubem; Huang, Bert: A general framework for adversarial label learning (2021)
  11. Biau, Gérard; Sangnier, Maxime; Tanielian, Ugo: Some theoretical insights into Wasserstein GANs (2021)
  12. Boubekki, Ahcène; Kampffmeyer, Michael; Brefeld, Ulf; Jenssen, Robert: Joint optimization of an autoencoder for clustering and embedding (2021)
  13. Filipe Assunção, Nuno Lourenço, Bernardete Ribeiro, Penousal Machado: Fast-DENSER: Fast Deep Evolutionary Network Structured Representation (2021) not zbMATH
  14. Fregier, Yael; Gouray, Jean-Baptiste: Mind2Mind: transfer learning for GANs (2021)
  15. Giffon, Luc; Emiya, Valentin; Kadri, Hachem; Ralaivola, Liva: Quick-means: accelerating inference for K-means by learning fast transforms (2021)
  16. Gouk, Henry; Frank, Eibe; Pfahringer, Bernhard; Cree, Michael J.: Regularisation of neural networks by enforcing Lipschitz continuity (2021)
  17. Grabovoy, A. V.; Strijov, V. V.: Bayesian distillation of deep learning models (2021)
  18. Hou, Jun; Qin, Tong; Wu, Kailiang; Xiu, Dongbin: A non-intrusive correction algorithm for classification problems with corrupted data (2021)
  19. Huang, Junhao; Sun, Weize; Huang, Lei: Joint structure and parameter optimization of multiobjective sparse neural network (2021)
  20. Iwen, Mark A.; Krahmer, Felix; Krause-Solberg, Sara; Maly, Johannes: On recovery guarantees for one-bit compressed sensing on manifolds (2021)

1 2 3 next