Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms. We present Fashion-MNIST, a new dataset comprising of 28x28 grayscale images of 70,000 fashion products from 10 categories, with 7,000 images per category. The training set has 60,000 images and the test set has 10,000 images. Fashion-MNIST is intended to serve as a direct drop-in replacement for the original MNIST dataset for benchmarking machine learning algorithms, as it shares the same image size, data format and the structure of training and testing splits. The dataset is freely available at this https URL

References in zbMATH (referenced in 36 articles )

Showing results 1 to 20 of 36.
Sorted by year (citations)

1 2 next

  1. Amir, Guy; Wu, Haoze; Barrett, Clark; Katz, Guy: An SMT-based approach for verifying binarized neural networks (2021)
  2. Arachie, Chidubem; Huang, Bert: A general framework for adversarial label learning (2021)
  3. Biau, Gérard; Sangnier, Maxime; Tanielian, Ugo: Some theoretical insights into Wasserstein GANs (2021)
  4. Filipe Assunção, Nuno Lourenço, Bernardete Ribeiro, Penousal Machado: Fast-DENSER: Fast Deep Evolutionary Network Structured Representation (2021) not zbMATH
  5. Giffon, Luc; Emiya, Valentin; Kadri, Hachem; Ralaivola, Liva: Quick-means: accelerating inference for K-means by learning fast transforms (2021)
  6. Gouk, Henry; Frank, Eibe; Pfahringer, Bernhard; Cree, Michael J.: Regularisation of neural networks by enforcing Lipschitz continuity (2021)
  7. Hou, Jun; Qin, Tong; Wu, Kailiang; Xiu, Dongbin: A non-intrusive correction algorithm for classification problems with corrupted data (2021)
  8. Huang, Junhao; Sun, Weize; Huang, Lei: Joint structure and parameter optimization of multiobjective sparse neural network (2021)
  9. Iwen, Mark A.; Krahmer, Felix; Krause-Solberg, Sara; Maly, Johannes: On recovery guarantees for one-bit compressed sensing on manifolds (2021)
  10. Jones, Ilenna Simone; Kording, Konrad Paul: Might a single neuron solve interesting machine learning problems through successive computations on its dendritic tree? (2021)
  11. Kopetzki, Anna-Kathrin; Günnemann, Stephan: Reachable sets of classifiers and regression models: (non-)robustness analysis and robust training (2021)
  12. Mizutani, Tomohiko: Convex programming based spectral clustering (2021)
  13. Nielsen, Frank; Sun, Ke: Chain rule optimal transport (2021)
  14. Sakai, Tomoya; Niu, Gang; Sugiyama, Masashi: Information-theoretic representation learning for positive-unlabeled classification (2021)
  15. Saul, Lawrence K.: An EM algorithm for capsule regression (2021)
  16. Shi, Junjie; Bian, Jiang; Richter, Jakob; Chen, Kuan-Hsun; Rahnenführer, Jörg; Xiong, Haoyi; Chen, Jian-Jia: MODES: model-based optimization on distributed embedded systems (2021)
  17. Wu, Mike; Parbhoo, Sonali; Hughes, Michael C.; Roth, Volker; Doshi-Velez, Finale: Optimizing for interpretability in deep neural networks with tree regularization (2021)
  18. Baldassi, Carlo; Pittorino, Fabrizio; Zecchina, Riccardo: Shaping the learning landscape in neural networks around wide flat minima (2020)
  19. Blusseau, Samy; Ponchon, Bastien; Velasco-Forero, Santiago; Angulo, Jesús; Bloch, Isabelle: Approximating morphological operators with part-based representations learned by asymmetric auto-encoders (2020)
  20. Cui, Zhenghang; Charoenphakdee, Nontawat; Sato, Issei; Sugiyama, Masashi: Classification from triplet comparison data (2020)

1 2 next