CIFAR

The CIFAR-10 and CIFAR-100 are labeled subsets of the 80 million tiny images dataset. They were collected by Alex Krizhevsky, Vinod Nair, and Geoffrey Hinton. The CIFAR-10 dataset consists of 60000 32x32 colour images in 10 classes, with 6000 images per class. There are 50000 training images and 10000 test images. The dataset is divided into five training batches and one test batch, each with 10000 images. The test batch contains exactly 1000 randomly-selected images from each class. The training batches contain the remaining images in random order, but some training batches may contain more images from one class than another. Between them, the training batches contain exactly 5000 images from each class. The CIFAR-100 dataset: This dataset is just like the CIFAR-10, except it has 100 classes containing 600 images each. There are 500 training images and 100 testing images per class. The 100 classes in the CIFAR-100 are grouped into 20 superclasses. Each image comes with a ”fine” label (the class to which it belongs) and a ”coarse” label (the superclass to which it belongs).


References in zbMATH (referenced in 151 articles )

Showing results 41 to 60 of 151.
Sorted by year (citations)
  1. Wang, Yifei; Jia, Zeyu; Wen, Zaiwen: Search direction correction with normalized gradient makes first-order methods faster (2021)
  2. Wen, Ming; Xu, Yixi; Zheng, Yunling; Yang, Zhouwang; Wang, Xiao: Sparse deep neural networks using (L_1,\infty)-weight normalization (2021)
  3. Wilson, Paul; Zanasi, Fabio: Reverse derivative ascent: a categorical approach to learning Boolean circuits (2021)
  4. Xiao, Danyang; Mei, Yuan; Kuang, Di; Chen, Mengqiang; Guo, Binbin; Wu, Weigang: EGC: entropy-based gradient compression for distributed deep learning (2021)
  5. Yang, Hongfei; Ding, Xiaofeng; Chan, Raymond; Hu, Hui; Peng, Yaxin; Zeng, Tieyong: A new initialization method based on normed statistical spaces in deep networks (2021)
  6. Yu, Jiahui; Spiliopoulos, Konstantinos: Normalization effects on shallow neural networks and related asymptotic expansions (2021)
  7. Zhao, Xing; Papagelis, Manos; An, Aijun; Chen, Bao Xin; Liu, Junfeng; Hu, Yonggang: Zipline: an optimized algorithm for the elastic bulk synchronous parallel model (2021)
  8. Abbasnejad, M. Ehsan; Shi, Javen; van den Hengel, Anton; Liu, Lingqiao: GADE: a generative adversarial approach to density estimation and its applications (2020)
  9. Aryal, Sunil; Ting, Kai Ming; Washio, Takashi; Haffari, Gholamreza: A comparative study of data-dependent approaches without learning in measuring similarities of data objects (2020)
  10. Bang, Duhyeon; Kang, Seoungyoon; Shim, Hyunjung: Discriminator feature-based inference by recycling the discriminator of GANs (2020)
  11. Borisyak, Maxim; Ryzhikov, Artem; Ustyuzhanin, Andrey; Derkach, Denis; Ratnikov, Fedor; Mineeva, Olga: ((1 + \varepsilon))-class classification: an anomaly detection method for highly imbalanced or incomplete data sets (2020)
  12. Carlsson, Gunnar; Gabrielsson, Rickard Brüel: Topological approaches to deep learning (2020)
  13. Cui, Zhenghang; Charoenphakdee, Nontawat; Sato, Issei; Sugiyama, Masashi: Classification from triplet comparison data (2020)
  14. Duan, Shiyu; Yu, Shujian; Chen, Yunmei; Principe, Jose C.: On kernel method-based connectionist models and supervised deep learning without backpropagation (2020)
  15. Frazier-Logue, Noah; Hanson, Stephen José: The stochastic delta rule: faster and more accurate deep learning through adaptive weight noise (2020)
  16. Fung, Samy Wu; Tyrväinen, Sanna; Ruthotto, Lars; Haber, Eldad: ADMM-softmax: an ADMM approach for multinomial logistic regression (2020)
  17. George Kyriakides, Konstantinos Margaritis: NORD: A python framework for Neural Architecture Search (2020) not zbMATH
  18. Georgiev, Dobromir; Gurov, Todor: Distributed deep learning on heterogeneous computing resources using gossip communication (2020)
  19. Gu, Xue; Meng, Ziyao; Liang, Yanchun; Xu, Dong; Huang, Han; Han, Xiaosong; et al.: ESAE: evolutionary strategy-based architecture evolution (2020)
  20. Harshvardhan, G. M.; Gourisaria, Mahendra Kumar; Pandey, Manjusha; Rautaray, Siddharth Swarup: A comprehensive survey and analysis of generative models in machine learning (2020)