CIFAR

The CIFAR-10 and CIFAR-100 are labeled subsets of the 80 million tiny images dataset. They were collected by Alex Krizhevsky, Vinod Nair, and Geoffrey Hinton. The CIFAR-10 dataset consists of 60000 32x32 colour images in 10 classes, with 6000 images per class. There are 50000 training images and 10000 test images. The dataset is divided into five training batches and one test batch, each with 10000 images. The test batch contains exactly 1000 randomly-selected images from each class. The training batches contain the remaining images in random order, but some training batches may contain more images from one class than another. Between them, the training batches contain exactly 5000 images from each class. The CIFAR-100 dataset: This dataset is just like the CIFAR-10, except it has 100 classes containing 600 images each. There are 500 training images and 100 testing images per class. The 100 classes in the CIFAR-100 are grouped into 20 superclasses. Each image comes with a ”fine” label (the class to which it belongs) and a ”coarse” label (the superclass to which it belongs).


References in zbMATH (referenced in 167 articles )

Showing results 101 to 120 of 167.
Sorted by year (citations)
  1. Wu, Bijiao; Wang, Dingheng; Zhao, Guangshe; Deng, Lei; Li, Guoqi: Hybrid tensor decomposition in neural network compression (2020)
  2. Wu, Jiasong; Wu, Fuzhi; Yang, Qihan; Zhang, Yan; Liu, Xilin; Kong, Youyong; Senhadji, Lotfi; Shu, Huazhong: Fractional spectral graph wavelets and their applications (2020)
  3. Wu, Min; Wicker, Matthew; Ruan, Wenjie; Huang, Xiaowei; Kwiatkowska, Marta: A game-based approximate verification of deep neural networks with provable guarantees (2020)
  4. Xu, Jian; Liu, Heng; Wu, Dexin; Zhou, Fucai; Gao, Chong-zhi; Jiang, Linzhi: Generating universal adversarial perturbation with ResNet (2020)
  5. Zheng, Qinghe; Tian, Xinyu; Yang, Mingqiang; Wu, Yulin; Su, Huake: PAC-Bayesian framework based drop-path method for 2D discriminative convolutional network pruning (2020)
  6. Zheng, Qinghe; Yang, Mingqiang; Tian, Xinyu; Jiang, Nan; Wang, Deqiang: A full stage data augmentation method in deep convolutional neural network for natural image classification (2020)
  7. Zhou, Dongruo; Xu, Pan; Gu, Quanquan: Stochastic nested variance reduction for nonconvex optimization (2020)
  8. Zou, Difan; Cao, Yuan; Zhou, Dongruo; Gu, Quanquan: Gradient descent optimizes over-parameterized deep ReLU networks (2020)
  9. Altalhi, A. H.; Forcén, J. I.; Pagola, M.; Barrenechea, E.; Bustince, H.; Takáč, Zdenko: Moderate deviation and restricted equivalence functions for measuring similarity between data (2019)
  10. Bo Chang, Minmin Chen, Eldad Haber, Ed H. Chi: AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks (2019) arXiv
  11. Chaudhari, Pratik; Choromanska, Anna; Soatto, Stefano; LeCun, Yann; Baldassi, Carlo; Borgs, Christian; Chayes, Jennifer; Sagun, Levent; Zecchina, Riccardo: Entropy-SGD: biasing gradient descent into wide valleys (2019)
  12. Cherubin, Giovanni: Majority vote ensembles of conformal predictors (2019)
  13. Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios: Neural Spline Flows (2019) arXiv
  14. Cowen, Benjamin; Saridena, Apoorva Nandini; Choromanska, Anna: LSALSA: accelerated source separation via learned sparse coding (2019)
  15. Ding, Hu; Yu, Haikuo; Wang, Zixiu: Greedy strategy works for (k)-center clustering with outliers and coreset construction (2019)
  16. Dong, Yinpeng; Ni, Renkun; Li, Jianguo; Chen, Yurong; Su, Hang; Zhu, Jun: Stochastic quantization for learning accurate low-bit deep neural networks (2019)
  17. Han, Huimei; Li, Ying; Zhu, Xingquan: Convolutional neural network learning for generic data classification (2019)
  18. He, Juncai; Xu, Jinchao: MgNet: a unified framework of multigrid and convolutional neural network (2019)
  19. Hidaka, Akinori; Watanabe, Kenji; Kurita, Takio: Sparse discriminant analysis based on estimation of posterior probabilities (2019)
  20. Higham, Catherine F.; Higham, Desmond J.: Deep learning: an introduction for applied mathematicians (2019)