mixup

mixup: Beyond Empirical Risk Minimization. Large deep neural networks are powerful, but exhibit undesirable behaviors such as memorization and sensitivity to adversarial examples. In this work, we propose mixup, a simple learning principle to alleviate these issues. In essence, mixup trains a neural network on convex combinations of pairs of examples and their labels. By doing so, mixup regularizes the neural network to favor simple linear behavior in-between training examples. Our experiments on the ImageNet-2012, CIFAR-10, CIFAR-100, Google commands and UCI datasets show that mixup improves the generalization of state-of-the-art neural network architectures. We also find that mixup reduces the memorization of corrupt labels, increases the robustness to adversarial examples, and stabilizes the training of generative adversarial networks.


References in zbMATH (referenced in 11 articles )

Showing results 1 to 11 of 11.
Sorted by year (citations)

  1. Jain, Niharika; Olmo, Alberto; Sengupta, Sailik; Manikonda, Lydia; Kambhampati, Subbarao: Imperfect imaGANation: implications of GANs exacerbating biases on facial data augmentation and snapchat face lenses (2022)
  2. Kopetzki, Anna-Kathrin; G√ľnnemann, Stephan: Reachable sets of classifiers and regression models: (non-)robustness analysis and robust training (2021)
  3. Liang, Senwei; Khoo, Yuehaw; Yang, Haizhao: Drop-activation: implicit parameter reduction and harmonious regularization (2021)
  4. Northcutt, Curtis G.; Jiang, Lu; Chuang, Isaac L.: Confident learning: estimating uncertainty in dataset labels (2021)
  5. Shu, Xin; Cheng, Xin; Xu, Shubin; Chen, Yunfang; Ma, Tinghuai; Zhang, Wei: How to construct low-altitude aerial image datasets for deep learning (2021)
  6. Yu, Suxiang; Zhang, Shuai; Wang, Bin; Dun, Hua; Xu, Long; Huang, Xin; Shi, Ermin; Feng, Xinxing: Generative adversarial network based data augmentation to improve cervical cell classification model (2021)
  7. Charley Gros, Andreanne Lemay, Olivier Vincent, Lucas Rouhier, Anthime Bucquet, Joseph Paul Cohen, Julien Cohen-Adad: ivadomed: A Medical Imaging Deep Learning Toolbox (2020) arXiv
  8. Chen, Yiming; Pan, Tianci; He, Cheng; Cheng, Ran: Efficient evolutionary deep neural architecture search (NAS) by noisy network morphism mutation (2020)
  9. Kihyuk Sohn, David Berthelot, Chun-Liang Li, Zizhao Zhang, Nicholas Carlini, Ekin D. Cubuk, Alex Kurakin, Han Zhang, Colin Raffel: FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence (2020) arXiv
  10. Oberman, Adam M.: Partial differential equation regularization for supervised machine learning (2020)
  11. van Engelen, Jesper E.; Hoos, Holger H.: A survey on semi-supervised learning (2020)