AlexNet

AlexNet is a convolutional neural network that is 8 layers deep. You can load a pretrained version of the network trained on more than a million images from the ImageNet database [1]. The pretrained network can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals. As a result, the network has learned rich feature representations for a wide range of images. The network has an image input size of 227-by-227. For more pretrained networks in MATLAB®, see Pretrained Deep Neural Networks.


References in zbMATH (referenced in 343 articles )

Showing results 1 to 20 of 343.
Sorted by year (citations)

1 2 3 ... 16 17 18 next

  1. Adcock, Ben; Dexter, Nick: The gap between theory and practice in function approximation with deep neural networks (2021)
  2. Chen, Tianbo; Sun, Ying; Li, Ta-Hsin: A semi-parametric estimation method for the quantile spectrum with an application to earthquake classification using convolutional neural network (2021)
  3. Chi, Heng; Zhang, Yuyu; Tang, Tsz Ling Elaine; Mirabella, Lucia; Dalloro, Livio; Song, Le; Paulino, Glaucio H.: Universal machine learning for topology optimization (2021)
  4. Effland, Alexander; Kobler, Erich; Pock, Thomas; Rajković, Marko; Rumpf, Martin: Image morphing in deep feature spaces: theory and applications (2021)
  5. Fan, Jianqing; Ma, Cong; Zhong, Yiqiao: A selective overview of deep learning (2021)
  6. Fresca, Stefania; Dede’, Luca; Manzoni, Andrea: A comprehensive deep learning-based approach to reduced order modeling of nonlinear time-dependent parametrized PDEs (2021)
  7. Gambella, Claudio; Ghaddar, Bissan; Naoum-Sawaya, Joe: Optimization problems for machine learning: a survey (2021)
  8. Gao, Yu; Zhang, Kai: Machine learning based data retrieval for inverse scattering problems with incomplete data (2021)
  9. Gordon, Andrew S. (ed.); Miller, Rob (ed.); Morgenstern, Leora (ed.); Turán, György (ed.): Preface (2021)
  10. Haghighat, Ehsan; Juanes, Ruben: SciANN: a keras/tensorflow wrapper for scientific computations and physics-informed deep learning using artificial neural networks (2021)
  11. Hao, Jie; Zhu, William: Architecture self-attention mechanism: nonlinear optimization for neural architecture search (2021)
  12. Ivek, Tomislav; Vlah, Domagoj: BlackBox: generalizable reconstruction of extremal values from incomplete spatio-temporal data (2021)
  13. Jia, Fan; Liu, Jun; Tai, Xue-Cheng: A regularized convolutional neural network for semantic image segmentation (2021)
  14. Kalogeris, Ioannis; Papadopoulos, Vissarion: Diffusion maps-aided neural networks for the solution of parametrized PDEs (2021)
  15. Keller, Rachael T.; Du, Qiang: Discovery of dynamics using linear multistep methods (2021)
  16. Khatri, Rajendra K. C.; Caseria, Brendan J.; Lou, Yifei; Xiao, Guanghua; Cao, Yan: Automatic extraction of cell nuclei using dilated convolutional network (2021)
  17. Kroemer, Oliver; Niekum, Scott; Konidaris, George: A review of robot learning for manipulation: challenges, representations, and algorithms (2021)
  18. Li, Yanting; Jin, Junwei; Zhao, Liang; Wu, Huaiguang; Sun, Lijun; Philip Chen, C. L.: A neighborhood prior constrained collaborative representation for classification (2021)
  19. Loiseau, Jean-Christophe; Brunton, Steven L.; Noack, Bernd R.: From the POD-Galerkin method to sparse manifold models (2021)
  20. Long, Ziang; Yin, Penghang; Xin, Jack: Global convergence and geometric characterization of slow to fast weight evolution in neural network training for classifying linearly non-separable data (2021)

1 2 3 ... 16 17 18 next