Adam

Adam: A Method for Stochastic Optimization. We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and/or sparse gradients. The hyper-parameters have intuitive interpretations and typically require little tuning. Some connections to related algorithms, on which Adam was inspired, are discussed. We also analyze the theoretical convergence properties of the algorithm and provide a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Empirical results demonstrate that Adam works well in practice and compares favorably to other stochastic optimization methods. Finally, we discuss AdaMax, a variant of Adam based on the infinity norm.


References in zbMATH (referenced in 453 articles )

Showing results 1 to 20 of 453.
Sorted by year (citations)

1 2 3 ... 21 22 23 next

  1. Adcock, Ben; Dexter, Nick: The gap between theory and practice in function approximation with deep neural networks (2021)
  2. Ainsworth, Mark; Dong, Justin: Galerkin neural networks: a framework for approximating variational equations with error control (2021)
  3. Anderson, Lara B.; Gerdes, Mathis; Gray, James; Krippendorf, Sven; Raghuram, Nikhil; Ruehle, Fabian: Moduli-dependent Calabi-Yau and SU(3)-structure metrics from machine learning (2021)
  4. Angeli, Andrea; Desmet, Wim; Naets, Frank: Deep learning for model order reduction of multibody systems to minimal coordinates (2021)
  5. Ao, Wenqi; Li, Wenbin; Qian, Jianliang: A data and knowledge driven approach for SPECT using convolutional neural networks and iterative algorithms (2021)
  6. Bakhtin, Anton; Deng, Yuntian; Gross, Sam; Ott, Myle; Ranzato, Marc’aurelio; Szlam, Arthur: Residual energy-based models for text (2021)
  7. Barakat, Anas; Bianchi, Pascal: Convergence and dynamical behavior of the ADAM algorithm for nonconvex stochastic optimization (2021)
  8. Benjamin Paaßen, Jessica McBroom, Bryn Jeffries, Irena Koprinska, Kalina Yacef: ast2vec: Utilizing Recursive Neural Encodings of Python Programs (2021) arXiv
  9. Bertsimas, Dimitris; Dunn, Jack; Wang, Yuchen: Near-optimal nonlinear regression trees (2021)
  10. Bohra, Navdeep; Bhatnagar, Vishal: Group level social media popularity prediction by MRGB and Adam optimization (2021)
  11. Boumezoued, Alexandre; Elfassihi, Amal: Mortality data correction in the absence of monthly fertility records (2021)
  12. Cai, Shengze; Wang, Zhicheng; Fuest, Frederik; Jeon, Young Jin; Gray, Callum; Karniadakis, George Em: Flow over an espresso cup: inferring 3-D velocity and pressure fields from tomographic background oriented Schlieren via physics-informed neural networks (2021)
  13. Canchumuni, Smith W. A.; Castro, Jose D. B.; Potratz, Júlia; Emerick, Alexandre A.; Pacheco, Marco Aurélio C.: Recent developments combining ensemble smoother and deep generative networks for facies history matching (2021)
  14. Cao, Yi; Liu, Xiaoquan; Zhai, Jia: Option valuation under no-arbitrage constraints with neural networks (2021)
  15. Carbonneau, Alexandre: Deep hedging of long-term financial derivatives (2021)
  16. Chen, Li-Wei; Cakal, Berkay A.; Hu, Xiangyu; Thuerey, Nils: Numerical investigation of minimum drag profiles in laminar flow using deep learning surrogates (2021)
  17. Chi, Heng; Zhang, Yuyu; Tang, Tsz Ling Elaine; Mirabella, Lucia; Dalloro, Livio; Song, Le; Paulino, Glaucio H.: Universal machine learning for topology optimization (2021)
  18. Choi, Hee-Sun; An, Junmo; Han, Seongji; Kim, Jin-Gyun; Jung, Jae-Yoon; Choi, Juhwan; Orzechowski, Grzegorz; Mikkola, Aki; Choi, Jin Hwan: Data-driven simulation for general-purpose multibody dynamics using deep neural networks (2021)
  19. David Salinas, Valentin Flunkert, Jan Gasthaus: DeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks (2021) arXiv
  20. De Loera, Jesús A.; Haddock, Jamie; Ma, Anna; Needell, Deanna: Data-driven algorithm selection and tuning in optimization and signal processing (2021)

1 2 3 ... 21 22 23 next