darch

darch: Package for deep architectures and Restricted-Bolzmann-Machines. The darch package is build on the basis of the code from G. E. Hinton and R. R. Salakhutdinov (available under Matlab Code for deep belief nets : last visit: 01.08.2013). This package is for generating neural networks with many layers (deep architectures) and train them with the method introduced by the publications ”A fast learning algorithm for deep belief nets” (G. E. Hinton, S. Osindero, Y. W. Teh) and ”Reducing the dimensionality of data with neural networks” (G. E. Hinton, R. R. Salakhutdinov). This method includes a pre training with the contrastive divergence method publishing by G.E Hinton (2002) and a fine tuning with common known training algorithms like backpropagation or conjugate gradient.


References in zbMATH (referenced in 223 articles )

Showing results 1 to 20 of 223.
Sorted by year (citations)

1 2 3 ... 10 11 12 next

  1. Cui, Ying; He, Ziyu; Pang, Jong-Shi: Multicomposite nonconvex optimization for training deep neural networks (2020)
  2. Desana, Mattia; Schnörr, Christoph: Sum-product graphical models (2020)
  3. Gong, Maoguo; Pan, Ke; Xie, Yu; Qin, A. K.; Tang, Zedong: Preserving differential privacy in deep neural networks with relevance-based adaptive noise imposition (2020)
  4. Har-Peled, Sariel; Jones, Mitchell: On separating points by lines (2020)
  5. Li, Xiang; Ning, Shaowu; Liu, Zhanli; Yan, Ziming; Luo, Chengcheng; Zhuang, Zhuo: Designing phononic crystal with anticipated band gap through a deep learning based data-driven method (2020)
  6. Oishi, Atsuya; Yagawa, Genki: A surface-to-surface contact search method enhanced by deep learning (2020)
  7. Puligilla, Shivakanth Chary; Jayaraman, Balaji: Assessment of end-to-end and sequential data-driven learning for non-intrusive modeling of fluid flows (2020)
  8. Ruehle, Fabian: Data science applications to string theory (2020)
  9. Stanko, Ivana: The architectures of Geoffrey Hinton (2020)
  10. Tsionas, Mike G.; Andrikopoulos, Athanasios: On a high-dimensional model representation method based on copulas (2020)
  11. van Engelen, Jesper E.; Hoos, Holger H.: A survey on semi-supervised learning (2020)
  12. Zheng, Kunming; Hu, Youmin; Wu, Bo: Intelligent fuzzy sliding mode control for complex robot system with disturbances (2020)
  13. Zhou, Ding-Xuan: Theory of deep convolutional neural networks: downsampling (2020)
  14. Zhou, Ding-Xuan: Universality of deep convolutional neural networks (2020)
  15. Zhou, Yicheng; Lu, Zhenzhou; Hu, Jinghan; Hu, Yingshi: Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square (2020)
  16. Bühlmann, Peter: Comments on “Data science, big data and statistics” (2019)
  17. Choi, Arthur; Wang, Ruocheng; Darwiche, Adnan: On the relative expressiveness of Bayesian and neural networks (2019)
  18. Chui, Charles K.; Lin, Shao-Bo; Zhou, Ding-Xuan: Deep neural networks for rotation-invariance approximation and learning (2019)
  19. Comsa, Iulia M.; Firsching, Moritz; Fischbacher, Thomas: SO(8) supergravity and the magic of machine learning (2019)
  20. Czaja, Wojciech; Li, Weilin: Analysis of time-frequency scattering transforms (2019)

1 2 3 ... 10 11 12 next