darch: Package for deep architectures and Restricted-Bolzmann-Machines. The darch package is build on the basis of the code from G. E. Hinton and R. R. Salakhutdinov (available under Matlab Code for deep belief nets : last visit: 01.08.2013). This package is for generating neural networks with many layers (deep architectures) and train them with the method introduced by the publications ”A fast learning algorithm for deep belief nets” (G. E. Hinton, S. Osindero, Y. W. Teh) and ”Reducing the dimensionality of data with neural networks” (G. E. Hinton, R. R. Salakhutdinov). This method includes a pre training with the contrastive divergence method publishing by G.E Hinton (2002) and a fine tuning with common known training algorithms like backpropagation or conjugate gradient.

References in zbMATH (referenced in 76 articles )

Showing results 1 to 20 of 76.
Sorted by year (citations)

1 2 3 4 next

  1. Cárdenas-Peña, David; Collazos-Huertas, Diego; Castellanos-Dominguez, German: Centered kernel alignment enhancing neural network pretraining for MRI-based dementia diagnosis (2016)
  2. Chen, Yutian; Bornn, Luke; de Freitas, Nando; Eskelin, Mareija; Fang, Jing; Welling, Max: Herded Gibbs sampling (2016)
  3. Mocanu, Decebal Constantin; Mocanu, Elena; Nguyen, Phuong H.; Gibescu, Madeleine; Liotta, Antonio: A topological insight into restricted Boltzmann machines (2016)
  4. Read, Jesse; Reutemann, Peter; Pfahringer, Bernhard; Holmes, Geoff: MEKA: a multi-label/multi-target extension to WEKA (2016)
  5. Sokolovska, Nataliya; Clément, Karine; Zucker, Jean-Daniel: Deep kernel dimensionality reduction for scalable data integration (2016)
  6. Yuille, Alan; Mottaghi, Roozbeh: Complexity of representation and inference in compositional models with part sharing (2016)
  7. Zhang, Shiliang; Jiang, Hui; Dai, Lirong: Hybrid orthogonal projection and estimation (HOPE): a new framework to learn neural networks (2016)
  8. Arora, Sanjeev; Ge, Rong; Moitra, Ankur; Sachdeva, Sushant: Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders (2015)
  9. Bengio, Yoshua (ed.): Editorial introduction to the neural networks special issue on deep learning of representations (2015)
  10. Berglund, Mathias; Raiko, Tapani; Cho, Kyunghyun: Measuring the usefulness of hidden units in Boltzmann machines with mutual information (2015) ioport
  11. Cang, Zixuan; Mu, Lin; Wu, Kedi; Opron, Kristopher; Xia, Kelin; Wei, Guo-Wei: A topological approach for protein classification (2015)
  12. Elfwing, S.; Uchibe, E.; Doya, K.: Expected energy-based restricted Boltzmann machine for classification (2015)
  13. Fischer, Asja; Igel, Christian: A bound for the convergence rate of parallel tempering for sampling restricted Boltzmann machines (2015)
  14. Iocchi, Luca; Holz, Dirk; Ruiz-del-Solar, Javier; Sugiura, Komei; van der Zant, Tijn: RoboCup@Home: analysis and results of evolving competitions for domestic and service robots (2015) ioport
  15. Kim, Sangwook; Yu, Zhibin; Kil, Rhee Man; Lee, Minho: Deep learning of support vector machines with class probability output networks (2015)
  16. Montúfar, Guido; Ay, Nihat; Ghazi-Zahedi, Keyan: Geometry and expressive power of conditional restricted Boltzmann machines (2015)
  17. Montúfar, Guido; Morton, Jason: Discrete restricted Boltzmann machines (2015)
  18. Schmidhuber, Jürgen: Deep learning in neural networks: an overview (2015) ioport
  19. Schuld, Maria; Sinayskiy, Ilya; Petruccione, Francesco: Simulating a perceptron on a quantum computer (2015)
  20. Schulz, Hannes; Cho, Kyunghyun; Raiko, Tapani; Behnke, Sven: Two-layer contractive encodings for learning stable nonlinear features (2015)

1 2 3 4 next