LSTM

The human brain is a recurrent neural network (RNN): a network of neurons with feedback connections. It can learn many behaviors / sequence processing tasks / algorithms / programs that are not learnable by traditional machine learning methods. This explains the rapidly growing interest in artificial RNNs for technical applications: general computers which can learn algorithms to map input sequences to output sequences, with or without a teacher. They are computationally more powerful and biologically more plausible than other adaptive approaches such as Hidden Markov Models (no continuous internal states), feedforward networks and Support Vector Machines (no internal states at all). Our recent applications include adaptive robotics and control, handwriting recognition, speech recognition, keyword spotting, music composition, attentive vision, protein analysis, stock market prediction, and many other sequence problems.


References in zbMATH (referenced in 24 articles , 1 standard article )

Showing results 1 to 20 of 24.
Sorted by year (citations)

1 2 next

  1. Heider, Yousef; Wang, Kun; Sun, WaiChing: (\mathrmSO(3))-invariance of informed-graph-based deep neural network for anisotropic elastoplastic materials (2020)
  2. Willmott, Devin; Murrugarra, David; Ye, Qiang: Improving RNA secondary structure prediction via state inference with deep recurrent neural networks (2020)
  3. Fernández-González, Daniel; Gómez-Rodríguez, Carlos: Faster shift-reduce constituent parsing with a non-binary, bottom-up strategy (2019)
  4. Yu, Yong; Si, Xiaosheng; Hu, Changhua; Zhang, Jianxun: A review of recurrent neural networks: LSTM cells and network architectures (2019)
  5. Aggarwal, Charu C.: Neural networks and deep learning. A textbook (2018)
  6. Cinar, Goktug T.; Sequeira, Pedro M. N.; Principe, Jose C.: Hierarchical linear dynamical systems for unsupervised musical note recognition (2018)
  7. Fischer, Thomas; Krauss, Christopher: Deep learning with long short-term memory networks for financial market predictions (2018)
  8. Zhu, Henghui; Paschalidis, Ioannis Ch.; Hasselmo, Michael E.: Neural circuits for learning context-dependent associations of stimuli (2018)
  9. Er, Meng Joo; Zhang, Yong; Wang, Ning; Pratama, Mahardhika: Attention pooling-based convolutional neural network for sentence modelling (2016)
  10. Schmidhuber, Jürgen: Deep learning in neural networks: an overview (2015) ioport
  11. Graves, Alex: Supervised sequence labelling with recurrent neural networks. (2012)
  12. Namikawa, Jun; Tani, Jun: Learning to imitate stochastic time series in a compositional way by chaos (2010)
  13. Liwicki, Marcus; Bunke, Horst: Combining diverse on-line and off-line systems for handwritten text line recognition (2009)
  14. Namikawa, Jun; Tani, Jun: A model for learning to segment temporal sequences, utilizing a mixture of RNN experts together with adaptive variance (2008)
  15. Kara, Sadik; Okandan, Mustafa: Atrial fibrillation classification with artificial neural networks (2007)
  16. Skowronski, Mark D.; Harris, John G.: Automatic speech recognition using a predictive echo state network classifier (2007)
  17. Grüning, André: Stack-like and queue-like dynamics in recurrent neural networks (2006)
  18. Graves, Alex; Schmidhuber, Jürgen: Framewise phoneme classification with bidirectional lstm and other neural network architectures (2005) ioport
  19. Jacobsson, Henrik: Rule extraction from recurrent neural networks: A taxonomy and review (2005)
  20. Gabrijel, Ivan; Dobnikar, Andrej: On-line identification and reconstruction of finite automata with generalized recurrent neural networks. (2003) ioport

1 2 next