• DeepMath

  • Referenced in 7 articles [sw27551]
  • Selection. We study the effectiveness of neural sequence models for premise selection in automated theorem...
  • OpenNMT

  • Referenced in 5 articles [sw26505]
  • initiative for neural machine translation and neural sequence modeling. Since its launch in December...
  • DENFIS

  • Referenced in 56 articles [sw24183]
  • fuzzy inference systems, denoted as dynamic evolving neural-fuzzy inference system (DENFIS), for adaptive online ... fuzzy rule set for a DENFIS online model; and (2) creation of a first-order ... high-order one, for a DENFIS offline model. A set of fuzzy rules ... models, is also introduced. It is demonstrated that DENFIS can effectively learn complex temporal sequences...
  • ART 3

  • Referenced in 27 articles [sw08755]
  • model to implement parallel search of compressed or distributed pattern recognition codes in a neural ... slow learning, and can robustly cope with sequences of asynchronous input patterns in real-time ... interactions that enable presynaptic transmitter dynamics to model the postsynaptic short-term memory representation...
  • NCRF++

  • Referenced in 1 article [sw26488]
  • quick implementation of different neural sequence labeling models with a CRF inference layer. It provides ... building the custom model structure through configuration file with flexible neural feature design and utilization ... most state-of-the-art neural sequence labeling models such as LSTM-CRF, facilitating reproducing...
  • LSTM

  • Referenced in 24 articles [sw03373]
  • human brain is a recurrent neural network (RNN): a network of neurons with feedback connections ... learn many behaviors / sequence processing tasks / algorithms / programs that are not learnable by traditional machine ... learn algorithms to map input sequences to output sequences, with or without a teacher. They ... other adaptive approaches such as Hidden Markov Models (no continuous internal states), feedforward networks...
  • DenseCap

  • Referenced in 2 articles [sw27203]
  • layer, and Recurrent Neural Network language model that generates the label sequences. We evaluate...
  • DeepAPI

  • Referenced in 1 article [sw15213]
  • approaches utilize information retrieval models to search for matching API sequences. These approaches treat queries ... learning based approach to generate API usage sequences for a given natural language query. Instead ... sequence of associated APIs. DeepAPI adapts a neural language model named RNN Encoder-Decoder...
  • Holophrasm

  • Referenced in 1 article [sw30124]
  • using a neural-network-augmented bandit algorithm and a sequence-to-sequence model for action...
  • rna-state-inf

  • Referenced in 1 article [sw33162]
  • recurrent neural networks. The problem of determining which nucleotides of an RNA sequence are paired ... learning techniques. Successful state inference of RNA sequences can be used to generate auxiliary information ... state inference, such as hidden Markov models, exhibit poor performance in RNA state inference, owing ... neural networks have emerged as a powerful tool that can model global nonlinear sequence dependencies...
  • LSTMVis

  • Referenced in 2 articles [sw27157]
  • State Dynamics in Recurrent Neural Networks. Recurrent neural networks, and in particular long short-term ... remarkably effective tool for sequence modeling that learn a dense black-box hidden representation ... input. Researchers interested in better understanding these models have studied the changes in hidden state ... LSTMVIS, a visual analysis tool for recurrent neural networks with a focus on understanding these...
  • Neural Monkey

  • Referenced in 2 articles [sw26583]
  • Neural Monkey: Neural Sequence Learning Using TensorFlow. The Neural Monkey package provides ... higher level abstraction for sequential neural network models, most prominently in Natural Language Processing...
  • GRAIL

  • Referenced in 1 article [sw17329]
  • localization and modeling system, called GRAIL. GRAIL is a multiple sensor-neural network-based system ... localizes genes in anonymous DNA sequence by recognizing features related to protein-coding regions ... neural network system. Localized coding regions are then ”optimally” parsed into a gene model. Through ... localization of genes on their newly sequenced...
  • Sockeye

  • Referenced in 2 articles [sw26584]
  • open-source sequence-to-sequence toolkit for Neural Machine Translation (NMT). Sockeye is a production ... ready framework for training and applying models as well as an experimental platform for researchers...
  • iTIS-PseKNC

  • Referenced in 3 articles [sw25196]
  • improvements have been achieved in developing computational models; however, development of accurate and reliable automated ... protocol for identification of TIS. Three protein sequence representation methods including dinucleotide composition, pseudo-dinucleotide ... Neural Network are assessed for their performance using the constructed descriptors. The proposed model iTIS...
  • OpenSeq2Seq

  • Referenced in 1 article [sw26483]
  • most effectively explore different sequence-to-sequence architectures. The efficiency is achieved by fully supporting ... building blocks for training encoder-decoder models for neural machine translation and automatic speech recognition...
  • ConvS2S

  • Referenced in 1 article [sw26536]
  • sequence via recurrent neural networks. We introduce an architecture based entirely on convolutional neural networks ... Compared to recurrent models, computations over all elements can be fully parallelized during training...
  • StrBioLib

  • Referenced in 2 articles [sw16906]
  • alignments between biopolymers based on either sequence or structure. Interfaces are provided to interact with ... commonly used bioinformatics applications, including (psi)-blast, modeller, muscle and Primer3, and tools are provided ... data. The library includes a general-purpose neural network object with multiple training algorithms ... used to build the astral compendium for sequence and structure analysis, and has been extensively...
  • char-rnn

  • Referenced in 1 article [sw27212]
  • model takes one text file as input and trains a Recurrent Neural Network that learns ... predict the next character in a sequence. The RNN can then be used to generate ... vanilla RNN, has more supporting code for model checkpointing, and is of course much more...
  • ByteNet

  • Referenced in 2 articles [sw26537]
  • neural network that is composed of two parts, one to encode the source sequence ... other to decode the target sequence. The two network parts are connected by stacking ... preserving the temporal resolution of the sequences. To address the differing lengths of the source ... linear in the length of the sequences and it sidesteps the need for excessive memorization...