Long Short-Term Memory (LSTM) RNN Training Algorithm
(Redirected from LSTM Algorithm)
Jump to navigation
Jump to search
A Long Short-Term Memory (LSTM) RNN Training Algorithm is a recurrent NNet training algorithm that accepts an LSTM model.
- Context:
- It can be implemented by an LSTM training system to train an LSTM neural network.
- Example(s):
- (Sutskever et al., 2014)'s.
- …
- Counter-Example(s):
- See: Bidirectional LSTM.
References
2014a
- (Sutskever et al., 2014) ⇒ Ilya Sutskever, Oriol Vinyals, and Quoc V. Le. (2014). “Sequence to Sequence Learning with Neural Networks.” In: Advances in Neural Information Processing Systems.
- QUOTE: Deep Neural Networks (DNNs) are powerful models that have achieved excellent performance on difficult learning tasks. Although DNNs work well whenever large labeled training sets are available, they cannot be used to map sequences to sequences. In this paper, we present a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure. Our method uses a multilayered Long Short-Term Memory (LSTM) to map the input sequence to a vector of a fixed dimensionality, and then another deep LSTM to decode the target sequence from the vector.
2014b
- (Sakşim et al., 2014) ⇒ Ha Sakşim, Andrew Senior, and Françoise Beaufays. (2014). “Long Short-term Memory Recurrent Neural Network Architectures for Large Scale Acoustic Modeling.” In: Fifteenth annual conference of the international speech communication association.
2001
- (Hochreiter et al., 2001) ⇒ Sepp Hochreiter, Yoshua Bengio, Paolo Frasconi, and Jürgen Schmidhuber. (2001). “Gradient Flow in Recurrent Nets: The Difficulty of Learning Long-term Dependencies.”
2000
- (Gers et al., 2000) ⇒ Felix A. Gers, Jürgen Schmidhuber, and Fred Cummins. (2000). “Learning to Forget: Continual Prediction with LSTM.” Neural Computation 12, no. 10
1997
- (Hochreiter & Schmidhuber, 1997) ⇒ Sepp Hochreiter, and Jürgen Schmidhuber. (1997). “Long Short-term Memory.” Neural computation 9, no. 8