Recurrent Highway Neural Network
(Redirected from RHN)
Jump to navigation
Jump to search
A Recurrent Highway Neural Network is a LSTM Recurrent Neural Network that allows a step-to-step transition depths larger than one.
- AKA: Highway Network.
- Context:
- It is composed of Highway Layers.
- It can be considered as an improved Residual Neural Network (ResNet) in which an additional weight matrix is used to learn the skip weight.
- Example(s):
- Counter-Example(s):
- a Bidirectional Associative Memory (BAM) Network;
- a DAG Recurrent Neural Network;
- an Echo State Network;
- an Elman Network;
- a Gated Recurrent Unit (GRU) Network;
- a Hopfield Recurrent Neural Network;
- a Jordan Network;
- a Long Short-Term Memory (LSTM) Network;
- a Recurrent Multilayer Perceptron Network;
- an Encoder-Decoder RNN,
- a seq2seq Network,
- a Transformer-based Network.
- See: Recursive Neural Network, Handwriting Recognition, Speech Recognition, Directed Cycle.
References
2017
- (Zilly et al., 2017) ⇒ Julian Georg Zilly, Rupesh Kumar Srivastava, Jan Koutník, and Jurgen Schmidhuber. (2017). “Recurrent Highway Networks.” In: Proceedings of the 34th International Conference on Machine Learning - Volume 70. arXiv:1607.03474
2015
- (Srivastava et al., 2015) ⇒ Rupesh Kumar Srivastava, Klaus Greff, and Jurgen Schmidhuber. (2015). “Training Very Deep Networks.” In: Advances in Neural Information Processing Systems 28: Annual Conference on Neural Information Processing Systems 2015.