Neural-based Character-Level Language Model (LM) Training Algorithm
(Redirected from neural character-level LM algorithm)
Jump to navigation
Jump to search
A Neural-based Character-Level Language Model (LM) Training Algorithm is a character-level language modeling algorithm that is a Neural-based LM Algorithm.
- Context:
- It can be implemented by a Neural-based Character-Level Language Modeling System.
- Example(s):
- an LSTM-based Character-Level LM Algorithm, such as in (Karpathy, 2015).
- …
- Counter-Example(s):
- See: Language Modeling.
References
2018
- https://github.com/PetrochukM/PyTorch-NLP/tree/master/examples/awd-lstm-lm
- QUOTE: This repository contains the code used for two Salesforce Research papers:
- Regularizing and Optimizing LSTM Language Models.
- An Analysis of Neural Language Modeling at Multiple Scales
This code was originally forked from the PyTorch word level language modeling example.
- QUOTE: This repository contains the code used for two Salesforce Research papers:
2015
- (Karpathy, 2015) ⇒ Andrej Karpathy. (2015). “The Unreasonable Effectiveness of Recurrent Neural Networks.” In: Proceedings of Blog post 2015-05-21.
- QUOTE: ... By the way, together with this post I am also releasing code on Github that allows you to train character-level language models based on multi-layer LSTMs. You give it a large chunk of text and it will learn to generate text like it one character at a time. ...