Text-String Probability Function Training System
Jump to navigation
Jump to search
A Text-String Probability Function Training System is a Language Modeling System that implements a text-string probability function training algorithm to solve a text-string probability function training task.
- AKA: Text Probability Function Training System, String Probability Function Training System.
- Context:
- It can range from being a Character-Level LM System, to being a Word-Level Language Modeling System, ...
- It can range from being a MaxLikelihood-based LM System, a Neural-based LM System (such as an LSTM-based LM system), ...
- Example(s):
- a Python-based Language Modeling System, ...
- a IRSTLM - Free software for language modeling
- a KenLM - Fast, Free software for language modeling
- a MITLM - MIT Language Modeling (MITLM) toolkit
- a OpenGrm NGram library - Free software for language modeling. Built on OpenFst.
- a Positional Language Model
- a RandLM - Free software for randomised language modeling
- a VariKN - Free software for creating, growing and pruning Kneser-Ney smoothed n-gram models.
- a SRILM -
- Counter-Example(s):
- See: Language Modeling Algorithm, Distributional Word Representation System.