n-Gram Probability Model
(Redirected from N-gram Model)
Jump to navigation
Jump to search
An n-Gram Probability Model is a probability function for n-gram.
- AKA: n-gram Probability Function.
- Context:
- It can be a Character n-Gram Model, or a Word n-Gram Model.
- See: n-Gram Language Model, n-Gram Dataset.
References
2012
- (Wikipedia, 2011) ⇒ http://en.wikipedia.org/wiki/N-gram
- QUOTE: An n-gram model is a type of probabilistic language model for predicting the next item in such a sequence in the form of a [math]\displaystyle{ (n - 1) }[/math]–order Markov model. n-gram models are now widely used in probability, communication theory, computational linguistics (for instance, statistical natural language processing), computational biology (for instance, biological sequence analysis), and data compression. The two core advantagesTemplate:Compared to? of n-gram models (and algorithms that use them) are relative simplicity and the ability to scale up – by simply increasing n a model can be used to store more context with a well-understood space–time tradeoff, enabling small experiments to scale up very efficiently.