Viterbi-based Decoding System
A Viterbi-based Decoding System is an exact decoding system (that applies a Viterbi algorithm.
- Example(s):
- Koen Dejonghe's Perl-based Viterbi Library, http://search.cpan.org/~koen/Algorithm-Viterbi-0.01/lib/Algorithm/Viterbi.pm
- Paul Fodor's Java-based Viterbi Program, http://www.cs.stonybrook.edu/~pfodor/viterbi/Viterbi.java
- Amit Kansal's Soft-Decision Viterbi Decoding with Puncturing Program https://www.google.com/search?q=Soft-Decision+Viterbi+Decoding+with+Puncturing+Amit+Kansal
- Jacob Shin's General Viterbi Program https://www.google.com/search?q=Jacob+Shin's+General+Viterbi+Algorithm
- Counter-Example(s):
- See: Forward Backward Algorithm, Encoder-Decoder Neural Network.
References
2021
- (Wikipedia, 2021) ⇒ https://en.wikipedia.org/wiki/Viterbi_decoder Retrieved:2021-7-4.
- A Viterbi decoder uses the Viterbi algorithm for decoding a bitstream that has been
encoded using a convolutional code or trellis code.
There are other algorithms for decoding a convolutionally encoded stream (for example, the Fano algorithm). The Viterbi algorithm is the most resource-consuming, but it does the maximum likelihood decoding. It is most often used for decoding convolutional codes with constraint lengths k≤3, but values up to k=15 are used in practice.
Viterbi decoding was developed by Andrew J. Viterbi and published in the paper There are both hardware (in modems) and software implementations of a Viterbi decoder.
Viterbi decoding is used in the iterative Viterbi decoding algorithm.
- A Viterbi decoder uses the Viterbi algorithm for decoding a bitstream that has been
2009
- http://www.stanford.edu/class/cs224s/hw4.html
- Today's homework ... is to decode a mystery sequence of digits that are originally from a speech file. You are going to do this by implementing the Viterbi decoding algorithm, and applying it to a file we will give you that contains phone likelihoods from GMMs, together with a lexicon. The output of your program should be the correct word sequence and the path probability of the most-likely path.
2005
- (Finkel et al., 2005) ⇒ Jenny Rose Finkel, Trond Grenager, and Christopher Manning. (2005). “Incorporating Non-local Information Into Information Extraction Systems by Gibbs Sampling.” In: Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics. doi:10.3115/1219840.1219885
- QUOTE: To verify the effectiveness of Gibbs sampling and simulated annealing as an inference technique for hidden state sequence models, we compare Gibbs and Viterbi inference methods for a basic CRF, without the addition of any non-local model. The results, given in Table 1, show that if the Gibbs sampler is run long enough, its accuracy is the same as a Viterbi decoder.