Skip-Gram NNLM Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
(Created page with "A Skip-Gram NNLM Algorithm is an NNLM algorithm that uses skip-gram co-occurrence statistics. * <B>Context:</B> ** It can be a Skip-Gram with Negative-Sampling (...")
 
No edit summary
Line 1: Line 1:
A [[Skip-Gram NNLM Algorithm]] is an [[NNLM algorithm]] that uses [[skip-gram co-occurrence statistic]]s.
A [[Skip-Gram NNLM Algorithm]] is an [[NNLM algorithm]] that uses [[skip-gram co-occurrence statistic]]s (to predict [[surrounding word]]s using the [[target word]]).
* <B>Context:</B>
* <B>Context:</B>
** It can be a [[Skip-Gram with Negative-Sampling (SGNS)]].
** It can be a [[Skip-Gram with Negative-Sampling (SGNS)]].
** It can be implemented by a [[Skip-Gram Word Embedding System]] (such as [[word2vec]]).
** It can be implemented by a [[Skip-Gram Word Embedding System]] (such as [[word2vec]]).
** It has [[training complexity]] proportional to <math>Q = C \times (D + D \times \log_2 (V)); \ (5) </math> where C is the [[maximum distance]] of the words.
** It has [[training complexity]] proportional to <math>Q = C \times (D + D \times \log_2 (V)); \ (5) </math> where C is the [[maximum distance]] of the words.
** It has performance similar to [[CBOW NNLM]].
* <B>Example(s):</B>
* <B>Example(s):</B>
** https://code.google.com/p/word2vec/source/browse/trunk/word2vec.c?spec=svn42&r=42#482
** https://code.google.com/p/word2vec/source/browse/trunk/word2vec.c?spec=svn42&r=42#482

Revision as of 03:18, 19 February 2015

A Skip-Gram NNLM Algorithm is an NNLM algorithm that uses skip-gram co-occurrence statistics (to predict surrounding words using the target word).



References

2015

2013