Skip-Gram with Negative Sampling Algorithm
(Redirected from skip-gram with negative-sampling (SGNS) training method)
Jump to navigation
Jump to search
A Skip-Gram with Negative Sampling Algorithm is a skip-gram algorithm that performs negative sampling of ...
- AKA: SGNS.
- Example(s):
- See: Continuous-BoW Algorithm.
References
2014
- (Levy & Goldberg, 2014) ⇒ Omer Levy, and Yoav Goldberg. (2014). “Neural Word Embedding As Implicit Matrix Factorization.” In: Advances in Neural Information Processing Systems.
- QUOTE: Recently, there has been a surge of work proposing to represent words as dense vectors, derived using various training methods inspired from neural-network language modeling [3, 9, 23, 21].
- (Goldberg & Levy, 2014) ⇒ Yoav Goldberg, and Omer Levy. (2014). “word2vec Explained: Deriving Mikolov Et Al.'s Negative-sampling Word-embedding Method.” In: arXiv preprint arXiv:1402.3722.
- (Mikolov et al., 2014) ⇒ Tomáš Mikolov, Ilya Sutskever, Kai Chen, Greg S. Corrado, and Jeff Dean. (2014). “Distributed Representations of Words and Phrases and their Compositionality.” In: Advances in Neural Information Processing Systems, 26.