word2vec Algorithm
(Redirected from Word2Vec Algorithm)
Jump to navigation
Jump to search
A word2vec Algorithm is a continuous dense distributional word model training algorithm associated with a word2vec-like system.
- Context:
- It was initially developed by Goldberg & Levy (2014).
- It can include a:
- …
- Example(s):
- TensorFlow Tutorial on Word2Vec:
- Counter-Example(s):
- See: Bag-of-Words Representation, Word Context Vectors, Continuous BoW-based Word Embedding Algorithm.
References
2021
- (TensorFlow, 2021) ⇒ https://www.tensorflow.org/tutorials/text/word2vec Retrieved:2021-05-09.
- QUOTE: Word2Vec is not a singular algorithm, rather, it is a family of model architectures and optimizations that can be used to learn word embeddings from large datasets. Embeddings learned through Word2Vec have proven to be successful on a variety of downstream natural language processing tasks.
2014a
- (Levy & Goldberg, 2014) ⇒ Omer Levy, and Yoav Goldberg. (2014). “Neural Word Embedding As Implicit Matrix Factorization.” In: Advances in Neural Information Processing Systems.
- QUOTE: We analyze skip-gram with negative-sampling (SGNS), a word embedding method introduced by Mikolov et al., and show that it is implicitly factorizing a word-context matrix, whose cells are the pointwise mutual information (PMI) of the respective word and context pairs, shifted by a global constant.
2014b
- (Goldberg & Levy, 2014) ⇒ Yoav Goldberg, and Omer Levy. (2014). “word2vec Explained: Deriving Mikolov Et Al.'s Negative-sampling Word-embedding Method.” In: arXiv preprint arXiv:1402.3722.
- QUOTE: The word2vec software of Tomáš Mikolov and colleagues (this https URL) has gained a lot of traction lately, and provides state-of-the-art word embeddings. ...