Gaussian Word Embedding System
Jump to navigation
Jump to search
A Gaussian Word Embedding System is a word embedding system that is a Gaussian embedding system.
- Context:
- It can implement a Gaussian Word Embedding Algorithm to solve a Gaussian Word Embedding Task.
- …
- Example(s):
- Counter-Example(s):
- See: Gaussian Process, Gaussian Vector Space.
References
2016
- (GitHub, 2016) ⇒ https://github.com/seomoz/word2gauss
- QUOTE: Python/Cython implementation of Luke Vilnis and Andrew McCallum Word Representations via Gaussian Embedding, ICLR 2015 that represents each word as a multivariate Gaussian. ...
... Instead of representing a word as a vector as in word2vec, word gauss represents each word as a multivariate Gaussian. Assuming some dictionary of known tokens w[i], i = 0 .. N-1, each word is represented as a probability P[i], a K dimensional Gaussian parameterized by …
- QUOTE: Python/Cython implementation of Luke Vilnis and Andrew McCallum Word Representations via Gaussian Embedding, ICLR 2015 that represents each word as a multivariate Gaussian. ...
2015
- (Vilnis & McCallum, 2015) ⇒ Luke Vilnis, and Andrew McCallum. (2015). “Word Representations via Gaussian Embedding.” In: Proceedings of the International Conference on Learning Representations (ICRL-2015).
- QUOTE: We draw inspiration from this work to propose novel word embedding algorithms that embed words directly as Gaussian distributional potential functions in an infinite dimensional function space. This allows us to map word types not only to vectors but to soft regions in space, modeling uncertainty, inclusion, and entailment, as well as providing a rich geometry of the latent space.