Gaussian Word Embedding Algorithm
Jump to navigation
Jump to search
A Gaussian Word Embedding Algorithm is a word embedding algorithm that is a Gaussian embedding algorithm (based on a Gaussian distributions).
- Context:
- It can be implemented by a Gaussian Word Embedding System.
- …
- Example(s):
- the word embedding algorithm proposed in Vilnis & McCallum (2015).
- …
- Counter-Example(s):
- See: Gaussian Process, Gaussian Vector Space.
References
2015
- (Vilnis & McCallum, 2015) ⇒ Luke Vilnis, and Andrew McCallum. (2015). “Word Representations via Gaussian Embedding.” In: Proceedings of the International Conference on Learning Representations (ICRL-2015).
- QUOTE: We draw inspiration from this work to propose novel word embedding algorithms that embed words directly as Gaussian distributional potential functions in an infinite dimensional function space. This allows us to map word types not only to vectors but to soft regions in space, modeling uncertainty, inclusion, and entailment, as well as providing a rich geometry of the latent space.