Gaussian Word Embedding Algorithm: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
m (Text replacement - "ions]] " to "ion]]s ") |
||
Line 17: | Line 17: | ||
=== 2015 === | === 2015 === | ||
* ([[2015_WordRepresentationsviaGaussianE|Vilnis & McCallum, 2015]]) ⇒ [[Luke Vilnis]], and [[Andrew McCallum]]. ([[2015]]). “[http://arxiv.org/pdf/1412.6623v1.pdf Word Representations via Gaussian Embedding].” In: [[Proceedings of the International Conference on Learning | * ([[2015_WordRepresentationsviaGaussianE|Vilnis & McCallum, 2015]]) ⇒ [[Luke Vilnis]], and [[Andrew McCallum]]. ([[2015]]). “[http://arxiv.org/pdf/1412.6623v1.pdf Word Representations via Gaussian Embedding].” In: [[Proceedings of the International Conference on Learning Representation]]s ([[ICRL-2015]]). | ||
** QUOTE: We draw inspiration from this work to propose novel [[word embedding algorithm]]s that embed words directly as [[Gaussian distributional potential function]]s in an [[infinite dimensional function]] [[space]]. </s> This allows us to map word types not only to [[vector]]s but to [[soft regions in space]], [[modeling uncertainty]], [[inclusion]], and [[entailment]], as well as providing a rich [[geometry]] of the [[latent space]]. </s> | ** QUOTE: We draw inspiration from this work to propose novel [[word embedding algorithm]]s that embed words directly as [[Gaussian distributional potential function]]s in an [[infinite dimensional function]] [[space]]. </s> This allows us to map word types not only to [[vector]]s but to [[soft regions in space]], [[modeling uncertainty]], [[inclusion]], and [[entailment]], as well as providing a rich [[geometry]] of the [[latent space]]. </s> | ||
Latest revision as of 07:29, 22 August 2024
A Gaussian Word Embedding Algorithm is a word embedding algorithm that is a Gaussian embedding algorithm (based on a Gaussian distributions).
- Context:
- It can be implemented by a Gaussian Word Embedding System.
- …
- Example(s):
- the word embedding algorithm proposed in Vilnis & McCallum (2015).
- …
- Counter-Example(s):
- See: Gaussian Process, Gaussian Vector Space.
References
2015
- (Vilnis & McCallum, 2015) ⇒ Luke Vilnis, and Andrew McCallum. (2015). “Word Representations via Gaussian Embedding.” In: Proceedings of the International Conference on Learning Representations (ICRL-2015).
- QUOTE: We draw inspiration from this work to propose novel word embedding algorithms that embed words directly as Gaussian distributional potential functions in an infinite dimensional function space. This allows us to map word types not only to vectors but to soft regions in space, modeling uncertainty, inclusion, and entailment, as well as providing a rich geometry of the latent space.