Gaussian Word Embedding Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
No edit summary
m (Text replacement - "ions]] " to "ion]]s ")
 
Line 17: Line 17:


=== 2015 ===
=== 2015 ===
* ([[2015_WordRepresentationsviaGaussianE|Vilnis & McCallum, 2015]]) ⇒ [[Luke Vilnis]], and [[Andrew McCallum]]. ([[2015]]). “[http://arxiv.org/pdf/1412.6623v1.pdf Word Representations via Gaussian Embedding].” In: [[Proceedings of the International Conference on Learning Representations]] ([[ICRL-2015]]).
* ([[2015_WordRepresentationsviaGaussianE|Vilnis & McCallum, 2015]]) ⇒ [[Luke Vilnis]], and [[Andrew McCallum]]. ([[2015]]). “[http://arxiv.org/pdf/1412.6623v1.pdf Word Representations via Gaussian Embedding].” In: [[Proceedings of the International Conference on Learning Representation]]s ([[ICRL-2015]]).
** QUOTE: We draw inspiration from this work to propose novel [[word embedding algorithm]]s that embed words directly as [[Gaussian distributional potential function]]s in an [[infinite dimensional function]] [[space]]. </s> This allows us to map word types not only to [[vector]]s but to [[soft regions in space]], [[modeling uncertainty]], [[inclusion]], and [[entailment]], as well as providing a rich [[geometry]] of the [[latent space]]. </s>
** QUOTE: We draw inspiration from this work to propose novel [[word embedding algorithm]]s that embed words directly as [[Gaussian distributional potential function]]s in an [[infinite dimensional function]] [[space]]. </s> This allows us to map word types not only to [[vector]]s but to [[soft regions in space]], [[modeling uncertainty]], [[inclusion]], and [[entailment]], as well as providing a rich [[geometry]] of the [[latent space]]. </s>



Latest revision as of 07:29, 22 August 2024