Distributional Word Vectorizing Function: Difference between revisions
Jump to navigation
Jump to search
(Created page with "A Distributional Word Vectorizing Function is a word vectorizing function that is a distributional text-item vetorizing function. * <B>Context:</B> ** It can be cr...") |
m (Text replacement - "ions]] " to "ion]]s ") |
||
(29 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
A [[Distributional Word Vectorizing Function]] is a [[word vectorizing function]] that is a [[distributional text-item | A [[Distributional Word Vectorizing Function]] is a [[word vectorizing function]] that is a [[distributional text-item vectorizing function]] (which maps a [[word]] into a [[distributional word vector]] from a [[distributional word vector space]]). | ||
* <B>AKA:</B> [[Distributional Word Vectorizing Function|Lexical Distributed Representation]]. | |||
* <B>Context:</B> | * <B>Context:</B> | ||
** It can be created by a [[Distributional Word Vectorizing Function Creation Task]]. | ** It can be created by a [[Distributional Word Vectorizing Function Creation System]] (that solves a [[Distributional Word Vectorizing Function Creation Task]]). | ||
** It can | ** It can define a [[Distributional Word Vector Space]]. | ||
* <B>Example(s):</ | ** It can range from being a [[Continuous Distributional Word Vectorizing Function]] to being a [[Discrete Distributional Word Vectorizing Function]]. | ||
** one created by [[ | ** It can range from being a [[Dense Distributional Word Vectorizing Function]] to being a [[Sparse Distributional Word Vectorizing Function]]. | ||
** … | |||
* <B>Example(s):</B>. | |||
** a [[word2vec word vectorizing model]]. | |||
** one created by [[SemanticVectors]]. | |||
** … | |||
* <B>Counter-Example(s):</B> | * <B>Counter-Example(s):</B> | ||
** a [[Distributional Phrase Embeddings Model]]. | |||
** a [[Distributional Sentence Embeddings Model]]/[[Distributional Sentence Vectorizing Function]]. | |||
** a [[Distributional Document Embeddings Model]]. | |||
** a [[Statistical Language Model]]. | |||
** a [[Word Classification Function]]. | ** a [[Word Classification Function]]. | ||
* <B>See:</B> [[Word Embeddings]], [[Vector Space]], [[Latent Concept]]. | |||
* <B>See:</B> [[Word Embeddings]]. | |||
---- | ---- | ||
---- | ---- | ||
== References == | |||
=== 2015 === | |||
* ([[2015_WordRepresentationsviaGaussianE|Vilnis & McCallum, 2015]]) ⇒ [[Luke Vilnis]], and [[Andrew McCallum]]. ([[2015]]). “[http://arxiv.org/pdf/1412.6623v1.pdf Word Representations via Gaussian Embedding].” In: arXiv preprint arXiv:1412.6623 submitted to ICRL 2015. | |||
** QUOTE: Current work in [[Distributional Word Vectorizing Function|lexical distributed representation]]s maps each word to a [[point vector]] in [[low-dimensional space]]. </s> [[Mapping]] instead to a [[dense space|density]] provides many interesting advantages, including better [[capturing uncertainty about a representation and its relationships]], expressing [[asymmetries]] more naturally than [[dot product]] or [[cosine similarity]], and enabling more expressive [[parameterization of decision boundaries]]. | |||
=== 2013 === | |||
* https://code.google.com/p/semanticvectors/wiki/Ideas | |||
** QUOTE: There are many reasons for investigating the relationships between [[Distributional Word Vectorizing Function|semantic vector model]]s and [[formal ontology model]]s including [[taxonomi]]es, [[conceptual graph]]s, [[Cyc]], [[RDF]], [[Wikipedia relationship]]s, etc. | |||
---- | |||
__NOTOC__ | |||
[[Category:Concept]] |
Latest revision as of 07:29, 22 August 2024
A Distributional Word Vectorizing Function is a word vectorizing function that is a distributional text-item vectorizing function (which maps a word into a distributional word vector from a distributional word vector space).
- AKA: Lexical Distributed Representation.
- Context:
- It can be created by a Distributional Word Vectorizing Function Creation System (that solves a Distributional Word Vectorizing Function Creation Task).
- It can define a Distributional Word Vector Space.
- It can range from being a Continuous Distributional Word Vectorizing Function to being a Discrete Distributional Word Vectorizing Function.
- It can range from being a Dense Distributional Word Vectorizing Function to being a Sparse Distributional Word Vectorizing Function.
- …
- Example(s):.
- a word2vec word vectorizing model.
- one created by SemanticVectors.
- …
- Counter-Example(s):
- See: Word Embeddings, Vector Space, Latent Concept.
References
2015
- (Vilnis & McCallum, 2015) ⇒ Luke Vilnis, and Andrew McCallum. (2015). “Word Representations via Gaussian Embedding.” In: arXiv preprint arXiv:1412.6623 submitted to ICRL 2015.
- QUOTE: Current work in lexical distributed representations maps each word to a point vector in low-dimensional space. Mapping instead to a density provides many interesting advantages, including better capturing uncertainty about a representation and its relationships, expressing asymmetries more naturally than dot product or cosine similarity, and enabling more expressive parameterization of decision boundaries.
2013
- https://code.google.com/p/semanticvectors/wiki/Ideas
- QUOTE: There are many reasons for investigating the relationships between semantic vector models and formal ontology models including taxonomies, conceptual graphs, Cyc, RDF, Wikipedia relationships, etc.