Vector Space Mapping Model (VSM)
(Redirected from vector space model)
Jump to navigation
Jump to search
A Vector Space Mapping Model (VSM) is a mapping model that maps an instance to a vector space.
- AKA: Vector-Space Representation.
- Context:
- It can be produced by a Vector Space Learning Task (solved by a vector space learning system)
- It can range from being a Binary Word Vector Space, to being a Integer Word Vector Space to being a Continuous Word Vector Space.
- It can range from being a Semantic Vector Space-based Model to being a Syntactic Vector Space-based Model.
- It can range from being a High-Dimensional Vector Space Model to being a Low-Dimensional Vector Space Model.
- It can (often) include a Vector Space Mapping Function (to map an external object into the model).
- Example(s):
- a Text-Item Vector Space, such as a word vector space model for a 20 Newsgroups dataset.
- a Visual-Item Vector Space.
- a DNA Vector Space.
- …
- Counter-Example(s):
- See: Vector Valued Function, Bag-of-Words Model.
References
2010
- (Turney & Pantel, 2010) ⇒ Peter D. Turney, and Patrick Pantel. (2010). “From Frequency to Meaning: Vector Space Models of Semantics.” In: Journal of Artificial Intelligence Research, 37(1).
- QUOTE: ... This intimate connection between the distributional hypothesis and VSMs is a strong motivation for taking a close look at VSMs. Not all uses of vectors and matrices count as vector space models. For the purposes of this survey, we take it as a defining property of VSMs that the values of the elements in a VSM must be derived from event frequencies, such as the number of times that a given word appears in a given context (see Section 2.6).