Distributional Semantic Modeling Task
(Redirected from Distributional Lexical Semantic Modeling Task)
Jump to navigation
Jump to search
A Distributional Semantic Modeling Task is a data-driven semantic modeling task that uses cooccurrence statistics to create a distributional word vector function.
- AKA: Distributional Word Vectorizing Function Generation.
- Context:
- It can be solved by a Distributional Word Vector Model Training System (that implements a Distributional Word Vector Model Creation Algorithm).
- It can be solved by a Distributional Semantic Modeling System (that implements a Distributional Semantic Modeling Algorithm).
- It can range from being a Count-based Distributional Lexical Semantic Modeling Task to being a Word Embedding Model Training.
- …
- Example(s):
- “create a distributional model from a 20 newsgroups corpus (e.g. using word2vec).”
- Counter-Example(s):
- See: Text-Item Vectoring Model Creation Task, Distributional Word Vector, Unsupervised Learning, Word-Space Model.
References
2010
- Magnus Sahlgren. https://www.sics.se/~mange/research.html
- QUOTE: My research is focused on how semantic knowledge is acquired and represented in man and machine. In particular, I study the distributional approach to semantic knowledge acquisition, in which semantic information is extracted from cooccurrence statistics. The underlying idea is that meanings are correlated with the distributional patterns of linguistic entities.