Hyperspace Analogue to Language Algorithm
(Redirected from hyperspace analogue to language (HAL) framework)
Jump to navigation
Jump to search
A Hyperspace Analogue to Language Algorithm is a word space modeling algorithm that ..
- See: LSI Algorithm.
References
2015
- http://en.wikipedia.org/wiki/Semantic_memory#Hyperspace_Analogue_to_Language_.28HAL.29
- The Hyperspace Analogue to Language (HAL) model[1][2] considers context only as the words that immediately surround a given word. HAL computes an NxN matrix, where N is the number of words in its using a 10-word reading frame that moves incrementally through a corpus of text. Like in SAM (see above), any time two words are simultaneously in the frame, the association between them is increased, that is, the corresponding cell in the NxN matrix is incremented. The amount by which the association is incremented varies inversely with the distance between the two words in the frame (specifically, [math]\displaystyle{ \Delta=11-d }[/math], where [math]\displaystyle{ d }[/math] is the distance between the two words in the frame). As in LSA (see above), the semantic similarity between two words is given by the cosine of the angle between their vectors (dimension reduction may be performed on this matrix, as well). In HAL, then, two words are semantically related if they tend to appear with the same words. Note that this may hold true even when the words being compared never actually co-occur (i.e., "chicken" and "canary").
- ↑ Lund, K., Burgess, C. & Atchley, R. A. (1995). Semantic and associative priming in a high-dimensional semantic space. Cognitive Science Proceedings (LEA), 660-665.
- ↑ Lund, K. & Burgess, C. (1996). Producing high-dimensional semantic spaces from lexical co-occurrence. Behavior Research Methods, Instruments & Computers, 28(2),203-208.
2009
- (Recchia & Jones, 2009) ⇒ Gabriel Recchia, and Michael N. Jones. (2009). “More Data Trumps Smarter Algorithms: Comparing Pointwise Mutual Information with Latent Semantic Analysis.” In: Behavior research methods, 41(3).
- QUOTE: Explaining how semantic representations of words are derived from experience is a central task for high-dimensional semantic space models such as the hyperspace analogue to language (HAL) framework of Burgess and Lund (2000), latent semantic analysis (LSA; Landauer & Dumais, 1997), and other lexical co-occurrence models of semantic memory. Although the models differ considerably in the algorithms used, they are all fundamentally based on the principle that a word’s meaning can be induced by observing its statistical usage across a large sample of language.
2000
- (Burgess & Lund, 2000) ⇒ Curt Burgess, and Kevin Lund. (2000). “The Dynamics of Meaning in Memory.” In: Cognitive dynamics: Conceptual and representational change in humans and machines, 13.