Least Recently Used Access (LRUA) Memory
A Least Recently Used Access (LRUA) Memory is a Memory-Augmented Neural Network memory cell that is based on the location (address) of a recently read or written content (memory).
- AKA: Least Recently Used Addressing Mechanism, Least Recently Accessed Addressing Scheme, Least Recently Used Memory, LRU Unit.
- Context:
- It was first developed by Santoro et al.(2016).
- …
- Example(s):
- Counter-Example(s):
- See: Least Recently Used (LRU) Algorithm, Dynamic Memory Network, Turing Machine, Programmable Computer, Auxiliary Memory, Gradient Descent, Long Short-Term Memory, Differentiable Neural Computer, Attention Mechanism.
References
2018
- (Gulcehre et al., 2018) ⇒ Caglar Gulcehre, Sarath Chandar, Kyunghyun Cho, and Yoshua Bengio. (2018). “Dynamic Neural Turing Machine with Continuous and Discrete Addressing Schemes". In: Neural Computation Journal, 30(4). ISBN:0899-7687 doi:10.1162/neco_a_01060
- QUOTE: We introduce a memory addressing operation that can learn to put more emphasis on the least recently used (LRU) memory (Santoro, Jartunov, Botvinick, Wierstra, & Lillicrap, 2016) locations. As Rae et al. (2016) and Santoro et al. (2016) observed, we find it easier to learn the write operations with the use of LRU addressing.
Figure 1: A graphical illustration of the proposed dynamic neural Turing machine with the recurrent-controller. The controller receives the fact as a continuous vector encoded by a recurrent neural network, computes the read and write weights for addressing the memory. If the D-NTM automatically detects that a query has been received, it returns an answer and terminates.
- QUOTE: We introduce a memory addressing operation that can learn to put more emphasis on the least recently used (LRU) memory (Santoro, Jartunov, Botvinick, Wierstra, & Lillicrap, 2016) locations. As Rae et al. (2016) and Santoro et al. (2016) observed, we find it easier to learn the write operations with the use of LRU addressing.
2016
- (Santoro et al., 2016) ⇒ Adam Santoro, Sergey Bartunov, Matthew Botvinick, Datan Wierstra, and Timothy Lillicrap. (2016). “One-shot Learning with Memory-Augmented Neural Networks.” In: Proceedings of Deep Learning Symposium (NIPS 2016) . e-print arXiv:1605.06065
- QUOTE: In previous instantiations of the NTM (Graves et al., 2014), memories were addressed by both content and location. Location-based addressing was used to promote iterative steps, akin to running along a tape, as well as long-distance jumps across memory. This method was advantageous for sequence-based prediction tasks. However, this type of access is not optimal for tasks that emphasize a conjunctive coding of information independent of sequence. As such, writing to memory in our model involves the use of a newly designed access module called the Least Recently Used Access (LRUA) module.
The LRUA module is a pure content-based memory writer that writes memories to either the least used memory location or the most recently used memory location. This module emphasizes accurate encoding of relevant (i.e., recent) information, and pure content-based retrieval. New information is written into rarely-used locations, preserving recently encoded information, or it is written to the last used location, which can function as an update of the memory with newer, possibly more relevant information.
- QUOTE: In previous instantiations of the NTM (Graves et al., 2014), memories were addressed by both content and location. Location-based addressing was used to promote iterative steps, akin to running along a tape, as well as long-distance jumps across memory. This method was advantageous for sequence-based prediction tasks. However, this type of access is not optimal for tasks that emphasize a conjunctive coding of information independent of sequence. As such, writing to memory in our model involves the use of a newly designed access module called the Least Recently Used Access (LRUA) module.