Sparse Access Memory Neural Network (SAM-ANN)
A Sparse Access Memory Neural Network (SAM-ANN) is a Memory-Augmented Neural Network (MANN) based on the implementation of Sparse Access Memory cells.
- Example(s):
- …
- Counter-Example(s):
- a Neural Turing Machine (NTM),
- a Neural Machine Translation (NMT) Network,
- a Hierarchical Attention Network,
- a Gated Convolutional Neural Network with Segment-level Attention Mechanism (SAM-GCNN),
- a Convolutional Neural Network with Segment-level Attention Mechanism (SAM-CNN),
- a Bidirectional Recurrent Neural Network with Attention Mechanism.
- See: Artificial Neural Network, Neural Natural Language Translation, Attention Mechanism, Deep Learning Neural Network, Speech Recognition, Document Classification.
References
2016
- (Rae et al., 2016) ⇒ Jack W Rae, Jonathan J Hunt, Tim Harley, Ivo Danihelka, Andrew Senior, Greg Wayne, Alex Graves, and Timothy P Lillicrap. (2016). “Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes.” In: Proceedings of Advances in Neural Information Processing Systems 29 (NIPS 2016). ISBN:978-1-5108-3881-9. e-print arXiv:1610.09027
- QUOTE: In this paper, we present a MANN named SAM (sparse access memory). By thresholding memory modifications to a sparse subset, and using efficient data structures for content-based read operations, our model is optimal in space and time with respect to memory size, while retaining end-to-end gradient based optimization.
(...)
This paper introduces Sparse Access Memory (SAM), a new neural memory architecture with two innovations. Most importantly, all writes to and reads from external memory are constrained to a sparse subset of the memory words, providing similar functionality as the NTM, while allowing computational and memory efficient operation. Secondly, we introduce a sparse memory management scheme that tracks memory usage and finds unused blocks of memory for recording new information. For a memory containing [math]\displaystyle{ N }[/math] words, SAM executes a forward, backward step in [math]\displaystyle{ \Theta(\log\; N) }[/math] time, initializes in [math]\displaystyle{ \Theta(N) }[/math] space, and consumes [math]\displaystyle{ \Theta(1) }[/math] space per time step. Under some reasonable assumptions, SAM is asymptotically optimal in time and space complexity (Supplementary A).
Figure 5: A schematic of the memory efficient backpropagation through time. Each circle represents an instance of the SAM core at a given time step. The grey box marks the dense memory. Each core holds a reference to the single instance of the memory, and this is represented by the solid connecting line above each core. We see during the forward pass, the memory’s contents are modified sparsely, represented by the solid horizontal lines. Instead of caching the changing memory state, we store only the sparse modifications — represented by the dashed white boxes. During the backward pass, we “revert” the cached modifications to restore the memory to its prior state, which is crucial for correct gradient calculations.
- QUOTE: In this paper, we present a MANN named SAM (sparse access memory). By thresholding memory modifications to a sparse subset, and using efficient data structures for content-based read operations, our model is optimal in space and time with respect to memory size, while retaining end-to-end gradient based optimization.