2016 LongShortTermMemoryNetworksforM
- (Cheng et al., 2016) ⇒ Jianpeng Cheng, Li Dong, and Mirella Lapata. (2016). “Long Short-Term Memory-Networks for Machine Reading.” In: Proceedings of Conference on Empirical Methods in Natural Language Processing (EMNLP 2016).
Subject Headings: Attention Mechanism.
Notes
Cited By
Quotes
Abstract
In this paper we address the question of how to render sequence-level networks better at handling structured input. We propose a machine reading simulator which processes text incrementally from left to right and performs shallow reasoning with memory and attention. The reader extends the Long Short-Term Memory architecture with a memory network in place of a single memory cell. This enables adaptive memory usage during recurrence with neural attention, offering a way to weakly induce relations among tokens. The system is initially designed to process a single sequence but we also demonstrate how to integrate it with an encoder-decoder architecture. Experiments on language modeling, sentiment analysis, and natural language inference show that our model matches or outperforms the state of the art.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2016 LongShortTermMemoryNetworksforM | Mirella Lapata Jianpeng Cheng Li Dong | Long Short-Term Memory-Networks for Machine Reading |