2015 LearningtoTransducewithUnbounde
- (Grefenstette et al., 2015) ⇒ Edward Grefenstette, Karl Moritz Hermann, Mustafa Suleyman, and Phil Blunsom. (2015). “Learning to Transduce with Unbounded Memory.” In: Proceedings of 28th International Conference on Neural Information Processing Systems - Volume 2 (NIPS 2015).
Subject Headings: Attention Mechanism.
Notes
Cited By
- http://scholar.google.com/scholar?q=%222015%22+Learning+to+Transduce+with+Unbounded+Memory
- http://dl.acm.org/citation.cfm?id=2969442.2969444&preflayout=flat#citedby
Quotes
Abstract
Recently, strong results have been demonstrated by Deep Recurrent Neural Networks on natural language transduction problems. In this paper we explore the representational power of these models using synthetic grammars designed to exhibit phenomena similar to those found in real transduction problems such as machine translation. These experiments lead us to propose new memory-based recurrent networks that implement continuously differentiable analogues of traditional data structures such as Stacks, Queues, and DeQues. We show that these architectures exhibit superior generalisation performance to Deep RNNs and are often able to learn the underlying generating algorithms in our transduction experiments.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2015 LearningtoTransducewithUnbounde | Edward Grefenstette Karl Moritz Hermann Phil Blunsom Mustafa Suleyman | Learning to Transduce with Unbounded Memory |