2016 AbstractiveTextSummarizationUsi

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Text Summarization; Natural Language Generation Task.

Notes

Pre-print(s) and Other Link(s):

Cited By

Quotes

Abstract

In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora. We propose several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-toword structure, and emitting words that are rare or unseen at training time. Our work shows that many of our proposed models contribute to further improvement in performance. We also propose a new dataset consisting of multi-sentence summaries, and establish performance benchmarks for further research.

References

BibTeX

@inproceedings{2016_AbstractiveTextSummarizationUsi,
  author    = {Ramesh Nallapati and
               Bowen Zhou and
               Cicero Nogueira dos Santos and
               Caglar Gulcehre and
               Bing Xiang},
  editor    = {Yoav Goldberg and
               Stefan Riezler},
  title     = {Abstractive Text Summarization using Sequence-to-sequence RNNs and
               Beyond},
  booktitle = {Proceedings of the 20th SIGNLL Conference on Computational Natural
               Language Learning (CoNLL 2016) Berlin, Germany, August 11-12, 2016},
  pages     = {280--290},
  publisher = {ACL},
  year      = {2016},
  url       = {https://doi.org/10.18653/v1/k16-1028},
  doi       = {10.18653/v1/k16-1028},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2016 AbstractiveTextSummarizationUsiCaglar Gulcehre
Cicero Nogueira dos Santos
Bing Xiang
Bowen Zhou
Ramesh Nallapati
Abstractive Text Summarization Using Sequence-to-sequence RNNs and Beyond2016