Automated News Summarization Task
(Redirected from news summarization)
Jump to navigation
Jump to search
An Automated News Summarization Task is a domain-specific summarization task for news items.
- Context:
- It can be solved by a News Summarization System (that implements a news summarization algorithm).
- …
- Example(s):
- one in an News Summarization Benchmark.
- …
- Counter-Example(s):
- See: Financial Summarization.
References
2023
- (Zhang, Ladhak et al., 2023) ⇒ Tianyi Zhang, Faisal Ladhak, Esin Durmus, Percy Liang, Kathleen McKeown, and Tatsunori B. Hashimoto. (2023). “Benchmarking Large Language Models for News Summarization.” arXiv preprint arXiv:2301.13848
2022
- (Goyal et al., 2022) ⇒ Tanya Goyal, Junyi Jessy Li, and Greg Durrett. (2023). “News Summarization and Evaluation in the Era of GPT-3.” doi:10.48550/arXiv.2209.12356
2022
- (Liang, Bommasani et al., 2022) ⇒ Percy Liang, Rishi Bommasani, Tony Lee, ... Sang Michael Xie, Shibani Santurkar, Surya Ganguli, Tatsunori Hashimoto, Thomas Icard, Tianyi Zhang, Vishrav Chaudhary, William Wang, Xuechen Li, Yifan Mai, Yuhui Zhang, and Yuta Koreeda. (2022). “Holistic Evaluation of Language Models.” doi:10.48550/arXiv.2211.09110
- QUOTE: Problem setting. We formulate text summarization as an unstructured sequence-to-sequence problem, where a document (e.g. a CNN news article) is the input and the LM is tasked with generating a summary that resembles the reference summary (e.g. the bullet point summary provided by CNN with their article). ...
... we select the CNN/DailyMail (Hermann et al., 2015a) and XSUM (Narayan et al., 2018) datasets, which are the most well-studied datasets in the literature on summarization faithfulness. This also ensures domain coverage of news-type data. Importantly, these datasets differ along a central axis studied in summarization: XSUM is a dataset with largely abstractive reference summaries (meaning the string overlap between the document and its summary in the dataset is relatively small on average), whereas CNN/DailyMail is a dataset with largely extractive reference summaries. ...
- Scenario: CNN/DaiiyMaii
- Input: Two years ago, the storied Boston Marathon ended in terror and altered the lives of runners,. Many bombing survivors. celebrating "One Boston Day," which was created to recognize acts of valor and to encourage kindness among Bostonians. .
- Reference: Citizens gather to honor victims on One Boston Day, two years after the marathon bombings.
- Fig. 13. Example of summarization. An example instance for summarization from CNN/DailyMail. ...
- QUOTE: Problem setting. We formulate text summarization as an unstructured sequence-to-sequence problem, where a document (e.g. a CNN news article) is the input and the LM is tasked with generating a summary that resembles the reference summary (e.g. the bullet point summary provided by CNN with their article). ...
2002
- (Barzilay et al., 2002) ⇒ Regina Barzilay, Noemie Elhadad, and Kathleen R. McKeown. (2002). “Inferring Strategies for Sentence Ordering in Multidocument News Summarization.” In: Journal of Artificial Intelligence Research, 17.