Neural Summarization Algorithm
Jump to navigation
Jump to search
A Neural Summarization Algorithm is a text summarization algorithm that leverages neural network architectures.
- Context:
- It can be implemented by a Neural Summarization System (that can solve a neural summarization task).
- It can range from being a Neural Extractive Summarization Algorithm to being a Neural Abstractive Summarization Algorithm.
- It can utilize Deep Learning techniques to process and understand complex patterns in text data.
- ...
- Example(s):
- an LLM-based Text Summarization Algorithm.
- a BERT-based Summarization Algorithm (that uses a BERT-based model) such as (Liu & Lapata, 2019).
- an RNN-based Summarization Algorithm, ...
- an LSTM-based Summarization Algorithm, ...
- a CNN-based Summarization Algorithm (that uses a [[CNN-based model), ...
- ...
- Counter-Example(s):
- Rule-Based Summarization Algorithm, which relies on predefined rules rather than learning from data.
- Statistical Summarization Algorithm, which uses statistical methods without deep learning architectures.
- See: Text Summarization, Deep Learning in NLP, Machine Learning Algorithm, BERT, RNN, CNN.
References
2019
- (Liu & Lapata, 2019) ⇒ Yang Liu, and Mirella Lapata. (2019). “Text Summarization with Pretrained Encoders.” In: arXiv:1908.08345 [doi:10.48550/arXiv.1908.08345]
- QUOTE: Bidirectional Encoder Representations from Transformer (BERT) represent the latest incarnation of pretrained language models, which have recently advanced a wide range of natural language processing tasks. In this paper, they demonstrate how BERT can be applied in text summarization using a general framework for both extractive summarization and abstractive summarization models. They introduce a novel document-level encoder based on BERT for this purpose.