2020 LongformerTheLongDocumentTransf
- (Beltagy et al., 2020) ⇒ Iz Beltagy, Matthew E Peters, and Arman Cohan. (2020). “Longformer: The Long-document Transformer.” In: arXiv preprint arXiv:2004.05150. doi:10.48550/arXiv.2004.05150
Subject Headings: Longformer Model.
Notes
Cited By
2022
- (Lin et al., 2022) ⇒ Tianyang Lin, Yuxin Wang, Xiangyang Liu, and Xipeng Qiu. (2022). “A Survey of Transformers.” AI Open
- … Longformer (Beltagy et al., 2020) uses a combination of band attention and internal global-node attention. The global nodes are chosen to be [CLS] token for classification and all question tokens for Question Answering tasks. They also replace some of the band attention heads in upper layers with dilated window attention to increase the receptive field without increasing computation.
Quotes
Abstract
Transformer-based models are unable to process long sequences due to their self-attention operation, which scales quadratically with the sequence length. To address this limitation, we introduce the Longformer with an attention mechanism that scales linearly with sequence length, making it easy to process documents of thousands of tokens or longer. Longformer's attention mechanism is a drop-in replacement for the standard self-attention and combines a local windowed attention with a task motivated global attention. Following prior work on long-sequence transformers, we evaluate Longformer on character-level language modeling and achieve state-of-the-art results on text8 and enwik8. In contrast to most prior work, we also pretrain Longformer and finetune it on a variety of downstream tasks. Our pretrained Longformer consistently outperforms RoBERTa on long document tasks and sets new state-of-the-art results on WikiHop and TriviaQA. We finally introduce the Longformer-Encoder-Decoder (LED), a Longformer variant for supporting long document generative sequence-to-sequence tasks, and demonstrate its effectiveness on the arXiv summarization dataset.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2020 LongformerTheLongDocumentTransf | Iz Beltagy Matthew E Peters Arman Cohan | Longformer: The Long-document Transformer | 10.48550/arXiv.2004.05150 | 2020 |