BERT-based Parallel Sequence Decoding with Adapters System
Jump to navigation
Jump to search
A BERT-based Parallel Sequence Decoding with Adapters System is a Linguistic Sequence Decoding System based on the BERT System developed by Guo et al. (2020).
- Context:
- It can solve a BERT-based Parallel Sequence Decoding with Adapters Task by implementing a BERT-based Parallel Sequence Decoding with Adapters Algorithm.
- Example(s):
- …
- Counter-Example(s):
- See: Transformer Network, Language Model, Natural Language Processing System, Graph Neural Network, Dense Relational Captioning System, Self-Attention Network, Gated Recurrent Unit, Long Short-Term Memory (LSTM) Network, RNN-Based Language Model, Backpropagation Through Time, Recurrent Neural Network.
References
2020b
- (Guo et al., 2020) ⇒ Junliang Guo, Zhirui Zhang, Linli Xu, Hao-Ran Wei, Boxing Chen, and Enhong Chen (2020)."Incorporating BERT into Parallel Sequence Decoding with Adapters". In: Proceedings of the Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems (NeurIPS 2020).
- QUOTE: Parallel sequence decoding hugely reduces the inference latency by neglecting the conditional dependency between output tokens, based on novel decoding algorithms including non-autoregressive decoding (...), insertion-based decoding (...) and Mask-Predict (...)