Xu-Sigal Multiple Sequence Decoding Task
Jump to navigation
Jump to search
A Xu-Sigal Multiple Sequence Decoding Task is a Linguistic Sequence Decoding Task performed by Su & Sigal (2020).
- Context:
- It can be solved by a Xu-Sigal Multiple Sequence Decoding System that implements a Su-Sigal Multiple Sequence Decoding Algorithm.
- Example(s):
- the one described in Xu & Sigal (2020),
- ...
- …
- Counter-Example(s):
- See: Transformer Network, Language Model, Natural Language Processing Task, Graph Neural Network, Dense Relational Captioning Task, Self-Attention Network, Gated Recurrent Unit, Long Short-Term Memory (LSTM) Network, RNN-Based Language Model, Backpropagation Through Time, Recurrent Neural Network.
References
2020
- (Xu & Sigal, 2020) ⇒ Bicheng Xu, and Leonid Sigal (2020). "Consistent Multiple Sequence Decoding". In: arXiv:2004.00760.
- QUOTE: Sequence decoding has emerged as one of the fundamental building blocks for a large variety of computer vision problems. For example, it is a critical component in a range of visual-lingual architectures, for tasks such as image captioning (...) and question answering (...), as well as in generative models that tackle trajectory prediction or forecasting (...). Most existing methods assume a single sequence and implement neural decoding using recurrent architectures, e.g., LSTMs or GRUs; recent variants include models like BERT (...) However, in many scenarios, more than one sequence needs to be decoded at the same time. Common examples include trajectory forecasting in team sports (...) or autonomous driving (...), where multiple agents (players/cars) need to be predicted and behavior of one agent may closely depend on the others.