Attention-based QA-LSTM
An Attention-based QA-LSTM is a QA-LSTM System that includes an Attention Mechanism.
- AKA: QA-LSTM with Attention.
- Context:
- …
- Example(s):
- Counter-Examples:
- See: Deep Learning, Attention Mechanism, Natural Language Processing, Question Answering System, Artificial Neural Network, Memory-based Neural Network, Memory-Augmented Neural Network, Convolutional Neural Network, Recurrent Neural Network, Long Short-Term Memory, Deep LSTM Reader, Attentive Reader System.
References
2016
- (Tan et al., 2016) ⇒ Ming Tan, Cicero dos Santos, Bing Xiang, and Bowen Zhou. (2016). “LSTM-based Deep Learning Models for Non-factoid Answer Selection.” In: Proceedings of ICLR 2016 Workshop. eprint arXiv:1511.04108
- QUOTE: Inspired by the work in (Hermann et al., 2015), we develop a very simple but efficient word-level attention on the basic model. Figure 3 shows the structure. Prior to the average or mean pooling, each biLSTM output vector will be multiplied by a softmax weight, which is determined by the question embedding from biLSTM.
Figure 3: QA-LSTM with attention
- QUOTE: Inspired by the work in (Hermann et al., 2015), we develop a very simple but efficient word-level attention on the basic model. Figure 3 shows the structure. Prior to the average or mean pooling, each biLSTM output vector will be multiplied by a softmax weight, which is determined by the question embedding from biLSTM.