QA-LSTM-CNN System
(Redirected from QA-LSTM/CNN)
Jump to navigation
Jump to search
A QA-LSTM-CNN System is a QA-LSTM System based on a CNN architecture.
- Context:
- It can solve a QA-LSTM-CNN Task by implementing a QA-LSTM-CNN Algorithm.
- Example(s):
- Counter-Examples:
- See: Deep Learning, Attention-based Mechanism, QA Dataset, Natural Language Processing, Question Answering System, Artificial Neural Network, Convolutional Neural Network, Recurrent Neural Network, Long Short-Term Memory.
References
2016
- (Tan et al., 2016) ⇒ Ming Tan, Cicero dos Santos, Bing Xiang, and Bowen Zhou. (2016). “LSTM-based Deep Learning Models for Non-factoid Answer Selection.” In: Proceedings of ICLR 2016 Workshop. eprint arXiv:1511.04108
- QUOTE: In the previous subsection, we generate the question and answer representations only by simple operations, such as max or mean pooling. In this subsection, we resort to a CNN structure built on the outputs of biLSTM, in order to give a more composite representation of questions and answers.
The structure of CNN in this work is similar to the one in (Feng et al., 2015), as shown in Figure 2. Unlike the traditional forward neural network, where each output is interactive with each input, the convolutional structure only imposes local interactions between the inputs within a filter size m.
Figure 2: QA-LSTM/CNN
- QUOTE: In the previous subsection, we generate the question and answer representations only by simple operations, such as max or mean pooling. In this subsection, we resort to a CNN structure built on the outputs of biLSTM, in order to give a more composite representation of questions and answers.