Exploiting Explicit Matching Knowledge with Long Short-Term Memory

被引:0
|
作者
Bao, Xinqi [1 ]
Wu, Yunfang [1 ]
机构
[1] Peking Univ, Sch Elect Engn & Comp Sci, Key Lab Computat Linguist, Beijing, Peoples R China
基金
中国国家自然科学基金; 国家高技术研究发展计划(863计划);
关键词
Lexical matching knowledge; LSTM; Question answering;
D O I
10.1007/978-3-319-69005-6_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently neural network models are widely applied in text-matching tasks like community-based question answering (cQA). The strong generalization power of neural networks enables these methods to find texts with similar topics but miss detailed matching information. However, as proven by traditional methods, the explicit lexical matching knowledge is important for effective answer retrieval. In this paper, we propose an ExMaLSTM model to incorporate the explicit matching knowledge into the long short-term memory (LSTM) neural network. We extract explicit lexical matching features with prior knowledge and then add them to the local representations of questions. We summarize the overall matching status by using a bi-directional LSTM. The final relevance score is calculated using a gate network, which can dynamically assign appropriate weights to the explicit matching score and the implicit relevance score. We conduct extensive experiments for answer retrieval in a cQA dataset. The results show that our proposed ExMaLSTM model outperforms both the traditional methods and various state-of-the-art neural network models significantly.
引用
收藏
页码:306 / 317
页数:12
相关论文
共 50 条