Attention-based skill translation models for expert finding

被引:13
|
作者
Fallahnejad, Zohreh [1 ]
Beigy, Hamid [1 ]
机构
[1] Sharif Univ Technol, Dept Comp Engn, Tehran, Iran
关键词
Expert finding; Semantic matching; Translation models; StackOverflow; FRAMEWORK;
D O I
10.1016/j.eswa.2021.116433
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The growing popularity of community question answering websites can be seen by the growing number of users. Many methods are proposed to identify talented users in these communities, but many of them suffer from vocabulary mismatches. The solution to this problem can be found in translation approaches. The present paper proposes two translation methods for extracting more relevant translations. The proposed methods rely on the attention mechanism. The methods use multi-label classifiers that take each question as input and predict the skills related to the question. Using the attention mechanism, the model is able to focus on specific parts of the given input and predict the correct labels. The ultimate goal of these networks is to predict skills related to questions. Using word attention scores, we can find out how relevant a single word is to a particular skill. As a result of these attention scores, we obtain more relevant translations for each skill. We then use these translations to bridge the lexical gap and improve expert retrieval results. Extensive experiments on two large sub-collections of the StackOverflow dataset demonstrate that the proposed methods outperform the best baseline method by up to 14.11/% MAP improvement.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Skill Translation Models in Expert Finding
    Nobari, Arash Dargahi
    Gharebagh, Sajad Sotudeh
    Neshati, Mahmood
    [J]. SIGIR'17: PROCEEDINGS OF THE 40TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2017, : 1057 - 1060
  • [2] Quality-aware skill translation models for expert finding on StackOverflow
    Nobari, Arash Dargahi
    Neshati, Mahmood
    Gharebagh, Sajad Sotudeh
    [J]. INFORMATION SYSTEMS, 2020, 87
  • [3] SAST: A self-attention based method for skill translation in T-shaped expert finding
    Fallahnejad, Zohreh
    Beigy, Hamid
    [J]. INFORMATION SCIENCES, 2024, 680
  • [4] Neural Machine Translation Models with Attention-Based Dropout Layer
    Israr, Huma
    Khan, Safdar Abbas
    Tahir, Muhammad Ali
    Shahzad, Muhammad Khuram
    Ahmad, Muneer
    Zain, Jasni Mohamad
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 75 (02): : 2981 - 3009
  • [5] Attention-Based Models for Speech Recognition
    Chorowski, Jan
    Bahdanau, Dzmitry
    Serdyuk, Dmitriy
    Cho, Kyunghyun
    Bengio, Yoshua
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [6] Recursive Annotations for Attention-Based Neural Machine Translation
    Ye, Shaolin
    Guo, Wu
    [J]. 2017 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2017, : 164 - 167
  • [7] A Survey on Attention-Based Models for Image Captioning
    Osman, Asmaa A. E.
    Shalaby, Mohamed A. Wahby
    Soliman, Mona M.
    Elsayed, Khaled M.
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (02) : 403 - 412
  • [8] Coherent Dialogue with Attention-Based Language Models
    Mei, Hongyuan
    Bansal, Mohit
    Walter, Matthew R.
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3252 - 3258
  • [9] Attention-Based Spatial Guidance for Image-to-Image Translation
    Lin, Yu
    Wang, Yigong
    Li, Yifan
    Gao, Yang
    Wang, Zhuoyi
    Khan, Latifur
    [J]. 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, : 816 - 825
  • [10] An Effective Coverage Approach for Attention-based Neural Machine Translation
    Hoang-Quan Nguyen
    Thuan-Minh Nguyen
    Huy-Hien Vu
    Van-Vinh Nguyen
    Phuong-Thai Nguyen
    Thi-Nga-My Dao
    Kieu-Hue Tran
    Khac-Quy Dinh
    [J]. PROCEEDINGS OF 2019 6TH NATIONAL FOUNDATION FOR SCIENCE AND TECHNOLOGY DEVELOPMENT (NAFOSTED) CONFERENCE ON INFORMATION AND COMPUTER SCIENCE (NICS), 2019, : 240 - 245