Combining bidirectional long short-term memory and self-attention mechanism for code search

被引:1
|
作者
Cao, Ben [1 ,2 ]
Liu, Jianxun [1 ,2 ,3 ]
机构
[1] Hunan Univ Sci & Technol, Sch Comp Sci & Engn, Xiangtan, Peoples R China
[2] Hunan Univ Sci & Technol, Key Lab Serv Comp & Novel Software Technol, Xiangtan, Peoples R China
[3] Hunan Univ Sci & Technol, Sch Comp Sci & Engn, Xiangtan 411201, Hunan, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
BiLSTM; code search; deep learning; self-attention mechanisms; semantic similarity; INFORMATION; NETWORKS;
D O I
10.1002/cpe.7662
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
With the wide application of deep learning in code search, especially the proposed code search model based on attention mechanism, the accuracy of code search has been greatly improved. However, the attention mechanism only captures the attention weight relationship between two words in the code fragment, without considering the contextual semantic relationship that exists between words in the code fragment, which can help improve the accuracy of code search. To address this problem, this paper proposes a model that combining bidirectional long short-term memory and self-attention mechanisms for code search (CBLSAM-CS). The model first captures the contextual semantic relationship of each word in the code fragment by long-short term memory network, and then uses the self-attention mechanism to extract deep-level features of the sequence. In order to verify the effectiveness of the proposed model, the paper has been conducted an experimental comparison with three other baseline models, CODEnn, CARLCS-CNN, and SAN-CS, on the basis of a public dataset containing 18 million code fragments. The experimental results show that the proposed model in this paper achieves 92.24% and 93.55% in mean reciprocal rank value and normalized discounted cumulative gain metrics, respectively, which are better than the baseline model. Therefore, it shows that the CBLSAM-CS model proposed in this paper can effectively improve the accuracy and efficiency of code search.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Sequential Prediction of Glycosylated Hemoglobin Based on Long Short-Term Memory with Self-Attention Mechanism
    Wang, Xiaojia
    Gong, Wenqing
    Zhu, Keyu
    Yao, Lushi
    Zhang, Shanshan
    Xu, Weiqun
    Guan, Yuxiang
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2020, 13 (01) : 1578 - 1589
  • [2] Sequential Prediction of Glycosylated Hemoglobin Based on Long Short-Term Memory with Self-Attention Mechanism
    Xiaojia Wang
    Wenqing Gong
    Keyu Zhu
    Lushi Yao
    Shanshan Zhang
    Weiqun Xu
    Yuxiang Guan
    [J]. International Journal of Computational Intelligence Systems, 2020, 13 : 1578 - 1589
  • [3] SALSTM: segmented self-attention long short-term memory for long-term forecasting
    Dai, Zhi-Qiang
    Li, Jie
    Cao, Yang-Jie
    Zhang, Yong-Xiang
    [J]. Journal of Supercomputing, 2025, 81 (01):
  • [4] Intrusion Detection Based on Bidirectional Long Short-Term Memory with Attention Mechanism
    Yang, Yongjie
    Tu, Shanshan
    Ali, Raja Hashim
    Alasmary, Hisham
    Waqas, Muhammad
    Amjad, Muhammad Nouman
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 74 (01): : 801 - 815
  • [5] CEEMDAN-RIME-Bidirectional Long Short-Term Memory Short-Term Wind Speed Prediction for Wind Farms Incorporating Multi-Head Self-Attention Mechanism
    Yang, Wenlu
    Zhang, Zhanqiang
    Meng, Keqilao
    Wang, Kuo
    Wang, Rui
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (18):
  • [6] Bridge Structural Damage Identification Based on Parallel Multi-head Self-attention Mechanism and Bidirectional Long and Short-term Memory Network
    Liu, Qi
    Wang, Jiaxing
    Dai, Hualin
    Ning, Liyuan
    Nie, Peng
    [J]. ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2024,
  • [7] Refined Answer Selection Method with Attentive Bidirectional Long Short-Term Memory Network and Self-Attention Mechanism for Intelligent Medical Service Robot
    Wang, Deguang
    Liang, Ye
    Ma, Hengrui
    Xu, Fengqiang
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (05):
  • [8] Prediction of oil production based on multivariate analysis and self-attention mechanism integrated with long short-term memory
    Yan, Hua
    Liu, Ming
    Yang, Bin
    Yang, Yang
    Ni, Hu
    Wang, Haoyu
    Wang, Ying
    [J]. PETROLEUM SCIENCE AND TECHNOLOGY, 2024,
  • [9] Remaining useful life prediction based on double self-attention mechanism and long short-term memory network
    Wu, Jiajun
    Su, Chun
    Zhang, Yuru
    [J]. Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2024, 46 (06): : 1986 - 1994
  • [10] Forecasting carbon price with attention mechanism and bidirectional long short-term memory network
    Qin, Chaoyong
    Qin, Dongling
    Jiang, Qiuxian
    Zhu, Bangzhu
    [J]. ENERGY, 2024, 299