Combining bidirectional long short-term memory and self-attention mechanism for code search

被引:1
|
作者
Cao, Ben [1 ,2 ]
Liu, Jianxun [1 ,2 ,3 ]
机构
[1] Hunan Univ Sci & Technol, Sch Comp Sci & Engn, Xiangtan, Peoples R China
[2] Hunan Univ Sci & Technol, Key Lab Serv Comp & Novel Software Technol, Xiangtan, Peoples R China
[3] Hunan Univ Sci & Technol, Sch Comp Sci & Engn, Xiangtan 411201, Hunan, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
BiLSTM; code search; deep learning; self-attention mechanisms; semantic similarity; INFORMATION; NETWORKS;
D O I
10.1002/cpe.7662
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
With the wide application of deep learning in code search, especially the proposed code search model based on attention mechanism, the accuracy of code search has been greatly improved. However, the attention mechanism only captures the attention weight relationship between two words in the code fragment, without considering the contextual semantic relationship that exists between words in the code fragment, which can help improve the accuracy of code search. To address this problem, this paper proposes a model that combining bidirectional long short-term memory and self-attention mechanisms for code search (CBLSAM-CS). The model first captures the contextual semantic relationship of each word in the code fragment by long-short term memory network, and then uses the self-attention mechanism to extract deep-level features of the sequence. In order to verify the effectiveness of the proposed model, the paper has been conducted an experimental comparison with three other baseline models, CODEnn, CARLCS-CNN, and SAN-CS, on the basis of a public dataset containing 18 million code fragments. The experimental results show that the proposed model in this paper achieves 92.24% and 93.55% in mean reciprocal rank value and normalized discounted cumulative gain metrics, respectively, which are better than the baseline model. Therefore, it shows that the CBLSAM-CS model proposed in this paper can effectively improve the accuracy and efficiency of code search.
引用
下载
收藏
页数:19
相关论文
共 50 条
  • [21] Code Defect Detection Model with Multi-layer Bi-directional Long Short Term Memory based on Self-Attention Mechanism
    Hou, Cong
    Sun, Yue
    Li, Lin
    Chen, Wei
    Xu, Xiaotian
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON ELECTRONIC INFORMATION TECHNOLOGY AND COMPUTER ENGINEERING, EITCE 2023, 2023, : 1656 - 1660
  • [22] Improving Ship Fuel Consumption and Carbon Intensity Prediction Accuracy Based on a Long Short-Term Memory Model with Self-Attention Mechanism
    Wang, Zhihuan
    Lu, Tianye
    Han, Yi
    Zhang, Chunchang
    Zeng, Xiangming
    Li, Wei
    APPLIED SCIENCES-BASEL, 2024, 14 (18):
  • [23] Self-attention based bidirectional long short-term memory-convolutional neural network classifier for the prediction of ischemic and non-ischemic cardiomyopathy
    Dubey, Kavita
    Agarwal, Anant
    Lathe, Astitwa Sarthak
    Kumar, Ranjeet
    Srivastava, Vishal
    LASER PHYSICS LETTERS, 2020, 17 (09)
  • [24] Attention Mechanism-Based Bidirectional Long Short-Term Memory for Cycling Activity Recognition Using Smartphones
    Nguyen, Van Sy
    Kim, Hyunseok
    Suh, Dongjun
    IEEE ACCESS, 2023, 11 : 136206 - 136218
  • [25] A Novel Bidirectional Long Short-Term Memory Network With Weighted Attention Mechanism for Industrial Soft Sensor Development
    Zhang, Miao
    Xu, Beike
    Jie, Jing
    Hou, Beiping
    Zhou, Le
    IEEE SENSORS JOURNAL, 2024, 24 (11) : 18546 - 18555
  • [26] Web service classification based on information gain theory and bidirectional long short-term memory with attention mechanism
    Zhang, Xiangping
    Liu, Jianxun
    Cao, Buqing
    Shi, Min
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2021, 33 (13):
  • [27] Sequential Recommendation Based on Long-Term and Short-Term User Behavior with Self-attention
    Wei, Xing
    Zuo, Xianglin
    Yang, Bo
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2019, PT I, 2019, 11775 : 72 - 83
  • [28] A forecast model of short-term wind speed based on the attention mechanism and long short-term memory
    Xing, Wang
    Qi-liang, Wu
    Gui-rong, Tan
    Dai-li, Qian
    Ke, Zhou
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (15) : 45603 - 45623
  • [29] A forecast model of short-term wind speed based on the attention mechanism and long short-term memory
    Wang Xing
    Wu Qi-liang
    Tan Gui-rong
    Qian Dai-li
    Zhou Ke
    Multimedia Tools and Applications, 2024, 83 : 45603 - 45623
  • [30] Short-term and long-term memory self-attention network for segmentation of tumours in 3D medical images
    Wen, Mingwei
    Zhou, Quan
    Tao, Bo
    Shcherbakov, Pavel
    Xu, Yang
    Zhang, Xuming
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (04) : 1524 - 1537