Multilayer self-attention residual network for code search

被引:0
|
作者
Hu, Haize [1 ,2 ]
Liu, Jianxun [1 ,2 ,3 ]
Zhang, Xiangping [1 ,2 ]
机构
[1] Hunan Univ Sci & Technol, Sch Comp Sci & Technol, Xiangtan, Peoples R China
[2] Hunan Univ Sci & Technol, Hunan Prov Key Lab Serv Comp & Novel Software Tech, Xiangtan, Peoples R China
[3] Hunan Univ Sci & Technol, Sch Comp Sci & Technol, Xiangtan 411100, Peoples R China
来源
关键词
code search; machine learning; residual networks; self-attention; software development; FILTER; MODEL;
D O I
10.1002/cpe.7650
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Software developers usually search existing code snippets in open source code repositories to modify and reuse them. Therefore, how to get the right code snippet from the open-source code repository quickly and accurately is the focus of current software development research. Nowadays, code search is one of the solutions. To improve the accuracy of source code feature information representation and the accuracy of code search. A multilayer self- attention residual network-based code search model (MSARN-CS) is proposed in this paper. In the MSARN-CS model, not only the weight of each word in the code sequence unit is considered but also the effect of embedding between code sequence units is calculated. In addition, an optimization model of residuals is introduced to compensate for the loss of information in the code sequences during the model training. To verify the search effectiveness of the MSARN-CS model, three other baseline models are compared on the basis of extensive source code data. The experimental results show that the MSARN-CS model has better search results compared with the baseline model. For parameter Recall@1, the experimental result of MSARN-CS model was 9.547, which as 100.90%, 73.87%, 60.37%, and 2.55% better compared to CODEnn, CRLCS, SAN-CS- and SAN-CS, respectively. For the parameter Recall@5, the results improved by 26.67%, 36.23%, 36.21%, and 1.63%, respectively, and for the parameter Recall@10, the results improved by 13.92%, 25.70%, 20.78%, and 2.23%, respectively. For the parameter mean reciprocal rank, the results improved by 52.89%, 76.17%, 63.38%, and 3.88%, respectively. For the parameter normalized discounted cumulative gain, the results improved by 54.22%, 60.55%, 50.28%, and 3.30%, respectively. The MSARN-CS model proposed in the paper can effectively improve the accuracy of code search and enhance the programming efficiency of developers.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] A mutual embedded self-attention network model for code search?
    Hu, Haize
    Liu, Jianxun
    Zhang, Xiangping
    Cao, Ben
    Cheng, Siqiang
    Long, Teng
    [J]. JOURNAL OF SYSTEMS AND SOFTWARE, 2023, 198
  • [2] Self-Attention Networks for Code Search
    Fang, Sen
    Tan, You-Shuai
    Zhang, Tao
    Liu, Yepang
    [J]. INFORMATION AND SOFTWARE TECHNOLOGY, 2021, 134
  • [3] Lightweight Self-Attention Residual Network for Hyperspectral Classification
    Xia, Jinbiao
    Cui, Ying
    Li, Wenshan
    Wang, Liguo
    Wang, Chao
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [4] Diversifying Search Results using Self-Attention Network
    Qin, Xubo
    Dou, Zhicheng
    Wen, Ji-Rong
    [J]. CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 1265 - 1274
  • [5] An effective dual self-attention residual network for seizure prediction
    Yang, Xinwu
    Zhao, Jiaqi
    Sun, Qi
    Lu, Jianbo
    Ma, Xu
    [J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2021, 29 : 1604 - 1613
  • [6] Crowd counting method based on the self-attention residual network
    Liu, Yan-Bo
    Jia, Rui-Sheng
    Liu, Qing-Ming
    Zhang, Xing-Li
    Sun, Hong-Mei
    [J]. APPLIED INTELLIGENCE, 2021, 51 (01) : 427 - 440
  • [7] Crowd counting method based on the self-attention residual network
    Yan-Bo Liu
    Rui-Sheng Jia
    Qing-Ming Liu
    Xing-Li Zhang
    Hong-Mei Sun
    [J]. Applied Intelligence, 2021, 51 : 427 - 440
  • [8] An Effective Dual Self-Attention Residual Network for Seizure Prediction
    Yang, Xinwu
    Zhao, Jiaqi
    Sun, Qi
    Lu, Jianbo
    Ma, Xu
    [J]. IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2021, 29 : 1604 - 1613
  • [9] GLOW : Global Weighted Self-Attention Network for Web Search
    Shan, Xuan
    Liu, Chuanjie
    Xia, Yiqian
    Chen, Qi
    Zhang, Yusi
    Ding, Kaize
    Liang, Yaobo
    Luo, Angen
    Luo, Yuxiang
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 519 - 528
  • [10] Self-attention enhanced deep residual network for spatial image steganalysis
    Xie, Guoliang
    Ren, Jinchang
    Marshall, Stephen
    Zhao, Huimin
    Li, Rui
    Chen, Rongjun
    [J]. DIGITAL SIGNAL PROCESSING, 2023, 139