ALSI-Transformer: Transformer-Based Code Comment Generation With Aligned Lexical and Syntactic Information

被引:1
|
作者
Park, Youngmi [1 ]
Park, Ahjeong [1 ]
Kim, Chulyun [1 ]
机构
[1] Sookmyung Womens Univ, Dept Informat Technol Engn, Seoul 04310, South Korea
关键词
Codes; Source coding; Syntactics; Data mining; Transformers; Machine translation; Logic gates; Program comprehension; comment generation; natural language processing; deep learning;
D O I
10.1109/ACCESS.2023.3268638
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Code comments explain the operational process of a computer program and increase the long-term productivity of programming tasks such as debugging and maintenance. Therefore, developing methods that automatically generate natural language comments from programming code is required. With the development of deep learning, various excellent models in the natural language processing domain have been applied for comment generation tasks, and recent studies have improved performance by simultaneously using the lexical information of the code token and the syntactical information obtained from the syntax tree. In this paper, to improve the accuracy of automatic comment generation, we introduce a novel syntactic sequence, Code-Aligned Type sequence (CAT), to align the order and length of lexical and syntactic information, and we propose a new neural network model, Aligned Lexical and Syntactic information-Transformer (ALSI-Transformer), based on a transformer that encodes the aligned multi-modal information with convolution and embedding aggregation layers. Through in-depth experiments, we compared ALSI-Transformer with current baseline methods using standard machine translation metrics and demonstrate that the proposed method achieves state-of-the-art performance in code comment generation.
引用
收藏
页码:39037 / 39047
页数:11
相关论文
共 50 条
  • [21] Applying Transformer-Based Text Summarization for Keyphrase Generation
    Glazkova A.V.
    Morozov D.A.
    Lobachevskii Journal of Mathematics, 2023, 44 (1) : 123 - 136
  • [22] Transformer-Based Graph Neural Networks for Outfit Generation
    Becattini, Federico
    Teotini, Federico Maria
    Bimbo, Alberto Del
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2024, 12 (01) : 213 - 223
  • [23] DLGNet: A Transformer-based Model for Dialogue Response Generation
    Olabiyi, Oluwatobi
    Mueller, Erik T.
    NLP FOR CONVERSATIONAL AI, 2020, : 54 - 62
  • [24] A Transformer-Based Approach for Smart Invocation of Automatic Code Completion
    de Moor, Aral
    van Deursen, Arie
    Izadi, Maliheh
    PROCEEDINGS OF THE 1ST ACM INTERNATIONAL CONFERENCE ON AI-POWERED SOFTWARE, AIWARE 2024, 2024, : 28 - 37
  • [25] A transformer-based approach to Nigerian Pidgin text generation
    Garba, Kabir
    Kolajo, Taiwo
    Agbogun, Joshua B.
    International Journal of Speech Technology, 2024, 27 (04) : 1027 - 1037
  • [26] Transformer-based image generation from scene graphs
    Sortino, Renato
    Palazzo, Simone
    Rundo, Francesco
    Spampinato, Concetto
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 233
  • [27] LayoutDM: Transformer-based Diffusion Model for Layout Generation
    Chai, Shang
    Zhuang, Liansheng
    Yan, Fengying
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 18349 - 18358
  • [28] Arabic Paraphrase Generation Using Transformer-Based Approaches
    Al-Shameri, Noora Aref
    Al-Khalifa, Hend S.
    IEEE ACCESS, 2024, 12 : 121896 - 121914
  • [29] A Transformer-Based Framework for Biomedical Information Retrieval Systems
    Hall, Karl
    Jayne, Chrisina
    Chang, Victor
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 317 - 331
  • [30] Transformer-based image captioning by leveraging sentence information
    Chahkandi, Vahid
    Fadaeieslam, Mohammad Javad
    Yaghmaee, Farzin
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (04)