End-to-end log statement generation at block-level

被引:0
|
作者
Fu, Ying [1 ]
Yan, Meng [1 ]
He, Pinjia [2 ]
Liu, Chao [1 ]
Zhang, Xiaohong [1 ]
Yang, Dan [3 ]
机构
[1] Chongqing Univ, Sch Big Data & Software Engn, Chongqing, Peoples R China
[2] Chinese Univ Hong Kong CUHK Shenzhen, Sch Data Sci, Shenzhen, Peoples R China
[3] Southwest Jiaotong Univ, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Log statement; End-to-end; Block-level; Deep learning;
D O I
10.1016/j.jss.2024.112146
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Logging is crucial in software development for addressing runtime issues but can pose challenges. Logging encompasses four essential sub -tasks: whether to log (Whether), where to log (Position), which log level (Level), and what information to log (Message). While existing approaches have performed well, they suffer from two limitations. Firstly, they address only a subset of the logging sub -tasks. Secondly, most of them focus on generating single log statements at class or method level, potentially overlooking multiple log statements within those scopes. To address these issues, we propose ELogger, which enables end -to -end log statement generation at block -level. Furthermore, ELogger implements block -level log generation, enabling it to handle multiple log statements within different code blocks of a method. Evaluation results indicate that ELogger correctly predicts all four sub -tasks in 19.55% of cases. Compared to the baselines that combined existing approaches for endto -end log statement generation, ELogger demonstrates a significant improvement with a 50.85% to 78.21% average increase. Additionally, ELogger correctly predicts whether to log in 71.68% of cases, two sub -tasks (Whether and Position) in 58.29% of cases, and three sub -tasks (Whether, Position, and Level) in 41.97% of cases, all of which outperform the baselines.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Block-level dependency syntax based model for end-to-end aspect-based sentiment analysis
    Xiang, Yan
    Zhang, Jiqun
    Guo, Junjun
    NEURAL NETWORKS, 2023, 166 : 225 - 235
  • [2] End-to-end learned block-based image compression with block-level masked convolutions and asymptotic closed-loop training
    Electrical and Electronics Engineering, Middle East Technical University, Ankara
    06800, Turkey
    Multimedia Tools Appl,
  • [3] Log-space polynomial end-to-end communication
    Kushilevitz, E
    Ostrovsky, R
    Rosen, A
    SIAM JOURNAL ON COMPUTING, 1998, 27 (06) : 1531 - 1549
  • [4] An End-to-end Log Management Framework for Distributed Systems
    He, Pinjia
    2017 IEEE 36TH INTERNATIONAL SYMPOSIUM ON RELIABLE DISTRIBUTED SYSTEMS (SRDS), 2017, : 266 - 267
  • [5] OneLog: towards end-to-end software log anomaly detection
    Hashemi, Shayan
    Mantyla, Mika
    AUTOMATED SOFTWARE ENGINEERING, 2024, 31 (02)
  • [6] An End-to-End Generative Architecture for Paraphrase Generation
    Yang, Qian
    Huo, Zhouyuan
    Shen, Dinghan
    Chen, Yong
    Wang, Wenlin
    Wang, Guoyin
    Carin, Lawrence
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3132 - 3142
  • [7] End-to-end Argument Generation System in Debating
    Sato, Misa
    Yanai, Kohsuke
    Yanase, Toshihiko
    Miyoshi, Toshinori
    Iwayama, Makoto
    Sun, Qinghua
    Niwa, Yoshiki
    PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2015): SYSTEM DEMONSTRATIONS, 2015, : 109 - 114
  • [8] An end-to-end model for chinese calligraphy generation
    Peichi Zhou
    Zipeng Zhao
    Kang Zhang
    Chen Li
    Changbo Wang
    Multimedia Tools and Applications, 2021, 80 : 6737 - 6754
  • [9] An end-to-end model for chinese calligraphy generation
    Zhou, Peichi
    Zhao, Zipeng
    Zhang, Kang
    Li, Chen
    Wang, Changbo
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (05) : 6737 - 6754
  • [10] End-to-End Differentiable GANs for Text Generation
    Kumar, Sachin
    Tsvetkov, Yulia
    NEURIPS WORKSHOPS, 2020, 2020, 137 : 118 - 128