Towards Summarizing Code Snippets Using Pre-Trained Transformers

被引:0
|
作者
Mastropaolo, Antonio [1 ]
Tufano, Rosalia [1 ]
Ciniselli, Matteo [1 ]
Aghajani, Emad [1 ]
Pascarella, Luca [2 ]
Bavota, Gabriele [1 ]
机构
[1] SEART @ Software Institute, Università della Svizzera Italiana Lugano, Switzerland, Switzerland
[2] Center for Project-Based Learning, ETH Zurich Zurich, Switzerland, Switzerland
来源
arXiv | 1600年
关键词
D O I
暂无
中图分类号
学科分类号
摘要
引用
收藏
相关论文
共 50 条
  • [1] Towards Summarizing Code Snippets Using Pre-Trained Transformers
    Mastropaolo, Antonio
    Ciniselli, Matteo
    Pascarella, Luca
    Tufano, Rosalia
    Aghajani, Emad
    Bavota, Gabriele
    PROCEEDINGS 2024 32ND IEEE/ACM INTERNATIONAL CONFERENCE ON PROGRAM COMPREHENSION, ICPC 2024, 2024, : 1 - 12
  • [2] Are Pre-trained Convolutions Better than Pre-trained Transformers?
    Tay, Yi
    Dehghani, Mostafa
    Gupta, Jai
    Aribandi, Vamsi
    Bahri, Dara
    Qin, Zhen
    Metzler, Donald
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4349 - 4359
  • [3] Calibration of Pre-trained Transformers
    Desai, Shrey
    Durrett, Greg
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 295 - 302
  • [4] Pre-trained transformers: an empirical comparison
    Casola, Silvia
    Lauriola, Ivano
    Lavelli, Alberto
    MACHINE LEARNING WITH APPLICATIONS, 2022, 9
  • [5] Emotion Recognition with Pre-Trained Transformers Using Multimodal Signals
    Vazquez-Rodriguez, Juan
    Lefebvre, Gregoire
    Cumin, Julien
    Crowley, James L.
    2022 10TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2022,
  • [6] Towards a Comprehensive Understanding and Accurate Evaluation of Societal Biases in Pre-Trained Transformers
    Silva, Andrew
    Tambwekar, Pradyumna
    Gombolay, Matthew
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2383 - 2389
  • [7] SCC-GPT: Source Code Classification Based on Generative Pre-Trained Transformers
    Alahmadi, Mohammad D.
    Alshangiti, Moayad
    Alsubhi, Jumana
    MATHEMATICS, 2024, 12 (13)
  • [8] Using Pre-Trained Models to Boost Code Review Automation
    Tufano, Rosalia
    Masiero, Simone
    Mastropaolo, Antonio
    Pascarella, Luca
    Poshyvanyk, Denys
    Bavota, Gabriele
    2022 ACM/IEEE 44TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE 2022), 2022, : 2291 - 2302
  • [9] Predicting Terms in IS-A Relations with Pre-trained Transformers
    Nikishina, Irina
    Chernomorchenko, Polina
    Demidova, Anastasiia
    Panchenko, Alexander
    Biemann, Chris
    13TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING AND THE 3RD CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, IJCNLP-AACL 2023, 2023, : 134 - 148
  • [10] Efficient feature selection for pre-trained vision transformers
    Huang, Lan
    Zeng, Jia
    Yu, Mengqiang
    Ding, Weiping
    Bai, Xingyu
    Wang, Kangping
    Computer Vision and Image Understanding, 2025, 254