Towards a question answering assistant for software development using a transformer-based language model

被引:0
|
作者
Vale, Liliane do Nascimento [1 ,2 ]
Maia, Marcelo de Almeida [2 ]
机构
[1] Fed Univ Catalao, Inst Biotechnol Comp Sci, Catalao, Go, Brazil
[2] Univ Fed Uberlandia, Fac Comp, Uberlandia, MG, Brazil
关键词
D O I
10.1109/BotSE52550.2021.00016
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Question answering platforms, such as Stack Overflow, have impacted substantially how developers search for solutions for their programming problems. The crowd knowledge content available from such platforms has also been used to leverage software development tools. The recent advances on Natural Language Processing, specifically on more powerful language models, have demonstrated ability to enhance text understanding and generation. In this context, we aim at investigating the factors that can influence on the application of such models for understanding source code related data and produce more interactive and intelligent assistants for software development. In this preliminary study, we particularly investigate if a how-to question filter and the level of context in the question may impact the results of a question answering transformer-based model. We suggest that fine-tuning models with corpus based on how-to questions can impact positively in the model and more contextualized questions also induce more objective answers.
引用
收藏
页码:39 / 42
页数:4
相关论文
共 50 条
  • [1] A Transformer-based Medical Visual Question Answering Model
    Liu, Lei
    Su, Xiangdong
    Guo, Hui
    Zhu, Daobin
    [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1712 - 1718
  • [2] Transformer-based vision-language alignment for robot navigation and question answering
    Luo, Haonan
    Guo, Ziyu
    Wu, Zhenyu
    Teng, Fei
    Li, Tianrui
    [J]. INFORMATION FUSION, 2024, 108
  • [3] Reading comprehension based question answering system in Bangla language with transformer-based learning
    Aurpa, Tanjim Taharat
    Rifat, Richita Khandakar
    Ahmed, Md Shoaib
    Anwar, Md Musfique
    Ali, A. B. M. Shawkat
    [J]. HELIYON, 2022, 8 (10)
  • [4] Transformer-Based Extractive Social Media Question Answering on TweetQA
    Butt, Sabur
    Ashraf, Noman
    Fahim, Hammad
    Sidorov, Grigori
    Gelbukh, Alexander
    [J]. COMPUTACION Y SISTEMAS, 2021, 25 (01): : 23 - 32
  • [5] Transformer-Based Neural Network for Answer Selection in Question Answering
    Shao, Taihua
    Guo, Yupu
    Chen, Honghui
    Hao, Zepeng
    [J]. IEEE ACCESS, 2019, 7 : 26146 - 26156
  • [6] Entity-aware answer sentence selection for question answering with transformer-based language models
    Abbasiantaeb, Zahra
    Momtazi, Saeedeh
    [J]. JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2022, 59 (03) : 755 - 777
  • [7] Entity-aware answer sentence selection for question answering with transformer-based language models
    Abbasiantaeb, Zahra
    Momtazi, Saeedeh
    [J]. Journal of Intelligent Information Systems, 2022, 59 (03): : 755 - 777
  • [8] Entity-aware answer sentence selection for question answering with transformer-based language models
    Zahra Abbasiantaeb
    Saeedeh Momtazi
    [J]. Journal of Intelligent Information Systems, 2022, 59 : 755 - 777
  • [9] Semantic Parameter Matching in Web APIs with Transformer-based Question Answering
    Kotstein, Sebastian
    Decker, Christian
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON SERVICE-ORIENTED SYSTEM ENGINEERING, SOSE, 2023, : 114 - 123
  • [10] Transformer-based Sparse Encoder and Answer Decoder for Visual Question Answering
    Peng, Longkun
    An, Gaoyun
    Ruan, Qiuqi
    [J]. 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 120 - 123