Domain Adaptive Code Completion via Language Models and Decoupled Domain Databases

被引:1
|
作者
Tang, Ze [1 ]
Ge, Jidong [1 ]
Liu, Shangqing [2 ]
Zhu, Tingwei [1 ]
Xu, Tongtong [3 ]
Huang, Liguo [4 ]
Luo, Bin [1 ]
机构
[1] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing, Peoples R China
[2] Nanyang Technol Univ, Singapore, Singapore
[3] Huawei Software Engn Applicat Technol, Shenzhen, Peoples R China
[4] Southern Methodist Univ, Dept Comp Sci, Dallas, TX USA
关键词
domain adaptive code completion; retrieval-augment language model;
D O I
10.1109/ASE56229.2023.00076
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Large Language Models (LLMs) have demonstrated remarkable performance in code completion. However, due to the lack of domain-specific knowledge, they may not be optimal in completing code that requires intensive domain knowledge for example completing the library names. Although there are several works that have confirmed the effectiveness of fine-tuning techniques to adapt language models for code completion in specific domains. They are limited by the need for constant fine-tuning of the model when the project is in constant iteration. To address this limitation, in this paper, we propose kNM-LM, a retrieval-augmented language model (R-LM), that integrates domain knowledge into language models without fine-tuning. Different from previous techniques, our approach is able to automatically adapt to different language models and domains. Specifically, it utilizes the in-domain code to build the retrieval-based database decoupled from LM, and then combines it with LM through Bayesian inference to complete the code. The extensive experiments on the completion of intra-project and intra-scenario have confirmed that kNM-LM brings about appreciable enhancements when compared to CodeGPT and UnixCoder. A deep analysis of our tool including the responding speed, storage usage, specific type code completion, and API invocation completion has confirmed that kNM-LM provides satisfactory performance, which renders it highly appropriate for domain adaptive code completion. Furthermore, our approach operates without the requirement for direct access to the language model's parameters. As a result, it can seamlessly integrate with black-box code completion models, making it easy to integrate our approach as a plugin to further enhance the performance of these models.
引用
收藏
页码:421 / 433
页数:13
相关论文
共 50 条
  • [41] AdaComplete: improve DL-based code completion method's domain adaptability
    Wang, Zejun
    Liu, Fang
    Hao, Yiyang
    Jin, Zhi
    AUTOMATED SOFTWARE ENGINEERING, 2023, 30 (01)
  • [42] Practicing Domain-Specific Languages: From Code to Models
    Gonnord, Laure
    Mosser, Sebastien
    21ST ACM/IEEE INTERNATIONAL CONFERENCE ON MODEL DRIVEN ENGINEERING LANGUAGES AND SYSTEMS: COMPANION PROCEEDINGS (MODELS-COMPANION '18), 2018, : 106 - 113
  • [43] Using Relationships for Matching Textual Domain Models with Existing Code
    Komondoor, Raghavan
    Bhattacharya, Indrajit
    D'Souza, Deepak
    Kale, Sachin
    2013 20TH WORKING CONFERENCE ON REVERSE ENGINEERING (WCRE), 2013, : 371 - 380
  • [44] An Optimization Approach for Matching Textual Domain Models with Existing Code
    Patil, Tejas
    Komondoor, Raghavan
    D'Souza, Deepak
    Bhattacharya, Indrajit
    32ND IEEE INTERNATIONAL CONFERENCE ON SOFTWARE MAINTENANCE AND EVOLUTION (ICSME 2016), 2016, : 134 - 144
  • [45] Domain-specific language models and lexicons for tagging
    Coden, AR
    Pakhomov, SV
    Ando, RKB
    Duffy, PH
    Chute, CG
    JOURNAL OF BIOMEDICAL INFORMATICS, 2005, 38 (06) : 422 - 430
  • [46] Unsupervised Domain Adaptation of Language Models for Reading Comprehension
    Nishida, Kosuke
    Nishida, Kyosuke
    Saito, Itsumi
    Asano, Hisako
    Tomita, Junji
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 5392 - 5399
  • [47] Conceptual language models for domain-specific retrieval
    Meij, Edgar
    Trieschnigg, Dolf
    de Rijke, Maarten
    Kraaij, Wessel
    INFORMATION PROCESSING & MANAGEMENT, 2010, 46 (04) : 448 - 469
  • [48] DOMAIN ROBUST, FAST, AND COMPACT NEURAL LANGUAGE MODELS
    Gerstenberger, Alexander
    Irie, Kazuki
    Golik, Pavel
    Beck, Eugen
    Ney, Hermann
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 7954 - 7958
  • [49] Domain Adaptation for Contrastive Audio-Language Models
    Deshmukh, Soham
    Singh, Rita
    Raj, Bhiksha
    INTERSPEECH 2024, 2024, : 1680 - 1684
  • [50] A Domain Specific Language for Extracting Models in Software Modernization
    Canovas Izquierdo, Javier Luis
    Garcia Molina, Jesus
    MODEL DRIVEN ARCHITECTURE - FOUNDATIONS AND APPLICATIONS, PROCEEDINGS, 2009, 5562 : 82 - 97