Large Language Models as a Service: optimisation strategies via Knowledge Space reduction

被引:1
|
作者
Panagoulias, Dimitrios P. [1 ]
Virvou, Maria [1 ]
Tsihrintzis, George A. [1 ]
机构
[1] Univ Piraeus, Dept Informat, Piraeus 18534, Greece
关键词
AI-empowered software engineering; Large Language Models; GPT; BERT;
D O I
10.1109/COINS61597.2024.10622117
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In Large Language Models (LLMs), "reducing the domain space" for text generation refers to limiting the range of content to be utilized for generating responses. This work explores Sequential Language Model Integration (SLMI), which mirrors the organization and distribution of knowledge across different fields of expertise. SLMI is the technique of linking multiple LLMs in a systematic manner and creating LLM-chains. In this paper, we refer to a process of choosing, linking and connecting LLMs with other services (invoke external machine learning processes or rule based applications) as "Large Language Models as a Service" to describe a complete system with specific agency. We outline the development and evaluation process of an SLMI methodology to refine response accuracy and we evaluate the methodology in the domain of dermatology using medical knowledge paths, where gains were found regarding diagnostic accuracy.
引用
收藏
页码:67 / 70
页数:4
相关论文
共 50 条
  • [1] Detoxifying Large Language Models via Knowledge Editing
    Wang, Mengru
    Zhang, Ningyu
    Xu, Ziwen
    Xi, Zekun
    Deng, Shumin
    Yao, Yunzhi
    Zhang, Qishen
    Yang, Linyi
    Wang, Jindong
    Chen, Huajun
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 3093 - 3118
  • [2] A comparative analysis of knowledge injection strategies for large language models in the domain
    Cadeddu, Andrea
    Chessa, Alessandro
    De Leo, Vincenzo
    Fenu, Gianni
    Motta, Enrico
    Osborne, Francesco
    Recupero, Diego Reforgiato
    Salatino, Angelo
    Secchi, Luca
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [3] MONITORASSISTANT: Simplifying Cloud Service Monitoring via Large Language Models
    Yu, Zhaoyang
    Ma, Minghua
    Zhang, Chaoyun
    Qin, Si
    Kang, Yu
    Bansal, Chetan
    Rajmohan, Saravan
    Dang, Yingnong
    Pei, Changhua
    Pei, Dan
    Lin, Qingwei
    Zhang, Dongmei
    COMPANION PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON THE FOUNDATIONS OF SOFTWARE ENGINEERING, FSE COMPANION 2024, 2024, : 38 - 49
  • [4] Integrating chemistry knowledge in large language models via prompt engineering
    Liu, Hongxuan
    Yin, Haoyu
    Luo, Zhiyao
    Wang, Xiaonan
    SYNTHETIC AND SYSTEMS BIOTECHNOLOGY, 2025, 10 (01) : 23 - 38
  • [5] Incorporating Molecular Knowledge in Large Language Models via Multimodal Modeling
    Yang, Zekun
    Lv, Kun
    Shu, Jian
    Li, Zheng
    Xiao, Ping
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2025,
  • [6] Difficulty aware programming knowledge tracing via large language models
    Yang, Lina
    Sun, Xinjie
    Li, Hui
    Xu, Ran
    Wei, Xuqin
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [7] Fine-Grained Task Planning for Service Robots Based on Object Ontology Knowledge via Large Language Models
    Li, Xiaodong
    Tian, Guohui
    Cui, Yongcheng
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (08): : 6872 - 6879
  • [8] Knowledge Graph-Enhanced Large Language Models via Path Selection
    Liu, Haochen
    Wang, Song
    Zhu, Yaochen
    Dong, Yushun
    Li, Jundong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 6311 - 6321
  • [9] Quantifying Domain Knowledge in Large Language Models
    Sayenju, Sudhashree
    Aygun, Ramazan
    Franks, Bill
    Johnston, Sereres
    Lee, George
    Choi, Hansook
    Modgil, Girish
    2023 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI, 2023, : 193 - 194
  • [10] Knowledge management in organization and the large language models
    Zelenkov, Yu. A.
    ROSSIISKII ZHURNAL MENEDZHMENTA-RUSSIAN MANAGEMENT JOURNAL, 2024, 22 (03): : 573 - 601