Multi-step Iterative Automated Domain Modeling with Large Language Models

被引:0
|
作者
Yang, Yujing [1 ]
Chen, Boqi [1 ]
Chen, Kua [1 ]
Mussbacher, Gunter [1 ]
Varro, Daniel [1 ,2 ]
机构
[1] McGill Univ, Montreal, PQ, Canada
[2] Linkoping Univ, Linkoping, Sweden
关键词
domain modeling; large language models; few-shot learning; prompt engineering;
D O I
10.1145/3652620.3687807
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Domain modeling, which represents the concepts and relationships in a problem domain, is an essential part of software engineering. As large language models (LLMs) have recently exhibited remarkable ability in language understanding and generation, many approaches are designed to automate domain modeling with LLMs. However, these approaches usually formulate all input information to the LLM in a single step. Our previous single-step approach resulted in many missing modeling elements and advanced patterns. This paper introduces a novel framework designed to enhance fully automated domain model generation. The proposed multi-step automated domain modeling approach extracts model elements (e.g., classes, attributes, and relationships) from problem descriptions. The approach includes instructions and human knowledge in each step and uses an iterative process to identify complex patterns, repeatedly extracting the pattern from various instances and then synthesizing these extractions into a summarized overview. Furthermore, the framework incorporates a self-reflection mechanism. This mechanism assesses each generated model element, offering self-feedback for necessary modifications or removals, and integrates the domain model with the generated self-feedback. The proposed approach is assessed in experiments, comparing it with a baseline single-step approach from our earlier work. Experiments demonstrate a significant improvement over our earlier work, with a 22.71% increase in the F-1-score for identifying classes, 75.18% for relationships, and a 10.39% improvement for identifying the player-role pattern, with comparable performance for attributes. Our approach, dataset, and evaluation provide valuable insight for future research in automated LLM-based domain modeling.
引用
收藏
页码:587 / 595
页数:9
相关论文
共 50 条
  • [1] INFORM : Information eNtropy based multi-step reasoning FOR large language Models
    Zhou, Chuyue
    You, Wangjie
    Li, Juntao
    Ye, Jing
    Chen, Kehai
    Zhang, Min
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 3565 - 3576
  • [2] MindMap: Constructing Evidence Chains for Multi-Step Reasoning in Large Language Models
    Wu, Yangyu
    Han, Xu
    Song, Wei
    Cheng, Miaomiao
    Li, Fei
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 19270 - 19278
  • [3] Multi-step Transfer Learning in Natural Language Processing for the Health Domain
    Manaka, Thokozile
    Van Zyl, Terence
    Kar, Deepak
    Wade, Alisha
    NEURAL PROCESSING LETTERS, 2024, 56 (03)
  • [4] Enhancing Domain Modeling with Pre-trained Large Language Models: An Automated Assistant for Domain Modelers
    Prokop, Dominik
    Stenchlak, Stepan
    Skoda, Petr
    Klimek, Jakub
    Necasky, Martin
    CONCEPTUAL MODELING, ER 2024, 2025, 15238 : 235 - 253
  • [5] Iterative multi-step explicit camera calibration
    Batista, J
    Araujo, H
    de Almeida, AT
    SIXTH INTERNATIONAL CONFERENCE ON COMPUTER VISION, 1998, : 709 - 714
  • [6] Towards a Mechanistic Interpretation of Multi-Step Reasoning Capabilities of Language Models
    Hou, Yifan
    Li, Jiaoda
    Fei, Yu
    Stolfo, Alessandro
    Zhou, Wangchunshu
    Zeng, Guangtao
    Bosselut, Antoine
    Sachan, Mrinmaya
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 4902 - 4919
  • [7] Emotion Recognition in Conversation with Multi-step Prompting Using Large Language Model
    Hama, Kenta
    Otsuka, Atsushi
    Ishii, Ryo
    SOCIAL COMPUTING AND SOCIAL MEDIA, PT I, SCSM 2024, 2024, 14703 : 338 - 346
  • [8] A multi-step class of iterative methods for nonlinear systems
    Fazlollah Soleymani
    Taher Lotfi
    Parisa Bakhtiari
    Optimization Letters, 2014, 8 : 1001 - 1015
  • [9] A multi-step class of iterative methods for nonlinear systems
    Soleymani, Fazlollah
    Lotfi, Taher
    Bakhtiari, Parisa
    OPTIMIZATION LETTERS, 2014, 8 (03) : 1001 - 1015
  • [10] When are Direct Multi-step and Iterative Forecasts Identical?
    McElroy, Tucker
    JOURNAL OF FORECASTING, 2015, 34 (04) : 315 - 336