Interactive Design by Integrating a Large Pre-Trained Language Model and Building Information Modeling

被引:0
|
作者
Jang, Suhyung [1 ]
Lee, Ghang [2 ]
机构
[1] Yonsei Univ, Dept Architecture & Architectural Engn, Bldg Informat Grp, Seoul, South Korea
[2] Yonsei Univ, Dept Architecture & Architectural Engn, Seoul, South Korea
关键词
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This study explores the potential of generative artificial intelligence (AI) models, specifically OpenAI's generative pre-trained transformer (GPT) series, when integrated with building information modeling (BIM) tools as an interactive design assistant for architectural design. The research involves the development and implementation of three key components: (1) BIM2XML, a component that translates BIM data into extensible markup language (XML) format; (2) natural language-based architectural detailing through interaction with AI (NADIA), a component that refines the input design in XML by identifying designer intent, relevant objects, and their attributes, using pre-trained language models; and (3) XML2BIM, a component that converts AI-generated XML data back into a BIM tool. This study validated the proposed approach through a case study involving design detailing, using the GPT series and Revit. Our findings demonstrate the effectiveness of state-of-the-art language models in facilitating dynamic collaboration between architects and AI systems, highlighting the potential for further advancements.
引用
收藏
页码:291 / 299
页数:9
相关论文
共 50 条
  • [21] Modeling Content Importance for Summarization with Pre-trained Language Models
    Xiao, Liqiang
    Lu Wang
    Hao He
    Jin, Yaohui
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 3606 - 3611
  • [22] POOE: predicting oomycete effectors based on a pre-trained large protein language model
    Zhao, Miao
    Lei, Chenping
    Zhou, Kewei
    Huang, Yan
    Fu, Chen
    Yang, Shiping
    Zhang, Ziding
    [J]. MSYSTEMS, 2024, 9 (01)
  • [23] Alternating Recurrent Dialog Model with Large-scale Pre-trained Language Models
    Wu, Qingyang
    Zhang, Yichi
    Li, Yu
    Yu, Zhou
    [J]. 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 1292 - 1301
  • [24] Enhancing Language Generation with Effective Checkpoints of Pre-trained Language Model
    Park, Jeonghyeok
    Zhao, Hai
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2686 - 2694
  • [25] Integrating Knowledge Graph Embeddings and Pre-trained Language Models in Hypercomplex Spaces
    Nayyeri, Mojtaba
    Wang, Zihao
    Akter, Mst. Mahfuja
    Alam, Mirza Mohtashim
    Rony, Md Rashad Al Hasan
    Lehmann, Jens
    Staab, Steffen
    [J]. SEMANTIC WEB, ISWC 2023, PART I, 2023, 14265 : 388 - 407
  • [26] Knowledge Enhanced Pre-trained Language Model for Product Summarization
    Yin, Wenbo
    Ren, Junxiang
    Wu, Yuejiao
    Song, Ruilin
    Liu, Lang
    Cheng, Zhen
    Wang, Sibo
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II, 2022, 13552 : 263 - 273
  • [27] Few-Shot NLG with Pre-Trained Language Model
    Chen, Zhiyu
    Eavani, Harini
    Chen, Wenhu
    Liu, Yinyin
    Wang, William Yang
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 183 - 190
  • [28] A Pre-trained Clinical Language Model for Acute Kidney Injury
    Mao, Chengsheng
    Yao, Liang
    Luo, Yuan
    [J]. 2020 8TH IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI 2020), 2020, : 531 - 532
  • [29] SsciBERT: a pre-trained language model for social science texts
    Si Shen
    Jiangfeng Liu
    Litao Lin
    Ying Huang
    Lin Zhang
    Chang Liu
    Yutong Feng
    Dongbo Wang
    [J]. Scientometrics, 2023, 128 : 1241 - 1263
  • [30] IndicBART: A Pre-trained Model for Indic Natural Language Generation
    Dabre, Raj
    Shrotriya, Himani
    Kunchukuttan, Anoop
    Puduppully, Ratish
    Khapra, Mitesh M.
    Kumar, Pratyush
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 1849 - 1863