CHATDESIGN: BOOTSTRAPPING GENERATIVE FLOOR PLAN DESIGN WITH PRE-TRAINED LARGE LANGUAGE MODELS

被引:0
|
作者
Li, Jinmin [1 ,2 ]
Luo, Yilu [1 ]
Lu, Shuai [1 ]
Zhang, Jingyun [2 ]
Wang, Jun [2 ]
Guo, Rizen [2 ]
Wang, Shaoming [2 ]
机构
[1] Tsinghua Univ, Shenzhen Int Grad Sch, Beijing, Peoples R China
[2] Tencent, Wechat Lab 33, Shenzhen, Peoples R China
基金
美国国家科学基金会;
关键词
floor plan generation; large language models; user interactions; automatic design; deep learning; pre-train models;
D O I
暂无
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Large language models (LLMs) have achieved remarkable success in various domains, revolutionizing tasks such as language translation, text generation, and question-answering. However, generating floor plan designs poses a unique challenge that demands the fulfilment of intricate spatial and relational constraints. In this paper, we propose ChatDesign, an innovative approach that leverages the power of pre-trained LLMs to generate floor plan designs from natural language descriptions, while incorporating iterative modifications based on user interaction. By processing user input text through a pre-trained LLM and utilizing a decoder, we can generate regression parameters and floor plans that are precisely tailored to satisfy the specific needs of the user. Our approach incorporates an iterative refinement process, optimizing the model output by considering the input text and previous results. Throughout these interactions, we employ many strategic techniques to ensure the generated design images align precisely with the user's requirements. The proposed approach is extensively evaluated through rigorous experiments, including user studies, demonstrating its feasibility and efficacy. The empirical results consistently demonstrate the superiority of our method over existing approaches, showcasing its ability to generate floor plans that rival those created by human designer. Our code will be available at https://github.com/THUKingmin/ChatDesign.
引用
收藏
页码:99 / 108
页数:10
相关论文
共 50 条
  • [21] Pre-trained Convolutional Networks and Generative Statistical Models: A Comparative Study in Large Datasets
    Michael, John
    Teixeira, Luis F.
    [J]. PATTERN RECOGNITION AND IMAGE ANALYSIS (IBPRIA 2017), 2017, 10255 : 69 - 75
  • [22] How large language models including generative pre-trained transformer (GPT) 3 and 4 will impact medicine and surgery
    S. B. Atallah
    N. R. Banda
    A. Banda
    N. A. Roeck
    [J]. Techniques in Coloproctology, 2023, 27 : 609 - 614
  • [23] How large language models including generative pre-trained transformer (GPT) 3 and 4 will impact medicine and surgery
    Atallah, S. B.
    Banda, N. R.
    Banda, A.
    Roeck, N. A.
    [J]. TECHNIQUES IN COLOPROCTOLOGY, 2023, 27 (08) : 609 - 614
  • [24] Clinical efficacy of pre-trained large language models through the lens of aphasia
    Cong, Yan
    Lacroix, Arianna N.
    Lee, Jiyeon
    [J]. SCIENTIFIC REPORTS, 2024, 14 (01):
  • [25] Editorial for Special Issue on Pre-trained Large Language Models for Information Processing
    Wang, Bin
    Kawahara, Tatsuya
    Li, Haizhou
    Meng, Helen
    Wu, Chung-Hsien
    [J]. APSIPA TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING, 2024, 13 (02)
  • [26] A Study of Pre-trained Language Models in Natural Language Processing
    Duan, Jiajia
    Zhao, Hui
    Zhou, Qian
    Qiu, Meikang
    Liu, Meiqin
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
  • [27] From Cloze to Comprehension: Retrofitting Pre-trained Masked Language Models to Pre-trained Machine Reader
    Xu, Weiwen
    Li, Xin
    Zhang, Wenxuan
    Zhou, Meng
    Lam, Wai
    Si, Luo
    Bing, Lidong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [28] Probing Pre-Trained Language Models for Disease Knowledge
    Alghanmi, Israa
    Espinosa-Anke, Luis
    Schockaert, Steven
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033
  • [29] Pre-trained models for natural language processing: A survey
    Qiu XiPeng
    Sun TianXiang
    Xu YiGe
    Shao YunFan
    Dai Ning
    Huang XuanJing
    [J]. SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2020, 63 (10) : 1872 - 1897
  • [30] Analyzing Individual Neurons in Pre-trained Language Models
    Durrani, Nadir
    Sajjad, Hassan
    Dalvi, Fahim
    Belinkov, Yonatan
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4865 - 4880