A multi-source transfer learning model based on LSTM and domain adaptation for building energy prediction

被引:41
|
作者
Lu, Huiming [1 ]
Wu, Jiazheng [1 ]
Ruan, Yingjun [1 ]
Qian, Fanyue [1 ]
Meng, Hua [1 ]
Gao, Yuan [2 ]
Xu, Tingting [1 ]
机构
[1] Tongji Univ, Coll Mech & Energy Engn, Shanghai 200092, Peoples R China
[2] Univ Tokyo, Grad Sch Engn, Dept Architecture, Tokyo, Japan
关键词
Building energy prediction; Transfer learning; Multi; -source; Energy consumption; Domain adaptation; LOAD;
D O I
10.1016/j.ijepes.2023.109024
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Transfer learning can use the knowledge learned from the operating data of other buildings to facilitate the energy prediction of a target building. However, most of the current research focuses on the transfer of a single source building of the same type or from the same region. A single source domain produces domain shift due to the difficulty of aligning the distribution with the target domain. To address this problem, this paper proposes a novel multi-source transfer learning energy prediction model based on long short-term memory (LSTM) and multi-kernel maximum mean difference (MK-MMD) domain adaptation. This model was used for the short-term energy prediction of different types of buildings lacking historical data. In addition, dynamic time warping (DTW) was used to select the source domain. Multiple multi-source models and corresponding single-source models were compared on a collection of buildings in the Higashida area of Fukuoka Prefecture, Japan. On the experimental datasets, the results showed that DTW relatively accurately measured the similarity between building energy consumption datasets. Compared with the results of the single-source transfer learning models, the multi-source transfer learning models achieved better average prediction performance, and their mean ab-solute percentage error (MAPE) improved the prediction accuracy by 6.88-15.37%.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] On the analysis of adaptability in multi-source domain adaptation
    Redko, Ievgen
    Habrard, Amaury
    Sebban, Marc
    MACHINE LEARNING, 2019, 108 (8-9) : 1635 - 1652
  • [42] Multi-source Domain Adaptation for Semantic Segmentation
    Zhao, Sicheng
    Li, Bo
    Yue, Xiangyu
    Gu, Yang
    Xu, Pengfei
    Hu, Runbo
    Chai, Hua
    Keutzer, Kurt
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [43] Multi-Source Domain Adaptation: A Causal View
    Zhang, Kun
    Gong, Mingming
    Schoelkopf, Bernhard
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 3150 - 3157
  • [44] Coupled Training for Multi-Source Domain Adaptation
    Amosy, Ohad
    Chechik, Gal
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 1071 - 1080
  • [45] Multi-Source Domain Adaptation for Object Detection
    Yao, Xingxu
    Zhao, Sicheng
    Xu, Pengfei
    Yang, Jufeng
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 3253 - 3262
  • [46] On the analysis of adaptability in multi-source domain adaptation
    Ievgen Redko
    Amaury Habrard
    Marc Sebban
    Machine Learning, 2019, 108 : 1635 - 1652
  • [47] Multi-Source Domain Adaptation with Sinkhorn Barycenter
    Komatsu, Tatsuya
    Matsui, Tomoko
    Gao, Junbin
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1371 - 1375
  • [48] Graphical Modeling for Multi-Source Domain Adaptation
    Xu, Minghao
    Wang, Hang
    Ni, Bingbing
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (03) : 1727 - 1741
  • [49] Multi-Source Attention for Unsupervised Domain Adaptation
    Cui, Xia
    Bollegala, Danushka
    1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 873 - 883
  • [50] Multi-Source Domain Adaptation with Mixture of Experts
    Guo, Jiang
    Shah, Darsh J.
    Barzilay, Regina
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4694 - 4703