The influence of unmeasured occupancy disturbances on the performance of black-box thermal building models

被引:1
|
作者
Christensen, Louise Raevdal Lund [1 ]
Broholt, Thea Hauge [1 ]
Knudsen, Michael Dahl [1 ]
Hedegaard, Rasmus Elbaek [1 ]
Petersen, Steffen [1 ]
机构
[1] Aarhus Univ, Dept Engn, Inge Lehmanns Gade 10, DK-8000 Aarhus C, Denmark
关键词
PREDICTIVE CONTROL; DEMAND RESPONSE;
D O I
10.1051/e3sconf/202017202010
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Previous studies have identified a significant potential in using economic model predictive control for space heating. This type of control requires a thermodynamic model of the controlled building that maps certain controllable inputs (heat power) and measured disturbances (ambient temperature and solar irradiation) to the controlled output variable (room temperature). Occupancy related disturbances, such as people heat gains and venting through windows, are often completely ignored or assumed to be fully known (measured) in these studies. However, this assumption is usually not fulfilled in practice and the current simulation study investigated the consequences thereof. The results indicate that the predictive performance (root mean square errors) of a black-box state-space model is not significantly affected by ignoring people heat gains. On the other hand, the predictive performance was significantly improved by including window opening status as a model input. The performance of black-box models for MPC of space heating could therefore benefit from having inputs from sensors that tracks window opening.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] Using nonlinear black-box models in fault detection
    Zhang, QH
    PROCEEDINGS OF THE 35TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-4, 1996, : 636 - 637
  • [42] PROPS: Probabilistic personalization of black-box sequence models
    Wojnowicz, Michael Thomas
    Zhao, Xuan
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 4768 - 4774
  • [43] Explaining Black-Box Models Using Interpretable Surrogates
    Kuttichira, Deepthi Praveenlal
    Gupta, Sunil
    Li, Cheng
    Rana, Santu
    Venkatesh, Svetha
    PRICAI 2019: TRENDS IN ARTIFICIAL INTELLIGENCE, PT I, 2019, 11670 : 3 - 15
  • [44] Explaining Black-Box Models for Biomedical Text Classification
    Moradi, Milad
    Samwald, Matthias
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2021, 25 (08) : 3112 - 3120
  • [45] Explainable AI: To Reveal the Logic of Black-Box Models
    Chinu
    Bansal, Urvashi
    New Generation Computing, 42 (01): : 53 - 87
  • [46] Knockoff Nets: Stealing Functionality of Black-Box Models
    Orekondy, Tribhuvanesh
    Schiele, Bernt
    Fritz, Mario
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 4949 - 4958
  • [47] Explaining Decisions of Black-Box Models Using BARBE
    Motallebi, Mohammad
    Anik, Md Tanvir Alam
    Zaiane, Osmar R.
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2023, PT II, 2023, 14147 : 82 - 97
  • [48] Effective fuzzy black-box modeling for building heating dynamics
    Killian, M.
    Mayer, B.
    Kozek, M.
    ENERGY AND BUILDINGS, 2015, 96 : 175 - 186
  • [49] Explainable AI: To Reveal the Logic of Black-Box Models
    Urvashi Chinu
    New Generation Computing, 2024, 42 : 53 - 87
  • [50] Interpretable Approaches to Detect Bias in Black-Box Models
    Tan, Sarah
    PROCEEDINGS OF THE 2018 AAAI/ACM CONFERENCE ON AI, ETHICS, AND SOCIETY (AIES'18), 2018, : 382 - 383