Using neural networks to model long-term dependencies in occupancy behavior

被引:10
|
作者
Kleinebrahm, Max [1 ]
Torriti, Jacopo [2 ]
McKenna, Russell [3 ,4 ]
Ardone, Armin [1 ]
Fichtner, Wolf [1 ]
机构
[1] Karlsruhe Inst Technol, Chair Energy Econ, Hertzstr 16, D-76187 Karlsruhe, Germany
[2] Univ Reading, Sch Built Environm, POB 219, Reading RG6 6AY, Berks, England
[3] Tech Univ Denmark, DTU Management, DK-2800 Lyngby, Denmark
[4] Univ Aberdeen, Sch Engn, Chair Energy Transit, Aberdeen AB24 3FX, Scotland
基金
英国工程与自然科学研究理事会;
关键词
Activity modelling; Mobility behavior; Neural networks; Synthetic data; ENERGY DEMAND; TIME; RESOLUTION;
D O I
10.1016/j.enbuild.2021.110879
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Models simulating household energy demand based on different occupant and household types and their behavioral patterns have received increasing attention over the last years due the need to better under -stand fundamental characteristics that shape the demand side. Most of the models described in the lit-erature are based on Time Use Survey data and Markov chains. Due to the nature of the underlying data and the Markov property, it is not sufficiently possible to consider long-term dependencies over sev-eral days in occupant behavior. An accurate mapping of long-term dependencies in behavior is of increas-ing importance, e.g. for the determination of flexibility potentials of individual households urgently needed to compensate supply-side fluctuations of renewable based energy systems. The aim of this study is to bridge the gap between social practice theory, energy related activity modelling and novel machine learning approaches. The weaknesses of existing approaches are addressed by combining time use survey data with mobility data, which provide information about individual mobility behavior over periods of one week. In social practice theory, emphasis is placed on the sequencing and repetition of practices over time. This suggests that practices have a memory. Transformer models based on the attention mechanism and Long short-term memory (LSTM) based neural networks define the state of the art in the field of nat-ural language processing (NLP) and are for the first time introduced in this paper for the generation of weekly activity profiles. In a first step an autoregressive model is presented, which generates synthetic weekly mobility schedules of individual occupants and thereby captures long-term dependencies in mobility behavior. In a second step, an imputation model enriches the weekly mobility schedules with detailed information about energy relevant at home activities. The weekly activity profiles build the basis for multiple use cases one of which is modelling consistent electricity, heat and mobility demand profiles of households. The approach developed provides the basis for making high-quality weekly activity data available to the general public without having to carry out complex application procedures. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Learning Long-Term Dependencies Using Layered Graph Neural Networks
    Bandinelli, Niccolo
    Bianchini, Monica
    Scarselli, Franco
    2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010, 2010,
  • [2] Long-Term Occupancy Grid Prediction Using Recurrent Neural Networks
    Schreiber, Marcel
    Hoermann, Stefan
    Dietmayer, Klaus
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 9299 - 9305
  • [3] Hierarchical recurrent neural networks for long-term dependencies
    ElHihi, S
    Bengio, Y
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 493 - 499
  • [4] Learning long-term dependencies with recurrent neural networks
    Schaefer, Anton Maximilian
    Udluft, Steffen
    Zimmermann, Hans-Georg
    NEUROCOMPUTING, 2008, 71 (13-15) : 2481 - 2488
  • [5] Learning long-term dependencies in NARX recurrent neural networks
    Lin, TN
    Horne, BG
    Tino, P
    Giles, CL
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (06): : 1329 - 1338
  • [6] Predicting the Long-Term Dependencies in Time Series Using Recurrent Artificial Neural Networks
    Ubal, Cristian
    Di-Giorgi, Gustavo
    Contreras-Reyes, Javier E.
    Salas, Rodrigo
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2023, 5 (04): : 1340 - 1358
  • [7] Modeling Long-Term Dependencies from Videos Using Deep Multiplicative Neural Networks
    Si, Wen
    Liu, Cong
    Bi, Zhongqin
    Shan, Meijing
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2020, 16 (02)
  • [8] Learning long-term dependencies in segmented memory recurrent neural networks
    Chen, JM
    Chaudhari, NS
    ADVANCES IN NEURAL NETWORKS - ISNN 2004, PT 1, 2004, 3173 : 362 - 369
  • [9] Revisiting the problem of learning long-term dependencies in recurrent neural networks
    Johnston, Liam
    Patel, Vivak
    Cui, Yumian
    Balaprakash, Prasanna
    NEURAL NETWORKS, 2025, 183
  • [10] Learning long term dependencies with recurrent neural networks
    Schaefer, Anton Maximilian
    Udluft, Steffen
    Zimmermann, Hans Georg
    ARTIFICIAL NEURAL NETWORKS - ICANN 2006, PT 1, 2006, 4131 : 71 - 80