USING DEEP LEARNING AND KNOWLEDGE TRANSFER TO DISAGGREGATE ENERGY CONSUMPTION

被引:3
|
作者
Teixeira, Rafael [1 ]
Antunes, Mario [2 ]
Gomes, Diogo [2 ]
机构
[1] Univ Aveiro, Dept Elect Telecomunicacoes & Informat, Aveiro, Portugal
[2] Univ Aveiro, Inst Telecomunicacoes, Aveiro, Portugal
关键词
NILM; NIALM; CNN; RNN; MLP; Transfer Learning;
D O I
10.1109/ICWAPR54887.2021.9736149
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
NILM or Non-Intrusive Load Monitoring is the task of disaggregating the energy consumed by a building in the energy consumed by its constituent appliances. With the increase in energy demand, governments started searching for solutions to reduce energy wastage on the demand side, and the deployment of smart meters was one of them. Their purpose was to give users information about the aggregated energy consumed in a given household at any given time. Since the smart meters collect the aggregated readings, the interest in NILM grew, as researchers could focus their attention on the disaggregation algorithm. Regarding the disaggregation algorithms, deep learning models have shown remarkable results surpassing the previous state-of-the-art models. With this in mind, this paper proposes three different deep learning models: a convolutional neural network with residual blocks, a recurrent neural network, and a multilayer perceptron that uses discrete wavelet transform as features. These models are trained on the UK-DALE and REFIT datasets and compared with the state-of-the-art models present in NILMTK-Contrib. The models are evaluated on their generalization and transfer learning ability, as these are two critical factors for a broad NILM deployment. Our models have shown competitive results compared to the state-of-the-art, achieving lower errors than 2 out of the three models used, getting closer to the performance of the third. The main advantage of our models is the ability to do real-time disaggregation, while the best model has a 30 min delay.
引用
收藏
页码:23 / 29
页数:7
相关论文
共 50 条
  • [31] Energy consumption prediction in water treatment plants using deep learning with data augmentation
    Harrou, Fouzi
    Dairi, Abdelkader
    Dorbane, Abdelhakim
    Sun, Ying
    [J]. RESULTS IN ENGINEERING, 2023, 20
  • [32] Prediction of Electric Buses Energy Consumption from Trip Parameters Using Deep Learning
    Pamula, Teresa
    Pamula, Danuta
    [J]. ENERGIES, 2022, 15 (05)
  • [33] Using Deep Learning for Alcohol Consumption Recognition
    Bernstein, Joseph P.
    Mendez, Brandon J.
    Sun, Peng
    Liu, Yang
    Shang, Yi
    [J]. 2017 14TH IEEE ANNUAL CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE (CCNC), 2017, : 1020 - 1021
  • [34] Transfer Learning for Leisure Centre Energy Consumption Prediction
    Banda, Paul
    Bhuiyan, Muhammed A.
    Zhang, Kevin
    Song, Andy
    [J]. COMPUTATIONAL SCIENCE - ICCS 2019, PT I, 2019, 11536 : 112 - 123
  • [35] A Building Energy Consumption Prediction Method Based on Integration of a Deep Neural Network and Transfer Reinforcement Learning
    Fu, Qiming
    Liu, QingSong
    Gao, Zhen
    Wu, Hongjie
    Fu, Baochuan
    Chen, Jianping
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2020, 34 (10)
  • [36] Using Transfer Learning for a Deep Learning Model Observer
    Murphy, W.
    Elangovan, P.
    Halling-Brown, M.
    Lewis, E.
    Young, K. C.
    Dance, D. R.
    Wells, K.
    [J]. MEDICAL IMAGING 2019: IMAGE PERCEPTION, OBSERVER PERFORMANCE, AND TECHNOLOGY ASSESSMENT, 2019, 10952
  • [38] Aggregate and disaggregate analysis on energy consumption and economic growth nexus in China
    Xuyi Liu
    [J]. Environmental Science and Pollution Research, 2018, 25 : 26512 - 26526
  • [39] Using Deep Learning to Replace Domain Knowledge
    Luebben, Christian
    Pahl, Marc-Oliver
    Khan, Mohammad Irfan
    [J]. 2020 IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (ISCC), 2020, : 423 - 428
  • [40] Application of Deep Learning Model in Building Energy Consumption Prediction
    Wang, Yiqiong
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022