SAED: self-attentive energy disaggregation

被引:16
|
作者
Virtsionis-Gkalinikis, Nikolaos [1 ]
Nalmpantis, Christoforos [1 ]
Vrakas, Dimitris [1 ]
机构
[1] Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki 54124, Greece
关键词
Energy disaggregation; Non-intrusive load monitoring; Artificial neural networks; Self attention;
D O I
10.1007/s10994-021-06106-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The field of energy disaggregation deals with the approximation of appliance electric consumption using only the aggregate consumption measurement of a mains meter. Recent research developments have used deep neural networks and outperformed previous methods based on Hidden Markov Models. On the other hand, deep learning models are computationally heavy and require huge amounts of data. The main objective of the current paper is to incorporate the attention mechanism into neural networks in order to reduce their computational complexity. For the attention mechanism two different versions are utilized, named Additive and Dot Attention. The experiments show that they perform on par, while the Dot mechanism is slightly faster. The two versions of self-attentive neural networks are compared against two state-of-the-art energy disaggregation deep learning models. The experimental results show that the proposed architecture achieves faster or equal training and inference time and with minor performance drop depending on the device or the dataset.
引用
收藏
页码:4081 / 4100
页数:20
相关论文
共 50 条
  • [1] SAED: self-attentive energy disaggregation
    Nikolaos Virtsionis-Gkalinikis
    Christoforos Nalmpantis
    Dimitris Vrakas
    [J]. Machine Learning, 2023, 112 : 4081 - 4100
  • [2] Self-Attentive Associative Memory
    Le, Hung
    Tran, Truyen
    Venkatesh, Svetha
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [3] Self-Attentive Sequential Recommendation
    Kang, Wang-Cheng
    McAuley, Julian
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 197 - 206
  • [4] On the Robustness of Self-Attentive Models
    Hsieh, Yu-Lun
    Cheng, Minhao
    Juan, Da-Cheng
    Wei, Wei
    Hsu, Wen-Lian
    Hsieh, Cho-Jui
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1520 - 1529
  • [5] Denoising Self-Attentive Sequential Recommendation
    Chen, Huiyuan
    Lin, Yusan
    Pan, Menghai
    Wang, Lan
    Yeh, Chin-Chia Michael
    Li, Xiaoting
    Zheng, Yan
    Wang, Fei
    Yang, Hao
    [J]. PROCEEDINGS OF THE 16TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2022, 2022, : 92 - 101
  • [6] A SELF-ATTENTIVE EMOTION RECOGNITION NETWORK
    Partaourides, Harris
    Papadamou, Kostantinos
    Kourtellis, Nicolas
    Leontiades, Ilias
    Chatzis, Sotirios
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 7199 - 7203
  • [7] Lightweight Self-Attentive Sequential Recommendation
    Li, Yang
    Chen, Tong
    Zhang, Peng-Fei
    Yin, Hongzhi
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 967 - 977
  • [8] Constituency Parsing with a Self-Attentive Encoder
    Kitaev, Nikita
    Klein, Dan
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 2676 - 2686
  • [9] Self-attentive Biaffine Dependency Parsing
    Li, Ying
    Li, Zhenghua
    Zhang, Min
    Wang, Rui
    Li, Sheng
    Si, Luo
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5067 - 5073
  • [10] Fast Self-Attentive Multimodal Retrieval
    Wehrmann, Jonatas
    Lopes, Mauricio A.
    More, Martin D.
    Barros, Rodrigo C.
    [J]. 2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2018), 2018, : 1871 - 1878