Multiagent-based secure energy management for multimedia grid communication using Q-learning

被引:11
|
作者
Kumari, Aparna [1 ]
Tanwar, Sudeep [1 ]
机构
[1] Nirma Univ, Inst Technol, Dept Comp Sci & Engn, Ahmadabad, Gujarat, India
关键词
Demand response management; Residential energy management; Reinforcement learning; Artificial intelligence; Q-learning; DEMAND RESPONSE; OPTIMIZATION; SYSTEMS; MODEL;
D O I
10.1007/s11042-021-11491-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In smart grid infrastructure, multimedia communication plays an important role in various applications, for instance, load monitoring, automatic smart meter reading, and energy management. Energy management has gained widespread popularity with the increasing energy demand. The goal of this paper is to explore multiagent-based Reinforcement Learning (RL) for multicarrier (i.e., electricity and gas) Residential Energy Management (REM) systems with data security. It facilitates the possibility for a separate Demand Response Program (DRP) for every energy component accelerates with computational aspects of RL. This paper proposes Q-MSEM, a Q-learning-based Multiagent and Secure Energy Management scheme for the optimal solution of REM problems using RL and Ethereum Blockchain (EB) to reduce energy load and decrease energy costs. Then, Q-MSEM uses Ethereum Smart Contract (ESC) to covenants data security issues using off-chain storage, i.e., InterPlanetary File System (IPFS) to handle data storage costs issues. Experimental results disclose the effectiveness of the proposed Q-MSEM scheme in terms of reduction in load, energy cost (15.82%), and data storage cost.
引用
收藏
页码:36645 / 36665
页数:21
相关论文
共 50 条
  • [1] Multiagent-based secure energy management for multimedia grid communication using Q-learning
    Aparna Kumari
    Sudeep Tanwar
    Multimedia Tools and Applications, 2022, 81 : 36645 - 36665
  • [2] Boosting Communication Efficiency in Federated Learning for Multiagent-Based Multimicrogrid Energy Management
    He, Shangyang
    Li, Yuanzheng
    Li, Yang
    Shi, Yang
    Chung, Chi Yung
    Zeng, Zhigang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [3] Genetic Based Fuzzy Q-Learning Energy Management for Smart Grid
    Li Xin
    Zang Chuanzhi
    Zeng Peng
    Yu Haibin
    PROCEEDINGS OF THE 31ST CHINESE CONTROL CONFERENCE, 2012, : 6924 - 6927
  • [4] Toward guidelines for modeling learning agents in multiagent-based simulation: Implications from Q-learning and sarsa agents
    Takadama, K
    Fujita, H
    MULTI-AGENT AND MULTI-AGENT-BASED SIMULATION, 2005, 3415 : 159 - 172
  • [5] Q-Learning Based Physical-Layer Secure Game Against Multiagent Attacks
    Xu, Yan
    Xia, Junjuan
    Wu, Huijun
    Fan, Liseng
    IEEE ACCESS, 2019, 7 : 49212 - 49222
  • [6] An Online Home Energy Management System using Q-Learning and Deep Q-Learning
    Izmitligil, Hasan
    Karamancioglu, Abdurrahman
    SUSTAINABLE COMPUTING-INFORMATICS & SYSTEMS, 2024, 43
  • [7] Fuzzy Q-Learning based Energy Management of Small Cells Powered by the Smart Grid
    Mendil, Mouhcine
    De Domenico, Antonio
    Heiries, Vincent
    Caire, Raphael
    Hadj-Said, Nouredine
    2016 IEEE 27TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR, AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2016, : 1936 - 1941
  • [8] Multiagent-Based Hybrid Energy Management System for Microgrids
    Mao, Meiqin
    Jin, Peng
    Hatziargyriou, Nikos D.
    Chang, Liuchen
    IEEE TRANSACTIONS ON SUSTAINABLE ENERGY, 2014, 5 (03) : 938 - 946
  • [9] A Hybrid Multiagent Framework With Q-Learning for Power Grid Systems Restoration
    Ye, Dayong
    Zhang, Minjie
    Sutanto, Danny
    IEEE TRANSACTIONS ON POWER SYSTEMS, 2011, 26 (04) : 2434 - 2441
  • [10] Recurrent Deep Multiagent Q-Learning for Autonomous Brokers in Smart Grid
    Yang, Yaodong
    Hao, Jianye
    Sun, Mingyang
    Wang, Zan
    Fan, Changjie
    Strbac, Goran
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 569 - 575