ACES: Automatic Configuration of Energy Harvesting Sensors with Reinforcement Learning

被引:28
|
作者
Fraternali, Francesco [1 ]
Balaji, Bharathan [2 ]
Agarwal, Yuvraj [3 ]
Gupta, Rajesh K. [1 ]
机构
[1] Univ Calif San Diego, Comp Sci & Engn, La Jolla, CA 92093 USA
[2] Amazon, 2250 7th Ave, Seattle, WA 98121 USA
[3] Carnegie Mellon Univ, Elect & Comp Engn, 5000 Forbes Ave, Pittsburgh, PA 15213 USA
关键词
Internet of Things; automatic configuration; reinforcement learning; smart buildings; energy harvesting; battery-less; real deployment; IOT;
D O I
10.1145/3404191
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Many modern smart building applications are supported by wireless sensors to sense physical parameters, given the flexibility they offer and the reduced cost of deployment. However, most wireless sensors are powered by batteries today, and large deployments are inhibited by the requirement of periodic battery replacement. Energy harvesting sensors provide an attractive alternative, but they need to provide adequate quality of service to applications given the uncertainty of energy availability. We propose ACES, which uses reinforcement learning to maximize sensing quality of energy harvesting sensors for periodic and event-driven indoor sensing with available energy. Our custom-built sensor platform uses a supercapacitor to store energy and Bluetooth Low Energy to relay sensors data. Using simulations and real deployments, we use the data collected to continually adapt the sensing of each node to changing environmental patterns and transfer learning to reduce the training time in real deployments. In our 60-node deployment lasting 2 weeks, nodes stop operations only 0.1% of the time, and collection of data is comparable with current battery-powered nodes. We show that ACES reduces the node duty-cycle period by an average of 33% compared to three prior reinforcement learning techniques while continuously learning environmental changes over time.
引用
收藏
页数:31
相关论文
共 50 条
  • [1] Scaling Configuration of Energy Harvesting Sensors with Reinforcement Learning
    Fraternali, Francesco
    Balaji, Bharathan
    Gupta, Rajesh
    PROCEEDINGS OF THE 2018 INTERNATIONAL WORKSHOP ON ENERGY HARVESTING & ENERGY-NEUTRAL SENSING SYSTEMS (ENSSYS '18), 2018, : 7 - 13
  • [2] Status Update for Correlated Energy Harvesting Sensors: A Deep Reinforcement Learning Approach
    Zhao, Nan
    Xu, Chao
    Zhang, Shun
    Xie, Yiping
    Wang, Xijun
    Sun, Hongguang
    2020 12TH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS AND SIGNAL PROCESSING (WCSP), 2020, : 170 - 175
  • [3] Reinforcement Learning-Based Transmission Policies for Energy Harvesting Powered Sensors
    Seifullaev, Ruslan
    Knorn, Steffi
    Ahlen, Anders
    Hostettler, Roland
    IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, 2024, 8 (04): : 1564 - 1573
  • [4] Power Usage of Energy Harvesting Sensors with a Drone Sink: A Reinforcement Learning Based Approach
    Kusaladharma, S.
    Adve, R. S.
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [5] Automatic configuration of sensors and actuators through innate learning
    Choi, TA
    Yim, EA
    Arroyo, AA
    Doty, KL
    ROBOTICS 98, 1998, : 64 - 70
  • [6] Age-Aware Status Update Control for Energy Harvesting IoT Sensors via Reinforcement Learning
    Hatami, Mohammad
    Jahandideh, Mojtaba
    Leinonen, Markus
    Codreanu, Marian
    2020 IEEE 31ST ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (IEEE PIMRC), 2020,
  • [7] Accelerated Structure-Aware Reinforcement Learning for Delay-Sensitive Energy Harvesting Wireless Sensors
    Sharma, Nikhilesh
    Mastronarde, Nicholas
    Chakareski, Jacob
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 1409 - 1424
  • [8] Reinforcement Learning in MIMO Wireless Networks with Energy Harvesting
    Ayatollahi, Hoda
    Tapparello, Cristiano
    Heinzelman, Wendi
    2017 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2017,
  • [9] Reinforcement Learning Approaches for IoT Networks with Energy Harvesting
    Liu, Xiaolan
    Gao, Yue
    2019 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA (ICCC), 2019,
  • [10] Distributed Energy Cooperation for Energy Harvesting Nodes Using Reinforcement Learning
    Lin, Wei-Ting
    Lai, I-Wei
    Lee, Chia-Han
    2015 IEEE 26TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR, AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2015, : 1584 - 1588