Energy-Efficient Object Tracking Using Adaptive ROI Subsampling and Deep Reinforcement Learning

被引:1
|
作者
Katoch, Sameeksha [1 ]
Iqbal, Odrika [1 ]
Spanias, Andreas [1 ]
Jayasuriya, Suren [1 ]
机构
[1] Arizona State Univ, Sch Elect Comp & Energy Engn, Tempe, AZ 85281 USA
关键词
Image sensors; Energy efficiency; Object tracking; Kalman filters; Cameras; Visualization; Target tracking; Reinforcement learning; energy optimization; adaptive subsampling; ROI tracking; COMPRESSION; ALGORITHMS; NETWORKS; MODEL;
D O I
10.1109/ACCESS.2023.3270776
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent innovations in ROI camera systems have opened up the avenue for exploring energy optimization techniques like adaptive subsampling. Generally speaking, image frame capture and read-out demand high power consumption. ROI camera systems make it possible to exploit the inverse relation between energy consumption and spatiotemporal pixel readout to optimize the power efficiency of the image sensor. To this end, we develop a reinforcement learning (RL) based adaptive subsampling framework which predicts ROI trajectories and reconfigures the image sensor on-the-fly for improved power efficiency of the image sensing pipeline. In our proposed framework, a pre-trained convolutional neural network (CNN) extracts rich visual features from incoming frames and a long short-term memory (LSTM) network predicts the region of interest (ROI) and subsampling pattern for the consecutive image frame. Based on the application and the difficulty level of object motion trajectory, the user can utilize either the predicted ROI or coarse subsampling pattern to switch off the pixels for sequential frame capture, thus saving energy. We have validated our proposed method by adapting existing trackers for the adaptive subsampling framework and evaluating them as competing baselines. As a proof-of-concept, our method outperforms the baselines and achieves an average AUC score of 0.5090 on three benchmarking datasets. We also characterize the energy-accuracy tradeoff of our method vs. the baselines and show that our approach is best suited for applications that demand both high visual tracking precision and low power consumption. On the TB100 dataset, our method achieves the highest AUC score of 0.5113 out of all the competing algorithms and requires a medium-level power consumption of approximately 4 W as per a generic energy model and an energy consumption of 1.9 mJ as per a mobile system energy model. Although other baselines are shown to have better performance in terms of power consumption, they are ill-suited for applications that require considerable tracking precision, making our method the ideal candidate in terms of power-accuracy tradeoff.
引用
收藏
页码:41995 / 42011
页数:17
相关论文
共 50 条
  • [1] ADAPTIVE VIDEO SUBSAMPLING FOR ENERGY-EFFICIENT OBJECT DETECTION
    Mohan, Divya
    Katoch, Sameeksha
    Jayasuriya, Suren
    Turaga, Pavan
    Spanias, Andreas
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 103 - 107
  • [2] DESIGN AND FPGA IMPLEMENTATION OF AN ADAPTIVE VIDEO SUBSAMPLING ALGORITHM FOR ENERGY-EFFICIENT SINGLE OBJECT TRACKING
    Iqbal, Odrika
    Siddiqui, Saquib
    Martin, Joshua
    Katoch, Sameeksha
    Spanias, Andreas
    Bliss, Daniel
    Jayasuriya, Suren
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 3065 - 3069
  • [3] Data-driven Energy-efficient Adaptive Sampling Using Deep Reinforcement Learning
    Demirel B.U.
    Chen L.
    Al Faruque M.A.
    ACM Transactions on Computing for Healthcare, 2023, 4 (03):
  • [4] Adaptive ROI generation for video object segmentation using reinforcement learning
    Sun, Mingjie
    Xiao, Jimin
    Lim, Eng Gee
    Xie, Yanchun
    Feng, Jiashi
    PATTERN RECOGNITION, 2020, 106 (106)
  • [5] Energy-Efficient Parking Analytics System using Deep Reinforcement Learning
    Rezaei, Yoones
    Lee, Stephen
    Mosse, Daniel
    BUILDSYS'21: PROCEEDINGS OF THE 2021 ACM INTERNATIONAL CONFERENCE ON SYSTEMS FOR ENERGY-EFFICIENT BUILT ENVIRONMENTS, 2021, : 81 - 90
  • [6] Energy-Efficient Ultra-Dense Network using Deep Reinforcement Learning
    Ju, Hyungyu
    Kim, Seungnyun
    Kim, YoungJoon
    Lee, Hyojin
    Shim, Byonghyo
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [7] An Energy-Efficient Hardware Accelerator for Hierarchical Deep Reinforcement Learning
    Shiri, Aidin
    Prakash, Bharat
    Mazumder, Arnab Neelim
    Waytowich, Nicholas R.
    Oates, Tim
    Mohsenin, Tinoosh
    2021 IEEE 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS), 2021,
  • [8] Energy-efficient VM scheduling based on deep reinforcement learning
    Wang, Bin
    Liu, Fagui
    Lin, Weiwei
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2021, 125 : 616 - 628
  • [9] Energy-Efficient IoT Sensor Calibration With Deep Reinforcement Learning
    Ashiquzzaman, Akm
    Lee, Hyunmin
    Um, Tai-Won
    Kim, Jinsul
    IEEE ACCESS, 2020, 8 : 97045 - 97055
  • [10] Energy-Efficient Ultra-Dense Network With Deep Reinforcement Learning
    Ju, Hyungyu
    Kim, Seungnyun
    Kim, Youngjoon
    Shim, Byonghyo
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (08) : 6539 - 6552