A Markov Decision Process Approach to Dynamic Power Management in a Cluster System

被引:7
|
作者
Okamura, Hiroyuki [1 ]
Miyata, Satoshi [1 ]
Dohi, Tadashi [1 ]
机构
[1] Hiroshima Univ, Grad Sch Engn, Dept Informat Engn, Higashihiroshima 7398527, Japan
来源
IEEE ACCESS | 2015年 / 3卷
关键词
Dynamic power management; power-aware control; Markov decision process; Markovian arrival process; ENERGY;
D O I
10.1109/ACCESS.2015.2508601
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dynamic power management (DPM) plays a significant role to save power consumption effectively in both the design and operational phases of computer-based systems. It is well known that the state-dependent control policy by monitoring energy states in each component or the whole system is efficient for power saving in server systems whose system state, such as transaction request, can be completely observed. In this paper, we consider an optimal power-aware design in a cluster system and formulate the DPM problem by means of the Markov decision process. We derive the dynamic programming equation for the optimal control policy, which maximizes the expected reward per unit electrical power, which is called the power effectiveness, and give the policy iteration algorithm to determine the optimal control policy sequentially. In numerical experiments, we show the optimal control policy for an example of a cluster system with two service nodes, where the arrival stream of the transaction request is described as a Markov modulated Poisson process. In addition, based on the access data of an enterprise system, the optimal power-aware control for the cluster system and its effectiveness is examined.
引用
收藏
页码:3039 / 3047
页数:9
相关论文
共 50 条
  • [1] Optimal Power-Aware Design in a Cluster System: Markov Decision Process Approach
    Okamura, Hiroyuki
    Miyata, Satoshi
    Dohi, Tadashi
    [J]. IEEE 12TH INT CONF UBIQUITOUS INTELLIGENCE & COMP/IEEE 12TH INT CONF ADV & TRUSTED COMP/IEEE 15TH INT CONF SCALABLE COMP & COMMUN/IEEE INT CONF CLOUD & BIG DATA COMP/IEEE INT CONF INTERNET PEOPLE AND ASSOCIATED SYMPOSIA/WORKSHOPS, 2015, : 527 - 532
  • [2] A semi-Markov Decision Process Based Dynamic Power Management for Mobile Devices
    Zhang, Mengxi
    Li, Yanjie
    Chen, Haoyao
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (IEEE RCAR), 2016, : 249 - 254
  • [3] MODELING PRAWN PRODUCTION MANAGEMENT-SYSTEM - A DYNAMIC MARKOV DECISION APPROACH
    LEUNG, PS
    SHANG, YC
    [J]. AGRICULTURAL SYSTEMS, 1989, 29 (01) : 5 - 20
  • [4] Dynamic Virtual Machine Management via Approximate Markov Decision Process
    Han, Zhenhua
    Tan, Haisheng
    Chen, Guihai
    Wang, Rui
    Chen, Yifan
    Lau, Francis C. M.
    [J]. IEEE INFOCOM 2016 - THE 35TH ANNUAL IEEE INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATIONS, 2016,
  • [5] Dynamic power management with fuzzy decision support system
    Samadi, Mehrzad
    Afzali-Kusha, Ali
    [J]. IEICE ELECTRONICS EXPRESS, 2008, 5 (19): : 789 - 795
  • [6] A Partially Observable Markov Decision Process Approach to Residential Home Energy Management
    Hansen, Timothy M.
    Chong, Edwin K. P.
    Suryanarayanan, Siddharth
    Maciejewski, Anthony A.
    Siegel, Howard Jay
    [J]. IEEE TRANSACTIONS ON SMART GRID, 2018, 9 (02) : 1271 - 1281
  • [7] A Markov Decision Process to Enhance Power System Operation Resilience during Hurricanes
    Abdelmalak, Michael
    Benidris, Mohammed
    [J]. 2021 IEEE POWER & ENERGY SOCIETY GENERAL MEETING (PESGM), 2021,
  • [8] A Markov Decision Process to Enhance Power System Operation Resilience during Wildfires
    Abdelmalak, Michael
    Benidris, Mohammed
    [J]. 2021 IEEE INDUSTRY APPLICATIONS SOCIETY ANNUAL MEETING (IAS), 2021,
  • [9] MARKOV DECISION-PROCESS - FUZZY APPROACH
    KIM, CE
    [J]. COMPUTERS & INDUSTRIAL ENGINEERING, 1994, 27 (1-4) : 161 - 165
  • [10] Dynamic Routing in Stochastic Urban Air Mobility Networks: A Markov Decision Process Approach
    Wei, Qinshuang
    Yu, Yue
    Topcu, Ufuk
    [J]. IFAC PAPERSONLINE, 2023, 56 (02): : 10760 - 10767