A Markov Decision Process Approach to Dynamic Power Management in a Cluster System

被引:7
|
作者
Okamura, Hiroyuki [1 ]
Miyata, Satoshi [1 ]
Dohi, Tadashi [1 ]
机构
[1] Hiroshima Univ, Grad Sch Engn, Dept Informat Engn, Higashihiroshima 7398527, Japan
来源
IEEE ACCESS | 2015年 / 3卷
关键词
Dynamic power management; power-aware control; Markov decision process; Markovian arrival process; ENERGY;
D O I
10.1109/ACCESS.2015.2508601
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dynamic power management (DPM) plays a significant role to save power consumption effectively in both the design and operational phases of computer-based systems. It is well known that the state-dependent control policy by monitoring energy states in each component or the whole system is efficient for power saving in server systems whose system state, such as transaction request, can be completely observed. In this paper, we consider an optimal power-aware design in a cluster system and formulate the DPM problem by means of the Markov decision process. We derive the dynamic programming equation for the optimal control policy, which maximizes the expected reward per unit electrical power, which is called the power effectiveness, and give the policy iteration algorithm to determine the optimal control policy sequentially. In numerical experiments, we show the optimal control policy for an example of a cluster system with two service nodes, where the arrival stream of the transaction request is described as a Markov modulated Poisson process. In addition, based on the access data of an enterprise system, the optimal power-aware control for the cluster system and its effectiveness is examined.
引用
收藏
页码:3039 / 3047
页数:9
相关论文
共 50 条
  • [41] Strategic Decision for Crowd-Sensing: An Approach based on Markov Decision Process
    Ray, Arpita
    Chowdhury, Chandreyee
    Roy, Sarbani
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ADVANCED NETWORKS AND TELECOMMUNICATIONS SYSTEMS (ANTS), 2017,
  • [42] Markov Decision Process Framework for Flight Safety Assessment and Management
    Balachandran, Sweewarman
    Atkins, Ella
    [J]. JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (04) : 817 - 830
  • [43] Customer lifetime value management based on markov decision process
    Tian, ZX
    He, Y
    [J]. ICIM' 2004: PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON INDUSTRIAL MANAGEMENT, 2004, : 549 - 554
  • [44] A Markov decision process approach for managing medical drone deliveries
    Asadi, Amin
    Pinkley, Sarah Nurre
    Mes, Martijn
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 204
  • [45] A Markov Decision Process Approach for Optimal Data Backup Scheduling
    Xia, Ruofan
    Machida, Fumio
    Trivedi, Kishor
    [J]. 2014 44TH ANNUAL IEEE/IFIP INTERNATIONAL CONFERENCE ON DEPENDABLE SYSTEMS AND NETWORKS (DSN), 2014, : 660 - 665
  • [46] When to challenge a call in tennis: A Markov decision process approach
    Nadimpalli, Vamsi K.
    Hasenbein, John J.
    [J]. JOURNAL OF QUANTITATIVE ANALYSIS IN SPORTS, 2013, 9 (03) : 229 - 238
  • [47] Repair strategies in an uncertain environment: Markov decision process approach
    Kim, Y-H
    Thomas, L. C.
    [J]. JOURNAL OF THE OPERATIONAL RESEARCH SOCIETY, 2006, 57 (08) : 957 - 964
  • [48] Modeling of Marketing Processes Using Markov Decision Process Approach
    Grunt, Ondrej
    Plucar, Jan
    Stakova, Marketa
    Janecko, Tomas
    Zelinka, Ivan
    [J]. PROCEEDINGS OF THE SECOND INTERNATIONAL SCIENTIFIC CONFERENCE INTELLIGENT INFORMATION TECHNOLOGIES FOR INDUSTRY (IITI'17), VOL 1, 2018, 679 : 468 - 476
  • [49] Cyber Security Resource Allocation: A Markov Decision Process Approach
    Njilla, Laurent L.
    Kamhoua, Charles A.
    Kwiat, Kevin A.
    Hurley, Patrick
    Pissinou, Niki
    [J]. 2017 IEEE 18TH INTERNATIONAL SYMPOSIUM ON HIGH ASSURANCE SYSTEMS ENGINEERING (HASE 2017), 2017, : 49 - 52
  • [50] Environment system and dynamic management decision
    Odanaka, T
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2001, 120 (1-3) : 255 - 263