An autonomous architecture based on reinforcement deep neural network for resource allocation in cloud computing

被引:3
|
作者
Javaheri, Seyed Danial Alizadeh [1 ]
Ghaemi, Reza [2 ]
Naeen, Hossein Monshizadeh [1 ]
机构
[1] Islamic Azad Univ, Dept Comp Engn, Neyshabur Branch, Neyshabur, Iran
[2] Islamic Azad Univ, Dept Comp Engn, Quchan Branch, Quchan, Iran
关键词
Cloud computing; Resource allocation; Autonomous system; Deep reinforcement neural network; FOG; MANAGEMENT; IOT; PARADIGM; ALGORITHMS;
D O I
10.1007/s00607-023-01220-7
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Today, cloud computing technology has attracted the attention of many researchers. According to the needs of users to quickly execute requests and provide quality services, optimal allocation of resources and timing of task execution between virtual machines in cloud computing are of great importance. One of the important challenges that cloud service providers face is the effective management of resources by physical infrastructure. Therefore, in this paper, an autonomous system based on the Clipped Double Deep Q-Learning (CDDQL) Algorithm and the meta-heuristic Particle Swarm Optimization (PSO) for resource allocation is proposed in the Fog-cloud computing infrastructure. The PSO algorithm is used to prioritize the tasks and CDDQL is used as the core of the autonomous system (Auto-CDDQL) to allocate the desired VM resources to the tasks. The proposed Auto-CDDQL is implemented in the Fog and performs this process autonomously. By evaluating the results, it was observed that the amount of Make Span, response time, task completion, resource utilization, and energy consumption rate in the proposed AutoCDDQL on the c-hilo dataset, compared to the FCFS, RR, and PBTS methods, are significantly improved.
引用
收藏
页码:371 / 403
页数:33
相关论文
共 50 条
  • [31] Resource Allocation Strategy Using Deep Reinforcement Learning in Cloud-Edge Collaborative Computing Environment
    Cen, Junjie
    Li, Yongbo
    [J]. MOBILE INFORMATION SYSTEMS, 2022, 2022
  • [32] Resource Allocation in a Network-Based Cloud Computing Environment: Design Challenges
    Abu Sharkh, Mohamed
    Jammal, Manar
    Shami, Abdallah
    Ouda, Abdelkader
    [J]. IEEE COMMUNICATIONS MAGAZINE, 2013, 51 (11) : 46 - 52
  • [33] A Novel Resource Productivity Based on Granular Neural Network in Cloud Computing
    Mahan, Farnaz
    Rozehkhani, Seyyed Meysam
    Pedrycz, Witold
    [J]. COMPLEXITY, 2021, 2021
  • [34] A Novel Resource Productivity Based on Granular Neural Network in Cloud Computing
    Mahan, Farnaz
    Rozehkhani, Seyyed Meysam
    Pedrycz, Witold
    [J]. Complexity, 2021, 2021
  • [35] Deep Reinforcement Learning for Edge Computing Resource Allocation in Blockchain Network Slicing Broker Framework
    Gong, Yu
    Sun, Siyuan
    Wei, Yifei
    Song, Mei
    [J]. 2021 IEEE 93RD VEHICULAR TECHNOLOGY CONFERENCE (VTC2021-SPRING), 2021,
  • [36] Novel Resource Allocation Algorithm of Edge Computing Based on Deep Reinforcement Learning Mechanism
    Zhang, Degan
    Fan, Hongrui
    Zhang, Jie
    [J]. 19TH IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED PROCESSING WITH APPLICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2021), 2021, : 437 - 444
  • [37] Deep Reinforcement Learning Based Dynamic Resource Allocation in Cloud Radio Access Networks
    Rodoshi, Rehenuma Tasnim
    Kim, Taewoon
    Choi, Wooyeol
    [J]. 11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 618 - 623
  • [38] Resource Allocation in Cloud Computing
    Senthilkumar, G.
    Tamilarasi, K.
    Velmurugan, N.
    Periasamy, J. K.
    [J]. JOURNAL OF ADVANCES IN INFORMATION TECHNOLOGY, 2023, 14 (05) : 1063 - 1072
  • [39] Resource allocation of fog radio access network based on deep reinforcement learning
    Tan, Jingru
    Guan, Wenbo
    [J]. ENGINEERING REPORTS, 2022, 4 (05)
  • [40] Deep Reinforcement Learning Based Resource Allocation for Network Slicing With Massive MIMO
    Yan, Dandan
    Ng, Benjamin K.
    Ke, Wei
    Lam, Chan-Tong
    [J]. IEEE ACCESS, 2023, 11 : 75899 - 75911