POLSTM: Poplar optimization-based long short term memory model for resource allocation in cloud environment

被引:0
|
作者
Samuel, Prithi [1 ]
Vinothini, Arumugham [2 ]
Kanniappan, Jayashree [3 ]
机构
[1] SRM Inst Sci & Technol, Sch Comp, Dept Computat Intelligence, Kattankulathur Campus, Chennai, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai 600127, Tamil Nadu, India
[3] Panimalar Engn Coll, Dept Artificial Intelligence & Data Sci, Chennai, India
关键词
Internet of things; Mobile edge computing; Cloud; Optimization; Fog computing; Task scheduling; WORKLOAD ALLOCATION; INTERNET; ENERGY;
D O I
10.1016/j.comcom.2023.08.008
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to the evolution of Internet of Things (IoT) paradigm, the number of devices is growing day by day which arises the stringent requirements for various communications. The standard cloud computing model is not capable of efficiently hosting IoT tasks due to the high latency associated with it. Hence, this work utilizes the mobile edge computing and fog concept to process the input tasks independent of the cloud layer. Generally, the high computation costs and energy consumption resulted in the applications such as power-hungry and computation-intensive, which become massive challenges for an IoT device. Motivated by the previous research, we mainly plan to investigate the resource allocation issues that arise in the MEC-enabled IoT-Fog-cloud architecture. We propose a Poplar Optimization algorithm (POA) based on Attention 1DCNN-LSTM architecture (POA-A1DCNN-LSTM) for improving the total utility of the MEC servers by optimizing the energy consumption and task delay. Initially, the POA algorithm is implemented for cluster head (optimal fog node) selection using the node degree, node distance, and residual energy. Next, the A1DCNN-LSTM architecture is employed for task offloading by selecting the fog node with minimal task length and processing delay. The performance of the proposed method is validated by average latency, user satisfaction, network lifetime, energy consumption, average time delay, and normalized system utility. The experimentation results revealed that the proposed method attained better effectiveness in different metrics by achieving 1.5 ms, 0.65, 60 s, 0.45 j, 385 ms, and 0.98 compared to state-of-the-art methods.The experimentation outcomes demonstrate the effectiveness of the proposed POA-A1DCNN-LSTM architecture to decrease task completion delay and offer effective task scheduling as well as resource allocation performances in the MEC-enabled IoT-Fog-cloud network.
引用
收藏
页码:11 / 23
页数:13
相关论文
共 50 条
  • [41] A resource auction based allocation mechanism in the cloud computing environment
    Wang, Xingwei
    Sun, Jiajia
    Huang, Min
    Wu, Chuan
    Wang, Xueyi
    2012 IEEE 26TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS & PHD FORUM (IPDPSW), 2012, : 2111 - 2115
  • [42] Access control based resource allocation in cloud computing environment
    Wang J.
    Liu J.
    Zhang H.
    Liu, Jinliang (836251714@qq.com), 1600, Femto Technique Co., Ltd. (19): : 236 - 243
  • [43] Resource Allocation Based on Prospect Theory in Cloud Manufacturing Environment
    He, Wei
    Luan, Shichao
    Jia, Guozhu
    Zong, Hengshan
    2017 INTERNATIONAL CONFERENCE ON COMPUTER SYSTEMS, ELECTRONICS AND CONTROL (ICCSEC), 2017, : 1329 - 1332
  • [44] Application type based resource allocation strategy in cloud environment
    Peng, Jun-jie
    Zhi, Xiao-fei
    Xie, Xiao-lan
    MICROPROCESSORS AND MICROSYSTEMS, 2016, 47 : 385 - 391
  • [45] Statistical Model Checking-Based Evaluation and Optimization for Cloud Workflow Resource Allocation
    Chen, Mingsong
    Huang, Saijie
    Fu, Xin
    Liu, Xiao
    He, Jifeng
    IEEE TRANSACTIONS ON CLOUD COMPUTING, 2020, 8 (02) : 443 - 458
  • [46] Metaheuristic Optimization-Based Resource Allocation Technique for Cybertwin-Driven 6G on IoE Environment
    Jain, Deepak Kumar
    Tyagi, Sumarga Kumar Sah
    Neelakandan, Subramani
    Prakash, Mohan
    Natrayan, Lakshmaiya
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (07) : 4884 - 4892
  • [47] Resource Usage Prediction of Cloud Workloads using Deep Bidirectional Long Short Term Memory Networks
    Gupta, Shaifu
    Dinesh, Dileep Aroor
    2017 IEEE INTERNATIONAL CONFERENCE ON ADVANCED NETWORKS AND TELECOMMUNICATIONS SYSTEMS (ANTS), 2017,
  • [48] The optimization of virtual resource allocation in cloud computing based on RBPSO
    Wang, Xiaohui
    Gu, Haoran
    Yue, YuXian
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2020, 32 (16):
  • [49] Dynamic Optimization Long Short-Term Memory Model Based on Data Preprocessing for Short-Term Traffic Flow Prediction
    Zhang, Yang
    Xin, Dongrong
    IEEE ACCESS, 2020, 8 : 91510 - 91520
  • [50] A Secure and Fair Resource Allocation Model under Hybrid Cloud Environment
    Zhao, Lei
    Wang, Fu
    Fan, Kaikai
    2014 2ND INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2014, : 969 - 973