SRA-E-ABCO: terminal task offloading for cloud-edge-end environments

被引:0
|
作者
Shun Jiao
Haiyan Wang
Jian Luo
机构
[1] Nanjing University of Posts and Telecommunications,School of Computer Science
[2] Jiangsu Key Laboratory of Big Data Security and Intelligent Processing,undefined
来源
关键词
Cloud-edge-end; Terminal device; Service reliability; Bee colony algorithm; Task offloading;
D O I
暂无
中图分类号
学科分类号
摘要
The rapid development of the Internet technology along with the emergence of intelligent applications has put forward higher requirements for task offloading. In Cloud-Edge-End (CEE) environments, offloading computing tasks of terminal devices to edge and cloud servers can effectively reduce system delay and alleviate network congestion. Designing a reliable task offloading strategy in CEE environments to meet users’ requirements is a challenging issue. To design an effective offloading strategy, a Service Reliability Analysis and Elite-Artificial Bee Colony Offloading model (SRA-E-ABCO) is presented for cloud-edge-end environments. Specifically, a Service Reliability Analysis (SRA) method is proposed to assist in predicting the offloading necessity of terminal tasks and analyzing the attributes of terminal devices and edge nodes. An Elite Artificial Bee Colony Offloading (E-ABCO) method is also proposed, which optimizes the offloading strategy by combining elite populations with improved fitness formulas, position update formulas, and population initialization methods. Simulation results on real datasets validate the efficient performance of the proposed scheme that not only reduces task offloading delay but also optimize system overhead in comparison to baseline schemes.
引用
收藏
相关论文
共 50 条
  • [1] SRA-E-ABCO: terminal task offloading for cloud-edge-end environments
    Jiao, Shun
    Wang, Haiyan
    Luo, Jian
    [J]. JOURNAL OF CLOUD COMPUTING-ADVANCES SYSTEMS AND APPLICATIONS, 2024, 13 (01):
  • [2] Task Offloading Strategy in Mobile Edge Computing Based on Cloud-Edge-End Cooperation
    Zhang W.
    Yu J.
    [J]. Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (02): : 371 - 385
  • [3] Collaborative cloud-edge-end task offloading with task dependency based on deep reinforcement learning
    Tang, Tiantian
    Li, Chao
    Liu, Fagui
    [J]. COMPUTER COMMUNICATIONS, 2023, 209 : 78 - 90
  • [4] Emergency task offloading strategy based on cloud-edge-end collaboration for smart factories
    Qu, Xiaofeng
    Wang, Huiqiang
    [J]. COMPUTER NETWORKS, 2023, 234
  • [5] Blockchain-Secured Task Offloading and Resource Allocation for Cloud-Edge-End Cooperative Networks
    Fan, Wenhao
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (08) : 8092 - 8110
  • [6] DRL-Based Green Task Offloading for Content Distribution in NOMA-Enabled Cloud-Edge-End Cooperation Environments
    Fang, Chao
    Meng, Xiangheng
    Hu, Zhaoming
    Yang, Xiaoping
    Xu, Fangmin
    Li, Peng
    Dong, Mianxiong
    [J]. ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 6126 - 6131
  • [7] Collaborative Cloud-Edge-End Task Offloading in Mobile-Edge Computing Networks With Limited Communication Capability
    Kai, Caihong
    Zhou, Hao
    Yi, Yibo
    Huang, Wei
    [J]. IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2021, 7 (02) : 624 - 634
  • [8] Cloud-Edge-End Collaborative Task Offloading in Vehicular Edge Networks: A Multilayer Deep Reinforcement Learning Approach
    Wu, Jiaqi
    Tang, Ming
    Jiang, Changkun
    Gao, Lin
    Cao, Bin
    [J]. IEEE Internet of Things Journal, 2024, 11 (22) : 36272 - 36290
  • [9] Time-Slotted Task Offloading and Resource Allocation for Cloud-Edge-End Cooperative Computing Networks
    Fan, Wenhao
    Liu, Xun
    Yuan, Hao
    Li, Nan
    Liu, Yuan'an
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (08) : 8225 - 8241
  • [10] Collaborative Cloud-Edge-End Task Offloading in NOMA-Enabled Mobile Edge Computing Using Deep Learning
    Du, RuiZhong
    Liu, Cui
    Gao, Yan
    Hao, PengNan
    Wang, ZiYuan
    [J]. JOURNAL OF GRID COMPUTING, 2022, 20 (02)