SRA-E-ABCO: terminal task offloading for cloud-edge-end environments

被引:0
|
作者
Shun Jiao
Haiyan Wang
Jian Luo
机构
[1] Nanjing University of Posts and Telecommunications,School of Computer Science
[2] Jiangsu Key Laboratory of Big Data Security and Intelligent Processing,undefined
来源
关键词
Cloud-edge-end; Terminal device; Service reliability; Bee colony algorithm; Task offloading;
D O I
暂无
中图分类号
学科分类号
摘要
The rapid development of the Internet technology along with the emergence of intelligent applications has put forward higher requirements for task offloading. In Cloud-Edge-End (CEE) environments, offloading computing tasks of terminal devices to edge and cloud servers can effectively reduce system delay and alleviate network congestion. Designing a reliable task offloading strategy in CEE environments to meet users’ requirements is a challenging issue. To design an effective offloading strategy, a Service Reliability Analysis and Elite-Artificial Bee Colony Offloading model (SRA-E-ABCO) is presented for cloud-edge-end environments. Specifically, a Service Reliability Analysis (SRA) method is proposed to assist in predicting the offloading necessity of terminal tasks and analyzing the attributes of terminal devices and edge nodes. An Elite Artificial Bee Colony Offloading (E-ABCO) method is also proposed, which optimizes the offloading strategy by combining elite populations with improved fitness formulas, position update formulas, and population initialization methods. Simulation results on real datasets validate the efficient performance of the proposed scheme that not only reduces task offloading delay but also optimize system overhead in comparison to baseline schemes.
引用
下载
收藏
相关论文
共 50 条
  • [21] Task Offloading for End-Edge-Cloud Orchestrated Computing in Mobile Networks
    Sun, Chuan
    Li, Hui
    Li, Xiuhua
    Wen, Junhao
    Xiong, Qingyu
    Wang, Xiaofei
    Leung, Victor C. M.
    2020 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2020,
  • [22] Multi-information based cloud-edge-end collaborative computational tasks offloading for industrial IoT systems
    Wu, Xiaoge
    PHYSICAL COMMUNICATION, 2024, 66
  • [23] Cloud-Edge-End Collaborative Intelligent Service Computation Offloading: A Digital Twin Driven Edge Coalition Approach for Industrial IoT
    Li, Xiaohuan
    Chen, Bitao
    Fan, Junchuan
    Kang, Jiawen
    Ye, Jin
    Wang, Xun
    Niyato, Dusit
    IEEE Transactions on Network and Service Management, 2024, 21 (06): : 6318 - 6330
  • [24] Cloud-edge-end workflow scheduling with multiple privacy levels
    Wang, Shuang
    Yuan, Zian
    Zhang, Xiaodong
    Wu, Jiawen
    Wang, Yamin
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2024, 189
  • [25] Research on Cloud-Edge-End Collaborative Computing Offloading Strategy in the Internet of Vehicles Based on the M-TSA Algorithm
    Xu, Qiliang
    Zhang, Guo
    Wang, Jianping
    SENSORS, 2023, 23 (10)
  • [26] A DRL-Driven Intelligent Optimization Strategy for Resource Allocation in Cloud-Edge-End Cooperation Environments
    Fang, Chao
    Zhang, Tianyi
    Huang, Jingjing
    Xu, Hang
    Hu, Zhaoming
    Yang, Yihui
    Wang, Zhuwei
    Zhou, Zequan
    Luo, Xiling
    SYMMETRY-BASEL, 2022, 14 (10):
  • [27] Task Offloading in Cloud-Edge Environments: A Deep-Reinforcement-Learning-Based Solution
    Wang, Suzhen
    Deng, Yongchen
    Hu, Zhongbo
    INTERNATIONAL JOURNAL OF DIGITAL CRIME AND FORENSICS, 2023, 15 (01)
  • [28] On the Optimality of Task Offloading in Mobile Edge Computing Environments
    Alghamdi, Ibrahim
    Anagnostopoulos, Christos
    Pezaros, Dimitrios P.
    2019 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2019,
  • [29] Task offloading for vehicular edge computing with edge-cloud cooperation
    Fei Dai
    Guozhi Liu
    Qi Mo
    WeiHeng Xu
    Bi Huang
    World Wide Web, 2022, 25 : 1999 - 2017
  • [30] Correction to: Task offloading for vehicular edge computing with edge‑cloud cooperation
    Fei Dai
    Guozhi Liu
    Qi Mo
    WeiHeng Xu
    Bi Huang
    World Wide Web, 2023, 26 : 633 - 633