Multifactorial Evolutionary Algorithm Based on Diffusion Gradient Descent

被引:8
|
作者
Liu, Zhaobo [1 ]
Li, Guo [2 ]
Zhang, Haili [3 ]
Liang, Zhengping [2 ]
Zhu, Zexuan [4 ,5 ,6 ]
机构
[1] Shenzhen Univ, Inst Adv Study, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[2] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[3] Shenzhen Polytech, Inst Appl Math, Shenzhen 518055, Peoples R China
[4] Shenzhen Univ, Natl Engn Lab Big Data Syst Comp Technol, Shenzhen 518060, Peoples R China
[5] Shenzhen Pengcheng Lab, Shenzhen 518055, Peoples R China
[6] BGI Shenzhen, Shenzhen 518083, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Optimization; Convergence; Statistics; Sociology; Knowledge transfer; Costs; Convergence analysis; diffusion gradient descent (DGD); evolutionary multitasking (EMT); multifactorial evolutionary algorithm (MFEA); MULTITASKING; OPTIMIZATION; LMS;
D O I
10.1109/TCYB.2023.3270904
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The multifactorial evolutionary algorithm (MFEA) is one of the most widely used evolutionary multitasking (EMT) algorithms. The MFEA implements knowledge transfer among optimization tasks via crossover and mutation operators and it obtains high-quality solutions more efficiently than single-task evolutionary algorithms. Despite the effectiveness of MFEA in solving difficult optimization problems, there is no evidence of population convergence or theoretical explanations of how knowledge transfer increases algorithm performance. To fill this gap, we propose a new MFEA based on diffusion gradient descent (DGD), namely, MFEA-DGD in this article. We prove the convergence of DGD for multiple similar tasks and demonstrate that the local convexity of some tasks can help other tasks escape from local optima via knowledge transfer. Based on this theoretical foundation, we design complementary crossover and mutation operators for the proposed MFEA-DGD. As a result, the evolution population is endowed with a dynamic equation that is similar to DGD, that is, convergence is guaranteed, and the benefit from knowledge transfer is explainable. In addition, a hyper-rectangular search strategy is introduced to allow MFEA-DGD to explore more underdeveloped areas in the unified express space of all tasks and the subspace of each task. The proposed MFEA-DGD is verified experimentally on various multitask optimization problems, and the results demonstrate that MFEA-DGD can converge faster to competitive results compared to state-of-the-art EMT algorithms. We also show the possibility of interpreting the experimental results based on the convexity of different tasks.
引用
收藏
页码:4267 / 4279
页数:13
相关论文
共 50 条
  • [21] A prediction algorithm of collection efficiency based on gradient descent method
    Ren J.
    Wang Q.
    Li W.
    Liu Y.
    Yi X.
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2023, 44 (04):
  • [22] A Robust Roll Angle Estimation Algorithm Based on Gradient Descent
    Fan, Rui
    Wang, Lujia
    Liu, Ming
    Pitas, Ioannis
    2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2019,
  • [23] An efficient medical image registration algorithm based on gradient descent
    Zou Xiao-chun
    Zhao Xin-bo
    Feng Yan
    2007 IEEE/ICME INTERNATIONAL CONFERENCE ON COMPLEX MEDICAL ENGINEERING, VOLS 1-4, 2007, : 636 - 639
  • [24] SPECULAR POINT CALCULATION BASED ON MODIFIED GRADIENT DESCENT ALGORITHM
    Tian, Yusen
    Wang, Xianyi
    Sun, Yueqiang
    Wang, Dongwei
    Wu, Chunjun
    Bai, Weihua
    Xia, Junming
    Du, Qifei
    IGARSS 2018 - 2018 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2018, : 1047 - 1050
  • [25] A Stochastic Gradient Descent Algorithm Based on Adaptive Differential Privacy
    Deng, Yupeng
    Li, Xiong
    He, Jiabei
    Liu, Yuzhen
    Liang, Wei
    COLLABORATIVE COMPUTING: NETWORKING, APPLICATIONS AND WORKSHARING, COLLABORATECOM 2022, PT II, 2022, 461 : 133 - 152
  • [26] A generalized normalized gradient descent algorithm
    Mandic, DP
    IEEE SIGNAL PROCESSING LETTERS, 2004, 11 (02) : 115 - 118
  • [27] Learning gradients by a gradient descent algorithm
    Dong, Xuemei
    Zhou, Ding-Xuan
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2008, 341 (02) : 1018 - 1027
  • [28] A stochastic multiple gradient descent algorithm
    Mercier, Quentin
    Poirion, Fabrice
    Desideri, Jean-Antoine
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2018, 271 (03) : 808 - 817
  • [29] Evolutionary algorithm and multifactorial evolutionary algorithm on clustered shortest-path tree problem
    Phan Thi Hong Hanh
    Pham Dinh Thanh
    Huynh Thi Thanh Binh
    INFORMATION SCIENCES, 2021, 553 : 280 - 304
  • [30] A parametric segmented multifactorial evolutionary algorithm based on a three-phase analysis
    Peihua Chai
    Langcai Cao
    Ridong Xu
    Yifeng Zeng
    Applied Intelligence, 2023, 53 : 25605 - 25625