Multifactorial Evolutionary Algorithm Based on Diffusion Gradient Descent

被引:8
|
作者
Liu, Zhaobo [1 ]
Li, Guo [2 ]
Zhang, Haili [3 ]
Liang, Zhengping [2 ]
Zhu, Zexuan [4 ,5 ,6 ]
机构
[1] Shenzhen Univ, Inst Adv Study, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[2] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[3] Shenzhen Polytech, Inst Appl Math, Shenzhen 518055, Peoples R China
[4] Shenzhen Univ, Natl Engn Lab Big Data Syst Comp Technol, Shenzhen 518060, Peoples R China
[5] Shenzhen Pengcheng Lab, Shenzhen 518055, Peoples R China
[6] BGI Shenzhen, Shenzhen 518083, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Optimization; Convergence; Statistics; Sociology; Knowledge transfer; Costs; Convergence analysis; diffusion gradient descent (DGD); evolutionary multitasking (EMT); multifactorial evolutionary algorithm (MFEA); MULTITASKING; OPTIMIZATION; LMS;
D O I
10.1109/TCYB.2023.3270904
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The multifactorial evolutionary algorithm (MFEA) is one of the most widely used evolutionary multitasking (EMT) algorithms. The MFEA implements knowledge transfer among optimization tasks via crossover and mutation operators and it obtains high-quality solutions more efficiently than single-task evolutionary algorithms. Despite the effectiveness of MFEA in solving difficult optimization problems, there is no evidence of population convergence or theoretical explanations of how knowledge transfer increases algorithm performance. To fill this gap, we propose a new MFEA based on diffusion gradient descent (DGD), namely, MFEA-DGD in this article. We prove the convergence of DGD for multiple similar tasks and demonstrate that the local convexity of some tasks can help other tasks escape from local optima via knowledge transfer. Based on this theoretical foundation, we design complementary crossover and mutation operators for the proposed MFEA-DGD. As a result, the evolution population is endowed with a dynamic equation that is similar to DGD, that is, convergence is guaranteed, and the benefit from knowledge transfer is explainable. In addition, a hyper-rectangular search strategy is introduced to allow MFEA-DGD to explore more underdeveloped areas in the unified express space of all tasks and the subspace of each task. The proposed MFEA-DGD is verified experimentally on various multitask optimization problems, and the results demonstrate that MFEA-DGD can converge faster to competitive results compared to state-of-the-art EMT algorithms. We also show the possibility of interpreting the experimental results based on the convexity of different tasks.
引用
收藏
页码:4267 / 4279
页数:13
相关论文
共 50 条
  • [1] An RNA evolutionary algorithm based on gradient descent for function optimization
    Wu, Qiuxuan
    Zhao, Zikai
    Chen, Mingming
    Chi, Xiaoni
    Zhang, Botao
    Wang, Jian
    Zhilenkov, Anton A.
    Chepinskiy, Sergey A.
    JOURNAL OF COMPUTATIONAL DESIGN AND ENGINEERING, 2024, 11 (04) : 332 - 357
  • [2] A hybrid training algorithm based on gradient descent and evolutionary computation
    Yu Xue
    Yiling Tong
    Ferrante Neri
    Applied Intelligence, 2023, 53 : 21465 - 21482
  • [3] A hybrid training algorithm based on gradient descent and evolutionary computation
    Xue, Yu
    Tong, Yiling
    Neri, Ferrante
    APPLIED INTELLIGENCE, 2023, 53 (18) : 21465 - 21482
  • [4] Convergence behavior of diffusion stochastic gradient descent algorithm
    Barani, Fatemeh
    Savadi, Abdorreza
    Yazdi, Hadi Sadoghi
    SIGNAL PROCESSING, 2021, 183
  • [5] Self-regularized nonlinear diffusion algorithm based on levenberg gradient descent
    Lu, Lu
    Zheng, Zongsheng
    Champagne, Benoit
    Yang, Xiaomin
    Wu, Wei
    SIGNAL PROCESSING, 2019, 163 (107-114) : 107 - 114
  • [6] A multiobjective multifactorial evolutionary algorithm based on decomposition
    Yao S.-S.
    Dong Z.-M.
    Wang X.-P.
    Kongzhi yu Juece/Control and Decision, 2021, 36 (03): : 637 - 644
  • [7] A gradient descent based algorithm for lp minimization
    Jiang, Shan
    Fang, Shu-Cherng
    Nie, Tiantian
    Xing, Wenxun
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2020, 283 (01) : 47 - 56
  • [8] Image registration algorithm based on gradient descent
    Zhao, Xinbo
    Zou, Xiaochun
    Zhang, Dinghua
    Zhang, Shunli
    Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University, 2007, 25 (05): : 642 - 645
  • [9] A Multifactorial Evolutionary Algorithm Based on Model Knowledge Transfer
    Lu, Xuan
    Chen, Lei
    Liu, Hai-Lin
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT IV, KSEM 2023, 2023, 14120 : 430 - 441
  • [10] Algorithm for Data Balancing Based on Gradient Descent
    Mukhin, A., V
    Kilbas, I. A.
    Paringer, R. A.
    Ilyasova, N. Yu
    Kupriyanov, A., V
    PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON ADVANCES IN SIGNAL PROCESSING AND ARTIFICIAL INTELLIGENCE, ASPAI' 2020, 2020, : 56 - 59