Multifactorial Evolutionary Algorithm Based on Diffusion Gradient Descent

被引:8
|
作者
Liu, Zhaobo [1 ]
Li, Guo [2 ]
Zhang, Haili [3 ]
Liang, Zhengping [2 ]
Zhu, Zexuan [4 ,5 ,6 ]
机构
[1] Shenzhen Univ, Inst Adv Study, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[2] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[3] Shenzhen Polytech, Inst Appl Math, Shenzhen 518055, Peoples R China
[4] Shenzhen Univ, Natl Engn Lab Big Data Syst Comp Technol, Shenzhen 518060, Peoples R China
[5] Shenzhen Pengcheng Lab, Shenzhen 518055, Peoples R China
[6] BGI Shenzhen, Shenzhen 518083, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Optimization; Convergence; Statistics; Sociology; Knowledge transfer; Costs; Convergence analysis; diffusion gradient descent (DGD); evolutionary multitasking (EMT); multifactorial evolutionary algorithm (MFEA); MULTITASKING; OPTIMIZATION; LMS;
D O I
10.1109/TCYB.2023.3270904
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The multifactorial evolutionary algorithm (MFEA) is one of the most widely used evolutionary multitasking (EMT) algorithms. The MFEA implements knowledge transfer among optimization tasks via crossover and mutation operators and it obtains high-quality solutions more efficiently than single-task evolutionary algorithms. Despite the effectiveness of MFEA in solving difficult optimization problems, there is no evidence of population convergence or theoretical explanations of how knowledge transfer increases algorithm performance. To fill this gap, we propose a new MFEA based on diffusion gradient descent (DGD), namely, MFEA-DGD in this article. We prove the convergence of DGD for multiple similar tasks and demonstrate that the local convexity of some tasks can help other tasks escape from local optima via knowledge transfer. Based on this theoretical foundation, we design complementary crossover and mutation operators for the proposed MFEA-DGD. As a result, the evolution population is endowed with a dynamic equation that is similar to DGD, that is, convergence is guaranteed, and the benefit from knowledge transfer is explainable. In addition, a hyper-rectangular search strategy is introduced to allow MFEA-DGD to explore more underdeveloped areas in the unified express space of all tasks and the subspace of each task. The proposed MFEA-DGD is verified experimentally on various multitask optimization problems, and the results demonstrate that MFEA-DGD can converge faster to competitive results compared to state-of-the-art EMT algorithms. We also show the possibility of interpreting the experimental results based on the convexity of different tasks.
引用
收藏
页码:4267 / 4279
页数:13
相关论文
共 50 条
  • [31] On the diffusion approximation of nonconvex stochastic gradient descent
    Hu, Wenqing
    Li, Chris Junchi
    Li, Lei
    Liu, Jian-Guo
    ANNALS OF MATHEMATICAL SCIENCES AND APPLICATIONS, 2019, 4 (01) : 3 - 32
  • [32] Adaptive archive-based multifactorial evolutionary algorithm for constrained multitasking optimization
    Xing, Caixiao
    Gong, Wenyin
    Li, Shuijia
    APPLIED SOFT COMPUTING, 2023, 143
  • [33] A parametric segmented multifactorial evolutionary algorithm based on a three-phase analysis
    Chai, Peihua
    Cao, Langcai
    Xu, Ridong
    Zeng, Yifeng
    APPLIED INTELLIGENCE, 2023, 53 (21) : 25605 - 25625
  • [34] Two-Frame Phase Shift Extraction Algorithm Based on Gradient Descent Algorithm
    Zhang Shaofeng
    Du Hubing
    Guo Ruiqing
    He Zhouxuan
    LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (13)
  • [35] A note on diffusion limits for stochastic gradient descent
    Lanconelli, Alberto
    Lauria, Christopher S. A.
    JOURNAL OF APPROXIMATION THEORY, 2025, 309
  • [36] Traffic Signal Timings Optimization Based on Genetic Algorithm and Gradient Descent
    Yadav, Alok
    Nuthong, Chaiwat
    2020 5TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS (ICCCS 2020), 2020, : 670 - 674
  • [37] An improved stochastic gradient descent algorithm based on Renyi differential privacy
    Cheng, XianFu
    Yao, YanQing
    Zhang, Liying
    Liu, Ao
    Li, Zhoujun
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (12) : 10694 - 10714
  • [38] Iterative quantum algorithm for combinatorial optimization based on quantum gradient descent
    Yi, Xin
    Huo, Jia-Cheng
    Gao, Yong-Pan
    Fan, Ling
    Zhang, Ru
    Cao, Cong
    RESULTS IN PHYSICS, 2024, 56
  • [39] Collaborative Innovation of Poster Design and CAD Based on Gradient Descent Algorithm
    Liao S.
    Zeng Z.
    Computer-Aided Design and Applications, 2024, 21 (S21): : 53 - 67
  • [40] A Maximum Power Point Tracking Algorithm Based On Gradient Descent Method
    Zhang, Jianpo
    Wang, Tao
    Ran, Huijuan
    2009 IEEE POWER & ENERGY SOCIETY GENERAL MEETING, VOLS 1-8, 2009, : 394 - +