Improving evolutionary algorithms with information feedback model for large-scale many-objective optimization

被引:4
|
作者
Wang, Yong [1 ]
Zhang, Qian [1 ]
Wang, Gai-Ge [1 ]
机构
[1] Ocean Univ China, Sch Comp Sci & Technol, Qingdao 266100, Peoples R China
关键词
Many-objective; Information feedback model; One-by-one selection; Evolutionary algorithms; Large-scale optimization; GENERATIONAL DISTANCE; MOEA/D; MECHANISM; SEARCH; DESIGN;
D O I
10.1007/s10489-022-03964-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, many evolutionary algorithms have been proposed. Compared to other algorithms, the core of the many-objective evolutionary algorithm using a one-by-one selection strategy is to select offspring one by one in environmental selection. However, it does not perform well in resolving large-scale many-objective optimization problems. In addition, a large amount of meaningful information in the population of the previous iteration is not retained. The information feedback model is an effective strategy to reuse the information from previous populations and integrate it into the update process of the offspring. Based on the original algorithm, this paper proposes a series of many-objective evolutionary algorithms, including six new algorithms. Experiments were carried out in three different aspects. Using the same nine benchmark problems, we compared the original algorithm with six new algorithms. Algorithms with excellent performance were selected and compared with the latest studies using the information feedback model from two aspects. Then, the best one was selected for comparison with six state-of-the-art many-objective evolutionary algorithms. Additionally, non-parametric statistical tests were conducted to evaluate the different algorithms. The comparison, with up to 15 objectives and 1500 decision variables, showed that the proposed algorithm achieved the best performance, indicating its strong competitiveness.
引用
收藏
页码:11439 / 11473
页数:35
相关论文
共 50 条
  • [21] Evolutionary many-objective optimization
    Ishibuchi, Hisao
    Tsukamoto, Noritaka
    Nojima, Yusuke
    2008 3RD INTERNATIONAL WORKSHOP ON GENETIC AND EVOLVING FUZZY SYSTEMS, 2008, : 45 - 50
  • [22] Evolutionary Many-Objective Optimization
    Ishibuchi, Hisao
    Sato, Hiroyuki
    PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION (GECCCO'19 COMPANION), 2019, : 614 - 661
  • [23] Preference-guided evolutionary algorithms for many-objective optimization
    Goulart, Fillipe
    Campelo, Felipe
    INFORMATION SCIENCES, 2016, 329 : 236 - 255
  • [24] Many-Objective Evolutionary Algorithms: A Survey
    Li, Bingdong
    Li, Jinlong
    Tang, Ke
    Yao, Xin
    ACM COMPUTING SURVEYS, 2015, 48 (01)
  • [25] A comparative study of many-objective optimizers on large-scale many-objective software clustering problems
    Amarjeet Prajapati
    Complex & Intelligent Systems, 2021, 7 : 1061 - 1077
  • [27] Distributed Parallel Particle Swarm Optimization for Multi-Objective and Many-Objective Large-Scale Optimization
    Cao, Bin
    Zhao, Jianwei
    Lv, Zhihan
    Liu, Xin
    Yang, Shan
    Kang, Xinyuan
    Kang, Kai
    IEEE ACCESS, 2017, 5 : 8214 - 8221
  • [28] Multipopulation-Based Differential Evolution for Large-Scale Many-Objective Optimization
    Zhang, Kai
    Shen, Chaonan
    Yen, Gary G.
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (12) : 7596 - 7608
  • [29] A universal large-scale many-objective optimization framework based on cultural learning
    Wang, Xia
    Ge, Hongwei
    Zhang, Naiqiang
    Hou, Yaqing
    Sun, Liang
    APPLIED SOFT COMPUTING, 2023, 145
  • [30] Many-objective firefly algorithm for solving large-scale sparse optimization problems
    Zhao, Jia
    Hu, Qiu-Min
    Xiao, Ren-Bin
    Pan, Zheng-Xiang
    Cui, Zhi-Hua
    Fan, Tang-Huai
    Kongzhi yu Juece/Control and Decision, 2024, 39 (12): : 3989 - 3996