An RNA evolutionary algorithm based on gradient descent for function optimization

被引:0
|
作者
Wu, Qiuxuan [1 ,2 ,3 ]
Zhao, Zikai [1 ,2 ]
Chen, Mingming [1 ,2 ]
Chi, Xiaoni [4 ]
Zhang, Botao [1 ,2 ]
Wang, Jian [1 ,2 ]
Zhilenkov, Anton A. [3 ]
Chepinskiy, Sergey A. [5 ]
机构
[1] Hangzhou Dianzi Univ, Int Joint Res Lab Autonomous Robot Syst, Hangzhou 310018, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Peoples R China
[3] St Petersburg State Marine Tech Univ, Inst Hydrodynam & Control Proc, St Petersburg 190121, Russia
[4] Hangzhou Vocat & Tech Sch, Jeely Automative Inst, Hangzhou 310018, Peoples R China
[5] ITMO Univ, Fac Control Syst & Robot, St Petersburg 197101, Russia
关键词
RNA-inspired operations; adaptive gradient descent mutation operator; heuristic algorithm; function optimization; GENETIC ALGORITHM; PARAMETER-ESTIMATION;
D O I
10.1093/jcde/qwae068
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The optimization of numerical functions with multiple independent variables was a significant challenge with numerous practical applications in process control systems, data fitting, and engineering designs. Although RNA genetic algorithms offer clear benefits in function optimization, including rapid convergence, they have low accuracy and can easily become trapped in local optima. To address these issues, a new heuristic algorithm was proposed, a gradient descent-based RNA genetic algorithm. Specifically, adaptive moment estimation (Adam) was employed as a mutation operator to improve the local development ability of the algorithm. Additionally, two new operators inspired by the inner-loop structure of RNA molecules were introduced: an inner-loop crossover operator and an inner-loop mutation operator. These operators enhance the global exploration ability of the algorithm in the early stages of evolution and enable it to escape from local optima. The algorithm consists of two stages: a pre-evolutionary stage that employs RNA genetic algorithms to identify individuals in the vicinity of the optimal region and a post-evolutionary stage that applies a adaptive gradient descent mutation to further enhance the solution's quality. When compared with the current advanced algorithms for solving function optimization problems, Adam RNA Genetic Algorithm (RNA-GA) produced better optimal solutions. In comparison with RNA-GA and Genetic Algorithm (GA) across 17 benchmark functions, Adam RNA-GA ranked first with the best result of an average rank of 1.58 according to the Friedman test. In the set of 29 functions of the CEC2017 suite, compared with heuristic algorithms such as African Vulture Optimization Algorithm, Dung Beetle Optimization, Whale Optimization Algorithm, and Grey Wolf Optimizer, Adam RNA-GA ranked first with the best result of an average rank of 1.724 according to the Friedman test. Our algorithm not only achieved significant improvements over RNA-GA but also performed excellently among various current advanced algorithms for solving function optimization problems, achieving high precision in function optimization. Graphical Abstract
引用
收藏
页码:332 / 357
页数:26
相关论文
共 50 条
  • [1] Multifactorial Evolutionary Algorithm Based on Diffusion Gradient Descent
    Liu, Zhaobo
    Li, Guo
    Zhang, Haili
    Liang, Zhengping
    Zhu, Zexuan
    IEEE TRANSACTIONS ON CYBERNETICS, 2024, 54 (07) : 4267 - 4279
  • [2] A hybrid training algorithm based on gradient descent and evolutionary computation
    Yu Xue
    Yiling Tong
    Ferrante Neri
    Applied Intelligence, 2023, 53 : 21465 - 21482
  • [3] A hybrid training algorithm based on gradient descent and evolutionary computation
    Xue, Yu
    Tong, Yiling
    Neri, Ferrante
    APPLIED INTELLIGENCE, 2023, 53 (18) : 21465 - 21482
  • [4] Evolutionary Gradient Descent for Non-convex Optimization
    Xue, Ke
    Qian, Chao
    Xu, Ling
    Fei, Xudong
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 3221 - 3227
  • [5] An improved Adagrad gradient descent optimization algorithm
    Zhang, N.
    Lei, D.
    Zhao, J. F.
    2018 CHINESE AUTOMATION CONGRESS (CAC), 2018, : 2359 - 2362
  • [6] Traffic Signal Timings Optimization Based on Genetic Algorithm and Gradient Descent
    Yadav, Alok
    Nuthong, Chaiwat
    2020 5TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS (ICCCS 2020), 2020, : 670 - 674
  • [7] Iterative quantum algorithm for combinatorial optimization based on quantum gradient descent
    Yi, Xin
    Huo, Jia-Cheng
    Gao, Yong-Pan
    Fan, Ling
    Zhang, Ru
    Cao, Cong
    RESULTS IN PHYSICS, 2024, 56
  • [8] BRDF modeling and optimization of a target surface based on the gradient descent algorithm
    Li, Yanhui
    Yang, Pengfei
    Bai, Lu
    Zhang, Zifei
    APPLIED OPTICS, 2023, 62 (36) : 9486 - 9492
  • [9] Optimization of the Heart Pump Geometry based on Multiple Gradient Descent Algorithm
    Iscan, Mehmet
    Kadipasaoglu, Kamuran
    2017 ELECTRIC ELECTRONICS, COMPUTER SCIENCE, BIOMEDICAL ENGINEERINGS' MEETING (EBBT), 2017,
  • [10] Hybrid FCM learning algorithm based on particle swarm optimization and gradient descent algorithm
    Chen, Jun
    Zhang, Yue
    Gao, Xudong
    16TH IEEE INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV 2020), 2020, : 801 - 806