An RNA evolutionary algorithm based on gradient descent for function optimization

被引:0
|
作者
Wu, Qiuxuan [1 ,2 ,3 ]
Zhao, Zikai [1 ,2 ]
Chen, Mingming [1 ,2 ]
Chi, Xiaoni [4 ]
Zhang, Botao [1 ,2 ]
Wang, Jian [1 ,2 ]
Zhilenkov, Anton A. [3 ]
Chepinskiy, Sergey A. [5 ]
机构
[1] Hangzhou Dianzi Univ, Int Joint Res Lab Autonomous Robot Syst, Hangzhou 310018, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Peoples R China
[3] St Petersburg State Marine Tech Univ, Inst Hydrodynam & Control Proc, St Petersburg 190121, Russia
[4] Hangzhou Vocat & Tech Sch, Jeely Automative Inst, Hangzhou 310018, Peoples R China
[5] ITMO Univ, Fac Control Syst & Robot, St Petersburg 197101, Russia
关键词
RNA-inspired operations; adaptive gradient descent mutation operator; heuristic algorithm; function optimization; GENETIC ALGORITHM; PARAMETER-ESTIMATION;
D O I
10.1093/jcde/qwae068
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The optimization of numerical functions with multiple independent variables was a significant challenge with numerous practical applications in process control systems, data fitting, and engineering designs. Although RNA genetic algorithms offer clear benefits in function optimization, including rapid convergence, they have low accuracy and can easily become trapped in local optima. To address these issues, a new heuristic algorithm was proposed, a gradient descent-based RNA genetic algorithm. Specifically, adaptive moment estimation (Adam) was employed as a mutation operator to improve the local development ability of the algorithm. Additionally, two new operators inspired by the inner-loop structure of RNA molecules were introduced: an inner-loop crossover operator and an inner-loop mutation operator. These operators enhance the global exploration ability of the algorithm in the early stages of evolution and enable it to escape from local optima. The algorithm consists of two stages: a pre-evolutionary stage that employs RNA genetic algorithms to identify individuals in the vicinity of the optimal region and a post-evolutionary stage that applies a adaptive gradient descent mutation to further enhance the solution's quality. When compared with the current advanced algorithms for solving function optimization problems, Adam RNA Genetic Algorithm (RNA-GA) produced better optimal solutions. In comparison with RNA-GA and Genetic Algorithm (GA) across 17 benchmark functions, Adam RNA-GA ranked first with the best result of an average rank of 1.58 according to the Friedman test. In the set of 29 functions of the CEC2017 suite, compared with heuristic algorithms such as African Vulture Optimization Algorithm, Dung Beetle Optimization, Whale Optimization Algorithm, and Grey Wolf Optimizer, Adam RNA-GA ranked first with the best result of an average rank of 1.724 according to the Friedman test. Our algorithm not only achieved significant improvements over RNA-GA but also performed excellently among various current advanced algorithms for solving function optimization problems, achieving high precision in function optimization. Graphical Abstract
引用
收藏
页码:332 / 357
页数:26
相关论文
共 50 条
  • [21] SPGD: Search Party Gradient Descent Algorithm, a Simple Gradient-Based Parallel Algorithm for Bound-Constrained Optimization
    Syed Shahul Hameed, A. S.
    Rajagopalan, Narendran
    MATHEMATICS, 2022, 10 (05)
  • [22] Adaptive gradient descent optimization algorithm with improved differential term
    Ge, Quan-Bo
    Zhang, Jian-Chao
    Yang, Qin-Min
    Li, Hong
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2022, 39 (04): : 623 - 632
  • [23] A Descent Generalized RMIL Spectral Gradient Algorithm for Optimization Problems
    Sulaiman, Ibrahim M.
    Kaelo, P.
    Khalid, Ruzelan
    Nawawi, Mohd Kamal M.
    INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, 2024, 34 (02) : 225 - 233
  • [24] Improvement of SPGD by Gradient Descent Optimization Algorithm in Deep Learning
    Zhao, Qingsong
    Hao, Shiqi
    Wang, Yong
    Wang, Lei
    Lin, Zhi
    2022 ASIA COMMUNICATIONS AND PHOTONICS CONFERENCE, ACP, 2022, : 469 - 472
  • [25] Multiple-gradient descent algorithm (MGDA) for multiobjective optimization
    Desideri, Jean-Antoine
    COMPTES RENDUS MATHEMATIQUE, 2012, 350 (5-6) : 313 - 318
  • [26] RNA BASED EVOLUTIONARY OPTIMIZATION
    SCHUSTER, P
    ORIGINS OF LIFE AND EVOLUTION OF BIOSPHERES, 1993, 23 (5-6): : 373 - 391
  • [27] Improved evolutionary algorithm for global optimization based on a smooth function
    No.36 Research Institute, CETC, Jiaxing 314033, China
    不详
    不详
    Jilin Daxue Xuebao (Gongxueban), 2008, 4 (865-870):
  • [28] New evolutionary algorithm for function optimization
    Guo, Tao
    Kang, Li-shan
    Wuhan University Journal of Natural Sciences, 1999, 4 (04): : 409 - 414
  • [29] Simple Evolutionary Optimization Can Rival Stochastic Gradient Descent in Neural Networks
    Morse, Gregory
    Stanley, Kenneth O.
    GECCO'16: PROCEEDINGS OF THE 2016 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2016, : 477 - 484
  • [30] Research on the Quadrotor of AHRS based on Gradient Descent Algorithm
    Lin Feng
    He Liuzeng
    2018 EIGHTH INTERNATIONAL CONFERENCE ON INSTRUMENTATION AND MEASUREMENT, COMPUTER, COMMUNICATION AND CONTROL (IMCCC 2018), 2018, : 1831 - 1834