An RNA evolutionary algorithm based on gradient descent for function optimization

被引:0
|
作者
Wu, Qiuxuan [1 ,2 ,3 ]
Zhao, Zikai [1 ,2 ]
Chen, Mingming [1 ,2 ]
Chi, Xiaoni [4 ]
Zhang, Botao [1 ,2 ]
Wang, Jian [1 ,2 ]
Zhilenkov, Anton A. [3 ]
Chepinskiy, Sergey A. [5 ]
机构
[1] Hangzhou Dianzi Univ, Int Joint Res Lab Autonomous Robot Syst, Hangzhou 310018, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Peoples R China
[3] St Petersburg State Marine Tech Univ, Inst Hydrodynam & Control Proc, St Petersburg 190121, Russia
[4] Hangzhou Vocat & Tech Sch, Jeely Automative Inst, Hangzhou 310018, Peoples R China
[5] ITMO Univ, Fac Control Syst & Robot, St Petersburg 197101, Russia
关键词
RNA-inspired operations; adaptive gradient descent mutation operator; heuristic algorithm; function optimization; GENETIC ALGORITHM; PARAMETER-ESTIMATION;
D O I
10.1093/jcde/qwae068
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The optimization of numerical functions with multiple independent variables was a significant challenge with numerous practical applications in process control systems, data fitting, and engineering designs. Although RNA genetic algorithms offer clear benefits in function optimization, including rapid convergence, they have low accuracy and can easily become trapped in local optima. To address these issues, a new heuristic algorithm was proposed, a gradient descent-based RNA genetic algorithm. Specifically, adaptive moment estimation (Adam) was employed as a mutation operator to improve the local development ability of the algorithm. Additionally, two new operators inspired by the inner-loop structure of RNA molecules were introduced: an inner-loop crossover operator and an inner-loop mutation operator. These operators enhance the global exploration ability of the algorithm in the early stages of evolution and enable it to escape from local optima. The algorithm consists of two stages: a pre-evolutionary stage that employs RNA genetic algorithms to identify individuals in the vicinity of the optimal region and a post-evolutionary stage that applies a adaptive gradient descent mutation to further enhance the solution's quality. When compared with the current advanced algorithms for solving function optimization problems, Adam RNA Genetic Algorithm (RNA-GA) produced better optimal solutions. In comparison with RNA-GA and Genetic Algorithm (GA) across 17 benchmark functions, Adam RNA-GA ranked first with the best result of an average rank of 1.58 according to the Friedman test. In the set of 29 functions of the CEC2017 suite, compared with heuristic algorithms such as African Vulture Optimization Algorithm, Dung Beetle Optimization, Whale Optimization Algorithm, and Grey Wolf Optimizer, Adam RNA-GA ranked first with the best result of an average rank of 1.724 according to the Friedman test. Our algorithm not only achieved significant improvements over RNA-GA but also performed excellently among various current advanced algorithms for solving function optimization problems, achieving high precision in function optimization. Graphical Abstract
引用
收藏
页码:332 / 357
页数:26
相关论文
共 50 条
  • [41] A Closed Loop Gradient Descent Algorithm applied to Rosenbrock's function
    Bhattacharjee, Subhransu S.
    Petersen, Ian R.
    2021 AUSTRALIAN & NEW ZEALAND CONTROL CONFERENCE (ANZCC), 2021, : 137 - 142
  • [42] Constrained Optimization by the Evolutionary Algorithm with Lower Dimensional Crossover and Gradient-Based Mutation
    Zhang, Qing
    Zeng, Sanyou
    Wang, Rui
    Shi, Hui
    Chen, Guang
    Ding, Lixin
    Kang, Lishan
    2008 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1-8, 2008, : 273 - 279
  • [43] A Hybrid Evolutionary Algorithm to Solve Function Optimization
    Zhao, Dan
    Li, Zhenhua
    Guo, Weiya
    PROGRESS IN INTELLIGENCE COMPUTATION AND APPLICATIONS, 2008, : 245 - 248
  • [44] Parameter optimization of an evolutionary algorithm for RNA structure discovery
    Fogel, GB
    Weekes, DG
    Sampath, R
    Ecker, DJ
    CEC2004: PROCEEDINGS OF THE 2004 CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1 AND 2, 2004, : 607 - 613
  • [45] A high efficient evolutionary algorithm for function optimization
    Xie, Datong
    Kang, Lishan
    Li, Chengjun
    Du, Xin
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2007, 14 : 634 - 639
  • [46] A Mirror Descent-Based Algorithm for Corruption-Tolerant Distributed Gradient Descent
    Wang, Shuche
    Tan, Vincent Y. F.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2025, 73 : 827 - 842
  • [47] A steepest descent evolution immune algorithm for multimodal function optimization
    Zhu, Li
    Li, Zhishu
    Sun, Bin
    ADVANCES IN COMPUTATION AND INTELLIGENCE, PROCEEDINGS, 2007, 4683 : 401 - +
  • [48] IdiffGrad: A Gradient Descent Algorithm for Intrusion Detection Based on diffGrad
    Sun, Weifeng
    Wang, Yiming
    Chang, Kangkang
    Meng, Kelong
    2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1583 - 1590
  • [49] Frequency memory based gradient descent bit flipping algorithm
    Asatani, Jun
    Kawanishi, Hiroaki
    Tokushige, Hitoshi
    Katayama, Kengo
    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2015, 10 (05) : 585 - 591
  • [50] A prediction algorithm of collection efficiency based on gradient descent method
    Ren J.
    Wang Q.
    Li W.
    Liu Y.
    Yi X.
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2023, 44 (04):