An RNA evolutionary algorithm based on gradient descent for function optimization

被引:0
|
作者
Wu, Qiuxuan [1 ,2 ,3 ]
Zhao, Zikai [1 ,2 ]
Chen, Mingming [1 ,2 ]
Chi, Xiaoni [4 ]
Zhang, Botao [1 ,2 ]
Wang, Jian [1 ,2 ]
Zhilenkov, Anton A. [3 ]
Chepinskiy, Sergey A. [5 ]
机构
[1] Hangzhou Dianzi Univ, Int Joint Res Lab Autonomous Robot Syst, Hangzhou 310018, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Peoples R China
[3] St Petersburg State Marine Tech Univ, Inst Hydrodynam & Control Proc, St Petersburg 190121, Russia
[4] Hangzhou Vocat & Tech Sch, Jeely Automative Inst, Hangzhou 310018, Peoples R China
[5] ITMO Univ, Fac Control Syst & Robot, St Petersburg 197101, Russia
关键词
RNA-inspired operations; adaptive gradient descent mutation operator; heuristic algorithm; function optimization; GENETIC ALGORITHM; PARAMETER-ESTIMATION;
D O I
10.1093/jcde/qwae068
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The optimization of numerical functions with multiple independent variables was a significant challenge with numerous practical applications in process control systems, data fitting, and engineering designs. Although RNA genetic algorithms offer clear benefits in function optimization, including rapid convergence, they have low accuracy and can easily become trapped in local optima. To address these issues, a new heuristic algorithm was proposed, a gradient descent-based RNA genetic algorithm. Specifically, adaptive moment estimation (Adam) was employed as a mutation operator to improve the local development ability of the algorithm. Additionally, two new operators inspired by the inner-loop structure of RNA molecules were introduced: an inner-loop crossover operator and an inner-loop mutation operator. These operators enhance the global exploration ability of the algorithm in the early stages of evolution and enable it to escape from local optima. The algorithm consists of two stages: a pre-evolutionary stage that employs RNA genetic algorithms to identify individuals in the vicinity of the optimal region and a post-evolutionary stage that applies a adaptive gradient descent mutation to further enhance the solution's quality. When compared with the current advanced algorithms for solving function optimization problems, Adam RNA Genetic Algorithm (RNA-GA) produced better optimal solutions. In comparison with RNA-GA and Genetic Algorithm (GA) across 17 benchmark functions, Adam RNA-GA ranked first with the best result of an average rank of 1.58 according to the Friedman test. In the set of 29 functions of the CEC2017 suite, compared with heuristic algorithms such as African Vulture Optimization Algorithm, Dung Beetle Optimization, Whale Optimization Algorithm, and Grey Wolf Optimizer, Adam RNA-GA ranked first with the best result of an average rank of 1.724 according to the Friedman test. Our algorithm not only achieved significant improvements over RNA-GA but also performed excellently among various current advanced algorithms for solving function optimization problems, achieving high precision in function optimization. Graphical Abstract
引用
收藏
页码:332 / 357
页数:26
相关论文
共 50 条
  • [31] Hinge Classification Algorithm Based on Asynchronous Gradient Descent
    Yan, Xiaodan
    Zhang, Tianxin
    Cui, Baojiang
    Deng, Jiangdong
    ADVANCES ON BROAD-BAND WIRELESS COMPUTING, COMMUNICATION AND APPLICATIONS, BWCCA-2017, 2018, 12 : 459 - 468
  • [32] Tip-tilt adaptive correction based on stochastic parallel gradient descent optimization algorithm
    Ma, Huimin
    Zhang, Pengfei
    Zhang, Jinghui
    Qiao, Chunhong
    Fan, Chengyu
    OPTICAL DESIGN AND TESTING IV, 2010, 7849
  • [33] A Combined Training Algorithm for RBF Neural Network Based on Particle Swarm Optimization and Gradient Descent
    Xu, Ming
    Chen, Hao
    Duan, Liwei
    PROCEEDINGS OF 2020 IEEE 9TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE (DDCLS'20), 2020, : 702 - 706
  • [34] Gradient descent based optimization of transparent Mamdani systems
    Riid, A
    Rüstern, E
    NEURAL NETWORKS AND SOFT COMPUTING, 2003, : 545 - 550
  • [35] A Cost-based Optimizer for Gradient Descent Optimization
    Kaoudi, Zoi
    Quiane-Ruiz, Jorge-Arnulfo
    Thirumuruganathan, Saravanan
    Chawla, Sanjay
    Agrawal, Divy
    SIGMOD'17: PROCEEDINGS OF THE 2017 ACM INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2017, : 977 - 992
  • [36] Comparison of the Stochastic Gradient Descent Based Optimization Techniques
    Yazan, Ersan
    Talu, M. Fatih
    2017 INTERNATIONAL ARTIFICIAL INTELLIGENCE AND DATA PROCESSING SYMPOSIUM (IDAP), 2017,
  • [37] A Stochastic Gradient Descent Algorithm for Antenna Tilt Optimization in Cellular Networks
    Liu, Yaxi
    Wei Huangfu
    Zhang, Haijun
    Long, Keping
    2018 10TH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS AND SIGNAL PROCESSING (WCSP), 2018,
  • [38] A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained Optimization
    Wu, XiaoPing
    Liu, LiYing
    Xie, FengJie
    Li, YongFei
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2015, 2015
  • [39] Using the Stochastic Gradient Descent Optimization Algorithm on Estimating of Reactivity Ratios
    Fazakas-Anca, Iosif Sorin
    Modrea, Arina
    Vlase, Sorin
    MATERIALS, 2021, 14 (16)
  • [40] A gradient descent algorithm for LASSO
    Kim, Yongdai
    Kim, Yuwon
    Kim, Jinseog
    PREDICTION AND DISCOVERY, 2007, 443 : 73 - 82