A Neural Network Algorithm of Learning Rate Adaptive Optimization and Its Application in Emitter Recognition

被引:0
|
作者
Jiang, Jihong [1 ]
Gou, Yan [1 ,2 ]
Zhang, Wei [1 ,3 ]
Yang, Jian [4 ]
Gu, Jie [3 ]
Shao, Huaizong [1 ,4 ]
机构
[1] Univ Elect Sci & Technol China, Chengdu 611731, Sichuan, Peoples R China
[2] Southwest China Inst Elect Technol, Chengdu 610036, Sichuan, Peoples R China
[3] Sci & Technol Elect Informat Control Lab, Chengdu 610036, Sichuan, Peoples R China
[4] Peng Cheng Lab, Shenzhen 519012, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Neural network; Learning rate; Algorithm optimization; Emitter recognition; Application;
D O I
10.1007/978-3-030-97124-3_29
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The setting of the learning rate in neural network training is very important. A too low learning rate will reduce the network optimization speed and prolong the training time while a too high learning rate is easy to exceed the optimal value, leading to the difficulty of model convergence. To solve this problem, based on the analysis of two common learning rate strategies, the attenuating learning rate and the adaptive learning rate, combined with the Adam algorithm, this paper proposes an adaptive learning rate algorithm based on the value of the current loss function and the previous one, and verifies the effectiveness of the algorithm by using the actual radiation source signal. The experimental results show that compared with the Adam algorithm, the number of network training iterations is reduced by 45.5% and the recognition accuracy has increased by 3.6%, which effectively improves the learning speed and reduces the training time.
引用
收藏
页码:390 / 402
页数:13
相关论文
共 50 条
  • [21] Application of Adaptive Whale Optimization Algorithm Based BP Neural Network in RSSI Positioning
    Duo Peng
    Mingshuo Liu
    Kun Xie
    [J]. Journal of Beijing Institute of Technology, 2024, 33 (06) - 529
  • [22] An improved artificial electric field algorithm and its application in neural network optimization
    Cheng, Jiatang
    Xu, Peizhen
    Xiong, Yan
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2022, 101
  • [23] Backpropagation Neural Network with Adaptive Learning Rate for Classification
    Jullapak, Rujira
    Thammano, Arit
    [J]. ADVANCES IN NATURAL COMPUTATION, FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, ICNC-FSKD 2022, 2023, 153 : 493 - 499
  • [24] An Application Research on Adaptive Learning Algorithm in Image Recognition
    Jia, Hailong
    Zhu, Shanhong
    [J]. 2012 2ND INTERNATIONAL CONFERENCE ON APPLIED ROBOTICS FOR THE POWER INDUSTRY (CARPI), 2012, : 401 - 403
  • [25] A sequential learning algorithm of neural network and its application in crop variety selection
    Deng, C
    Zhang, R
    Li, SW
    Xiong, FL
    [J]. ARTIFICIAL INTELLIGENCE IN AGRICULTURE 1998, 1998, : 127 - 131
  • [26] Neural network learning algorithm based on fading Kalman filtering and its application
    Gao, Shesheng
    Yang, Yi
    Gao, Bingbing
    [J]. Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University, 2015, 33 (02): : 320 - 325
  • [27] Neural network learning algorithm based on direction of inner product and its application
    Ma, Xiaomin
    Yang, Yixian
    [J]. Beijing Youdian Xueyuan Xuebao/Journal of Beijing University of Posts And Telecommunications, 1998, 21 (04): : 43 - 47
  • [28] Neural Network Application for Emitter Identification
    Matuszewski, Jan
    Sikorska-Lukasiewicz, Katarzyna
    [J]. 2017 18TH INTERNATIONAL RADAR SYMPOSIUM (IRS), 2017,
  • [29] An adaptive recursive least square algorithm for feed forward neural network and its application
    Qing, Xi-hong
    Xu, Jun-yi
    Gu, Fen-hong
    Feng, Ai-mu
    Nin, Wei
    Tao, Hua-xue
    [J]. ADVANCED INTELLIGENT COMPUTING THEORIES AND APPLICATIONS, PROCEEDINGS: WITH ASPECTS OF ARTIFICIAL INTELLIGENCE, 2007, 4682 : 315 - 323
  • [30] The Improved Training Algorithm of Back Propagation Neural Network with Self-adaptive Learning Rate
    Li, Yong
    Fu, Yang
    Li, Hui
    Zhang, Si-Wen
    [J]. PROCEEDINGS OF THE 2009 INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND NATURAL COMPUTING, VOL I, 2009, : 73 - 76