An Adaptive Optimization Method Based on Learning Rate Schedule for Neural Networks

被引:2
|
作者
Yi, Dokkyun [1 ]
Ji, Sangmin [2 ]
Park, Jieun [1 ]
机构
[1] Daegu Univ, Seongsan Liberal Arts Coll, Kyungsan 38453, South Korea
[2] Chungnam Natl Univ, Dept Math, Coll Nat Sci, Daejeon 34134, South Korea
来源
APPLIED SCIENCES-BASEL | 2021年 / 11卷 / 02期
基金
新加坡国家研究基金会;
关键词
stochastic gradient methods; optimization; convergence; machine learning;
D O I
10.3390/app11020850
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Artificial intelligence (AI) is achieved by optimizing the cost function constructed from learning data. Changing the parameters in the cost function is an AI learning process (or AI learning for convenience). If AI learning is well performed, then the value of the cost function is the global minimum. In order to obtain the well-learned AI learning, the parameter should be no change in the value of the cost function at the global minimum. One useful optimization method is the momentum method; however, the momentum method has difficulty stopping the parameter when the value of the cost function satisfies the global minimum (non-stop problem). The proposed method is based on the momentum method. In order to solve the non-stop problem of the momentum method, we use the value of the cost function to our method. Therefore, as the learning method processes, the mechanism in our method reduces the amount of change in the parameter by the effect of the value of the cost function. We verified the method through proof of convergence and numerical experiments with existing methods to ensure that the learning works well.
引用
收藏
页码:1 / 11
页数:11
相关论文
共 50 条
  • [1] An Adaptive Learning Rate Schedule for SIGNSGD Optimizer in Neural Networks
    Kang Wang
    Tao Sun
    Yong Dou
    [J]. Neural Processing Letters, 2022, 54 : 803 - 816
  • [2] An Adaptive Learning Rate Schedule for SIGNSGD Optimizer in Neural Networks
    Wang, Kang
    Sun, Tao
    Dou, Yong
    [J]. NEURAL PROCESSING LETTERS, 2022, 54 (02) : 803 - 816
  • [3] A Novel Learning Rate Schedule in Optimization for Neural Networks and It's Convergence
    Park, Jieun
    Yi, Dokkyun
    Ji, Sangmin
    [J]. SYMMETRY-BASEL, 2020, 12 (04):
  • [4] Performance Enhancement of Adaptive Neural Networks Based on Learning Rate
    Zubair, Swaleha
    Singha, Anjani Kumar
    Pathak, Nitish
    Sharma, Neelam
    Urooj, Shabana
    Larguech, Samia Rabeh
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 74 (01): : 2005 - 2019
  • [5] The Optimization of Learning Rate for Neural Networks
    Huang, Weizhe
    Chen, Chi-Hua
    [J]. ASIA-PACIFIC JOURNAL OF CLINICAL ONCOLOGY, 2023, 19 : 17 - 17
  • [6] Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks
    Iiduka, Hideaki
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13250 - 13261
  • [7] A Diffferential Adaptive Learning Rate Method for Back-Propagation Neural Networks
    Iranmanesh, Saeid
    [J]. NN'09: PROCEEDINGS OF THE 10TH WSEAS INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, 2009, : 30 - 34
  • [8] ESOA Algorithm Based on learning rate optimization in Convolutional neural networks
    Wei, Peiyang
    Shi, Xiaoyu
    Zhou, Jiesan
    [J]. 2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 435 - 439
  • [9] Adaptive Learning Rate for Unsupervised Learning of Deep Neural Networks
    Golovko, Vladimir
    Mikhno, Egor
    Kroschanka, Aliaksandr
    Chodyka, Marta
    Lichograj, Piotr
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [10] The Effect of Adaptive Learning Rate on the Accuracy of Neural Networks
    Jepkoech, Jennifer
    Mugo, David Muchangi
    Kenduiywo, Benson K.
    Too, Edna Chebet
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (08) : 736 - 751