Adaptive Learning Rate for Unsupervised Learning of Deep Neural Networks

被引:4
|
作者
Golovko, Vladimir [1 ]
Mikhno, Egor [2 ]
Kroschanka, Aliaksandr [2 ]
Chodyka, Marta [1 ]
Lichograj, Piotr [1 ]
机构
[1] John Paul II Univ Appl Sci Biala Podlaska, Dept Informat, Biala Podlaska, Poland
[2] Brest State Tech Univ, Dept Intelligent Informat Technol, Brest, BELARUS
关键词
adaptive training step; deep learning; unsupervised learning;
D O I
10.1109/IJCNN54540.2023.10191642
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper an approach for adaptive learning step calculation using ReLU transfer function in neural network is proposed. This adaptive learning rate aims to automatically choose the step size that minimizes the objective function of neural network. We give a theoretical justification for the proposed adaptive learning rate approach, which is based on the steepest descent method. The main contribution of this paper is a novel technique for adaptive learning rate calculation, if we use ReLU transfer function. The experiments in data compression datasets show that proposed approach provides better generalization capability (test set accuracy) and permits to choose the learning rate automatically.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Adaptive Learning Rate and Momentum for Training Deep Neural Networks
    Hao, Zhiyong
    Jiang, Yixuan
    Yu, Huihua
    Chiang, Hsiao-Dong
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 381 - 396
  • [2] Learning to Optimize with Unsupervised Learning: Training Deep Neural Networks for URLLC
    Sun, Chengjian
    Yang, Chenyang
    [J]. 2019 IEEE 30TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2019, : 451 - 457
  • [3] Theoretical Notes on Unsupervised Learning in Deep Neural Networks
    Golovko, Vladimir
    Kroshchanka, Aliaksandr
    [J]. PROCEEDINGS OF THE 8TH INTERNATIONAL JOINT CONFERENCE ON COMPUTATIONAL INTELLIGENCE, VOL 3: NCTA, 2016, : 91 - 96
  • [4] Appropriate Learning Rates of Adaptive Learning Rate Optimization Algorithms for Training Deep Neural Networks
    Iiduka, Hideaki
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13250 - 13261
  • [5] Evolving Unsupervised Deep Neural Networks for Learning Meaningful Representations
    Sun, Yanan
    Yen, Gary G.
    Yi, Zhang
    [J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2019, 23 (01) : 89 - 103
  • [6] An Unsupervised Learning Algorithm for Deep Recurrent Spiking Neural Networks
    Du, Pangao
    Lin, Xianghong
    Pi, Xiaomei
    Wang, Xiangwen
    [J]. 2020 11TH IEEE ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2020, : 603 - 607
  • [7] Learning Compact Binary Descriptors with Unsupervised Deep Neural Networks
    Lin, Kevin
    Lu, Jiwen
    Chen, Chu-Song
    Zhou, Jie
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 1183 - 1192
  • [8] IMPROVING DEEP CONVOLUTIONAL NEURAL NETWORKS WITH UNSUPERVISED FEATURE LEARNING
    Kien Nguyen
    Fookes, Clinton
    Sridharan, Sridha
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 2270 - 2274
  • [9] Adaptive Learning Rate via Covariance Matrix Based Preconditioning for Deep Neural Networks
    Ida, Yasutoshi
    Fujiwara, Yasuhiro
    Iwamura, Sotetsu
    [J]. PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1923 - 1929
  • [10] The Effect of Adaptive Learning Rate on the Accuracy of Neural Networks
    Jepkoech, Jennifer
    Mugo, David Muchangi
    Kenduiywo, Benson K.
    Too, Edna Chebet
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (08) : 736 - 751