Optimal stochastic gradient descent algorithm for filtering

被引:1
|
作者
Turali, M. Yigit [1 ]
Koc, Ali T. [1 ]
Kozat, Suleyman S. [1 ]
机构
[1] Bilkent Univ, Dept Elect & Elect Engn, TR-06800 Ankara, Turkiye
关键词
Learning rate; Linear filtering; Optimization; Stochastic gradient descent; PREDICTION;
D O I
10.1016/j.dsp.2024.104731
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Stochastic Gradient Descent (SGD) is a fundamental optimization technique in machine learning, due to its efficiency in handling large-scale data. Unlike typical SGD applications, which rely on stochastic approximations, this work explores the convergence properties of SGD from a deterministic perspective. We address the crucial aspect of learning rate settings, a common obstacle in optimizing SGD performance, particularly in complex environments. In contrast to traditional methods that often provide convergence results based on statistical expectations (which are usually not justified), our approach introduces universally applicable learning rates. These rates ensure that a model trained with SGD matches the performance of the best linear filter asymptotically, applicable irrespective of the data sequence length and independent of statistical assumptions about the data. By establishing learning rates that scale as mu = O(1/t), we offer a solution that sidesteps the need for prior data knowledge, a prevalent limitation in real-world applications. To this end, we provide a robust framework for SGD's application across varied settings, guaranteeing convergence results that hold under both deterministic and stochastic scenarios without any underlying assumptions.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] An Efficient Stochastic Gradient Descent Algorithm to Maximize the Coverage of Cellular Networks
    Liu, Yaxi
    Wei Huangfu
    Zhang, Haijun
    Long, Keping
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2019, 18 (07) : 3424 - 3436
  • [42] Estimating the atmospheric correlation length with stochastic parallel gradient descent algorithm
    Yazdani, R.
    Hajimahmoodzadeh, M.
    Fallah, H. R.
    APPLIED OPTICS, 2014, 53 (07) : 1442 - 1448
  • [43] SW-SGD: The Sliding Window Stochastic Gradient Descent Algorithm
    Chakroun, Imen
    Haber, Tom
    Ashby, Thomas J.
    INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE (ICCS 2017), 2017, 108 : 2318 - 2322
  • [44] Coregistration based on stochastic parallel gradient descent algorithm for SAR interferometry
    Long, Xuejun
    Fu, Sihua
    Yu, Qifeng
    Wang, Sanhong
    Qi, Bo
    Ren, Ge
    REMOTE SENSING LETTERS, 2014, 5 (11) : 991 - 1000
  • [45] Implementation of Stochastic Parallel Gradient Descent Algorithm for Coherent Beam Combining
    Linslal, C. L.
    Sooraj, M. S.
    Padmanabhan, A.
    Venkitesh, D.
    Srinivasan, B.
    HIGH-POWER LASERS AND APPLICATIONS IX, 2018, 10811
  • [46] Using the Stochastic Gradient Descent Optimization Algorithm on Estimating of Reactivity Ratios
    Fazakas-Anca, Iosif Sorin
    Modrea, Arina
    Vlase, Sorin
    MATERIALS, 2021, 14 (16)
  • [47] Stochastic gradient descent algorithm preserving differential privacy in MapReduce framework
    Yu Y.
    Fu Y.
    Wu X.
    Tongxin Xuebao/Journal on Communications, 2018, 39 (01): : 70 - 77
  • [48] A Stochastic Parallel Gradient Descent Algorithm for Person Re-identification
    Cheng, Keyang
    Tao, Fei
    2018 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (IEEE VCIP), 2018,
  • [49] Improved method of stochastic parallel gradient descent algorithm with global coupling
    Jiang, Pengzhi
    Liang, Yonghui
    Xu, Jieping
    Mao, Hongjun
    Guangxue Xuebao/Acta Optica Sinica, 2014, 34
  • [50] Adaptive stochastic gradient descent for optimal control of parabolic equations with random parameters
    Cao, Yanzhao
    Das, Somak
    Wyk, Hans-Werner
    NUMERICAL METHODS FOR PARTIAL DIFFERENTIAL EQUATIONS, 2022, 38 (06) : 2104 - 2122