Adjusted stochastic gradient descent for latent factor analysis

被引:13
|
作者
Li, Qing [1 ,2 ]
Xiong, Diwen [1 ]
Shang, Mingsheng [1 ]
机构
[1] Chinese Acad Sci, Chongqing Inst Green & Intelligent Technol, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing 400714, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Coll Comp Sci & Technol, Chongqing 400065, Peoples R China
关键词
Big data analysis; High-dimensional and incomplete matrix; Stochastic gradient descent; Latent factor analysis; Gradient adjustment; Adaptive model; Particle swarm optimization; Local optima; FACTOR MODEL; SPARSE MATRICES; NEURAL-NETWORKS; OPTIMIZATION;
D O I
10.1016/j.ins.2021.12.065
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A high-dimensional and incomplete (HDI) matrix is a common form of big data in most industrial applications. Stochastic gradient descent (SGD) algorithm optimized latent factor analysis (LFA) model is often adopted in learning the abundant knowledge in HDI matrix. Despite its computational tractability and scalability, when solving a bilinear problem such as LFA, the regular SGD algorithm tends to be stuck in a local optimum. To address this issue, the paper innovatively proposes an Adjusted Stochastic Gradient Descent (ASGD) for Latent Factor Analysis, where the adjustment mechanism is implemented by considering the bi-polar gradient directions during optimization, such mechanism is theoretically proved for its efficiency in overstepping local saddle points and avoiding premature convergence. Also, the hyper-parameters of the model are implemented in a self-adaptive manner using the particle swarm optimization (PSO) algorithm, for higher practicality. Experimental results show that the proposed model outperforms other state-of-the-art approaches on six different HDI matrices from industrial applications, especially in prediction accuracy for missing data.(c) 2021 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页码:196 / 213
页数:18
相关论文
共 50 条
  • [1] Adaptive Alternating Stochastic Gradient Descent Algorithms for Large-Scale Latent Factor Analysis
    Qin, Wen
    Luo, Xin
    Zhou, MengChu
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON SERVICES COMPUTING (SCC 2021), 2021, : 285 - 290
  • [2] A Nonlinear PID-Incorporated Adaptive Stochastic Gradient Descent Algorithm for Latent Factor Analysis
    Li, Jinli
    Luo, Xin
    Yuan, Ye
    Gao, Shangce
    [J]. IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2023, : 1 - 15
  • [3] A Fuzzy PID-Incorporated Stochastic Gradient Descent Algorithm for Fast and Accurate Latent Factor Analysis
    Yuan, Ye
    Li, Jinli
    Luo, Xin
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2024, 32 (07) : 4049 - 4061
  • [4] Latent Factor-Based Recommenders Relying on Extended Stochastic Gradient Descent Algorithms
    Luo, Xin
    Wang, Dexian
    Zhou, MengChu
    Yuan, Huaqiang
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (02): : 916 - 926
  • [5] A proportional-integral-derivative-incorporated stochastic gradient descent-based latent factor analysis model
    Li, Jinli
    Yuan, Ye
    Ruan, Tao
    Chen, Jia
    Luo, Xin
    [J]. NEUROCOMPUTING, 2021, 427 : 29 - 39
  • [6] Parallel Adaptive Stochastic Gradient Descent Algorithms for Latent Factor Analysis of High-Dimensional and Incomplete Industrial Data
    Qin, Wen
    Luo, Xin
    Li, Shuai
    Zhou, MengChu
    [J]. IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2023, 21 (03) : 1 - 14
  • [7] Large-Scale and Scalable Latent Factor Analysis via Distributed Alternative Stochastic Gradient Descent for Recommender Systems
    Shi, Xiaoyu
    He, Qiang
    Luo, Xin
    Bai, Yanan
    Shang, Mingsheng
    [J]. IEEE TRANSACTIONS ON BIG DATA, 2022, 8 (02) : 420 - 431
  • [8] Effects of Extended Stochastic Gradient Descent Algorithms on Improving Latent Factor-Based Recommender Systems
    Luo, Xin
    Zhou, MengChu
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (02) : 618 - 624
  • [9] Efficient preconditioned stochastic gradient descent for estimation in latent variable models
    Baey, Charlotte
    Delattre, Maud
    Kuhn, Estelle
    Leger, Jean-Benoist
    Lemler, Sarah
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [10] Analysis of stochastic gradient descent in continuous time
    Jonas Latz
    [J]. Statistics and Computing, 2021, 31