Accurate Latent Factor Analysis via Particle Swarm Optimizers

被引:0
|
作者
Chen, Jia [1 ]
Luo, Xin [2 ,3 ,4 ]
Zhou, MengChu [5 ]
机构
[1] Beihang Univ, Sch Cyber Sci & Technol, Beijing 100191, Peoples R China
[2] Chinese Acad Sci, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing 400714, Peoples R China
[3] Chinese Acad Sci, Chongqing Engn Res Ctr Big Data Applicat Smart Ci, Chongqing Inst Green & Intelligent Technol, Chongqing 400714, Peoples R China
[4] Univ Chinese Acad Sci, Chongqing Sch, Chongqing 400714, Peoples R China
[5] New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
基金
中国国家自然科学基金;
关键词
Big Data; Latent Factor Analysis (LFA); Particle Swarm Optimization (PSO); High-dimensional and Sparse Matrix; Large-Scale Incomplete Data; Machine Learning; Missing Data Estimation; Industrial Application; OPTIMIZATION; STRATEGY;
D O I
10.1109/SMC52423.2021.9659218
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
A stochastic-gradient-descent-based Latent Factor Analysis (LFA) model is highly efficient in representative learning of a High-Dimensional and Sparse (HiDS) matrix. Its learning rate adaptation is vital in ensuring its efficiency. Such adaptation can be realized with an evolutionary computing algorithm. However, a resultant model tends to suffer from two issues: a) the pre-mature convergence of the swarm of learning rates as caused by an adopted evolution algorithm, and b) the pre-mature convergence of the LFA model as caused jointly by evolution-based learning rate adaptation and an optimization algorithm. This paper focuses on the methods to address such issues. A Hierarchical Particle-swarm-optimization-incorporated Latent factor analysis (HPL) model with a two-layered structure is proposed, where the first layer pre-trains desired latent factors with a position-transitional particle-swarm-optimization-based LFA model, and the second layer performs latent factor refining with a newly-proposed mini-batch particle swarm optimizer. With such design, an HPL model can well handle the pre-mature convergence, which is supported by the positive experimental results achieved on HiDS matrices from industrial applications.
引用
收藏
页码:2930 / 2935
页数:6
相关论文
共 50 条
  • [41] Pseudo-Adaptive Penalization to Handle Constraints in Particle Swarm Optimizers
    Innocente, M. S.
    Sienz, J.
    [J]. PROCEEDINGS OF THE TENTH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STRUCTURES TECHNOLOGY, 2010, 93
  • [42] Markov Chain Models of Bare-Bones Particle Swarm Optimizers
    Poli, Riccardo
    Langdon, William B.
    [J]. GECCO 2007: GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, VOL 1 AND 2, 2007, : 142 - +
  • [43] How to design a powerful family of particle swarm optimizers for inverse modelling
    Fernandez Martinez, Juan Luis
    Garcia Gonzalo, Esperanza
    Fernandez Muniz, Zulima
    Mukerji, Tapan
    [J]. TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2012, 34 (06) : 705 - 719
  • [44] Mean and Variance of the Sampling Distribution of Particle Swarm Optimizers During Stagnation
    Poli, Riccardo
    [J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2009, 13 (04) : 712 - 721
  • [45] NESTING DISCRETE PARTICLE SWARM OPTIMIZERS FOR MULTI-SOLUTION PROBLEMS
    Kubota, Masafami
    Saito, Toshimichi
    [J]. ECTA 2011/FCTA 2011: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON EVOLUTIONARY COMPUTATION THEORY AND APPLICATIONS AND INTERNATIONAL CONFERENCE ON FUZZY COMPUTATION THEORY AND APPLICATIONS, 2011, : 263 - 266
  • [46] Towards a Network-based Approach to Analyze Particle Swarm Optimizers
    Oliveira, Marcos
    Bastos-Filho, Carmelo J. A.
    Menezes, Ronaldo
    [J]. 2014 IEEE SYMPOSIUM ON SWARM INTELLIGENCE (SIS), 2014, : 166 - 173
  • [47] Empirical study of hybrid particle swarm optimizers with the simplex method operator
    Wang, F
    Qiu, YH
    [J]. 5TH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, PROCEEDINGS, 2005, : 308 - 313
  • [48] Most significant hotspot detection using improved particle swarm optimizers
    Wadhwa, Ankita
    Thakur, Manish Kumar
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2022, 75
  • [49] Fission-and-Recombination Particle Swarm Optimizers for Search of Multiple Solutions
    Sato, Takumi
    Saito, Toshimichi
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2014), PT II, 2014, 8835 : 254 - 262
  • [50] Particle Swarm Optimization: A Powerful Family of Stochastic Optimizers. Analysis, Design and Application to Inverse Modelling
    Luis Fernandez-Martinez, Juan
    Garcia-Gonzalo, Esperanza
    Saraswathi, Saras
    Jernigan, Robert
    Kloczkowski, Andrzej
    [J]. ADVANCES IN SWARM INTELLIGENCE, PT I, 2011, 6728 : 1 - 8