Featured Hybrid Recommendation System Using Stochastic Gradient Descent

被引:5
|
作者
Nguyen, Si Thin [1 ]
Kwak, Hyun Young [1 ]
Lee, Si Young [1 ]
Gim, Gwang Yong [2 ]
机构
[1] Soongsil Univ, Grad Sch, Dept IT Policy & Management, Seoul, South Korea
[2] Soongsil Univ, Grad Sch, Dept Business Adm, Seoul, South Korea
关键词
Recommendation system; stochastic gradient; decent matrix factorization; content-based; collaborative filtering; incremental learning; SPECIAL-ISSUE;
D O I
10.2991/ijndc.k.201218.004
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Beside cold-start and sparsity, developing incremental algorithms emerge as interesting research to recommendation system in real-data environment. While hybrid system research is insufficient due to the complexity in combining various source of each single such as content-based or collaboration filtering, stochastic gradient descent exposes the limitations regarding optimal process in incremental learning. Stem from these disadvantages, this study adjusts a novel incremental algorithm using in featured hybrid system combing the feature of content-based method and the robustness of matrix factorization in collaboration filtering. To evaluate experiments, the authors simultaneously design an incremental evaluation approach for real data. With the hypothesis results, the study proves that the featured hybrid system is feasible to develop as the future direction research, and the proposed model achieve better results in both learning time and accuracy. (C) 2021 The Authors. Published by Atlantis Press B.V.
引用
收藏
页码:25 / 32
页数:8
相关论文
共 50 条
  • [1] Featured Hybrid Recommendation System Using Stochastic Gradient Descent
    Si Thin Nguyen
    Hyun Young Kwak
    Si Young Lee
    Gwang Yong Gim
    [J]. International Journal of Networked and Distributed Computing, 2021, 9 : 25 - 32
  • [2] Implicit Stochastic Gradient Descent Method for Cross-Domain Recommendation System
    Vo, Nam D.
    Hong, Minsung
    Jung, Jason J.
    [J]. SENSORS, 2020, 20 (09)
  • [3] Stochastic gradient descent for hybrid quantum-classical optimization
    Sweke, Ryan
    Wilde, Frederik
    Meyer, Johannes Jakob
    Schuld, Maria
    Faehrmann, Paul K.
    Meynard-Piganeau, Barthelemy
    Eisert, Jens
    [J]. QUANTUM, 2020, 4
  • [4] Hybrid Approximate Gradient and Stochastic Descent for Falsification of Nonlinear Systems
    Yaghoubi, Shakiba
    Fainekos, Georgios
    [J]. 2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 529 - 534
  • [5] Adaptive wavefront correction: a hybrid VLSI/optical system implementing parallel stochastic gradient descent
    Cohen, MH
    Vorontsov, M
    Carhart, G
    Cauwenberghs, G
    [J]. OPTICS IN ATMOSPHERIC PROPAGATION AND ADAPTIVE SYSTEMS III, 1999, 3866 : 176 - 182
  • [6] Brain Source Localization Using Stochastic Gradient Descent
    Al-Momani, Sajedah
    Mir, Hasan
    Al-Nashash, Hasan
    Al-Kaylani, Muhammad
    [J]. IEEE SENSORS JOURNAL, 2021, 21 (06) : 8375 - 8383
  • [7] Distributed Stochastic Gradient Descent Using LDGM Codes
    Horii, Shunsuke
    Yoshida, Takahiro
    Kobayashi, Manabu
    Matsushima, Toshiyasu
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1417 - 1421
  • [8] LARGE SCALE RANKING USING STOCHASTIC GRADIENT DESCENT
    Tas, Engin
    [J]. COMPTES RENDUS DE L ACADEMIE BULGARE DES SCIENCES, 2022, 75 (10): : 1419 - 1427
  • [9] Using Stochastic Gradient Decent Algorithm For Incremental Matrix Factorization In Recommendation System
    Nguyen, Si-Thin
    Kwak, Hyun-Young
    Lee, Seok-Hee
    Gim, Gwang-Yong
    [J]. 2019 20TH IEEE/ACIS INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ARTIFICIAL INTELLIGENCE, NETWORKING AND PARALLEL/DISTRIBUTED COMPUTING (SNPD), 2019, : 308 - 319
  • [10] Preconditioned Stochastic Gradient Descent
    Li, Xi-Lin
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (05) : 1454 - 1466