Attentional factorization machine with review-based user-item interaction for recommendation

被引:0
|
作者
Li, Zheng [1 ,2 ,3 ]
Jin, Di [1 ]
Yuan, Ke [1 ]
机构
[1] Henan Univ, Coll Comp & Informat Engn, Kaifeng 475004, Henan, Peoples R China
[2] Henan Univ, Henan Engn Lab Spatial Informat Proc, Kaifeng 475004, Henan, Peoples R China
[3] Henan Univ, Henan Key Lab Big Data Anal & Proc, Kaifeng 475004, Henan, Peoples R China
基金
中国国家自然科学基金;
关键词
NETWORK;
D O I
10.1038/s41598-023-40633-4
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
In recommender systems, user reviews on items contain rich semantic information, which can express users' preferences and item features. However, existing review-based recommendation methods either use the static word vector model or cannot effectively extract long sequence features in reviews, resulting in the limited ability of user feature expression. Furthermore, the impact of different or useless feature interactions between users and items on recommendation performance is ignored. Therefore, we propose an attentional factorization machine with review-based user-item interaction for recommendation (AFMRUI), which first leverages RoBERTa to obtain the embedding feature of each user/item review, and combines bidirectional gated recurrent units with attention network to highlight more useful information in both user and item reviews. Then we adopt AFM to learn user-item feature interactions to distinguish the importance of different user-item feature interactions and further to obtain more accurate rating prediction, so as to promote recommendation. Finally, we conducted performance evaluation on five real-world datasets. Experimental results on five datasets demonstrated that the proposed AFMRUI outperformed the state-of-the-art review-based methods regarding two commonly used evaluation metrics.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Attentional factorization machine with review-based user–item interaction for recommendation
    Zheng Li
    Di Jin
    Ke Yuan
    [J]. Scientific Reports, 13
  • [2] Recommendation Based on Multimodal Information of User-Item Interactions
    Cai, Guoyong
    Chen, Nannan
    [J]. 2019 9TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST2019), 2019, : 288 - 293
  • [3] Dual disentanglement of user-item interaction for recommendation with causal embedding
    Wang, Chenyu
    Ye, Yawen
    Ma, Liyuan
    Li, Dun
    Zhuang, Lei
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (05)
  • [4] User-Item Matching for Recommendation Fairness
    Dong, Qiang
    Xie, Shuang-Shuang
    Li, Wen-Jun
    [J]. IEEE ACCESS, 2021, 9 : 130389 - 130398
  • [5] Learning user-item paths for explainable recommendation
    Wang, Tongxuan
    Zheng, Xiaolong
    He, Saike
    Zhang, Zhu
    Wu, Desheng Dash
    [J]. IFAC PAPERSONLINE, 2020, 53 (05): : 436 - 440
  • [6] Learning Neighbor User Intention on User-Item Interaction Graphs for Better Sequential Recommendation
    Yu, Mei
    Zhu, Kun
    Zhao, Mankun
    Yu, Jian
    Xu, Tianyi
    Jin, Di
    Li, Xuewei
    Yu, Ruiguo
    [J]. ACM TRANSACTIONS ON THE WEB, 2024, 18 (02)
  • [7] Service Recommendation based on Attentional Factorization Machine
    Cao, Yingcheng
    Liu, Jianxun
    Shi, Min
    Cao, Buqing
    Chen, Ting
    Wen, Yiping
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON SERVICES COMPUTING (IEEE SCC 2019), 2019, : 189 - 196
  • [8] Sketching Dynamic User-Item Interactions for Online Item Recommendation
    Kitazawa, Takuya
    [J]. CHIIR'17: PROCEEDINGS OF THE 2017 CONFERENCE HUMAN INFORMATION INTERACTION AND RETRIEVAL, 2017, : 357 - 360
  • [9] Declarative User-Item Profiling Based Context-Aware Recommendation
    Lumbantoruan, Rosni
    Zhou, Xiangmin
    Reu, Yongli
    [J]. ADVANCED DATA MINING AND APPLICATIONS, 2020, 12447 : 413 - 427
  • [10] Improving Collaborative Recommendation via User-Item Subgroups
    Bu, Jiajun
    Shen, Xin
    Xu, Bin
    Chen, Chun
    He, Xiaofei
    Cai, Deng
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (09) : 2363 - 2375