An efficient co-Attention Neural Network for Social Recommendation

被引:10
|
作者
Li, Munan [1 ]
Tei, Kenji [1 ]
Fukazawa, Yoshiaki [1 ]
机构
[1] Waseda Univ, Dept Comp Sci, Tokyo, Japan
关键词
Social Recommendation; co-Attention Neural Network; Network Embedding;
D O I
10.1145/3350546.3352498
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The recent boom in social networking services has prompted the research of recommendation systems. The basic assumption behind these works was that "users' preference is similar to or influenced by their friends". Although many studies have attempted to use social relations to enhance recommendation system, they neglected that the heterogeneous nature of online social networks and the variations in users' trust of friends according to different items. As a natural symmetry between the latent preference vector of the user and her friends, we propose a new social recommendation method called ScAN (short for "co-Attention Neural Network for Social Recommendation"). ScAN is based on a co-attention neural network, which learns the influence value between the user and her friends from the historical data of the interaction between the user/her friends and an item. When the user interacts with different items, different attention weights are assigned to the user and her friends respectively, and the user's new latent preference feature is obtained through an aggregation strategy. To enhance the recommendation performance, a network embedding technique is utilized as a pre-training strategy to extract the users' embedding and to incorporate the extracted factors into the neural network model. By conducting extensive experiments on three different real-world datasets, we demonstrate that our proposed method ScAN achieves a superior performance for all datasets compare with state-of-the-art baseline methods in social recommendation task.
引用
收藏
页码:34 / 42
页数:9
相关论文
共 50 条
  • [41] Multi-Channel Co-Attention Network for Visual Question Answering
    Tian, Weidong
    He, Bin
    Wang, Nanxun
    Zhao, Zhongqiu
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [42] Deep Co-Attention Network for Multi-View Subspace Learning
    Zheng, Lecheng
    Cheng, Yu
    Yang, Hongxia
    Cao, Nan
    He, Jingrui
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1528 - 1539
  • [43] Co-attention dictionary network for weakly-supervised semantic segmentation
    Wan, Weitao
    Chen, Jiansheng
    Yang, Ming-Hsuan
    Ma, Huimin
    NEUROCOMPUTING, 2022, 486 : 272 - 285
  • [44] Co-Attention Graph Pooling for Efficient Pairwise Graph Interaction Learning
    Lee, Junhyun
    Kim, Bumsoo
    Jeon, Minji
    Kang, Jaewoo
    IEEE ACCESS, 2023, 11 : 78549 - 78560
  • [45] Deep Modular Co-Attention Shifting Network for Multimodal Sentiment Analysis
    Shi, Piao
    Hu, Min
    Shi, Xuefeng
    Ren, Fuji
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (04)
  • [46] CANet: Co-attention network for RGB-D semantic segmentation
    Zhou, Hao
    Qi, Lu
    Huang, Hai
    Yang, Xu
    Wan, Zhaoliang
    Wen, Xianglong
    PATTERN RECOGNITION, 2022, 124
  • [47] Hierarchical Co-Attention Selection Network for Interpretable Fake News Detection
    Ge, Xiaoyi
    Hao, Shuai
    Li, Yuxiao
    Wei, Bin
    Zhang, Mingshu
    BIG DATA AND COGNITIVE COMPUTING, 2022, 6 (03)
  • [48] Pyramid Co-Attention Compare Network for Few-Shot Segmentation
    Zhang, Defu
    Luo, Ronghua
    Chen, Xuebin
    Chen, Lingwei
    IEEE ACCESS, 2021, 9 : 137249 - 137259
  • [49] Spatiotemporal-Textual Co-Attention Network for Video Question Answering
    Zha, Zheng-Jun
    Liu, Jiawei
    Yang, Tianhao
    Zhang, Yongdong
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2019, 15 (02)
  • [50] Encoder Fusion Network with Co-Attention Embedding for Referring Image Segmentation
    Feng, Guang
    Hu, Zhiwei
    Zhang, Lihe
    Lu, Huchuan
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 15501 - 15510