BERTERS: Multimodal representation learning for expert recommendation system with transformers and graph embeddings

被引:14
|
作者
Nikzad-Khasmakhi, N. [1 ]
Balafar, M. A. [1 ]
Feizi-Derakhshi, M. Reza [1 ]
Motamed, Cina [2 ]
机构
[1] Univ Tabriz, Dept Comp Engn, Tabriz, Iran
[2] Univ Orleans, Dept Comp Sci, Orleans, France
关键词
Multimodal representation learning; Expert recommendation system; Transformer; Graph embedding;
D O I
10.1016/j.chaos.2021.111260
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
An expert recommendation system suggests relevant experts of a particular topic based on three differ-ent scores authority, text similarity, and reputation. Most of the previous studies individually compute these scores and join them with a linear combination strategy. While, in this paper, we introduce a transfer learning-based and multimodal approach, called BERTERS, that presents each expert candidate by a single vector representation that includes these scores in itself. BERTERS determines a representa-tion for each candidate that presents the candidate's level of knowledge, popularity and influence, and history. BERTERS directly uses both transformers and the graph embedding techniques to convert the con -tent published by candidates and collaborative relationships between them into low-dimensional vectors which show the candidates' text similarity and authority scores. Also, to enhance the accuracy of rec-ommendation, BERTERS takes into account additional features as reputation score. We conduct extensive experiments over the multi-label classification, recommendation, and visualization tasks. Also, we assess its performance on four different classifiers, diverse train ratios, and various embedding sizes. In the clas-sification task, BERTERS strengthens the performance on Micro-F1 and Macro-F1 metrics by 23 . 40% and 34 . 45% compared with single-modality based methods. Furthermore, BERTERS achieves a gain of 9 . 12% in comparison with the baselines. Also, the results prove the capability of BERTERS to extend into a variety of domains such as academic and CQA to find experts. Since our proposed expert embeddings contain rich semantic and syntactic information of the candidate, BERTERS resulted in significantly improved per-formance over the baselines in all tasks. (c) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Trip Reinforcement Recommendation with Graph-based Representation Learning
    Chen, Lei
    Cao, Jie
    Tao, Haicheng
    Wu, Jia
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2023, 17 (04)
  • [22] Graph Diffusion-Based Representation Learning for Sequential Recommendation
    Wang, Zhaobo
    Zhu, Yanmin
    Wang, Chunyang
    Zhao, Xuhao
    Li, Bo
    Yu, Jiadi
    Tang, Feilong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 8395 - 8407
  • [23] Few-shot learning with transformers via graph embeddings for molecular property prediction
    Torres, Luis H. M.
    Ribeiro, Bernardete
    Arrais, Joel P.
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 225
  • [24] GraphFormers: GNN-nested Transformers for Representation Learning on Textual Graph
    Yang, Junhan
    Liu, Zheng
    Xiao, Shitao
    Li, Chaozhuo
    Lian, Defu
    Agrawal, Sanjay
    Singh, Amit
    Sun, Guangzhong
    Xie, Xing
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [25] Multimodal representation learning for tourism recommendation with two-tower architecture
    Cui, Yuhang
    Liang, Shengbin
    Zhang, Yuying
    PLOS ONE, 2024, 19 (02):
  • [26] Explainable mutual fund recommendation system developed based on knowledge graph embeddings
    Pei-Ying Hsu
    Chiao-Ting Chen
    Chin Chou
    Szu-Hao Huang
    Applied Intelligence, 2022, 52 : 10779 - 10804
  • [27] Explainable mutual fund recommendation system developed based on knowledge graph embeddings
    Hsu, Pei-Ying
    Chen, Chiao-Ting
    Chou, Chin
    Huang, Szu-Hao
    APPLIED INTELLIGENCE, 2022, 52 (09) : 10779 - 10804
  • [28] Word Representation Learning in Multimodal Pre-Trained Transformers: An Intrinsic Evaluation
    Pezzelle, Sandro
    Takmaz, Ece
    Fernandez, Raquel
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2021, 9 : 1563 - 1579
  • [29] Multimodal Representation Learning via Graph Isomorphism Network for Toxicity Multitask Learning
    Wang, Guishen
    Feng, Hui
    Du, Mengyan
    Feng, Yuncong
    Cao, Chen
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2024, 64 (21) : 8322 - 8338
  • [30] Multimodal Movie Recommendation System Using Deep Learning
    Mu, Yongheng
    Wu, Yun
    MATHEMATICS, 2023, 11 (04)