Improving Transformer-based Sequential Conversational Recommendations through Knowledge Graph Embeddings

被引:2
|
作者
Petruzzelli, Alessandro [1 ]
Martina, Alessandro Francesco Maria [1 ]
Spillo, Giuseppe [1 ]
Musto, Cataldo [1 ]
de Gemmis, Marco [1 ]
Lops, Pasquale [1 ]
Semeraro, Giovanni [1 ]
机构
[1] Univ Bari, Bari, Italy
关键词
Conversational Recommendations; Transformers; Knowledge Graphs; Recommender Systems; PREFERENCES; CRITIQUES;
D O I
10.1145/3627043.3659565
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Conversational Recommender Systems (CRS) have recently drawn attention due to their capacity of delivering personalized recommendations through multi-turn natural language interactions. In this paper, we fit into this research line and we introduce a Knowledge-Aware Sequential Conversational Recommender System (KASCRS) that exploits transformers and knowledge graph embeddings to provide users with recommendations in a conversational setting. In particular, KASCRS is able to predict a suitable recommendation based on the elements that are mentioned in a conversation between a user and a CRS. To do this, we design a model that: (i) encodes each conversation as a sequence of entities that are mentioned in the dialogue (i.e., items and properties), and (ii) is trained on a cloze task, that is to say, it learns to predict the final element in the sequence - that corresponds to the item to be recommended - based on the information it has previously seen. The model has two main hallmarks: first, we exploit Transformers and self-attention to capture the sequential dependencies that exist among the entities that are mentioned in the training dialogues, in a way similar to session-based recommender systems [25]. Next, we used knowledge graphs (KG) to improve the quality of the representation of the elements mentioned in each sequence. Indeed, we exploit knowledge graph embeddings techniques to pre-train the representation of items and properties, and we fed the input layer of our architecture with the resulting embeddings. In this way, KASCRS integrates both knowledge from the KGs as well as the dependencies and the co-occurrences emerging from conversational data, resulting in a more accurate representation of users and items. Our experiments confirmed this intuition, since KASCRS overcame several state-of-the-art baselines on two different datasets.
引用
收藏
页码:172 / 182
页数:11
相关论文
共 50 条
  • [31] Enriching Transformer-Based Embeddings for Emotion Identification in an Agglutinative Language: Turkish
    Uymaz, Hande Aka
    Metin, Senem Kumova
    IT PROFESSIONAL, 2023, 25 (04) : 67 - 73
  • [32] Hybrid Semantics-Aware Recommendations Exploiting Knowledge Graph Embeddings
    Musto, Cataldo
    Basile, Pierpaolo
    Semeraro, Giovanni
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI*IA 2019, 2019, 11946 : 87 - 100
  • [33] InteractE: Improving Convolution-Based Knowledge Graph Embeddings by Increasing Feature Interactions
    Vashishth, Shikhar
    Sanya, Soumya
    Nitin, Vikram
    Agrawal, Nilesh
    Talukdar, Partha
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 3009 - 3016
  • [34] Clinical trial recommendations using Semantics-Based inductive inference and knowledge graph embeddings
    Devarakonda, Murthy, V
    Mohanty, Smita
    Sunkishala, Raja Rao
    Mallampalli, Nag
    Liu, Xiong
    JOURNAL OF BIOMEDICAL INFORMATICS, 2024, 154
  • [35] Enriching Translation-Based Knowledge Graph Embeddings Through Continual Learnings
    Song, Hyun-Je
    Park, Seong-Bae
    IEEE ACCESS, 2018, 6 : 60489 - 60497
  • [36] Conversational Question Answering over Knowledge Graphs with Transformer and Graph Attention Networks
    Kacupaj, Endri
    Plepi, Joan
    Singh, Kuldeep
    Thakkar, Harsh
    Lehmann, Jens
    Maleshkova, Maria
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 850 - 862
  • [37] A Transformer-based Multi-Platform Sequential Estimation Fusion
    Zhai, Xupeng
    Yang, Yanbo
    Liu, Zhunga
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 144
  • [38] Mental Health Counseling From Conversational Content With Transformer-Based Machine Learning
    Imel, Zac E.
    Tanana, Michael J.
    Soma, Christina S.
    Hull, Thomas D.
    Pace, Brian T.
    Stanco, Sarah C.
    Creed, Torrey A.
    Moyers, Theresa B.
    Atkins, David C.
    JAMA NETWORK OPEN, 2024, 7 (01) : E2352590
  • [39] Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings
    Prakash, Prafull
    Shashidhar, Saurabh Kumar
    Zhao, Wenlong
    Rongali, Subendhu
    Khan, Haidar
    Kayser, Michael
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4711 - 4717
  • [40] Transformer-Based Cross-Modal Recipe Embeddings with Large Batch Training
    Yang, Jing
    Chen, Junwen
    Yanai, Keiji
    MULTIMEDIA MODELING, MMM 2023, PT II, 2023, 13834 : 471 - 482