Multi-teacher knowledge distillation for debiasing recommendation with uniform data

被引:0
|
作者
Yang, Xinxin [1 ]
Li, Xinwei [1 ]
Liu, Zhen [1 ]
Yuan, Yafan [1 ]
Wang, Yannan [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Comp Sci & Technol, Beijing 100044, Peoples R China
基金
中国国家自然科学基金;
关键词
Debiasing recommendation; Knowledge distillation; Contrastive learning; Collaborative filtering;
D O I
10.1016/j.eswa.2025.126808
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent studies have highlighted the bias problem in recommender systems which affects the learning of users' true preferences. One significant reason for bias is that the training data is missing not at random (MNAR). While existing approaches have demonstrated the usefulness of uniform data that is missing at random (MAR) for debiasing, the current models lack a comprehensive exploration of unbiased features within uniform data. Considering the valuableness and limited size of uniform data, this paper proposes a multi-teacher knowledge distillation framework (UKDRec) to extract and transfer more unbiased information from uniform data. The proposed framework consists of two components: a label-based teacher model that leverages supervision signals and a feature-based teacher model that facilitates the transfer of comprehensive unbiased features. To effectively extract unbiased features, we introduce a contrastive learning strategy that combines the uniform data with control data. The framework is trained using a multi-task learning approach, which enhances the transfer of unbiased knowledge. Extensive experiments conducted on real-world datasets demonstrate the superior debiasing performance of our approach compared to competitive baselines.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] mKDNAD: A network flow anomaly detection method based on multi-teacher knowledge distillation
    Yang, Yang
    Liu, Dan
    2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 314 - 319
  • [32] Named Entity Recognition Method Based on Multi-Teacher Collaborative Cyclical Knowledge Distillation
    Jin, Chunqiao
    Yang, Shuangyuan
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 230 - 235
  • [33] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature
    Fei, Hongxiao
    Tan, Yangying
    Huang, Wenti
    Long, Jun
    Huang, Jincai
    Yang, Liu
    WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
  • [34] Adaptive weighted multi-teacher distillation for efficient medical imaging segmentation with limited data
    Ben Loussaief, Eddardaa
    Rashwan, Hatem A.
    Ayad, Mohammed
    Khalid, Adnan
    Puig, Domemec
    KNOWLEDGE-BASED SYSTEMS, 2025, 315
  • [35] Continual Learning with Confidence-based Multi-teacher Knowledge Distillation for Neural Machine Translation
    Guo, Jiahua
    Liang, Yunlong
    Xu, Jinan
    2024 6TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING, ICNLP 2024, 2024, : 336 - 343
  • [36] Adaptive multi-teacher softened relational knowledge distillation framework for payload mismatch in image steganalysis
    Yu, Lifang
    Li, Yunwei
    Weng, Shaowei
    Tian, Huawei
    Liu, Jing
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2023, 95
  • [37] UNIC: Universal Classification Models via Multi-teacher Distillation
    Sariyildiz, Mert Bulent
    Weinzaepfel, Philippe
    Lucas, Thomas
    Larlus, Diane
    Kalantidis, Yannis
    COMPUTER VISION-ECCV 2024, PT IV, 2025, 15062 : 353 - 371
  • [38] Enhanced Accuracy and Robustness via Multi-teacher Adversarial Distillation
    Zhao, Shiji
    Yu, Jie
    Sun, Zhenlong
    Zhang, Bo
    Wei, Xingxing
    COMPUTER VISION - ECCV 2022, PT IV, 2022, 13664 : 585 - 602
  • [39] LGFA-MTKD: Enhancing Multi-Teacher Knowledge Distillation with Local and Global Frequency Attention
    Cheng, Xin
    Zhou, Jinjia
    INFORMATION, 2024, 15 (11)
  • [40] MT4MTL-KD: A Multi-Teacher Knowledge Distillation Framework for Triplet Recognition
    Gui, Shuangchun
    Wang, Zhenkun
    Chen, Jixiang
    Zhou, Xun
    Zhang, Chen
    Cao, Yi
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2024, 43 (04) : 1628 - 1639