Multi-teacher knowledge distillation for debiasing recommendation with uniform data

被引:0
|
作者
Yang, Xinxin [1 ]
Li, Xinwei [1 ]
Liu, Zhen [1 ]
Yuan, Yafan [1 ]
Wang, Yannan [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Comp Sci & Technol, Beijing 100044, Peoples R China
基金
中国国家自然科学基金;
关键词
Debiasing recommendation; Knowledge distillation; Contrastive learning; Collaborative filtering;
D O I
10.1016/j.eswa.2025.126808
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent studies have highlighted the bias problem in recommender systems which affects the learning of users' true preferences. One significant reason for bias is that the training data is missing not at random (MNAR). While existing approaches have demonstrated the usefulness of uniform data that is missing at random (MAR) for debiasing, the current models lack a comprehensive exploration of unbiased features within uniform data. Considering the valuableness and limited size of uniform data, this paper proposes a multi-teacher knowledge distillation framework (UKDRec) to extract and transfer more unbiased information from uniform data. The proposed framework consists of two components: a label-based teacher model that leverages supervision signals and a feature-based teacher model that facilitates the transfer of comprehensive unbiased features. To effectively extract unbiased features, we introduce a contrastive learning strategy that combines the uniform data with control data. The framework is trained using a multi-task learning approach, which enhances the transfer of unbiased knowledge. Extensive experiments conducted on real-world datasets demonstrate the superior debiasing performance of our approach compared to competitive baselines.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] A Multi-Teacher Assisted Knowledge Distillation Approach for Enhanced Face Image Authentication
    Cheng, Tiancong
    Zhang, Ying
    Yin, Yifang
    Zimmermann, Roger
    Yu, Zhiwen
    Guo, Bin
    PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 135 - 143
  • [22] DE-MKD: Decoupled Multi-Teacher Knowledge Distillation Based on Entropy
    Cheng, Xin
    Zhang, Zhiqiang
    Weng, Wei
    Yu, Wenxin
    Zhou, Jinjia
    MATHEMATICS, 2024, 12 (11)
  • [23] MULTI-TEACHER DISTILLATION FOR INCREMENTAL OBJECT DETECTION
    Jiang, Le
    Cheng, Hongqiang
    Ye, Xiaozhou
    Ouyang, Ye
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 5520 - 5524
  • [24] Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector
    Shang, Ronghua
    Li, Wenzheng
    Zhu, Songling
    Jiao, Licheng
    Li, Yangyang
    NEURAL NETWORKS, 2023, 164 : 345 - 356
  • [25] Device adaptation free-KDA based on multi-teacher knowledge distillation
    Yang, Yafang
    Guo, Bin
    Liang, Yunji
    Zhao, Kaixing
    Yu, Zhiwen
    Journal of Ambient Intelligence and Humanized Computing, 2024, 15 (10) : 3603 - 3615
  • [26] Multi-teacher knowledge distillation for compressed video action recognition based on deep learning
    Wu, Meng-Chieh
    Chiu, Ching-Te
    JOURNAL OF SYSTEMS ARCHITECTURE, 2020, 103
  • [27] Bi-Level Orthogonal Multi-Teacher Distillation
    Gong, Shuyue
    Wen, Weigang
    ELECTRONICS, 2024, 13 (16)
  • [28] KDCRec: Knowledge Distillation for Counterfactual Recommendation via Uniform Data
    Liu, Dugang
    Cheng, Pengxiang
    Lin, Zinan
    Luo, Jinwei
    Dong, Zhenhua
    He, Xiuqiang
    Pan, Weike
    Ming, Zhong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (08) : 8143 - 8156
  • [29] MULTI-TEACHER KNOWLEDGE DISTILLATION FOR COMPRESSED VIDEO ACTION RECOGNITION ON DEEP NEURAL NETWORKS
    Wu, Meng-Chieh
    Chiu, Ching-Te
    Wu, Kun-Hsuan
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2202 - 2206
  • [30] Visual emotion analysis using skill-based multi-teacher knowledge distillation
    Cladiere, Tristan
    Alata, Olivier
    Ducottet, Christophe
    Konik, Hubert
    Legrand, Anne-Claire
    PATTERN ANALYSIS AND APPLICATIONS, 2025, 28 (02)