Multi-teacher knowledge distillation for debiasing recommendation with uniform data

被引:0
|
作者
Yang, Xinxin [1 ]
Li, Xinwei [1 ]
Liu, Zhen [1 ]
Yuan, Yafan [1 ]
Wang, Yannan [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Comp Sci & Technol, Beijing 100044, Peoples R China
基金
中国国家自然科学基金;
关键词
Debiasing recommendation; Knowledge distillation; Contrastive learning; Collaborative filtering;
D O I
10.1016/j.eswa.2025.126808
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent studies have highlighted the bias problem in recommender systems which affects the learning of users' true preferences. One significant reason for bias is that the training data is missing not at random (MNAR). While existing approaches have demonstrated the usefulness of uniform data that is missing at random (MAR) for debiasing, the current models lack a comprehensive exploration of unbiased features within uniform data. Considering the valuableness and limited size of uniform data, this paper proposes a multi-teacher knowledge distillation framework (UKDRec) to extract and transfer more unbiased information from uniform data. The proposed framework consists of two components: a label-based teacher model that leverages supervision signals and a feature-based teacher model that facilitates the transfer of comprehensive unbiased features. To effectively extract unbiased features, we introduce a contrastive learning strategy that combines the uniform data with control data. The framework is trained using a multi-task learning approach, which enhances the transfer of unbiased knowledge. Extensive experiments conducted on real-world datasets demonstrate the superior debiasing performance of our approach compared to competitive baselines.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation
    Li, Ruoyu
    Yun, Lijun
    Zhang, Mingxuan
    Yang, Yanchen
    Cheng, Feiyan
    SENSORS, 2023, 23 (22)
  • [42] A General Knowledge Distillation Framework for Counterfactual Recommendation via Uniform Data
    Liu, Dugang
    Cheng, Pengxiang
    Dong, Zhenhua
    He, Xiuqiang
    Pan, Weike
    Ming, Zhong
    PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, : 831 - 840
  • [43] MTMS: Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension
    Zhao, Zhuo
    Xie, Zhiwen
    Zhou, Guangyou
    Huang, Jimmy Xiangji
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 1995 - 2005
  • [44] Multi-Teacher Distillation With Single Model for Neural Machine Translation
    Liang, Xiaobo
    Wu, Lijun
    Li, Juntao
    Qin, Tao
    Zhang, Min
    Liu, Tie-Yan
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2022, 30 : 992 - 1002
  • [45] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation
    Cao, Shengcao
    Li, Mengtian
    Hays, James
    Ramanan, Deva
    Wang, Yu-Xiong
    Gui, Liang-Yan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [46] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System
    Yang, Ze
    Shou, Linjun
    Gong, Ming
    Lin, Wutao
    Jiang, Daxin
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
  • [47] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement
    Zhang, Tianchi
    Liu, Yuxuan
    Mase, Atsushi
    APPLIED SCIENCES-BASEL, 2024, 14 (02):
  • [48] Disentangled causal representation learning for debiasing recommendation with uniform data
    Yang, Xinxin
    Li, Xinwei
    Liu, Zhen
    Wang, Yannan
    Lu, Sibo
    Liu, Feng
    APPLIED INTELLIGENCE, 2024, 54 (08) : 6760 - 6775
  • [49] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks
    Cuong Pham
    Tuan Hoang
    Thanh-Toan Do
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
  • [50] Dissolved oxygen prediction in the Taiwan Strait with the attention-based multi-teacher knowledge distillation model
    Chen, Lei
    Lin, Ye
    Guo, Minquan
    Lu, Wenfang
    Li, Xueding
    Zhang, Zhenchang
    OCEAN & COASTAL MANAGEMENT, 2025, 265