Mitigate Gender Bias Using Negative Multi-task Learning

被引:2
|
作者
Gao, Liyuan [1 ]
Zhan, Huixin [1 ]
Sheng, Victor S. [1 ]
机构
[1] Texas Tech Univ, Comp Sci, 2500 Broadway, Lubbock, TX 79409 USA
关键词
Gender bias; Selective privacy-preserving; Negative multi-task learning; Classification;
D O I
10.1007/s11063-023-11368-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models have showcased remarkable performances in natural language processing tasks. While much attention has been paid to improvements in utility, privacy leakage and social bias are two major concerns arising in trained models. In this paper, we address both privacy protection and gender bias mitigation in classification models simultaneously. We first introduce a selective privacy-preserving method that obscures individuals' sensitive information by adding noise to word embeddings. Then, we propose a negative multi-task learning framework to mitigate gender bias, which involves a main task and a gender prediction task. The main task employs a positive loss constraint for utility assurance, while the gender prediction task utilizes a negative loss constraint to remove gender-specific features. We have analyzed four existing word embeddings and evaluated them for sentiment analysis and medical text classification tasks within the proposed negative multi-task learning framework. For instances, RoBERTa achieves the best performance with an average accuracy of 95% for both negative and positive sentiment, with 1.1 disparity score and 1.6 disparity score respectively, and GloVe achieves the best average accuracy of 96.42% with a 0.28 disparity score for the medical task. Our experimental results indicate that our negative multi-task learning framework can effectively mitigate gender bias while maintaining model utility for both sentiment analysis and medical text classification.
引用
收藏
页码:11131 / 11146
页数:16
相关论文
共 50 条
  • [21] Asynchronous Multi-Task Learning
    Baytas, Inci M.
    Yan, Ming
    Jain, Anil K.
    Zhou, Jiayu
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 11 - 20
  • [22] Calibrated Multi-Task Learning
    Nie, Feiping
    Hu, Zhanxuan
    Li, Xuelong
    KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 2012 - 2021
  • [23] An overview of multi-task learning
    Yu Zhang
    Qiang Yang
    National Science Review, 2018, 5 (01) : 30 - 43
  • [24] Boosted multi-task learning
    Chapelle, Olivier
    Shivaswamy, Pannagadatta
    Vadrevu, Srinivas
    Weinberger, Kilian
    Zhang, Ya
    Tseng, Belle
    MACHINE LEARNING, 2011, 85 (1-2) : 149 - 173
  • [25] Parallel Multi-Task Learning
    Zhang, Yu
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 629 - 638
  • [26] Distributed Multi-Task Learning
    Wang, Jialei
    Kolar, Mladen
    Srebro, Nathan
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 751 - 760
  • [27] Loss-Balanced Task Weighting to Reduce Negative Transfer in Multi-Task Learning
    Liu, Shengchao
    Liang, Yingyu
    Gitter, Anthony
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9977 - 9978
  • [28] Learning Sparse Task Relations in Multi-Task Learning
    Zhang, Yu
    Yang, Qiang
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2914 - 2920
  • [29] Survey of Multi-Task Learning
    Zhang Y.
    Liu J.-W.
    Zuo X.
    1600, Science Press (43): : 1340 - 1378
  • [30] A Survey on Multi-Task Learning
    Zhang, Yu
    Yang, Qiang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (12) : 5586 - 5609