Deep Asymmetric Multi-task Feature Learning

被引:0
|
作者
Lee, Hae Beom [1 ,2 ]
Yang, Eunho [2 ,3 ]
Hwang, Sung Ju [2 ,3 ]
机构
[1] UNIST, Ulsan, South Korea
[2] Altrics, Seoul, South Korea
[3] Korea Adv Inst Sci & Technol, Daejeon, South Korea
基金
新加坡国家研究基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose Deep Asymmetric Multitask Feature Learning (Deep-AMTFL) which can learn deep representations shared across multiple tasks while effectively preventing negative transfer that may happen in the feature sharing process. Specifically, we introduce an asymmetric autoencoder term that allows reliable predictors for the easy tasks to have high contribution to the feature learning while suppressing the influences of unreliable predictors for more difficult tasks. This allows the learning of less noisy representations, and enables unreliable predictors to exploit knowledge from the reliable predictors via the shared latent features. Such asymmetric knowledge transfer through shared features is also more scalable and efficient than inter-task asymmetric transfer. We validate our Deep-AMTFL model on multiple benchmark datasets for multitask learning and image classification, on which it significantly outperforms existing symmetric and asymmetric multitask learning models, by effectively preventing negative transfer in deep feature learning.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Convex multi-task feature learning
    Andreas Argyriou
    Theodoros Evgeniou
    Massimiliano Pontil
    [J]. Machine Learning, 2008, 73 : 243 - 272
  • [2] Multi-Task Feature Interaction Learning
    Lin, Kaixiang
    Xu, Jianpeng
    Baytas, Inci M.
    Ji, Shuiwang
    Zhou, Jiayu
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 1735 - 1744
  • [3] Convex multi-task feature learning
    Argyriou, Andreas
    Evgeniou, Theodoros
    Pontil, Massimiliano
    [J]. MACHINE LEARNING, 2008, 73 (03) : 243 - 272
  • [4] Multi-task Learning Deep Neural Networks For Speech Feature Denoising
    Huang, Bin
    Ke, Dengfeng
    Zheng, Hao
    Xu, Bo
    Xu, Yanyan
    Su, Kaile
    [J]. 16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 2464 - 2468
  • [5] Learning Task Relational Structure for Multi-Task Feature Learning
    Wang, De
    Nie, Feiping
    Huang, Heng
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 1239 - 1244
  • [6] Pareto Multi-task Deep Learning
    Riccio, Salvatore D.
    Dyankov, Deyan
    Jansen, Giorgio
    Di Fatta, Giuseppe
    Nicosia, Giuseppe
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 132 - 141
  • [7] Multi-Stage Multi-Task Feature Learning
    Gong, Pinghua
    Ye, Jieping
    Zhang, Changshui
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 2979 - 3010
  • [8] Multi-task Attribute Joint Feature Learning
    Chang, Lu
    Fang, Yuchun
    Jiang, Xiaoda
    [J]. BIOMETRIC RECOGNITION, CCBR 2015, 2015, 9428 : 193 - 200
  • [9] Asymmetric Multi-Task Learning with Local Transference
    Oliveira, Saullo H. G.
    Goncalves, Andre R.
    Von Zuben, Fernando J.
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2022, 16 (05)
  • [10] Prototype Feature Extraction for Multi-task Learning
    Xin, Shen
    Jiao, Yuhang
    Long, Cheng
    Wang, Yuguang
    Wang, Xiaowei
    Yang, Sen
    Liu, Ji
    Zhang, Jie
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2472 - 2481