Multi-task feature selection with sparse regularization to extract common and task-specific features

被引:14
|
作者
Zhang, Jiashuai [1 ]
Miao, Jianyu [2 ]
Zhao, Kun [3 ]
Tian, Yingjie [4 ,5 ,6 ]
机构
[1] Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
[2] Henan Univ Technol, Coll Informat Sci & Engn, Zhengzhou 450001, Henan, Peoples R China
[3] Beijing Wuzi Univ, Sch Logist, Beijing 101149, Peoples R China
[4] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[5] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[6] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-task feature learning; Sparse regularization; Non-convex; ADMM; VARIABLE SELECTION; REGRESSION; CLASSIFICATION;
D O I
10.1016/j.neucom.2019.02.035
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-task learning (MTL), with the help of the relationship among tasks, is able to improve the generalization performance of all tasks by learning multiple tasks simultaneously. Multi-task sparse feature learning, formulated under the regularization framework, is one of the main approaches for MTL. Hence, the regularization term is crucial for multi-task sparse feature models. While most of the existing models utilize convex sparse regularization, a non-convex capped-l(1) regularization is extended into MTL and proven as a powerful sparse term. In this paper, we propose a novel regularization term for multi-task learning by extending the non-convex l(1-2) regularization to multi-task learning. The regularization term can not only realize group sparsity to extract the common features shared by all tasks, but also learn task-specific features through the relaxation of the second term in regularization. Although the model formulation is similar to a proposed one for multi-class problem, we first extend l(1-2) regularization to multi-task learning so that both common features and task-specific features can be extracted. A classical multi-task learning model (Multi-task Feature Selection, MTFS) can be viewed as a special case of our proposed model. Due to the complexity of regularization, we approximate the original problem by a locally linear subproblem and then use the Alternating Direction Method of Multipliers (ADMM) to solve this subproblem. The theoretical analysis shows the convergence of the proposed algorithm and the time complexity of the algorithm is provided. Experimental results demonstrate the effectiveness of our proposed method. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:76 / 89
页数:14
相关论文
共 50 条
  • [31] Joint Structure Feature Exploration and Regularization for Multi-Task Graph Classification
    Pan, Shirui
    Wu, Jia
    Zhu, Xingquan
    Zhang, Chengqi
    Yu, Philip S.
    [J]. 2016 32ND IEEE INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE), 2016, : 1474 - 1475
  • [32] Task-specific Compression for Multi-task Language Models using Attribution-based Pruning
    Yang, Nakyeong
    Jang, Yunah
    Lee, Hwanhee
    Jung, Seohyeong
    Jung, Kyomin
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 594 - 604
  • [33] Sparse Multi-Task Regression and Feature Selection to Identify Brain Imaging Predictors for Memory Performance
    Wang, Hua
    Nie, Feiping
    Huang, Heng
    Risacher, Shannon
    Ding, Chris
    Saykin, Andrew J.
    Shen, Li
    [J]. 2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 557 - 562
  • [34] SC-LSTM: Learning Task-Specific Representations in Multi-Task Learning for Sequence Labeling
    Lu, Peng
    Bai, Ting
    Langlais, Philippe
    [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 2396 - 2406
  • [35] TSMV: TASK-SPECIFIC MULTI-VIEW FEATURE LEARNING
    Zhang, Chengyue
    Han, Yahong
    [J]. 8TH INTERNATIONAL CONFERENCE ON INTERNET MULTIMEDIA COMPUTING AND SERVICE (ICIMCS2016), 2016, : 39 - 42
  • [36] Multi-task Optimisation for Multi-objective Feature Selection in Classification
    Lin, Jiabin
    Chen, Qi
    Xue, Bing
    Zhang, Mengjie
    [J]. PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2022, 2022, : 264 - 267
  • [37] Multi-subject brain decoding with multi-task feature selection
    Wang, Liye
    Tang, Xiaoying
    Liu, Weifeng
    Peng, Yuhua
    Gao, Tianxin
    Xu, Yong
    [J]. BIO-MEDICAL MATERIALS AND ENGINEERING, 2014, 24 (06) : 2987 - 2994
  • [38] Multi-task Joint Feature Selection for Multi-label Classification
    He Zhifen
    Yang Ming
    Liu Huidong
    [J]. CHINESE JOURNAL OF ELECTRONICS, 2015, 24 (02) : 281 - 287
  • [39] Multi-task Joint Feature Selection for Multi-label Classification
    HE Zhifen
    YANG Ming
    LIU Huidong
    [J]. Chinese Journal of Electronics, 2015, 24 (02) : 281 - 287
  • [40] Surgery Duration Prediction Using Multi-Task Feature Selection
    Azriel, David
    Rinott, Yosef
    Tal, Orna
    Abbou, Benyamine
    Rappoport, Nadav
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (07) : 4216 - 4223