Multi-task feature selection with sparse regularization to extract common and task-specific features

被引:14
|
作者
Zhang, Jiashuai [1 ]
Miao, Jianyu [2 ]
Zhao, Kun [3 ]
Tian, Yingjie [4 ,5 ,6 ]
机构
[1] Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
[2] Henan Univ Technol, Coll Informat Sci & Engn, Zhengzhou 450001, Henan, Peoples R China
[3] Beijing Wuzi Univ, Sch Logist, Beijing 101149, Peoples R China
[4] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[5] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[6] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-task feature learning; Sparse regularization; Non-convex; ADMM; VARIABLE SELECTION; REGRESSION; CLASSIFICATION;
D O I
10.1016/j.neucom.2019.02.035
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-task learning (MTL), with the help of the relationship among tasks, is able to improve the generalization performance of all tasks by learning multiple tasks simultaneously. Multi-task sparse feature learning, formulated under the regularization framework, is one of the main approaches for MTL. Hence, the regularization term is crucial for multi-task sparse feature models. While most of the existing models utilize convex sparse regularization, a non-convex capped-l(1) regularization is extended into MTL and proven as a powerful sparse term. In this paper, we propose a novel regularization term for multi-task learning by extending the non-convex l(1-2) regularization to multi-task learning. The regularization term can not only realize group sparsity to extract the common features shared by all tasks, but also learn task-specific features through the relaxation of the second term in regularization. Although the model formulation is similar to a proposed one for multi-class problem, we first extend l(1-2) regularization to multi-task learning so that both common features and task-specific features can be extracted. A classical multi-task learning model (Multi-task Feature Selection, MTFS) can be viewed as a special case of our proposed model. Due to the complexity of regularization, we approximate the original problem by a locally linear subproblem and then use the Alternating Direction Method of Multipliers (ADMM) to solve this subproblem. The theoretical analysis shows the convergence of the proposed algorithm and the time complexity of the algorithm is provided. Experimental results demonstrate the effectiveness of our proposed method. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:76 / 89
页数:14
相关论文
共 50 条
  • [1] Multi-task Sparse Gaussian Processes with Improved Multi-task Sparsity Regularization
    Zhu, Jiang
    Sun, Shiliang
    [J]. PATTERN RECOGNITION (CCPR 2014), PT I, 2014, 483 : 54 - 62
  • [2] Multi-Task Learning with Task-Specific Feature Filtering in Low-Data Condition
    Lee, Sang-woo
    Lee, Ryong
    Seo, Min-seok
    Park, Jong-chan
    Noh, Hyeon-cheol
    Ju, Jin-gi
    Jang, Rae-young
    Lee, Gun-woo
    Choi, Myung-seok
    Choi, Dong-geol
    [J]. ELECTRONICS, 2021, 10 (21)
  • [3] Efficient healthcare supply chain: A prioritized multi-task learning approach with task-specific regularization
    Kar, Soumyadipta
    Mohanty, Manas Kumar
    Thakurta, Parag Kumar Guha
    [J]. Engineering Applications of Artificial Intelligence, 2024, 133
  • [4] Efficient healthcare supply chain: A prioritized multi-task learning approach with task-specific regularization
    Kar, Soumyadipta
    Mohanty, Manas Kumar
    Thakurta, Parag Kumar Guha
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 133
  • [5] Wasserstein regularization for sparse multi-task regression
    Janati, Hicham
    Cuturi, Marco
    Gramfort, Alexandre
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [6] A Kernel Approach to Multi-Task Learning with Task-Specific Kernels
    武威
    李航
    胡云华
    金榕
    [J]. Journal of Computer Science & Technology, 2012, 27 (06) : 1289 - 1301
  • [7] A Kernel Approach to Multi-Task Learning with Task-Specific Kernels
    Wei Wu
    Hang Li
    Yun-Hua Hu
    Rong Jin
    [J]. Journal of Computer Science and Technology, 2012, 27 : 1289 - 1301
  • [8] A Kernel Approach to Multi-Task Learning with Task-Specific Kernels
    Wu, Wei
    Li, Hang
    Hu, Yun-Hua
    Jin, Rong
    [J]. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2012, 27 (06) : 1289 - 1301
  • [9] DEEP MULTI-TASK AND TASK-SPECIFIC FEATURE LEARNING NETWORK FOR ROBUST SHAPE PRESERVED ORGAN SEGMENTATION
    Tan, Chaowei
    Zhao, Liang
    Yan, Zhennan
    Li, Kang
    Metaxas, Dimitris
    Zhan, Yiqiang
    [J]. 2018 IEEE 15TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI 2018), 2018, : 1221 - 1224
  • [10] Deep Task-specific Bottom Representation Network for Multi-Task Recommendation
    Liu, Qi
    Zhou, Zhilong
    Jiang, Gangwei
    Ge, Tiezheng
    Lian, Defu
    [J]. PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 1637 - 1646