Multi-task TSK fuzzy system modeling using inter-task correlation information

被引:30
|
作者
Jiang, Yizhang [1 ,2 ]
Deng, Zhaohong [1 ]
Chung, Fu-Lai [2 ]
Wang, Shitong [1 ,2 ]
机构
[1] Jiangnan Univ, Sch Digital Media, Wuxi, Jiangsu, Peoples R China
[2] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-task learning; Inter-task latent correlation; Fuzzy modeling; Takagi-Sugeno-Kang fuzzy system; STATISTICAL COMPARISONS; CLASSIFIERS; MACHINE; LOGIC;
D O I
10.1016/j.ins.2014.12.007
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The classical fuzzy system modeling methods have been typically developed for the single task modeling scene, which is essentially not in accordance with many practical applications where a multi-task problem must be considered for the given modeling task. Although a multi-task problem can be decomposed into many single-task sub-problems, the modeling results indeed tell us that the individual modeling approach will not be very suitable for multi-task problems due to the ignorance of the inter-task latent correlation between different tasks. In order to circumvent this shortcoming, a multi-task Takagi-Sugeno-Kang fuzzy system model is proposed based on the classical L2-norm Takagi-Sugeno-Kang fuzzy system in this paper. The proposed model cannot only take advantage of independent information of each task, but also make use of the inter-task latent correlation information effectively, resulting to obtain better generalization performance for the built fuzzy systems. Experiments on synthetic and real-world datasets demonstrate the applicability and distinctive performance of the proposed multi-task fuzzy system model in multi-task modeling scenarios. (C) 2014 Elsevier Inc. All rights reserved.
引用
收藏
页码:512 / 533
页数:22
相关论文
共 50 条
  • [41] Information Diffusion Enhanced by Multi-Task Peer Prediction
    Ito, Kensuke
    Ohsawa, Shohei
    Tanaka, Hideyuki
    [J]. IIWAS2018: THE 20TH INTERNATIONAL CONFERENCE ON INFORMATION INTEGRATION AND WEB-BASED APPLICATIONS & SERVICES, 2014, : 94 - 102
  • [42] Multi-Task Clustering of Human Actions by Sharing Information
    Yan, Xiaoqiang
    Hu, Shizhe
    Ye, Yangdong
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 4049 - 4057
  • [43] Information networks fusion based on multi-task coordination
    Dong Li
    Derong Shen
    Yue Kou
    Tiezheng Nie
    [J]. Frontiers of Computer Science, 2021, 15
  • [44] An Information-Theoretic Approach for Multi-task Learning
    Yang, Pei
    Tan, Qi
    Xu, Hao
    Ding, Yehua
    [J]. ADVANCED DATA MINING AND APPLICATIONS, PROCEEDINGS, 2009, 5678 : 386 - 396
  • [45] Improving Inter-task Communication Performance on Multi-Core Packet Processing Platform
    Ma, Shicong
    Wang, Baosheng
    Zhang, Xiaozhe
    Gao, Xianming
    Liu, Zhongju
    [J]. 2015 8TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID), VOL 2, 2015, : 485 - 488
  • [46] A Multi-Task Learning Approach for Delayed Feedback Modeling
    Huangfu, Zhigang
    Zhang, Gong-Duo
    Wu, Zhengwei
    Wu, Qintong
    Zhang, Zhiqiang
    Gu, Lihong
    Zhou, Jun
    Gu, Jinjie
    [J]. COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 116 - 120
  • [47] Modeling Subjective Affect Annotations with Multi-Task Learning
    Hayat, Hassan
    Ventura, Carles
    Lapedriza, Agata
    [J]. SENSORS, 2022, 22 (14)
  • [48] A Multi-Task Learning Approach to Personalized Progression Modeling
    Ghalwash, Mohamed
    Dow, Daby
    [J]. 2020 8TH IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI 2020), 2020, : 92 - 100
  • [49] Modeling disease progression via multi-task learning
    Zhou, Jiayu
    Liu, Jun
    Narayan, Vaibhav A.
    Ye, Jieping
    [J]. NEUROIMAGE, 2013, 78 : 233 - 248
  • [50] Multi-Task Learning with Language Modeling for Question Generation
    Zhou, Wenjie
    Zhang, Minghua
    Wu, Yunfang
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3394 - 3399