The multi-task transfer learning for multiple data streams with uncertain data

被引:1
|
作者
Liu, Bo [1 ]
Huang, Yongsheng [1 ]
Xiao, Yanshan [2 ]
Zheng, Zhiyu [1 ]
Sun, Peng [1 ]
Zhao, Shilei [1 ]
Li, Xiaokai [1 ]
Peng, Tiantian [1 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
[2] Guangdong Univ Technol, Sch Comp Sci, Guangzhou, Peoples R China
关键词
Multiple data streams; Transfer learning; Multi-task learning; Uncertain data; CONCEPT DRIFT; PREDICTION; MACHINE;
D O I
10.1016/j.ins.2024.120314
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In existing research on data streams, most problems are processed and studied based on single data streams. However, there exist multiple data streams and the reaching data may contain noise information which is considered uncertain in representation. In this paper, we propose the multitask transfer learning for multiple data streams with uncertain data (MTUMDS), which can make use of the similarity of multiple data streams to carry out multi -task learning that can improve the classification ability of the data stream model. At the same time, transfer learning is applied for each single data stream, which transfers knowledge from the known classifiers of the previous time windows to the current target window classifier. This can settle the situation that the concept drift causes model fitness reduction. Then, in view of the noise and collection error hidden in the real data, boundary constraints are generated for each sample to build the SVM classifier to solve the uncertainty of the data. A large number of experiments in multiple data streams show that our approach has better performance and robustness than previous studies.
引用
收藏
页数:17
相关论文
共 50 条
  • [31] Multi-task learning for intelligent data processing in granular computing context
    Liu, Han
    Cocea, Mihaela
    Ding, Weili
    GRANULAR COMPUTING, 2018, 3 (03) : 257 - 273
  • [32] A Hierarchical Multi-Task Learning Framework for Semantic Annotation in Tabular Data
    Wu, Jie
    Hou, Mengshu
    ENTROPY, 2024, 26 (08)
  • [33] Bayesian Max-margin Multi-Task Learning with Data Augmentation
    Li, Chengtao
    Zhu, Jun
    Chen, Lianfei
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 415 - 423
  • [34] Federated Multi-Task Learning with Non-Stationary Heterogeneous Data
    Zhang, Hongwei
    Tao, Meixia
    Shi, Yuanming
    Bi, Xiaoyan
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 4950 - 4955
  • [35] Federated Multi-task Learning with Hierarchical Attention for Sensor Data Analytics
    Chen, Yujing
    Ning, Yue
    Chai, Zheng
    Rangwala, Huzefa
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [36] Accelerated multi-task online learning algorithm for big data stream
    Li, Zhijie
    Li, Yuanxiang
    Wang, Feng
    Kuang, Li
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2015, 52 (11): : 2545 - 2554
  • [37] Double-coupling learning for multi-task data stream classification
    Shi, Yingzhong
    Li, Andong
    Deng, Zhaohong
    Yan, Qisheng
    Lou, Qiongdan
    Chen, Haoran
    Choi, Kup-Sze
    Wang, Shitong
    INFORMATION SCIENCES, 2022, 613 : 494 - 506
  • [38] Multi-Task Learning for Compositional Data via Sparse Network Lasso
    Okazaki, Akira
    Kawano, Shuichi
    ENTROPY, 2022, 24 (12)
  • [39] Multi-task learning for spatial events prediction from social data
    Eom, Sungkwang
    Oh, Byungkook
    Shin, Sangjin
    Lee, Kyong-Ho
    INFORMATION SCIENCES, 2021, 581 : 278 - 290
  • [40] Explainable Recommendation via Multi-Task Learning in Opinionated Text Data
    Wang, Nan
    Wang, Hongning
    Jia, Yiling
    Yin, Yue
    ACM/SIGIR PROCEEDINGS 2018, 2018, : 165 - 174