Fuzzy Shared Representation Learning for Multistream Classification

被引:0
|
作者
Yu, En [1 ]
Lu, Jie [1 ]
Zhang, Guangquan [1 ]
机构
[1] Univ Technol Sydney, Australian Artificial Intelligence Inst AAII, Fac Engn & Informat Technol, Decis Syst & E Serv Intelligence Lab, Ultimo, NSW 2007, Australia
基金
澳大利亚研究理事会;
关键词
Concept drift; Fuzzy systems; Adaptation models; Data models; Uncertainty; Task analysis; Monitoring; fuzzy systems; multistream classification; transfer learning; CONCEPT DRIFT DETECTION; MODEL;
D O I
10.1109/TFUZZ.2024.3423024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multistream classification aims to predict the target stream by transferring knowledge from labeled source streams amid nonstationary processes with concept drifts. While existing methods address label scarcity, covariate shift, and asynchronous concept drift, they focus solely on the original feature space, neglecting the influence of redundant or low-quality features with uncertainties. Therefore, the advancement of this task is still challenged by how to: 1) ensure guaranteed joint representations of different streams, 2) grapple with uncertainty and interpretability during knowledge transfer, and 3) track and adapt the asynchronous drifts in each stream. To address these challenges, we propose an interpretable fuzzy shared representation learning (FSRL) method based on the Takagi-Sugeno-Kang (TSK) fuzzy system. Specifically, FSRL accomplishes the nonlinear transformation of individual streams by learning the fuzzy mapping with the antecedents of the TSK fuzzy system, thereby effectively preserving discriminative information for each original stream in an interpretable way. Then, a multistream joint distribution adaptation algorithm is proposed to optimize the consequent part of the TSK fuzzy system, which learns the final fuzzy shared representations for different streams. Hence, this method concurrently investigates both the commonalities across streams and the distinctive information within each stream. Following that, window-based and GMM-based online adaptation strategies are designed to address the asynchronous drifts over time. The former can directly demonstrate the effectiveness of FSRL in knowledge transfer across multiple streams, while the GMM-based method offers an informed way to overcome the asynchronous drift problem by integrating drift detection and adaptation. Finally, extensive experiments on several synthetic and real-world benchmarks with concept drift demonstrate the proposed method's effectiveness and efficiency.
引用
收藏
页码:5625 / 5637
页数:13
相关论文
共 50 条
  • [21] Multi-feature Shared and Specific Representation for Pattern Classification
    Ke, Kangyin
    Yang, Meng
    PATTERN RECOGNITION AND COMPUTER VISION, PT III, 2018, 11258 : 573 - 585
  • [22] A Shared Representation for Object Tracking and Classification using Siamese Networks
    Kretz, Adrian
    Mester, Rudolf
    2020 IEEE SOUTHWEST SYMPOSIUM ON IMAGE ANALYSIS AND INTERPRETATION (SSIAI 2020), 2020, : 54 - 57
  • [23] Hybrid Knowledge Representation Applied to the Learning of the Shared Attention
    Policastro, Claudio A.
    Zuliani, Giovana
    da Silva, Renato R.
    Munhoz, Vitor R.
    Romero, Roseli A. F.
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 1579 - +
  • [24] Representation Learning for Constructive Comments Classification
    Uribe, Diego
    Cuan, Enrique
    Urquizo, Elisa
    2020 INTERNATIONAL CONFERENCE ON MECHATRONICS, ELECTRONICS AND AUTOMOTIVE ENGINEERING (ICMEAE 2020), 2020, : 71 - 75
  • [25] Deep shared representation learning for weather elements forecasting
    Mehrkanoon, Siamak
    KNOWLEDGE-BASED SYSTEMS, 2019, 179 : 120 - 128
  • [26] Contrastive Representation Learning for Electroencephalogram Classification
    Mohsenvand, Mostafa 'Neo'
    Izadi, Mohammad Rasool
    Maes, Pattie
    MACHINE LEARNING FOR HEALTH, VOL 136, 2020, 136 : 238 - 253
  • [27] Learning discriminative representation for image classification
    Peng, Chong
    Liu, Yang
    Zhang, Xin
    Kang, Zhao
    Chen, Yongyong
    Chen, Chenglizhao
    Cheng, Qiang
    KNOWLEDGE-BASED SYSTEMS, 2021, 233
  • [28] Discriminative learning by sparse representation for classification
    Zang, Fei
    Zhang, Jiangshe
    NEUROCOMPUTING, 2011, 74 (12-13) : 2176 - 2183
  • [29] Packet Representation Learning for Traffic Classification
    Meng, Xuying
    Wang, Yequan
    Ma, Runxin
    Luo, Haitong
    Li, Xiang
    Zhang, Yujun
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 3546 - 3554
  • [30] Learning Deep Representation for Imbalanced Classification
    Huang, Chen
    Li, Yining
    Loy, Chen Change
    Tang, Xiaoou
    2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 5375 - 5384