Prototypical Inception Network with Cross Branch Attention for Time Series Classification

被引:3
|
作者
Sun, Jingyu [1 ]
Takeuchi, Susumu [1 ]
Yamasaki, Ikuo [1 ]
机构
[1] NTT Corp, NTT Software Innovat Ctr, Tokyo, Japan
关键词
Time Series Classification; Prototypical neural network; Inception Time; Machine learning; Few shot learning;
D O I
10.1109/IJCNN52387.2021.9533440
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nowadays, the explosive growth of time series data and the idea of automatically classifying them brought chances of advanced analysis and machine cognitive processes in various domains. Hundreds of Time Series Classification (TSC) algorithms have been proposed during the last decade. Most of them need the large quantity of labeled training data for achieving good precision. However, we noticed that under most of the training scenarios, the large-scale supervised training datasets are not readily available. We thus proposed a few shot deep learning neural network framework PIN-BA (Prototypical Inception Network with Cross Branch Attention) for time series classification with only limited training data. We use CNN (Convolutional Neural Network) with branches of different reception windows for capturing the features of different time window scales. We design cross branch attention schemes based on prototypical networks to emphasize the crucial features' information during classification. A few experiments were conducted on the famous UCR Time Series Data Sets. Experimental results demonstrate that our proposed framework outperforms the other TSC models, such like 1NN-DTW and InceptionTime under the few shot training data scenarios.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] TapNet: Multivariate Time Series Classification with Attentional Prototypical Network
    Zhang, Xuchao
    Gao, Yifeng
    Lin, Jessica
    Lu, Chang-Tien
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6845 - 6852
  • [2] Satellite Image Time-Series Classification with Inception-Enhanced Temporal Attention Encoder
    Zhang, Zheng
    Zhang, Weixiong
    Meng, Yu
    Zhao, Zhitao
    Tang, Ping
    Li, Hongyi
    [J]. Remote Sensing, 2024, 16 (23)
  • [3] CATodyNet: Cross-attention temporal dynamic graph neural network for multivariate time series classification
    Gui, Haoyu
    Li, Guanjun
    Tang, Xianghong
    Lu, Jianguang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2024, 300
  • [4] Demo Abstract: Lightweight Attention Network for Time Series Classification on Edge
    Mukhopadhyay, Shalini
    Dey, Swarnava
    Pal, Arpan
    Ashwin, S.
    [J]. PROCEEDINGS OF THE 21ST ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2023, 2023, : 484 - 485
  • [5] Prototypical Network with Residual Attention for Modulation Classification of Wireless Communication Signals
    Zang, Bo
    Gou, Xiaopeng
    Zhu, Zhigang
    Long, Lulan
    Zhang, Haotian
    [J]. ELECTRONICS, 2023, 12 (24)
  • [6] Multi-scale Attention Convolutional Neural Network for time series classification
    Chen, Wei
    Shi, Ke
    [J]. NEURAL NETWORKS, 2021, 136 (136) : 126 - 140
  • [7] Cross-attention multi-branch network for fundus diseases classification using SLO images
    Xie, Hai
    Zeng, Xianlu
    Lei, Haijun
    Du, Jie
    Wang, Jiantao
    Zhang, Guoming
    Cao, Jiuwen
    Wang, Tianfu
    Lei, Baiying
    [J]. MEDICAL IMAGE ANALYSIS, 2021, 71
  • [8] Rethinking attention mechanism in time series classification
    Zhao, Bowen
    Xing, Huanlai
    Wang, Xinhan
    Song, Fuhong
    Xiao, Zhiwen
    [J]. INFORMATION SCIENCES, 2023, 627 : 97 - 114
  • [9] DA-Net: Dual-attention network for multivariate time series classification
    Chen, Rongjun
    Yan, Xuanhui
    Wang, Shiping
    Xiao, Guobao
    [J]. INFORMATION SCIENCES, 2022, 610 : 472 - 487
  • [10] INSTINCT: Inception-based Symbolic Time Intervals series classification
    Harel, Omer David
    Moskovitch, Robert
    [J]. INFORMATION SCIENCES, 2023, 642