Look into the LITE in deep learning for time series classification

被引:0
|
作者
Ismail-Fawaz, Ali [1 ]
Devanne, Maxime [1 ]
Berretti, Stefano [2 ]
Weber, Jonathan [1 ]
Forestier, Germain [1 ,3 ]
机构
[1] Univ Haute Alsace, IRIMAS, Mulhouse, France
[2] Univ Florence, MICC, Florence, Italy
[3] Monash Univ, DSAI, Melbourne, Australia
关键词
Time series classification; Deep learning; Convolutional neural networks; DepthWise separable convolutions;
D O I
10.1007/s41060-024-00708-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models have been shown to be a powerful solution for Time Series Classification (TSC). State-of-the-art architectures, while producing promising results on the UCR and the UEA archives, present a high number of trainable parameters. This can lead to long training with high CO2 emission, power consumption and possible increase in the number of FLoating-point Operation Per Second (FLOPS). In this paper, we present a new architecture for TSC, the Light Inception with boosTing tEchnique (LITE) with only 2.34%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2.34\%$$\end{document} of the number of parameters of the state-of-the-art InceptionTime model, while preserving performance. This architecture, with only 9, 814 trainable parameters due to the usage of DepthWise Separable Convolutions (DWSC), is boosted by three techniques: multiplexing, custom filters, and dilated convolution. The LITE architecture, trained on the UCR, is 2.78 times faster than InceptionTime and consumes 2.79 times less CO2 and power, while achieving an average accuracy of 84.62%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$84.62\%$$\end{document} compared to 84.91%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$84.91\%$$\end{document} with InceptionTime. To evaluate the performance of the proposed architecture on multivariate time series data, we adapt LITE to handle multivariate time series, we call this version LITEMV. To bring theory into application, we also conducted experiments using LITEMV on multivariate time series representing human rehabilitation movements, showing that LITEMV not only is the most efficient model but also the best performing for this application on the Kimore dataset, a skeleton-based human rehabilitation exercises dataset. Moreover, to address the interpretability of LITEMV, we present a study using Class Activation Maps to understand the classification decision taken by the model during evaluation.
引用
收藏
页数:21
相关论文
共 50 条
  • [31] FilterNet: A Many-to-Many Deep Learning Architecture for Time Series Classification
    Chambers, Robert D.
    Yoder, Nathanael C.
    SENSORS, 2020, 20 (09)
  • [32] The Time Series Classification of Discrete-Time Chaotic Systems Using Deep Learning Approaches
    Akmese, Omer Faruk
    Emin, Berkay
    Alaca, Yusuf
    Karaca, Yeliz
    Akgul, Akif
    MATHEMATICS, 2024, 12 (19)
  • [33] Compressed Learning for Time Series Classification
    Lee, Yuh-Jye
    Pao, Hsing-Kuo
    Shih, Shueh-Han
    Lin, Jing-Yao
    Chen, Xin-Rong
    2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2016, : 923 - 930
  • [34] Transfer learning for time series classification
    Fawaz, Hassan Ismail
    Forestier, Germain
    Weber, Jonathan
    Idoumghar, Lhassane
    Muller, Pierre-Alain
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 1367 - 1376
  • [35] Deep learning based classification of time series of chaotic systems over graphic images
    Uzun, Sueleyman
    Kacar, Sezgin
    Aricioglu, Burak
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (03) : 8413 - 8437
  • [36] Fault Detection Utilizing Deep Learning for Long-Sequence Time Series Classification
    Hu, Yunwei
    Sethi, Guneet
    2022 68TH ANNUAL RELIABILITY AND MAINTAINABILITY SYMPOSIUM (RAMS 2022), 2022,
  • [37] Classification of Indian Classical Music With Time-Series Matching Deep Learning Approach
    Sharma, Akhilesh Kumar
    Aggarwal, Gaurav
    Bhardwaj, Sachit
    Chakrabarti, Prasun
    Chakrabarti, Tulika
    Abawajy, Jemal H.
    Bhattacharyya, Siddhartha
    Mishra, Richa
    Das, Anirban
    Mahdin, Hairulnizam
    IEEE ACCESS, 2021, 9 : 102041 - 102052
  • [38] Multivariate Time Series Early Classification with Interpretability Using Deep Learning and Attention Mechanism
    Hsu, En-Yu
    Liu, Chien-Liang
    Tseng, Vincent S.
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT III, 2019, 11441 : 541 - 553
  • [39] Convolutional-and Deep Learning-Based Techniques for Time Series Ordinal Classification
    Ayllon-Gavilan, Rafael
    Guijo-Rubio, David
    Gutierrez, Pedro Antonio
    Bagnall, Anthony
    Hervas-Martinez, Cesar
    IEEE TRANSACTIONS ON CYBERNETICS, 2025, 55 (02) : 537 - 549
  • [40] Deep learning based classification of time series of chaotic systems over graphic images
    Süleyman UZUN
    Sezgin Kaçar
    Burak Arıcıoğlu
    Multimedia Tools and Applications, 2024, 83 : 8413 - 8437