Look into the LITE in deep learning for time series classification

被引:0
|
作者
Ismail-Fawaz, Ali [1 ]
Devanne, Maxime [1 ]
Berretti, Stefano [2 ]
Weber, Jonathan [1 ]
Forestier, Germain [1 ,3 ]
机构
[1] Univ Haute Alsace, IRIMAS, Mulhouse, France
[2] Univ Florence, MICC, Florence, Italy
[3] Monash Univ, DSAI, Melbourne, Australia
关键词
Time series classification; Deep learning; Convolutional neural networks; DepthWise separable convolutions;
D O I
10.1007/s41060-024-00708-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models have been shown to be a powerful solution for Time Series Classification (TSC). State-of-the-art architectures, while producing promising results on the UCR and the UEA archives, present a high number of trainable parameters. This can lead to long training with high CO2 emission, power consumption and possible increase in the number of FLoating-point Operation Per Second (FLOPS). In this paper, we present a new architecture for TSC, the Light Inception with boosTing tEchnique (LITE) with only 2.34%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2.34\%$$\end{document} of the number of parameters of the state-of-the-art InceptionTime model, while preserving performance. This architecture, with only 9, 814 trainable parameters due to the usage of DepthWise Separable Convolutions (DWSC), is boosted by three techniques: multiplexing, custom filters, and dilated convolution. The LITE architecture, trained on the UCR, is 2.78 times faster than InceptionTime and consumes 2.79 times less CO2 and power, while achieving an average accuracy of 84.62%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$84.62\%$$\end{document} compared to 84.91%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$84.91\%$$\end{document} with InceptionTime. To evaluate the performance of the proposed architecture on multivariate time series data, we adapt LITE to handle multivariate time series, we call this version LITEMV. To bring theory into application, we also conducted experiments using LITEMV on multivariate time series representing human rehabilitation movements, showing that LITEMV not only is the most efficient model but also the best performing for this application on the Kimore dataset, a skeleton-based human rehabilitation exercises dataset. Moreover, to address the interpretability of LITEMV, we present a study using Class Activation Maps to understand the classification decision taken by the model during evaluation.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] A deep multi-task representation learning method for time series classification and retrieval
    Chen, Ling
    Chen, Donghui
    Yang, Fan
    Sun, Jianling
    INFORMATION SCIENCES, 2021, 555 : 17 - 32
  • [42] Multivariate time series signals classification of electromyography (EMG) with deep learning algorithm and architecture
    Tseng, Kuo-Kun
    Chen, Guanrong
    Lin, Regina Fang-Ying
    Dong, Zhiwei
    BASIC & CLINICAL PHARMACOLOGY & TOXICOLOGY, 2019, 125 : 13 - 14
  • [43] Time Series Representation Learning: A Survey on Deep Learning Techniques for Time Series Forecasting
    Schmieg, Tobias
    Lanquillon, Carsten
    ARTIFICIAL INTELLIGENCE IN HCI, PT I, AI-HCI 2024, 2024, 14734 : 422 - 435
  • [44] Temporal representation learning for time series classification
    Hu, Yupeng
    Zhan, Peng
    Xu, Yang
    Zhao, Jia
    Li, Yujun
    Li, Xueqing
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (08): : 3169 - 3182
  • [45] Discriminative Dictionary Learning for Time Series Classification
    Zhang, Wei
    Wang, Zhihai
    Yuan, Jidong
    Hao, Shilei
    IEEE ACCESS, 2020, 8 : 185032 - 185044
  • [46] A Shapelet Learning Method for Time Series Classification
    Yang, Yi
    Deng, Qilin
    Shen, Furao
    Zhao, Jinxi
    Luo, Chaomin
    2016 IEEE 28TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2016), 2016, : 423 - 430
  • [47] Temporal representation learning for time series classification
    Yupeng Hu
    Peng Zhan
    Yang Xu
    Jia Zhao
    Yujun Li
    Xueqing Li
    Neural Computing and Applications, 2021, 33 : 3169 - 3182
  • [48] Characteristic Subspace Learning for Time Series Classification
    He, Yuanduo
    Pei, Jialiang
    Chu, Xu
    Wang, Yasha
    Jin, Zhu
    Peng, Guangju
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 1019 - 1024
  • [49] End-to-end multivariate time series classification via hybrid deep learning architectures
    Mehak Khan
    Hongzhi Wang
    Alladoumbaye Ngueilbaye
    Aya Elfatyany
    Personal and Ubiquitous Computing, 2023, 27 : 177 - 191
  • [50] DEEP LEARNING CROP CLASSIFICATION APPROACH BASED ON SPARSE CODING OF TIME SERIES OF SATELLITE DATA
    Lavreniuk, Mykola
    Kussul, Nataliia
    Novikov, Alexei
    IGARSS 2018 - 2018 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2018, : 4812 - 4815