Look into the LITE in deep learning for time series classification

被引:0
|
作者
Ismail-Fawaz, Ali [1 ]
Devanne, Maxime [1 ]
Berretti, Stefano [2 ]
Weber, Jonathan [1 ]
Forestier, Germain [1 ,3 ]
机构
[1] Univ Haute Alsace, IRIMAS, Mulhouse, France
[2] Univ Florence, MICC, Florence, Italy
[3] Monash Univ, DSAI, Melbourne, Australia
关键词
Time series classification; Deep learning; Convolutional neural networks; DepthWise separable convolutions;
D O I
10.1007/s41060-024-00708-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning models have been shown to be a powerful solution for Time Series Classification (TSC). State-of-the-art architectures, while producing promising results on the UCR and the UEA archives, present a high number of trainable parameters. This can lead to long training with high CO2 emission, power consumption and possible increase in the number of FLoating-point Operation Per Second (FLOPS). In this paper, we present a new architecture for TSC, the Light Inception with boosTing tEchnique (LITE) with only 2.34%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2.34\%$$\end{document} of the number of parameters of the state-of-the-art InceptionTime model, while preserving performance. This architecture, with only 9, 814 trainable parameters due to the usage of DepthWise Separable Convolutions (DWSC), is boosted by three techniques: multiplexing, custom filters, and dilated convolution. The LITE architecture, trained on the UCR, is 2.78 times faster than InceptionTime and consumes 2.79 times less CO2 and power, while achieving an average accuracy of 84.62%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$84.62\%$$\end{document} compared to 84.91%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$84.91\%$$\end{document} with InceptionTime. To evaluate the performance of the proposed architecture on multivariate time series data, we adapt LITE to handle multivariate time series, we call this version LITEMV. To bring theory into application, we also conducted experiments using LITEMV on multivariate time series representing human rehabilitation movements, showing that LITEMV not only is the most efficient model but also the best performing for this application on the Kimore dataset, a skeleton-based human rehabilitation exercises dataset. Moreover, to address the interpretability of LITEMV, we present a study using Class Activation Maps to understand the classification decision taken by the model during evaluation.
引用
收藏
页数:21
相关论文
共 50 条
  • [21] Robust IoT time series classification with data compression and deep learning
    Azar, Joseph
    Makhoul, Abdallah
    Couturier, Raphael
    Demerjian, Jacques
    NEUROCOMPUTING, 2020, 398 : 222 - 234
  • [22] Towards Backdoor Attack on Deep Learning based Time Series Classification
    Ding, Daizong
    Zhang, Mi
    Huang, Yuanmin
    Pan, Xudong
    Feng, Fuli
    Jiang, Erling
    Yang, Min
    2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, : 1274 - 1287
  • [23] Working memory load recognition with deep learning time series classification
    Pang, Richong
    Sang, Haojun
    Yi, Li
    Gao, Chenyang
    Xu, Hongkai
    Wei, Yanzhao
    Zhang, Lei
    Sun, Jinyan
    BIOMEDICAL OPTICS EXPRESS, 2024, 15 (05): : 2780 - 2797
  • [24] Time Series Analysis for Encrypted Traffic Classification: A Deep Learning Approach
    Vu, Ly
    Thuy, Hoang V.
    Quang Uy Nguyen
    Ngoc, Tran N.
    Nguyen, Diep N.
    Dinh Thai Hoang
    Dutkiewicz, Eryk
    2018 18TH INTERNATIONAL SYMPOSIUM ON COMMUNICATIONS AND INFORMATION TECHNOLOGIES (ISCIT), 2018, : 121 - 126
  • [25] A Hybrid Deep Representation Learning Model for Time Series Classification and Prediction
    Guo, Yang
    Wu, Zhenyu
    Ji, Yang
    2017 3RD INTERNATIONAL CONFERENCE ON BIG DATA COMPUTING AND COMMUNICATIONS (BIGCOM), 2017, : 226 - 231
  • [26] Time series classification: nearest neighbor versus deep learning models
    Weiwei Jiang
    SN Applied Sciences, 2020, 2
  • [27] Deep Transfer Learning for Time Series Data Based on Sensor Modality Classification
    Li, Frederic
    Shirahama, Kimiaki
    Nisar, Muhammad Adeel
    Huang, Xinyu
    Grzegorzek, Marcin
    SENSORS, 2020, 20 (15) : 1 - 25
  • [28] Trend Prediction Classification for High Frequency Bitcoin Time Series with Deep Learning
    Shintate, Takuya
    Pichl, Lukas
    JOURNAL OF RISK AND FINANCIAL MANAGEMENT, 2019, 12 (01)
  • [29] Quantitative and Qualitative Analysis of Time-Series Classification Using Deep Learning
    Ebrahim, Saba Ale
    Poshtan, Javad
    Jamali, Seyedh Mahboobeh
    Ebrahim, Nader Ale
    IEEE ACCESS, 2020, 8 : 90202 - 90215
  • [30] DuPLO: A DUal view Point deep Learning architecture for time series classificatiOn
    Interdonato, Roberto
    Ienco, Dino
    Gaetano, Raffaele
    Ose, Kenji
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2019, 149 : 91 - 104