FastEE: Fast Ensembles of Elastic Distances for time series classification

被引:0
|
作者
Chang Wei Tan
François Petitjean
Geoffrey I. Webb
机构
[1] Monash University,Faculty of Information Technology, 25 Exhibition Walk
来源
关键词
Time series classification; Scalable; Similarity measures; Ensembles;
D O I
暂无
中图分类号
学科分类号
摘要
In recent years, many new ensemble-based time series classification (TSC) algorithms have been proposed. Each of them is significantly more accurate than their predecessors. The Hierarchical Vote Collective of Transformation-based Ensembles (HIVE-COTE) is currently the most accurate TSC algorithm when assessed on the UCR repository. It is a meta-ensemble of 5 state-of-the-art ensemble-based classifiers. The time complexity of HIVE-COTE—particularly for training—is prohibitive for most datasets. There is thus a critical need to speed up the classifiers that compose HIVE-COTE. This paper focuses on speeding up one of its components: Ensembles of Elastic Distances (EE), which is the classifier that leverages on the decades of research into the development of time-dedicated measures. Training EE can be prohibitive for many datasets. For example, it takes a month on the ElectricDevices dataset with 9000 instances. This is because EE needs to cross-validate the hyper-parameters used for the 11 similarity measures it encompasses. In this work, Fast Ensembles of Elastic Distances is proposed to train EE faster. There are two versions to it. The exact version makes it possible to train EE 10 times faster. The approximate version is 40 times faster than EE without significantly impacting the classification accuracy. This translates to being able to train EE on ElectricDevices in 13 h.
引用
收藏
页码:231 / 272
页数:41
相关论文
共 50 条
  • [21] CHARACTERISTIC LENGTHS AND DISTANCES-FAST AND ROBUST FEATURES FOR NONLINEAR TIME SERIES
    Son Hai Nguyen
    Chelidze, David
    PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE 2012, VOL 1, PTS A AND B, 2012, : 615 - 621
  • [22] Time-Series Classification with COTE: The Collective of Transformation-Based Ensembles
    Bagnall, Anthony
    Lines, Jason
    Hills, Jon
    Bostrom, Aaron
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (09) : 2522 - 2535
  • [23] Learning elastic memory online for fast time series forecasting
    Samanta, Subhrajit
    Pratama, Mahardhika
    Sundaram, Suresh
    Srikanth, Narasimalu
    NEUROCOMPUTING, 2020, 390 : 315 - 326
  • [24] Fast, accurate and explainable time series classification through randomization
    Nestor Cabello
    Elham Naghizade
    Jianzhong Qi
    Lars Kulik
    Data Mining and Knowledge Discovery, 2024, 38 : 748 - 811
  • [25] Multiclass Sparse Centroids With Application to Fast Time Series Classification
    Bradde, Tommaso
    Fracastoro, Giulia
    Calafiore, Giuseppe C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (08) : 5206 - 5211
  • [26] Fast, accurate and explainable time series classification through randomization
    Cabello, Nestor
    Naghizade, Elham
    Qi, Jianzhong
    Kulik, Lars
    DATA MINING AND KNOWLEDGE DISCOVERY, 2024, 38 (02) : 748 - 811
  • [27] Combining discrete SVM and fixed cardinality warping distances for multivariate time series classification
    Orsenigo, C.
    Vercellis, C.
    PATTERN RECOGNITION, 2010, 43 (11) : 3787 - 3794
  • [28] DETECTION OF MULTIPLE DISORDERS AND CLASSIFICATION OF TIME-SERIES BY STATISTICAL ESTIMATES OF INTERCLASS DISTANCES
    MELNIKOVA, EN
    KHARIN, YS
    AUTOMATION AND REMOTE CONTROL, 1991, 52 (12) : 1703 - 1710
  • [29] Convolutional Neural Network with an Elastic Matching Mechanism for Time Series Classification
    Ouyang, Kewei
    Hou, Yi
    Zhou, Shilin
    Zhang, Ye
    ALGORITHMS, 2021, 14 (07)
  • [30] A Significantly Faster Elastic-Ensemble for Time-Series Classification
    Oastler, George
    Lines, Jason
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2019, PT I, 2019, 11871 : 446 - 453