Time Series Classification with Shallow Learning Shepard Interpolation Neural Networks

被引:2
|
作者
Smith, Kaleb E. [1 ]
Williams, Phillip [2 ]
机构
[1] Florida Inst Technol, Melbourne, FL 32901 USA
[2] Univ Ottawa, Ottawa, ON, Canada
来源
关键词
D O I
10.1007/978-3-319-94211-7_36
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time series classification (TSC) has been an ongoing machine learning problem with countless proposed algorithms spanning a multitude of fields. Whole series, intervals, shapelet, dictionary-based, and model-based are all different past approaches to solving TSC. Then there's deep learning approaches that try to utilize all the success demonstrated by neural network's (NN) architecture in image classification to TSC. Deep learning typically requires vast amounts of training data and computational power to have meaningful results. But, what if there was a network inspired not by a biological brain, but that of mathematics proven in theory? Or better yet, what if that network was not as computationally expensive as deep learning networks, which have billions of parameters and need a surplus of training data? This desired network is exactly what the Shepard Interpolation Neural Networks (SINN) provide - a shallow learning approach with minimal training samples needed and a foundation on a statistical interpolation technique to achieve great results. These networks learn metric features which can be more mathematically explained and understood. In this paper, we leverage the novel SINN architecture on a popular benchmark TSC data set achieving stateof- the-art accuracy on several of its test sets while being competitive against the other established algorithms. We also demonstrate that even when there is a lack of training data, the SINN outperforms other deep learning algorithms.
引用
收藏
页码:329 / 338
页数:10
相关论文
共 50 条
  • [1] Shepard Interpolation Neural Networks with K-Means: A Shallow Learning Method for Time Series Classification
    Smith, Kaleb E.
    Williams, Phillip
    Bryan, Kaylen J.
    Solomon, Mitchell
    Ble, Max
    Haber, Rana
    [J]. 2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [2] Deep Convolutional-Shepard Interpolation Neural Networks for Image Classification Tasks
    Smith, Kaleb E.
    Williams, Phillip
    Chaiya, Tatsanee
    Ble, Max
    [J]. IMAGE ANALYSIS AND RECOGNITION (ICIAR 2018), 2018, 10882 : 185 - 192
  • [3] Convolutional neural networks for time series classification
    Bendong Zhao
    Huanzhang Lu
    Shangfeng Chen
    Junliang Liu
    Dongya Wu
    [J]. Journal of Systems Engineering and Electronics, 2017, 28 (01) : 162 - 169
  • [4] Convolutional neural networks for time series classification
    Zhao, Bendong
    Lu, Huanzhang
    Chen, Shangfeng
    Liu, Junliang
    Wu, Dongya
    [J]. JOURNAL OF SYSTEMS ENGINEERING AND ELECTRONICS, 2017, 28 (01) : 162 - 169
  • [5] Convolutional Neural Networks for Time Series Classification
    Zebik, Mariusz
    Korytkowski, Marcin
    Angryk, Rafal
    Scherer, Rafal
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2017, PT II, 2017, 10246 : 635 - 642
  • [6] Recurrent neural networks for time series classification
    Hüsken, M
    Stagge, P
    [J]. NEUROCOMPUTING, 2003, 50 : 223 - 235
  • [7] Learning and predicting time series by neural networks
    Freking, Ansgar
    Kinzel, Wolfgang
    Kanter, Ido
    [J]. Physical Review E - Statistical, Nonlinear, and Soft Matter Physics, 2002, 65 (05): : 1 - 050903
  • [8] Learning and predicting time series by neural networks
    Freking, A
    Kinzel, W
    Kanter, I
    [J]. PHYSICAL REVIEW E, 2002, 65 (05):
  • [9] Predictive modular neural networks for time series classification
    Kehagias, A
    Petridis, V
    [J]. NEURAL NETWORKS, 1997, 10 (01) : 31 - 49
  • [10] Combining contextual neural networks for time series classification
    Kamara, Amadu Fullah
    Chen, Enhong
    Liu, Qi
    Pan, Zhen
    [J]. NEUROCOMPUTING, 2020, 384 : 57 - 66