Learning Entropy: On Shannon vs. Machine-Learning-Based Information in Time Series

被引:0
|
作者
Bukovsky, Ivo [1 ]
Budik, Ondrej [1 ]
机构
[1] Univ South Bohemia Ceske Budejovice, Fac Sci, Dept Comp Sci, Ceske Budejovice, Czech Republic
关键词
Novelty detection; Learning Entropy; Adaptive filter; Least mean square; Multichannel EEG; APPROXIMATE ENTROPY; NOVELTY DETECTION;
D O I
10.1007/978-3-031-14343-4_38
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The paper discusses the Learning-based information (L) and Learning Entropy (LE) in contrast to classical Shannon probabilistic Information (I) and probabilistic entropy (H). It is shown that L corresponds to the recently introduced Approximate Individual Sample-point Learning Entropy (AISLE). For data series, then, the LE should be defined as the mean value of L that is finally in proper accordance with Shannon's concept of entropy H. The distinction of L against I is explained by the real-time anomaly detection of individual time series data points (states). First, the principal distinction of the information concept of Ivs.L is demonstrated in respect to data governing law that L considers explicitly (while I does not). Second, it is shown that L has the potential to be applied on much shorter datasets than I because of the learning system being pre-trained and being able to generalize from a smaller dataset. Then, floating window trajectories of the covariance matrix norm, the trajectory of approximate variance fractal dimension, and especially the windowed Shannon Entropy trajectory are compared to LE on multichannel EEG featuring epileptic seizure. The results on real time series show that L, i.e., AISLE, can be a useful counterpart to Shannon entropy allowing us also for more detailed search of anomaly onsets (change points).
引用
收藏
页码:402 / 415
页数:14
相关论文
共 50 条
  • [1] Machine-Learning-Based Semiparametric Time Series Conditional Variance: Estimation and Forecasting
    Dang, Justin
    Ullah, Aman
    [J]. JOURNAL OF RISK AND FINANCIAL MANAGEMENT, 2022, 15 (01)
  • [2] Forecasting Bitcoin returns: Econometric time series analysis vs. machine learning
    Berger, Theo
    Koubova, Jana
    [J]. JOURNAL OF FORECASTING, 2024,
  • [3] Analysis of Shannon-Fisher information plane in time series based on information entropy
    Wang, Yuanyuan
    Shang, Pengjian
    [J]. CHAOS, 2018, 28 (10)
  • [4] Shannon entropy vs. Kolmogorov complexity
    Muchnik, An.
    Vereshchagin, N.
    [J]. COMPUTER SCIENCE - THEORY AND APPLICATIONS, 2006, 3967 : 281 - 291
  • [5] Machine-Learning-Based Accessibility System
    Banerjee K.
    Singh A.
    Akhtar N.
    Vats I.
    [J]. SN Computer Science, 5 (3)
  • [6] Machine-learning-based image categorization
    Han, YT
    Qi, XJ
    [J]. IMAGE ANALYSIS AND RECOGNITION, 2005, 3656 : 585 - 592
  • [7] Machine-Learning-Based Predictive Handover
    Masri, Ahmed
    Veijalainen, Teemu
    Martikainen, Henrik
    Mwanje, Stephen
    Ali-Tolppa, Janne
    Kajo, Marton
    [J]. 2021 IFIP/IEEE INTERNATIONAL SYMPOSIUM ON INTEGRATED NETWORK MANAGEMENT (IM 2021), 2021, : 648 - 652
  • [8] A Review of ARIMA vs. Machine Learning Approaches for Time Series Forecasting in Data Driven Networks
    Kontopoulou, Vaia I.
    Panagopoulos, Athanasios D.
    Kakkos, Ioannis
    Matsopoulos, George K.
    [J]. FUTURE INTERNET, 2023, 15 (08):
  • [9] Machine-Learning-Based Functional Time Series Forecasting: Application to Age-Specific Mortality Rates
    Beyaztas, Ufuk
    Shang, Hanlin
    [J]. FORECASTING, 2022, 4 (01): : 394 - 408
  • [10] Time Series Prediction Based on Machine Learning
    Jiang, Q. Y.
    [J]. PROCEEDINGS OF THE 2015 INTERNATIONAL CONFERENCE ON ELECTRICAL, AUTOMATION AND MECHANICAL ENGINEERING (EAME 2015), 2015, 13 : 128 - 129