Visualising deep network time-series representations

被引:1
|
作者
Leporowski, Blazej [1 ]
Iosifidis, Alexandros [1 ]
机构
[1] Aarhus Univ, Dept Elect & Comp Engn, Aarhus, Denmark
来源
NEURAL COMPUTING & APPLICATIONS | 2021年 / 33卷 / 23期
关键词
Time-series visualisation; Neural network representations visualisation; Financial data; Limit order book;
D O I
10.1007/s00521-021-06244-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the popularisation of machine learning models, more often than not, they still operate as black boxes with no insight into what is happening inside the model. There exist a few methods that allow to visualise and explain why a model has made a certain prediction. Those methods, however, allow visualisation of the link between the input and output of the model without presenting how the model learns to represent the data used to train the model as whole. In this paper, a method that addresses that issue is proposed, with a focus on visualising multi-dimensional time-series data. Experiments on a high-frequency stock market dataset show that the method provides fast and discernible visualisations. Large datasets can be visualised quickly and on one plot, which makes it easy for a user to compare the learned representations of the data. The developed method successfully combines known techniques to provide an insight into the inner workings of time-series classification models.
引用
收藏
页码:16489 / 16498
页数:10
相关论文
共 50 条
  • [1] Visualising deep network time-series representations
    Błażej Leporowski
    Alexandros Iosifidis
    [J]. Neural Computing and Applications, 2021, 33 : 16489 - 16498
  • [2] Ensemble echo network with deep architecture for time-series modeling
    Ruihan Hu
    Zhi-Ri Tang
    Xiaoying Song
    Jun Luo
    Edmond Q. Wu
    Sheng Chang
    [J]. Neural Computing and Applications, 2021, 33 : 4997 - 5010
  • [3] Ensemble echo network with deep architecture for time-series modeling
    Hu, Ruihan
    Tang, Zhi-Ri
    Song, Xiaoying
    Luo, Jun
    Wu, Edmond Q.
    Chang, Sheng
    [J]. NEURAL COMPUTING & APPLICATIONS, 2021, 33 (10): : 4997 - 5010
  • [4] Time-series Clustering with Jointly Learning Deep Representations, Clusters and Temporal Boundaries
    Tzirakis, Panagiotis
    Nicolaou, Mihalis A.
    Schuller, Bjoern
    Zafeiriou, Stefanos
    [J]. 2019 14TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2019), 2019, : 438 - 442
  • [5] On Graph Time-Series Representations for Temporal Networks
    Rossi, Ryan A.
    Ahmed, Nesreen K.
    Park, Namyong
    [J]. COMPANION OF THE WORLD WIDE WEB CONFERENCE, WWW 2023, 2023, : 14 - 18
  • [6] ARMA APPROXIMATIONS AND REPRESENTATIONS OF A STATIONARY TIME-SERIES
    POURAHMADI, M
    [J]. SANKHYA-THE INDIAN JOURNAL OF STATISTICS SERIES B, 1992, 54 : 235 - 241
  • [7] Time-Series Information and Unsupervised Learning of Representations
    Ryabko, Daniil
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2020, 66 (03) : 1702 - 1713
  • [8] A Novel Deep Parallel Time-Series Relation Network for Fault Diagnosis
    Yang, Chun
    Zhang, Jiyang
    Chang, Yang
    Zou, Jianxiao
    Liu, Zhiliang
    Fan, Shicai
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [9] Simulating Time-Series Data for Improved Deep Neural Network Performance
    Yeomans, Jordan
    Thwaites, Simon
    Robertson, William S. P.
    Booth, David
    Ng, Brian
    Thewlis, Dominic
    [J]. IEEE ACCESS, 2019, 7 : 131248 - 131255
  • [10] Deep Time-Series Clustering: A Review
    Alqahtani, Ali
    Ali, Mohammed
    Xie, Xianghua
    Jones, Mark W.
    [J]. ELECTRONICS, 2021, 10 (23)