Time irreversibility of financial time series based on higher moments and multiscale Kullback-Leibler divergence

被引:8
|
作者
Li, Jinyang [1 ]
Shang, Pengjian [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Sci, Dept Math, Beijing 100044, Peoples R China
关键词
Irreversibility; Financial time series; High moments; Multiscale Kullback-Leibler divergence; ENTROPY;
D O I
10.1016/j.physa.2018.02.099
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Irreversibility is an important property of time series. In this paper, we propose the higher moments and multiscale Kullback-Leibler divergence to analyze time series irreversibility. The higher moments Kullback-Leibler divergence (HMKLD) can amplify irreversibility and make the irreversibility variation more obvious. Therefore, many time series whose irreversibility is hard to be found are also able to show the variations. We employ simulated data and financial stock data to test and verify this method, and find that HMKLD of stock data is growing in the form of fluctuations. As for multiscale Kullback-Leibler divergence (MKLD), it is very complex in the dynamic system, so that exploring the law of simulation and stock system is difficult. In conventional multiscale entropy method, the coarse-graining process is non-overlapping, however we apply a different coarse-graining process and obtain a surprising discovery. The result shows when the scales are 4 and 5, their entropy is nearly similar, which demonstrates MKLD is efficient to display characteristics of time series irreversibility. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:248 / 255
页数:8
相关论文
共 50 条
  • [41] Kullback-Leibler divergence measure based tests concerning the biasness in a sample
    Economou, Polychronis
    Tzavelas, George
    STATISTICAL METHODOLOGY, 2014, 21 : 88 - 108
  • [42] Anomaly detection based on probability density function with Kullback-Leibler divergence
    Wang, Wei
    Zhang, Baoju
    Wang, Dan
    Jiang, Yu
    Qin, Shan
    SIGNAL PROCESSING, 2016, 126 : 12 - 17
  • [43] Improving PCA-based anomaly detection by using multiple time scale analysis and Kullback-Leibler divergence
    Callegari, Christian
    Gazzarrini, Loris
    Giordano, Stefano
    Pagano, Michele
    Pepe, Teresa
    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, 2014, 27 (10) : 1731 - 1751
  • [44] Edge Detection Method of Binary Image Based on Kullback-Leibler Divergence
    Li, Jianjun
    Wei, Zhihui
    Zhang, Zhengjun
    PROCEEDINGS OF 2008 INTERNATIONAL PRE-OLYMPIC CONGRESS ON COMPUTER SCIENCE, VOL II: INFORMATION SCIENCE AND ENGINEERING, 2008, : 466 - 468
  • [45] Kullback-Leibler Divergence based Graph Pruning in Robotic Feature Mapping
    Wang, Yue
    Xiong, Rong
    Li, Qianshan
    Huang, Shoudong
    2013 EUROPEAN CONFERENCE ON MOBILE ROBOTS (ECMR 2013), 2013, : 32 - 37
  • [46] Dispersion indices based on Kerridge inaccuracy measure and Kullback-Leibler divergence
    Balakrishnan, Narayanaswamy
    Buono, Francesco
    Cali, Camilla
    Longobardi, Maria
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2024, 53 (15) : 5574 - 5592
  • [47] Kullback-Leibler divergence based multidimensional robust universal hypothesis testing
    Bahceci, Ufuk
    STATISTICS AND COMPUTING, 2024, 34 (06)
  • [48] Kullback-Leibler divergence-based global localization for mobile robots
    Martin, Fernando
    Moreno, Luis
    Blanco, Dolores
    Luisa Munoz, Maria
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2014, 62 (02) : 120 - 130
  • [49] A Novel Multiple Kernel Learning Method Based on the Kullback-Leibler Divergence
    Liang, Zhizheng
    Zhang, Lei
    Liu, Jin
    NEURAL PROCESSING LETTERS, 2015, 42 (03) : 745 - 762
  • [50] Human tracking with variable prediction steps based on Kullback-Leibler divergence
    Takemura, Noriko
    Nakamura, Yutaka
    Matsumoto, Yoshio
    Ishiguro, Hiroshi
    ARTIFICIAL LIFE AND ROBOTICS, 2010, 15 (01) : 111 - 116