Chaos forgets and remembers: Measuring information creation, destruction, and storage

被引:14
|
作者
James, Ryan G. [1 ]
Burke, Korana [1 ]
Crutchfield, James P. [1 ,2 ]
机构
[1] Univ Calif Davis, Ctr Complex Sci, Dept Phys, Davis, CA 95616 USA
[2] Santa Fe Inst, Santa Fe, NM 87501 USA
关键词
Chaos; Entropy rate; Bound information; Shannon information measures; Information diagram; Discrete-time maps;
D O I
10.1016/j.physleta.2014.05.014
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The hallmark of deterministic chaos is that it creates information the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system's intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information the ephemeral information is forgotten and a portion the bound information is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:2124 / 2127
页数:4
相关论文
共 20 条