Generalized (c,d)-Entropy and Aging Random Walks
<p>Entropies parametrized in the <math display="inline"> <mrow> <mo>(</mo> <mi>c</mi> <mo>,</mo> <mi>d</mi> <mo>)</mo> </mrow> </math>-plane, with their associated distribution functions. Boltzmann–Gibbs–Shannon (BGS) entropy corresponds to <math display="inline"> <mrow> <mo>(</mo> <mn>1</mn> <mo>,</mo> <mn>1</mn> <mo>)</mo> </mrow> </math>, Tsallis entropy to <math display="inline"> <mrow> <mo>(</mo> <mi>c</mi> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </math> and entropies for stretched exponentials to <math display="inline"> <mrow> <mo>(</mo> <mn>1</mn> <mo>,</mo> <mi>d</mi> <mo>></mo> <mn>0</mn> <mo>)</mo> </mrow> </math>. Entropies leading to distribution functions with compact support belong to equivalence class <math display="inline"> <mrow> <mo>(</mo> <mn>1</mn> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </math>. Figure from [<a href="#B3-entropy-15-05324" class="html-bibr">3</a>].</p> "> Figure 2
<p>An example for an auto-correlated random walk that persistently walks in the same direction for <math display="inline"> <mrow> <mo>∝</mo> <msup> <mi>n</mi> <mrow> <mn>1</mn> <mo>−</mo> <mi>α</mi> </mrow> </msup> </mrow> </math> steps (<math display="inline"> <mrow> <mi>α</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>5</mn> </mrow> </math>).</p> "> Figure 3
<p>Comparison of the first three even moments, <math display="inline"> <mrow> <mo>〈</mo> <msup> <mi>x</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> <mo>〉</mo> </mrow> </math>, <math display="inline"> <mrow> <mo>〈</mo> <msup> <mi>x</mi> <mn>4</mn> </msup> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> <mo>〉</mo> </mrow> </math> and <math display="inline"> <mrow> <mo>〈</mo> <msup> <mi>x</mi> <mn>6</mn> </msup> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> <mo>〉</mo> </mrow> </math>, and the average number of direction reversal-decisions, <math display="inline"> <mrow> <msub> <mi>k</mi> <mo>−</mo> </msub> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> </mrow> </math>, with <math display="inline"> <mrow> <mn>1</mn> <mo>≤</mo> <mi>N</mi> <mo>≤</mo> <mn>50</mn> <mo>,</mo> <mn>000</mn> </mrow> </math> for the auto-correlated random walk (blue lines) and aging random walks (red dashed lines) for values <math display="inline"> <mrow> <mi>α</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>2</mn> </mrow> </math>, <math display="inline"> <mrow> <mn>0</mn> <mo>.</mo> <mn>5</mn> </mrow> </math> and <math display="inline"> <mrow> <mn>0</mn> <mo>.</mo> <mn>8</mn> </mrow> </math>.</p> "> Figure 4
<p>The maximal number of direction reversal decisions in random walks in entropic classes <math display="inline"> <mrow> <mo>(</mo> <mi>c</mi> <mo>,</mo> <mi>d</mi> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> </math> with <math display="inline"> <mrow> <mn>0</mn> <mo><</mo> <mi>c</mi> <mo><</mo> <mn>1</mn> </mrow> </math> for the values <math display="inline"> <mrow> <mi>λ</mi> <mo>=</mo> <mn>1</mn> <mo>.</mo> <mn>1</mn> </mrow> </math>, <math display="inline"> <mrow> <mn>1</mn> <mo>.</mo> <mn>2</mn> </mrow> </math> and <math display="inline"> <mrow> <mn>1</mn> <mo>.</mo> <mn>3</mn> </mrow> </math>.</p> "> Figure 5
<p>In the three top panes, the second moment, <math display="inline"> <mrow> <mo>〈</mo> <msup> <mi>x</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> <mo>〉</mo> </mrow> </math>, is shown for <math display="inline"> <mrow> <mi>ν</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>2</mn> </mrow> </math>, <math display="inline"> <mrow> <mn>0</mn> <mo>.</mo> <mn>5</mn> </mrow> </math> and <math display="inline"> <mrow> <mn>0</mn> <mo>.</mo> <mn>8</mn> </mrow> </math>, for <math display="inline"> <mrow> <mi>λ</mi> <mo>=</mo> <mn>1</mn> <mo>.</mo> <mn>1</mn> </mrow> </math> (black), <math display="inline"> <mrow> <mn>1</mn> <mo>.</mo> <mn>2</mn> </mrow> </math> (red) and <math display="inline"> <mrow> <mn>1</mn> <mo>.</mo> <mn>3</mn> </mrow> </math> (green). The blue dotted and dashed lines indicate the functions, <math display="inline"> <msup> <mi>N</mi> <mn>2</mn> </msup> </math> and <span class="html-italic">N</span>, respectively. A cross-over from <math display="inline"> <mrow> <mo>〈</mo> <msup> <mi>x</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> <mo>〉</mo> <mo>∼</mo> <mi>N</mi> </mrow> </math> to <math display="inline"> <mrow> <mrow> <mo>〈</mo> <msup> <mi>x</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> <mo>〉</mo> </mrow> <mo>∼</mo> <msup> <mi>N</mi> <mn>2</mn> </msup> </mrow> </math> (free motion) is clearly visible for <math display="inline"> <mrow> <mi>ν</mi> <mo>=</mo> <mn>0</mn> <mo>.</mo> <mn>5</mn> </mrow> </math> and <math display="inline"> <mrow> <mn>0</mn> <mo>.</mo> <mn>8</mn> </mrow> </math>. The three bottom panes show the average number of direction reversal-decisions, <math display="inline"> <mrow> <msub> <mi>k</mi> <mo>−</mo> </msub> <mrow> <mo>(</mo> <mi>N</mi> <mo>)</mo> </mrow> </mrow> </math>. Simulations were performed in the range <math display="inline"> <mrow> <mn>1</mn> <mo>≤</mo> <mi>N</mi> <mo>≤</mo> <mn>50</mn> <mo>,</mo> <mn>000</mn> </mrow> </math>. For <math display="inline"> <mrow> <mi>ν</mi> <mo>→</mo> <mn>1</mn> </mrow> </math>, the crossover happens at smaller <span class="html-italic">N</span>, for all values of <span class="html-italic">λ</span>.</p> ">
Abstract
:1. Introduction: Mini-Review of -Entropy
- Khinchin’s first axiom states that for a system with W potential outcomes (states), each of which is given by a probability, , with , the entropy, , as a measure of uncertainty about the system must take its maximum for the equi-distribution , for all i.
- Khinchin’s second axiom (missing in [4]) states that any entropy should remain invariant under adding zero-probability states to the system, i.e., .
- Khinchin’s third axiom (separability axiom) finally makes a statement of the composition of two finite probabilistic systems, A and B. If the systems are independent of each other, entropy should be additive, meaning that the entropy of the combined system, , should be the sum of the individual systems, . If the two systems are dependent on each other, the entropy of the combined system, i.e., the information given by the realization of the two finite schemes, A and B, , is equal to the information gained by a realization of system A, , plus the mathematical expectation of information gained by a realization of system B, after the realization of system A, .
- Khinchin’s fourth axiom is the requirement that entropy is a continuous function of all its arguments, , and does not depend on anything else.
Entropy | c | d | Reference | |
---|---|---|---|---|
c | d | |||
1 | 1 | [5] | ||
0 | [6] | |||
() | 0 | [8] | ||
1 | 0 | [6] | ||
1 | 0 | [9] | ||
1 | 0 | [10] | ||
1 | [7] | |||
1 | [12,13] | |||
1 | [14] |
- SK1: The requirement that S depends continuously on p implies that g is a continuous function.
- SK2: The requirement that the entropy is maximal for the equi-distribution (for all i) implies that g is a concave function.
- SK3: The requirement that adding a zero-probability state to a system, with , does not change the entropy implies that .
- SK4 (separability axiom): The entropy of a system, composed of sub-systems A and B, equals the entropy of A plus the expectation value of the entropy of B, conditional on A. Note that this also corresponds exactly to Markovian processes.
- Boltzmann–Gibbs entropy belongs to the class. One gets from Equation (3)
- Tsallis entropy belongs to the class. From Equation (3) and the choice (see below), we get
1.1. Distribution Functions
Special Cases of Distribution Functions
1.2. How to Determine the Exponents, c and d.
1.3. A Note on Rényi-type Entropies
2. Aging Random Walks
2.1. Accelerating and Auto-Correlated Random Walks
2.2. Generalization to Aging (Path-Dependent) Random Walks
2.3. General Classes of Aging Random Walks
3. Conclusions
Conflicts of Interest
References
- Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. Europhys. Lett. 2011, 93, 20006. [Google Scholar] [CrossRef]
- Hanel, R.; Thurner, S. When do generalized entropies apply? How phase space volume determines entropy. Europhys. Lett. 2011, 96, 50003. [Google Scholar] [CrossRef]
- Thurner, S.; Hanel, R. What Do Generalized Entropies Look Like? An Axiomatic Approach for Complex, Non-Ergodic Systems. In Recent Advances in Generalized Information Measures and Statistics; Kowalski, A.M., Rossignoli, R., Curado, E.M.F., Eds.; Bentham Science eBook: Sharjah, United Arab Emirates, 2013; in press. [Google Scholar]
- Shannon, C.E. A Mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar]
- Khinchin, A.I. Mathematical Foundations of Information Theory; Dover Publications: Mineola, NY, USA, 1957. [Google Scholar]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Anteneodo, C.; Plastino, A.R. Maximum entropy approach to stretched exponential probability distributions. J. Phys. A: Math Gen. 1999, 32, 1089–1097. [Google Scholar] [CrossRef]
- Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef]
- Curado, E.M.F.; Nobre, F.D. On the stability of analytic entropic forms. Phys. Stat. Mech. Appl. 2004, 335, 94–106. [Google Scholar] [CrossRef]
- Tsekouras, G.A.; Tsallis, C. Generalized entropy arising from a distribution of q indices. Phys. Rev. E 2005, 71, 046144. [Google Scholar] [CrossRef]
- Hanel, R.; Thurner, S. Generalized Boltzmann factors and the maximum entropy principle: Entropies for complex systems. Phys. Stat. Mech. Appl. 2007, 380, 109–114. [Google Scholar] [CrossRef]
- Ubriaco, M.R. Entropies based on factional calculus. Phys. Lett. A 2009, 373, 2516–2519. [Google Scholar] [CrossRef]
- Tsallis, C. Introduction to Nonextensive Statistical Mechanics; Springer: New York, NY, USA, 2009. [Google Scholar]
- Shafee, F. Lambert function and a new non-extensive form of entropy. IMA J. Appl. Math. 2007, 72, 785–800. [Google Scholar] [CrossRef]
- Hanel, R.; Thurner, S.; Gell-Mann, M. Generalized entropies and the transformation group of superstatistics. Proc. Natl. Acad. Sci. USA 2011, 108, 6390–6394. [Google Scholar] [CrossRef]
- Hanel, R.; Thurner, S.; Gell-Mann, M. Generalized entropies and logarithms and their duality relations. Proc. Natl. Acad. Sci. USA 2012, 109, 19151–19154. [Google Scholar] [CrossRef] [PubMed]
- Lesche, B. Instabilities of Rényi entropies. J. Stat. Phys. 1982, 27, 419–422. [Google Scholar] [CrossRef]
- Abe, S. Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies. Phys. Rev. E 2002, 66, 046134. [Google Scholar] [CrossRef]
- Jizba, P.; Arimitsu, T. Observability of Rényis entropy. Phys. Rev. E 2004, 69, 026128. [Google Scholar] [CrossRef]
- Kaniadakis, G.; Scarfone, A.M. Lesche stability of κ-entropy. Phys. Stat. Mech. Appl. 2004, 340, 102–109. [Google Scholar] [CrossRef]
- Hanel, R.; Thurner, S.; Tsallis, C. On the robustness of q-expectation values and Rényi entropy. Europhys. Lett. 2009, 85, 20005. [Google Scholar] [CrossRef]
- Tsallis, C.; Gell-Mann, M.; Sato, Y. Asymptotically scale-invariant occupancy of phase space makes the entropy Sq extensive. Proc. Natl. Acad. Sci. USA 2005, 102, 15377–15382. [Google Scholar] [CrossRef] [PubMed]
© 2013 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Hanel, R.; Thurner, S. Generalized (c,d)-Entropy and Aging Random Walks. Entropy 2013, 15, 5324-5337. https://doi.org/10.3390/e15125324
Hanel R, Thurner S. Generalized (c,d)-Entropy and Aging Random Walks. Entropy. 2013; 15(12):5324-5337. https://doi.org/10.3390/e15125324
Chicago/Turabian StyleHanel, Rudolf, and Stefan Thurner. 2013. "Generalized (c,d)-Entropy and Aging Random Walks" Entropy 15, no. 12: 5324-5337. https://doi.org/10.3390/e15125324