Information Entropy of Chaos

  • Tutorial


Introduction


There are a lot of publications on Habr in which the concept of entropy is considered, here are just a few of them [1 ÷ 5]. The publications were positively received by readers and aroused great interest. It is enough to give the definition of entropy given by the author of the publication [1]: “entropy is how much information you do not know about the system.” Publications about the chaos phenomenon on Habr are also sufficient [6–9]. However, the relationship of entropy and chaos in both groups of publications was not considered.

This is due to the fact that different areas of knowledge distinguish different types of measures of chaos:

  • informational;
  • thermodynamic;
  • differential;
  • cultural.

Chaos measures are also described taking into account their specificity, even in one of these areas is quite difficult.

Trying to simplify the task as much as possible, I decided to consider the relationship between informational entropy and chaos using the example of the similarity of the regions of passage from order to chaos in the diagrams in the form of point mappings and in the graphs of the entropy coefficient for these regions.

What came of this you will learn by looking under the cat.

The mechanisms of transition from order to chaos


An analysis of the mechanisms of the transition from order to chaos in real systems and various models has revealed the versatility of relatively few scenarios of transition to chaos. The transition to chaos can be represented in the form of a bifurcation diagram (the term “bifurcation” is used to denote qualitative rearrangements of the system with the emergence of a new mode of its behavior).

The entry of the system into unpredictable mode is described by a cascade of bifurcations following one after another. The cascade of bifurcations leads sequentially to a choice between two solutions, then four and so on, the system begins to oscillate in a chaotic, turbulent mode of sequentially doubling the number of possible values.

We consider bifurcations of period doubling and the appearance of chaos in point mappings. Display is a function that shows the dependence of the following values ​​of the system parameters on previous values:

$ x_ {n + 1} = f (x_ {n}) = \ lambda x_ {n} (1-x_ {n}) $

Consider also the second commonly used function:

$ x_ {n + 1} = f (x_ {n}) = \ lambda \ cdot x_ {n} \ cdot (1-x_ {n} ^ {2}) $

Using point mappings, objects are studied not with continuous, but with discrete time . Upon transition to the display, the dimension of the system under study may decrease.

When changing the external parameter \ lambda, point mappings exhibit rather complicated behavior, which becomes chaotic with sufficiently large \ lambda. Chaos is a very fast recession of trajectories in phase space.

Bifurcation is a qualitative restructuring of the motion picture. The values ​​of the control parameter at which bifurcations occur are called critical or bifurcation values.

To build the diagrams, we will use the following two listings:

No. 1. For function:$ x_ {n + 1} = f (x_ {n}) = \ lambda x_ {n} (1-x_ {n}) $

Program listing
# -*- coding: utf8 -*-
import matplotlib.pyplot as plt
from numpy import *
def f(a,x0):
        x1=(a-1)/a#точка пересечения функции с прямой        
        def ff(x):#логистическая функция
                return a*x*(1-x)   
        def fl(x):
                return x
        x=x0;y=0;Y=[];X=[]
        for i in arange(1,1000,1):
                X.append(x)
                Y.append(y)
                y=ff(x)
                X.append(x)
                Y.append(y)
                x=y
        plt.title('Диаграмма логистической функции \n\
$x_{n+1}=\lambda \cdot x_{n}\cdot (1-x_{n})$ при $\lambda$ =%s и x0=%s '%(a,x0))       
        plt.plot(X,Y,'r')
        x1=arange(0,1,0.001)
        y1=[ff(x) for x in x1]
        y2=[fl(x) for x in x1]
        plt.plot(x1,y1,'b')
        plt.plot(x1,y2,'g')
        plt.grid(True)
        plt.show()

No. 2. For function $ x_ {n + 1} = f (x_ {n}) = \ lambda \ cdot x_ {n} \ cdot (1-x_ {n} ^ {2}) $

Program listing
# -*- coding: utf8 -*-
import matplotlib.pyplot as plt
from numpy import *
def f(a,x0):
        x1=((a-1)/a)**0.5
        def ff(x):#логистическая функция
                return a*x*(1-x**2)  
        def fl(x):
                return x
        x=x0;y=0;Y=[];X=[]
        for i in arange(1,1000,1):
                X.append(x)
                Y.append(y)
                y=ff(x)
                X.append(x)
                Y.append(y)
                x=y         
        plt.title('Диаграмма логистической функции \n\
$x_{n+1}=\lambda \cdot x_{n}\cdot (1-x_{n}^{2})$ при $\lambda$ =%s и x0=%s '%(a,x0))
        plt.plot(X,Y,'r')
        x1=arange(0,1,0.001)
        y1=[ff(x) for x in x1]
        y2=[fl(x) for x in x1]
        plt.plot(x1,y1,'b')
        plt.plot(x1,y2,'g')
        plt.grid(True)
        plt.show()


To assess the impact of the nature of the logistic function on critical values $ \ lambda $ consider diagrams with a function $ x_ {n + 1} = f (x_ {n}) = \ lambda x_ {n} (1-x_ {n}) $for this we will use listing No. 1:

For 0 <\ lambda <1 for$ \ lambda = $ 0.95and x0 = 0.47 we get the diagram:



In this case, the map has a single fixed point$ x ^ {*} = 0 $
which is sustainable.

At$ 1 <\ lambda <3 $ for $ \ lambda = $ 2.5x0 = 0.7, we obtain the diagram:



On the interval [0, 1], one more fixed stable point appears$ x_ {1} ^ {*} = 1-1 / \ lambda $

At $ 1 <\ lambda <3 $ for $ \ lambda = $ 2.5and x0 = 0.01 we get the diagram:



Fixed point$ x ^ {*} = 0 $loses stability.

At$ 3 <\ lambda <3.45 $ for $ \ lambda = $ 3.43and x0 = 0.7 we get the diagram: The



mapping undergoes bifurcation: a fixed point$ x_ {1} ^ {*} $becomes unstable, and a double cycle appears instead.

At$ 3.45 <\ lambda <4.0 $ for $ \ lambda = $ 3.55
and x0 = 0.2 we get the diagram:



When passing the parameter$ \ lambda $ through value $ \ lambda = 3.45 $, A 2-fold cycle becomes a 4-fold, and beyond.

At the final value$ \ lambda = 4 $the system has unstable cycles of all possible orders:



To assess the influence of the nature of the logistic function on critical values$ \ lambda $ consider diagrams with a function $ x_ {n + 1} = f (x_ {n}) = \ lambda \ cdot x_ {n} \ cdot (1-x_ {n} ^ {2}) $, for this we will use listing No. 2.

At$ 0 <\ lambda <= 1.0 $ for $ \ lambda = 0.5 $and x0 = 0.2: The



map has a single fixed point$ x ^ {*} = 0 $which is sustainable.

At$ 1 <\ lambda <= 1.998 ... $ for $ \ lambda = $ 1.8and x0 = 0.55:



Point$ x ^ {*} = 0 $ loses stability, a new stable point appears $ x_ {1} ^ {*} $

At $ 1.99 <\ lambda <= 2.235 ... $ for $ \ lambda = $ 2.2and x0 = 0.2:



A period doubling bifurcation occurs, a 2-fold cycle appears. Further increase$ \ lambda $leads to a cascade of period doubling bifurcations.

At$ 2,235 <\ lambda 2.5980 ... $ for $ \ lambda = 2.287 $and x0 = 0.2:



Increase$ \ lambda $led to a cascade of period doubling bifurcations.

At$ \ lambda = 2.59 $the system has unstable cycles of all possible periods:



As was shown in the diagrams, with an increase in the order of the logistic function, the range of change$ \ lambda $tapers off.

Using diagrams, we traced the path from order to chaos, while setting the values$ \ lambda $for different logistic functions. It remains to answer the question: how to measure chaos? The answer for some of the types of chaos listed at the beginning of the article is known
- entropy is a measure of chaos. This answer can be fully attributed to information chaos, however, what entropy is applied here and how to compare with the numerical value already considered$ \ lambda $ - I will try to answer this question in the next part of the article.

Information entropy and entropy coefficient


We will consider information binary entropy for independent random events. $ x $ c $ n $ possible states distributed with probabilities $ p_ {i} (i = 1, .., n) $. Information binary entropy is calculated by the formula:

$ H (x) = - \ sum_ {i = 1} ^ {n} p_ {i} \ cdot log_ {2} (p_ {i}) $

This value is also called the mean entropy of the message. Value$ H_ {i} = - log_ {2} (p_ {i}) $called private entropy, characterizing only the i -th state. In the general case, the base of the logarithm in the definition of entropy can be any greater than 1; his choice determines the unit of measurement of entropy.

We will use decimal logarithms in which entropy and information are measured in bits. The amount of information in bits will be calculated correctly when, for example, variables$ X $ and$ \ Delta $will be substituted into the corresponding expressions for entropy no matter in which, but always in the same units. Really:

$ q = H (x) -H (\ Delta) = log_ {10} \ left (X_ {2} -X_ {1} \ right) -log_ {10} (2 \ Delta) = log_ {10} (\ frac {X_ {2} -X_ {1}} {2 \ Delta}) $

where X and $ \ Delta $must be in the same units.

An estimate of the entropy value of a random variable from experimental data is found from a histogram from the following relation:

$ \ Delta_ {e} = \ frac {1} {2} e ^ {H (x)} = \ frac {d} {2} \ prod_ {i = 1} ^ {m} (\ frac {n} { n_ {i}}) ^ {\ frac {n_ {i}} {n}} = \ frac {dn} {2} 10 ^ {- \ frac {1} {n}} \ sum_ {i = 1} ^ {m} n_ {i} log_ {10} (n_ {i}) $

Where: $ d $ –Width of each column of the histogram; $ m $ - number of columns; $ n $-total amount of data; $ n_ {i} $ - amount of data in $ i $that column.

The entropy coefficient is determined from the ratio:

$ k_ {e} = \ frac {\ Delta_ {e}} {\ sigma} $

Where: $ \ sigma $ - standard deviation.

Information entropy as a measure of chaos


To analyze the phenomena of information chaos using the entropy coefficient, we first create a branch diagram for the function $ x_ {n + 1} = f (x_ {n}) = \ lambda x_ {n} (1-x_ {n}) $ with the application of transition areas obtained during the construction of histograms:

Branch diagram
import matplotlib.pyplot as plt
import matplotlib.pyplot as plt
from numpy import*
N=1000
y=[]
y.append(0.5)
for r in arange(3.58,3.9,0.0001):  
    for n in arange(1,N,1):       
        y.append(round(r*y[n-1]*(1-y[n-1]),4)) 
    y=y[N-250:N]
    x=[r ]*250
    plt.plot( x,y, color='black', linestyle=' ', marker='.', markersize=1) 
plt.figure(1)
plt.title("Диаграмма ветвления  при 3,6<= $\lambda$ <=3,9")
plt.xlabel("r")
plt.ylabel("$\lambda$ ")
plt.axvline(x=3.63,color='black',linestyle='--')
plt.axvline(x=3.74,color='black',linestyle='--')
plt.axvline(x=3.83,color='black',linestyle='--')
plt.axvline(x=3.9,color='black',linestyle='--')
plt.show()


We



get : We plot for the entropy coefficient for the same areas$ \ lambda $:

The graph for the entropy coefficient
import matplotlib.pyplot as plt
from numpy import*
data_k=[]
m='auto'
for p in arange(3.58,3.9,0.0001):
    q=[round(p,2)]    
    M=zeros([1001,1])    
    for j in arange(0,1,1):
        M[0,j]=0.5
    for j in arange(0,1,1):
        for i in arange(1,1001,1):
            M[i,j]=q[j]*M[i-1,j]*(1-M[i-1,j])           
    a=[]   
    for i in arange(0,1001,1):
        a.append(M[i,0])
    n=len(a)
    z=histogram(a, bins=m)
    if type(m) is str:
        m=len(z[0])    
    y=z[0]        
    d=z[1][1]-z[1][0]
    h=0.5*d*n*10**(-sum([w*log10(w) for w in y if w!=0])/n)
    ke=round(h/std(a),3)
    data_k.append(ke)
plt.title("Энтропийный коэффициент ke для 3,6<= $\lambda$ <=3,9")
plt.plot(arange(3.58,3.9,0.0001),data_k)
plt.xlabel("$\lambda$ ")
plt.ylabel("ke")
plt.axvline(x=3.63,color='black',linestyle='--')
plt.axvline(x=3.74,color='black',linestyle='--')
plt.axvline(x=3.83,color='black',linestyle='--')
plt.axvline(x=3.9,color='black',linestyle='--')
plt.grid()
plt.show()


We get:



Comparing the diagram and the graph, we see an identical display of the regions on the diagram and on the graph for the entropy coefficient for the function$ x_ {n + 1} = f (x_ {n}) = \ lambda x_ {n} (1-x_ {n}) $.

For further analysis of the phenomena of information chaos using the entropy coefficient, we create a branch diagram for the logistic function:$ x_ {n + 1} = f (x_ {n}) = \ lambda \ cdot x_ {n} \ cdot (1-x_ {n} ^ {2}) $ with the application of transition areas:

Branch diagram
import matplotlib.pyplot as plt
from numpy import*
N=1000
y=[]
y.append(0.5)
for r in arange(2.25,2.56,0.0001):  
    for n in arange(1,N,1):       
        y.append(round(r*y[n-1]*(1-(y[n-1])**2),4)) 
    y=y[N-250:N]
    x=[r ]*250
    plt.plot( x,y, color='black', linestyle=' ', marker='.', markersize=1) 
plt.figure(1)
plt.title("Диаграмма ветвления  при 2.25<=$\lambda$ <=2.56")
plt.xlabel("$\lambda$ ")
plt.ylabel("y")
plt.axvline(x=2.34,color='black',linestyle='--')
plt.axvline(x=2.39,color='black',linestyle='--')
plt.axvline(x=2.45,color='black',linestyle='--')
plt.axvline(x=2.49,color='black',linestyle='--')
plt.axvline(x=2.56,color='black',linestyle='--')
plt.show()


We



get : We plot for the entropy coefficient for the same areas$ \ lambda $:

Entropy coefficient graph
import matplotlib.pyplot as plt
from numpy import*
data_k=[]
m='auto'
for p in arange(2.25,2.56,0.0001):
    q=[round(p,2)]    
    M=zeros([1001,1])    
    for j in arange(0,1,1):
        M[0,j]=0.5
    for j in arange(0,1,1):
        for i in arange(1,1001,1):
            M[i,j]=q[j]*M[i-1,j]*(1-(M[i-1,j])**2)           
    a=[]   
    for i in arange(0,1001,1):
        a.append(M[i,0])
    n=len(a)
    z=histogram(a, bins=m)
    if type(m) is str:
        m=len(z[0])    
    y=z[0]        
    d=z[1][1]-z[1][0]
    h=0.5*d*n*10**(-sum([w*log10(w) for w in y if w!=0])/n)
    ke=round(h/std(a),3)
    data_k.append(ke)
plt.figure(2)
plt.title("Энтропийный коэффициент ke для 2.25<= $\lambda$ <=2.56")
plt.plot(arange(2.25,2.56,0.0001),data_k)
plt.xlabel("$\lambda$ ")
plt.ylabel("ke")
plt.axvline(x=2.34,color='black',linestyle='--')
plt.axvline(x=2.39,color='black',linestyle='--')
plt.axvline(x=2.45,color='black',linestyle='--')
plt.axvline(x=2.49,color='black',linestyle='--')
plt.axvline(x=2.56,color='black',linestyle='--')
plt.grid()
plt.show()


We get:



Comparing the diagram and the graph, we see an identical display of the regions on the diagram and on the graph for the entropy coefficient for the function$ x_ {n + 1} = f (x_ {n}) = \ lambda \ cdot x_ {n} \ cdot (1-x_ {n} ^ {2}) $

Conclusions:


The educational problem is solved in the article: is informational entropy a measure of chaos, and the answer of this question is affirmatively given by Python.

References


  1. Entropy? It's simple!
  2. Introduction to the concept of entropy and its many faces.
  3. Entropy and decision trees.
  4. A treatise on entropy.
  5. Entropy and WinRAR.
  6. Mathematical models of chaos.
  7. A bit about chaos and how to create it.
  8. A critical look at the Lorenz attractor.
  9. FPGA chaos generators.

Also popular now: