Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Phys Rev E ; 106(3-1): 034127, 2022 Sep.
Article in English | MEDLINE | ID: mdl-36266815

ABSTRACT

We propose two different approaches for introducing the information temperature of binary Nth-order Markov chains. The first approach is based on a comparison of Markov sequences with equilibrium Ising chains at given temperatures. The second approach uses probabilities of finite-length subsequences of symbols occurring, which determine their entropies. The derivative of the entropy with respect to the energy gives the information temperature measured on the scale of introduced energy. For the case of a nearest-neighbor spin-symbol interaction, both approaches give similar results. However, the method based on the correspondence of the N-step Markov and Ising chains appears to be very cumbersome for N>3. We also introduce the information temperature for the weakly correlated one-parametric Markov chains and present results for the stepwise and power memory functions. An application of the developed method to obtain the information temperature of some literary texts is given.

2.
Phys Rev E ; 102(2-1): 022119, 2020 Aug.
Article in English | MEDLINE | ID: mdl-32942436

ABSTRACT

Considering symbolic and numerical random sequences in the framework of the additive Markov chain approach, we establish a relation between their correlation functions and conditional entropies. We express the entropy by means of the two-point probability distribution functions and then evaluate the entropy for the numerical random chain in terms of the correlation function. We show that such approximation gives a satisfactory result only for special types of random sequences. In general case the conditional entropy of numerical sequences obtained in the two-point distribution function approach is lower. We derive the conditional entropy of the additive Markov chain as a sum of the Kullback-Leibler mutual information and give an example of random sequence with the exactly zero correlation function and the nonzero correlations.

SELECTION OF CITATIONS
SEARCH DETAIL
...