Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters











Database
Language
Publication year range
1.
Entropy (Basel) ; 24(9)2022 Sep 11.
Article in English | MEDLINE | ID: mdl-36141168

ABSTRACT

We compare and contrast three different, but complementary views of "structure" and "pattern" in spatial processes. For definiteness and analytical clarity, we apply all three approaches to the simplest class of spatial processes: one-dimensional Ising spin systems with finite-range interactions. These noncritical systems are well-suited for this study since the change in structure as a function of system parameters is more subtle than that found in critical systems where, at a phase transition, many observables diverge, thereby making the detection of change in structure obvious. This survey demonstrates that the measures of pattern from information theory and computational mechanics differ from known thermodynamic and statistical mechanical functions. Moreover, they capture important structural features that are otherwise missed. In particular, a type of mutual information called the excess entropy-an information theoretic measure of memory-serves to detect ordered, low entropy density patterns. It is superior in several respects to other functions used to probe structure, such as magnetization and structure factors. ϵ-Machines-the main objects of computational mechanics-are seen to be the most direct approach to revealing the (group and semigroup) symmetries possessed by the spatial patterns and to estimating the minimum amount of memory required to reproduce the configuration ensemble, a quantity known as the statistical complexity. Finally, we argue that the information theoretic and computational mechanical analyses of spatial patterns capture the intrinsic computational capabilities embedded in spin systems-how they store, transmit, and manipulate configurational information to produce spatial structure.

2.
Chaos ; 21(3): 037114, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21974677

ABSTRACT

We calculate the local contributions to the Shannon entropy and excess entropy and use these information theoretic measures as quantitative probes of the order arising from quenched disorder in the diluted Ising antiferromagnet on a triangular lattice. When one sublattice is sufficiently diluted, the system undergoes a temperature-driven phase transition, with the other two sublattices developing magnetizations of equal magnitude and opposite sign as the system is cooled.(1) The diluted sublattice has no net magnetization but exhibits spin glass ordering. The distribution of local entropies shows a dramatic broadening at low temperatures; this indicates that the system's total entropy is not shared equally across the lattice. The entropy contributions from some regions exhibit local reentrance, although the entropy of the system decreases monotonically as expected. The average excess entropy shows a sharp peak at the critical temperature, showing that the excess entropy is sensitive to the structural changes that occur as a result of the spin glass ordering.

3.
Chaos ; 18(4): 043106, 2008 Dec.
Article in English | MEDLINE | ID: mdl-19123616

ABSTRACT

Intrinsic computation refers to how dynamical systems store, structure, and transform historical and spatial information. By graphing a measure of structural complexity against a measure of randomness, complexity-entropy diagrams display the different kinds of intrinsic computation across an entire class of systems. Here, we use complexity-entropy diagrams to analyze intrinsic computation in a broad array of deterministic nonlinear and linear stochastic processes, including maps of the interval, cellular automata, and Ising spin systems in one and two dimensions, Markov chains, and probabilistic minimal finite-state machines. Since complexity-entropy diagrams are a function only of observed configurations, they can be used to compare systems without reference to system coordinates or parameters. It has been known for some time that in special cases complexity-entropy diagrams reveal that high degrees of information processing are associated with phase transitions in the underlying process space, the so-called "edge of chaos." Generally, though, complexity-entropy diagrams differ substantially in character, demonstrating a genuine diversity of distinct kinds of intrinsic computation.


Subject(s)
Algorithms , Computer Simulation , Information Storage and Retrieval/methods , Models, Statistical , Nonlinear Dynamics , Entropy
4.
Phys Rev E Stat Nonlin Soft Matter Phys ; 67(5 Pt 1): 051104, 2003 May.
Article in English | MEDLINE | ID: mdl-12786131

ABSTRACT

We develop information-theoretic measures of spatial structure and pattern in more than one dimension. As is well known, the entropy density of a two-dimensional configuration can be efficiently and accurately estimated via a converging sequence of conditional entropies. We show that the manner in which these conditional entropies converge to their asymptotic value serves as a measure of global correlation and structure for spatial systems in any dimension. We compare and contrast entropy convergence with mutual-information and structure-factor techniques for quantifying and detecting spatial structure.

5.
Chaos ; 13(1): 25-54, 2003 Mar.
Article in English | MEDLINE | ID: mdl-12675408

ABSTRACT

We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of apparent memory stored in a source and the amounts of information that must be extracted from observations of a source in order for it to be optimally predicted and for an observer to synchronize to it. To measure the difficulty of synchronization, we define the transient information and prove that, for Markov processes, it is related to the total uncertainty experienced while synchronizing to a process. One consequence of ignoring a process's structural properties is that the missed regularities are converted to apparent randomness. We demonstrate that this problem arises particularly for settings where one has access only to short measurement sequences. Numerically and analytically, we determine the Shannon entropy growth curve, and related quantities, for a range of stochastic and deterministic processes. We conclude by looking at the relationships between a process's entropy convergence behavior and its underlying computational structure.


Subject(s)
Nonlinear Dynamics , Markov Chains , Models, Statistical , Models, Theoretical , Stochastic Processes , Thermodynamics
SELECTION OF CITATIONS
SEARCH DETAIL