Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Phys Rev E ; 108(5-1): 054126, 2023 Nov.
Article in English | MEDLINE | ID: mdl-38115447

ABSTRACT

Thermodynamic uncertainty relations (TURs) express a fundamental lower bound on the precision (inverse scaled variance) of any thermodynamic charge-e.g., work or heat-by functionals of the average entropy production. Relying on purely variational arguments, we significantly extend TUR inequalities by incorporating and analyzing the impact of higher statistical cumulants of the entropy production itself within the general framework of time-symmetrically-controlled computation. We derive an exact expression for the charge that achieves the minimum scaled variance, for which the TUR bound tightens to an equality that we name the thermodynamic uncertainty theorem (TUT). Importantly, both the minimum scaled variance charge and the TUT are functionals of the stochastic entropy production, thus retaining the impact of its higher moments. In particular, our results show that, beyond the average, the entropy production distribution's higher moments have a significant effect on any charge's precision. This is made explicit via a thorough numerical analysis of "swap" and "reset" computations that quantitatively compares the TUT against previous generalized TURs.

2.
J Stat Phys ; 187(2): 17, 2022.
Article in English | MEDLINE | ID: mdl-35400756

ABSTRACT

Landauer's Principle states that the energy cost of information processing must exceed the product of the temperature, Boltzmann's constant, and the change in Shannon entropy of the information-bearing degrees of freedom. However, this lower bound is achievable only for quasistatic, near-equilibrium computations-that is, only over infinite time. In practice, information processing takes place in finite time, resulting in dissipation and potentially unreliable logical outcomes. For overdamped Langevin dynamics, we show that counterdiabatic potentials can be crafted to guide systems rapidly and accurately along desired computational paths, providing shortcuts that allow for the precise design of finite-time computations. Such shortcuts require additional work, beyond Landauer's bound, that is irretrievably dissipated into the environment. We show that this dissipated work is proportional to the computation rate as well as the square of the information-storing system's length scale. As a paradigmatic example, we design shortcuts to create, erase, and transfer a bit of information metastably stored in a double-well potential. Though dissipated work generally increases with operation fidelity, we show that it is possible to compute with perfect fidelity in finite time with finite work. We also show that the robustness of information storage affects an operation's energetic cost-specifically, the dissipated work scales as the information lifetime of the bistable system. Our analysis exposes a rich and nuanced relationship between work, speed, size of the information-bearing degrees of freedom, storage robustness, and the difference between initial and final informational statistics.

3.
Phys Rev Lett ; 118(22): 220602, 2017 Jun 02.
Article in English | MEDLINE | ID: mdl-28621996

ABSTRACT

A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the information-processing second law (IPSL): the physical entropy of the Universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? We identify a minimal, and thus inescapable, transient dissipation of physical information processing, which is not captured by asymptotic rates, but is critical to adaptive thermodynamic processes such as those found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing task. Applying these results to producing structured patterns from a structureless information reservoir, we show that "retrodictive" generators achieve the minimal costs. The results establish the thermodynamic toll imposed by a physical system's structure as it comes to optimally transduce information.

4.
Phys Rev E ; 95(1-1): 012152, 2017 Jan.
Article in English | MEDLINE | ID: mdl-28208508

ABSTRACT

Information engines can use structured environments as a resource to generate work by randomizing ordered inputs and leveraging the increased Shannon entropy to transfer energy from a thermal reservoir to a work reservoir. We give a broadly applicable expression for the work production of an information engine, generally modeled as a memoryful channel that communicates inputs to outputs as it interacts with an evolving environment. The expression establishes that an information engine must have more than one memory state in order to leverage input environment correlations. To emphasize this functioning, we designed an information engine powered solely by temporal correlations and not by statistical biases, as employed by previous engines. Key to this is the engine's ability to synchronize-the engine automatically returns to a desired dynamical phase when thrown into an unwanted, dissipative phase by corruptions in the input-that is, by unanticipated environmental fluctuations. This self-correcting mechanism is robust up to a critical level of corruption, beyond which the system fails to act as an engine. We give explicit analytical expressions for both work and critical corruption level and summarize engine performance via a thermodynamic-function phase diagram over engine control parameters. The results reveal a thermodynamic mechanism based on nonergodicity that underlies error correction as it operates to support resilient engineered and biological systems.

5.
Phys Rev Lett ; 116(19): 190601, 2016 May 13.
Article in English | MEDLINE | ID: mdl-27232011

ABSTRACT

We introduce a deterministic chaotic system-the Szilard map-that encapsulates the measurement, control, and erasure protocol by which Maxwellian demons extract work from a heat reservoir. Implementing the demon's control function in a dynamical embodiment, our construction symmetrizes the demon and the thermodynamic system, allowing one to explore their functionality and recover the fundamental trade-off between the thermodynamic costs of dissipation due to measurement and those due to erasure. The map's degree of chaos-captured by the Kolmogorov-Sinai entropy-is the rate of energy extraction from the heat bath. Moreover, an engine's statistical complexity quantifies the minimum necessary system memory for it to function. In this way, dynamical instability in the control protocol plays an essential and constructive role in intelligent thermodynamic systems.

SELECTION OF CITATIONS
SEARCH DETAIL
...