Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Entropy (Basel) ; 26(3)2024 Feb 27.
Article in English | MEDLINE | ID: mdl-38539714

ABSTRACT

We developed a macroscopic description of the evolutionary dynamics by following the temporal dynamics of the total Shannon entropy of sequences, denoted by S, and the average Hamming distance between them, denoted by H. We argue that a biological system can persist in the so-called quasi-equilibrium state for an extended period, characterized by strong correlations between S and H, before undergoing a phase transition to another quasi-equilibrium state. To demonstrate the results, we conducted a statistical analysis of SARS-CoV-2 data from the United Kingdom during the period between March 2020 and December 2023. From a purely theoretical perspective, this allowed us to systematically study various types of phase transitions described by a discontinuous change in the thermodynamic parameters. From a more-practical point of view, the analysis can be used, for example, as an early warning system for pandemics.

2.
Proc Natl Acad Sci U S A ; 119(6)2022 02 08.
Article in English | MEDLINE | ID: mdl-35131858

ABSTRACT

We outline a phenomenological theory of evolution and origin of life by combining the formalism of classical thermodynamics with a statistical description of learning. The maximum entropy principle constrained by the requirement for minimization of the loss function is employed to derive a canonical ensemble of organisms (population), the corresponding partition function (macroscopic counterpart of fitness), and free energy (macroscopic counterpart of additive fitness). We further define the biological counterparts of temperature (evolutionary temperature) as the measure of stochasticity of the evolutionary process and of chemical potential (evolutionary potential) as the amount of evolutionary work required to add a new trainable variable (such as an additional gene) to the evolving system. We then develop a phenomenological approach to the description of evolution, which involves modeling the grand potential as a function of the evolutionary temperature and evolutionary potential. We demonstrate how this phenomenological approach can be used to study the "ideal mutation" model of evolution and its generalizations. Finally, we show that, within this thermodynamics framework, major transitions in evolution, such as the transition from an ensemble of molecules to an ensemble of organisms, that is, the origin of life, can be modeled as a special case of bona fide physical phase transitions that are associated with the emergence of a new type of grand canonical ensemble and the corresponding new level of description.


Subject(s)
Origin of Life , Thermodynamics , Biological Evolution , Entropy , Models, Biological , Mutation/genetics , Temperature
3.
Proc Natl Acad Sci U S A ; 119(6)2022 02 08.
Article in English | MEDLINE | ID: mdl-35121666

ABSTRACT

We apply the theory of learning to physically renormalizable systems in an attempt to outline a theory of biological evolution, including the origin of life, as multilevel learning. We formulate seven fundamental principles of evolution that appear to be necessary and sufficient to render a universe observable and show that they entail the major features of biological evolution, including replication and natural selection. It is shown that these cornerstone phenomena of biology emerge from the fundamental features of learning dynamics such as the existence of a loss function, which is minimized during learning. We then sketch the theory of evolution using the mathematical framework of neural networks, which provides for detailed analysis of evolutionary phenomena. To demonstrate the potential of the proposed theoretical framework, we derive a generalized version of the Central Dogma of molecular biology by analyzing the flow of information during learning (back propagation) and predicting (forward propagation) the environment by evolving organisms. The more complex evolutionary phenomena, such as major transitions in evolution (in particular, the origin of life), have to be analyzed in the thermodynamic limit, which is described in detail in the paper by Vanchurin et al. [V. Vanchurin, Y. I. Wolf, E. V. Koonin, M. I. Katsnelson, Proc. Natl. Acad. Sci. U.S.A. 119, 10.1073/pnas.2120042119 (2022)].


Subject(s)
Biological Evolution , Learning , Models, Biological , Selection, Genetic/genetics , Thermodynamics
4.
Entropy (Basel) ; 24(1)2021 Dec 21.
Article in English | MEDLINE | ID: mdl-35052033

ABSTRACT

Neural network is a dynamical system described by two different types of degrees of freedom: fast-changing non-trainable variables (e.g., state of neurons) and slow-changing trainable variables (e.g., weights and biases). We show that the non-equilibrium dynamics of trainable variables can be described by the Madelung equations, if the number of neurons is fixed, and by the Schrodinger equation, if the learning system is capable of adjusting its own parameters such as the number of neurons, step size and mini-batch size. We argue that the Lorentz symmetries and curved space-time can emerge from the interplay between stochastic entropy production and entropy destruction due to learning. We show that the non-equilibrium dynamics of non-trainable variables can be described by the geodesic equation (in the emergent space-time) for localized states of neurons, and by the Einstein equations (with cosmological constant) for the entire network. We conclude that the quantum description of trainable variables and the gravitational description of non-trainable variables are dual in the sense that they provide alternative macroscopic descriptions of the same learning system, defined microscopically as a neural network.

5.
Entropy (Basel) ; 22(11)2020 Oct 26.
Article in English | MEDLINE | ID: mdl-33286978

ABSTRACT

We discuss a possibility that the entire universe on its most fundamental level is a neural network. We identify two different types of dynamical degrees of freedom: "trainable" variables (e.g., bias vector or weight matrix) and "hidden" variables (e.g., state vector of neurons). We first consider stochastic evolution of the trainable variables to argue that near equilibrium their dynamics is well approximated by Madelung equations (with free energy representing the phase) and further away from the equilibrium by Hamilton-Jacobi equations (with free energy representing the Hamilton's principal function). This shows that the trainable variables can indeed exhibit classical and quantum behaviors with the state vector of neurons representing the hidden variables. We then study stochastic evolution of the hidden variables by considering D non-interacting subsystems with average state vectors, x¯1, …, x¯D and an overall average state vector x¯0. In the limit when the weight matrix is a permutation matrix, the dynamics of x¯µ can be described in terms of relativistic strings in an emergent D+1 dimensional Minkowski space-time. If the subsystems are minimally interacting, with interactions that are described by a metric tensor, and then the emergent space-time becomes curved. We argue that the entropy production in such a system is a local function of the metric tensor which should be determined by the symmetries of the Onsager tensor. It turns out that a very simple and highly symmetric Onsager tensor leads to the entropy production described by the Einstein-Hilbert term. This shows that the learning dynamics of a neural network can indeed exhibit approximate behaviors that were described by both quantum mechanics and general relativity. We also discuss a possibility that the two descriptions are holographic duals of each other.

SELECTION OF CITATIONS
SEARCH DETAIL
...