Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Chaos ; 31(9): 093123, 2021 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-34598470

RESUMO

A quantitative evaluation of the contribution of individual units in producing the collective behavior of a complex network can allow us to understand the potential damage to the structure integrity due to the failure of local nodes. Given a time series for each unit, a natural way to do this is to find the information flowing from the unit of concern to the rest of the network. In this study, we show that this flow can be rigorously derived in the setting of a continuous-time dynamical system. With a linear assumption, a maximum likelihood estimator can be obtained, allowing us to estimate it in an easy way. As expected, this "cumulative information flow" does not equal the sum of the information flows to other individual units, reflecting the collective phenomenon that a group is not the addition of individual members. For the purpose of demonstration and validation, we have examined a network made of Stuart-Landau oscillators. Depending on the topology, the computed information flow may differ. In some situations, the most crucial nodes for the network are not the hubs, i.e., nodes with high degrees; they may have low degrees and, if depressed or attacked, will cause the failure of the entire network. This study can help diagnose neural network problems, control epidemic diseases, trace city traffic bottlenecks, identify the potential cause of power grid failure (e.g., the 2003 great power outage that darkened much of North America), build robust computer networks, and so forth.


Assuntos
Redes Neurais de Computação , América do Norte , Probabilidade
2.
Sci Rep ; 11(1): 17860, 2021 Sep 09.
Artigo em Inglês | MEDLINE | ID: mdl-34504151

RESUMO

The 2014-2015 "Monster"/"Super" El Niño failed to be predicted one year earlier due to the growing importance of a new type of El Niño, El Niño Modoki, which reportedly has much lower forecast skill with the classical models. In this study, we show that, so far as of today, this new El Niño actually can be mostly predicted at a lead time of more than 10 years. This is achieved through tracing the predictability source with an information flow-based causality analysis, which has been rigorously established from first principles during the past 16 years (e.g., Liang in Phys Rev E 94:052201, 2016). We show that the information flowing from the solar activity 45 years ago to the sea surface temperature results in a causal structure resembling the El Niño Modoki mode. Based on this, a multidimensional system is constructed out of the sunspot number series with time delays of 22-50 years. The first 25 principal components are then taken as the predictors to fulfill the prediction, which through causal AI based on the Liang-Kleeman information flow reproduces rather accurately the events thus far 12 years in advance.

3.
Entropy (Basel) ; 23(6)2021 May 28.
Artigo em Inglês | MEDLINE | ID: mdl-34071323

RESUMO

Causality analysis is an important problem lying at the heart of science, and is of particular importance in data science and machine learning. An endeavor during the past 16 years viewing causality as a real physical notion so as to formulate it from first principles, however, seems to have gone unnoticed. This study introduces to the community this line of work, with a long-due generalization of the information flow-based bivariate time series causal inference to multivariate series, based on the recent advance in theoretical development. The resulting formula is transparent, and can be implemented as a computationally very efficient algorithm for application. It can be normalized and tested for statistical significance. Different from the previous work along this line where only information flows are estimated, here an algorithm is also implemented to quantify the influence of a unit to itself. While this forms a challenge in some causal inferences, here it comes naturally, and hence the identification of self-loops in a causal graph is fulfilled automatically as the causalities along edges are inferred. To demonstrate the power of the approach, presented here are two applications in extreme situations. The first is a network of multivariate processes buried in heavy noises (with the noise-to-signal ratio exceeding 100), and the second a network with nearly synchronized chaotic oscillators. In both graphs, confounding processes exist. While it seems to be a challenge to reconstruct from given series these causal graphs, an easy application of the algorithm immediately reveals the desideratum. Particularly, the confounding processes have been accurately differentiated. Considering the surge of interest in the community, this study is very timely.

4.
Entropy (Basel) ; 23(3)2021 Mar 07.
Artigo em Inglês | MEDLINE | ID: mdl-33799929

RESUMO

Recently, it has been shown that the information flow and causality between two time series can be inferred in a rigorous and quantitative sense, and, besides, the resulting causality can be normalized. A corollary that follows is, in the linear limit, causation implies correlation, while correlation does not imply causation. Now suppose there is an event A taking a harmonic form (sine/cosine), and it generates through some process another event B so that B always lags A by a phase of π/2. Here the causality is obviously seen, while by computation the correlation is, however, zero. This apparent contradiction is rooted in the fact that a harmonic system always leaves a single point on the Poincaré section; it does not add information. That is to say, though the absolute information flow from A to B is zero, i.e., TA→B=0, the total information increase of B is also zero, so the normalized TA→B, denoted as τA→B, takes the form of 00. By slightly perturbing the system with some noise, solving a stochastic differential equation, and letting the perturbation go to zero, it can be shown that τA→B approaches 100%, just as one would have expected.

5.
Entropy (Basel) ; 24(1)2021 Dec 21.
Artigo em Inglês | MEDLINE | ID: mdl-35052029

RESUMO

Information flow provides a natural measure for the causal interaction between dynamical events. This study extends our previous rigorous formalism of componentwise information flow to the bulk information flow between two complex subsystems of a large-dimensional parental system. Analytical formulas have been obtained in a closed form. Under a Gaussian assumption, their maximum likelihood estimators have also been obtained. These formulas have been validated using different subsystems with preset relations, and they yield causalities just as expected. On the contrary, the commonly used proxies for the characterization of subsystems, such as averages and principal components, generally do not work correctly. This study can help diagnose the emergence of patterns in complex systems and is expected to have applications in many real world problems in different disciplines such as climate science, fluid dynamics, neuroscience, financial economics, etc.

6.
Entropy (Basel) ; 21(2)2019 Feb 05.
Artigo em Inglês | MEDLINE | ID: mdl-33266864

RESUMO

A fundamental problem regarding the storm-jet stream interaction in the extratropical atmosphere is how energy and information are exchanged between scales. While energy transfer has been extensively investigated, the latter has been mostly overlooked, mainly due to a lack of appropriate theory and methodology. Using a recently established rigorous formalism of information flow, this study attempts to examine the problem in the setting of a three-dimensional quasi-geostrophic zonal jet, with storms excited by a set of optimal perturbation modes. We choose for this study a period when the self-sustained oscillation is in quasi-equilibrium, and when the energetics mimick the mid-latitude atmospheric circulation where available potential energy is cascaded downward to smaller scales, and kinetic energy is inversely transferred upward toward larger scales. By inverting a three-dimensional elliptic differential operator, the model is first converted into a low-dimensional dynamical system, where the components correspond to different time scales. The information exchange between the scales is then computed through ensemble prediction. For this particular problem, the resulting cross-scale information flow is mostly from smaller scales to larger scales. That is to say, during this period, this model extratropical atmosphere is dominated by a bottom-up causation, as collective patterns emerge out of independent entities and macroscopic thermodynamic properties evolve from random molecular motions. This study makes a first step toward an important field in understanding the eddy-mean flow interaction in weather and climate phenomena such as atmospheric blocking, storm track, North Atlantic Oscillation, to name a few.

7.
Chaos ; 28(7): 075311, 2018 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-30070535

RESUMO

Recently, a rigorous formalism has been established for information flow and causality within dynamical systems with respect to Shannon entropy. In this study, we re-establish the formalism with respect to relative entropy, or Kullback-Leiber divergence, a well-accepted measure of predictability because of its appealing properties such as invariance upon nonlinear transformation and consistency with the second law of thermodynamics. Different from previous studies (which yield consistent results only for 2D systems), the resulting information flow, say T, is precisely the same as that with respect to Shannon entropy for systems of arbitrary dimensionality, except for a minus sign (reflecting the opposite notion of predictability vs. uncertainty). As before, T possesses a property called principle of nil causality, a fact that classical formalisms fail to verify in many situation. Besides, it proves to be invariant upon nonlinear transformation, indicating that the so-obtained information flow should be an intrinsic physical property. This formalism has been validated with the stochastic gradient system, a nonlinear system that admits an analytical equilibrium solution of the Boltzmann type.

8.
Phys Rev E ; 94(5-1): 052201, 2016 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-27967120

RESUMO

Information flow or information transfer the widely applicable general physics notion can be rigorously derived from first principles, rather than axiomatically proposed as an ansatz. Its logical association with causality is firmly rooted in the dynamical system that lies beneath. The principle of nil causality that reads, an event is not causal to another if the evolution of the latter is independent of the former, which transfer entropy analysis and Granger causality test fail to verify in many situations, turns out to be a proven theorem here. Established in this study are the information flows among the components of time-discrete mappings and time-continuous dynamical systems, both deterministic and stochastic. They have been obtained explicitly in closed form, and put to applications with the benchmark systems such as the Kaplan-Yorke map, Rössler system, baker transformation, Hénon map, and stochastic potential flow. Besides unraveling the causal relations as expected from the respective systems, some of the applications show that the information flow structure underlying a complex trajectory pattern could be tractable. For linear systems, the resulting remarkably concise formula asserts analytically that causation implies correlation, while correlation does not imply causation, providing a mathematical basis for the long-standing philosophical debate over causation versus correlation.

9.
Sci Rep ; 6: 21691, 2016 Feb 22.
Artigo em Inglês | MEDLINE | ID: mdl-26900086

RESUMO

We use a newly developed technique that is based on the information flow concept to investigate the causal structure between the global radiative forcing and the annual global mean surface temperature anomalies (GMTA) since 1850. Our study unambiguously shows one-way causality between the total Greenhouse Gases and GMTA. Specifically, it is confirmed that the former, especially CO2, are the main causal drivers of the recent warming. A significant but smaller information flow comes from aerosol direct and indirect forcing, and on short time periods, volcanic forcings. In contrast the causality contribution from natural forcings (solar irradiance and volcanic forcing) to the long term trend is not significant. The spatial explicit analysis reveals that the anthropogenic forcing fingerprint is significantly regionally varying in both hemispheres. On paleoclimate time scales, however, the cause-effect direction is reversed: temperature changes cause subsequent CO2/CH4 changes.

10.
Artigo em Inglês | MEDLINE | ID: mdl-26382363

RESUMO

Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

11.
Phys Rev E Stat Nonlin Soft Matter Phys ; 90(5-1): 052150, 2014 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-25493782

RESUMO

Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion, namely, information flow, we solve an inverse problem and give this important and challenging question, which is of interest in a wide variety of disciplines, a positive answer. Here causality is measured by the time rate of information flowing from one series to the other. The resulting formula is tight in form, involving only commonly used statistics, namely, sample covariances; an immediate corollary is that causation implies correlation, but correlation does not imply causation. It has been validated with touchstone linear and nonlinear series, purportedly generated with one-way causality that evades the traditional approaches. It has also been applied successfully to the investigation of real-world problems; an example presented here is the cause-and-effect relation between the two climate modes, El Niño and the Indian Ocean Dipole (IOD), which have been linked to hazards in far-flung regions of the globe. In general, the two modes are mutually causal, but the causality is asymmetric: El Niño tends to stabilize IOD, while IOD functions to make El Niño more uncertain. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean.

12.
Phys Rev E Stat Nonlin Soft Matter Phys ; 78(3 Pt 1): 031113, 2008 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-18850999

RESUMO

Information flow or information transfer is an important concept in general physics and dynamical systems which has applications in a wide variety of scientific disciplines. In this study, we show that a rigorous formalism can be established in the context of a generic stochastic dynamical system. An explicit formula has been obtained for the resulting transfer measure, which possesses a property of transfer asymmetry and, if the stochastic perturbation to the receiving component does not rely on the giving component, has a form the same as that for the corresponding deterministic system. This formula is further illustrated and validated with a two-dimensional Langevin equation. A remarkable observation is that, for two highly correlated time series, there could be no information transfer from one certain series, say x_{2} , to the other (x_{1}) . That is to say, the evolution of x_{1} may have nothing to do with x_{2} , even though x_{1} and x_{2} are highly correlated. Information flow analysis thus extends the traditional notion of correlation analysis and/or mutual information analysis by providing a quantitative measure of causality between dynamical events.

13.
Phys Rev Lett ; 95(24): 244101, 2005 Dec 09.
Artigo em Inglês | MEDLINE | ID: mdl-16384382

RESUMO

We present a rigorous formalism of information transfer for systems with dynamics fully known. This follows from an accurate classification of the mechanisms for the entropy change of one component into a self-evolution plus a transfer from the other component. The formalism applies to both continuous flows and discrete maps. The resulting transfer measure possesses a property of asymmetry and is qualitatively consistent with the classical measures. It is further validated with the baker transformation and the Hénon map.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA