Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 58
Filter
1.
Phys Rev E ; 101(4-1): 042302, 2020 Apr.
Article in English | MEDLINE | ID: mdl-32422803

ABSTRACT

We study experimentally the properties of the flow of mechanical vibration-driven vehicles confined in two chambers connected through a narrow opening. We report that the density of particles around the opening presents critical behavior and scaling properties. By mapping this density to the financial market price, we document that the main stylized facts observed in financial systems have their counterparts in the mechanical system. The experimental model accurately reproduces financial properties such as scaling of the price fluctuation, volatility clustering, and multiscaling.

2.
Nat Commun ; 11(1): 9, 2020 Jan 07.
Article in English | MEDLINE | ID: mdl-31911596

ABSTRACT

The flow behavior of soft materials below the yield stress can be rich and is not fully understood. Here, we report shear-stress-induced reorganization of three-dimensional solid-like soft materials formed by closely packed nematic domains of surfactant micelles and a repulsive Wigner glass formed by anisotropic clay nano-discs having ionic interactions. The creep response of both the systems below the yield stress results in angular velocity fluctuations of the shearing plate showing large temporal burst-like events that resemble seismic foreshocks-aftershocks data measuring the ground motion during earthquake avalanches. We find that the statistical properties of the quake events inside such a burst map on to the scaling relations for magnitude and frequency distribution of earthquakes, given by Gutenberg-Richter and Omori laws, and follow a power-law distribution of the inter-occurrence waiting time. In situ polarized optical microscopy reveals that during these events the system self-organizes to a much stronger solid-like state.

3.
R Soc Open Sci ; 6(7): 180643, 2019 Jul.
Article in English | MEDLINE | ID: mdl-31417685

ABSTRACT

We present a detailed bubble analysis of the Bitcoin to US Dollar price dynamics from January 2012 to February 2018. We introduce a robust automatic peak detection method that classifies price time series into periods of uninterrupted market growth (drawups) and regimes of uninterrupted market decrease (drawdowns). In combination with the Lagrange Regularization Method for detecting the beginning of a new market regime, we identify three major peaks and 10 additional smaller peaks, that have punctuated the dynamics of Bitcoin price during the analysed time period. We explain this classification of long and short bubbles by a number of quantitative metrics and graphs to understand the main socio-economic drivers behind the ascent of Bitcoin over this period. Then, a detailed analysis of the growing risks associated with the three long bubbles using the Log-Periodic Power-Law Singularity (LPPLS) model is based on the LPPLS Confidence Indicators, defined as the fraction of qualified fits of the LPPLS model over multiple time windows. Furthermore, for various fictitious 'present' times t 2 before the crashes, we employ a clustering method to group the predicted critical times t c of the LPPLS fits over different time scales, where t c is the most probable time for the ending of the bubble. Each cluster is proposed as a plausible scenario for the subsequent Bitcoin price evolution. We present these predictions for the three long bubbles and the four short bubbles that our time scale of analysis was able to resolve. Overall, our predictive scheme provides useful information to warn of an imminent crash risk.

4.
Philos Trans A Math Phys Eng Sci ; 374(2058)2016 Jan 13.
Article in English | MEDLINE | ID: mdl-26621989

ABSTRACT

A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

5.
Article in English | MEDLINE | ID: mdl-25974543

ABSTRACT

We show that the log-periodic power law singularity model (LPPLS), a mathematical embodiment of positive feedbacks between agents and of their hierarchical dynamical organization, has a significant predictive power in financial markets. We find that LPPLS-based strategies significantly outperform the randomized ones and that they are robust with respect to a large selection of assets and time periods. The dynamics of prices thus markedly deviate from randomness in certain pockets of predictability that can be associated with bubble market regimes. Our hybrid approach, marrying finance with the trading strategies, and critical phenomena with LPPLS, demonstrates that targeting information related to phase transitions enables the forecast of financial bubbles and crashes punctuating the dynamics of prices.

6.
Article in English | MEDLINE | ID: mdl-24580169

ABSTRACT

For any branching process, we demonstrate that the typical total number rmp(ντ) of events triggered over all generations within any sufficiently large time window τ exhibits, at criticality, a superlinear dependence rmp(ντ)∼(ντ)γ (with γ>1) on the total number ντ of the immigrants arriving at the Poisson rate ν. In branching processes in which immigrants (or sources) are characterized by fertilities distributed according to an asymptotic power-law tail with tail exponent 1<γ⩽2, the exponent of the superlinear law for rmp(ντ) is identical to the exponent γ of the distribution of fertilities. For γ>2 and for standard branching processes without power-law distribution of fertilities, rmp(ντ)∼(ντ)2. This scaling law replaces and tames the divergence ντ/(1-n) of the mean total number R̅t(τ) of events, as the branching ratio (defined as the average number of triggered events of first generation per source) tends to 1. The derivation uses the formalism of generating probability functions. The corresponding prediction is confirmed by numerical calculations, and an heuristic derivation enlightens its underlying mechanism. We also show that R̅t(τ) is always linear in ντ even at criticality (n=1). Our results thus illustrate the fundamental difference between the mean total number, which is controlled by a few extremely rare realizations, and the typical behavior represented by rmp(ντ).

7.
Article in English | MEDLINE | ID: mdl-24032916

ABSTRACT

We present a method to estimate the multifractal spectrum of point distributions. The method incorporates two motivated criteria (barycentric pivot point selection and nonoverlapping coverage) in order to reduce edge effects, improve precision, and reduce computation time. Implementation of the method on synthetic benchmarks demonstrates the superior performance of the proposed method compared with existing alternatives routinely used in the literature. Finally, we use the method to estimate the multifractal properties of the widely studied growth process of diffusion-limited aggregation (DLA) and compare our results with recent and earlier studies. Our tests support the conclusion of a genuine but weak multifractality of the central core of DLA clusters, with D(q) decreasing from 1.75±0.01 for q=-10 to 1.65±0.01 for q =+10.

8.
Article in English | MEDLINE | ID: mdl-23496576

ABSTRACT

We study the statistical properties of recurrence times in the self-excited Hawkes conditional Poisson process, the simplest extension of the Poisson process that takes into account how the past events influence the occurrence of future events. Specifically, we analyze the impact of the power law distribution of fertilities with exponent α, where the fertility of an event is the number of triggered events of first generation, on the probability distribution function (PDF) f(τ) of the recurrence times τ between successive events. The other input of the model is an exponential law quantifying the PDF of waiting times between an event and its first generation triggered events, whose characteristic time scale is taken as our time unit. At short-time scales, we discover two intermediate power law asymptotics, f(τ)~τ(-(2-α)) for τ<<τ(c) and f(τ)~τ(-α) for τ(c)<<τ<<1, where τ(c) is associated with the self-excited cascades of triggered events. For 1<<τ<<1/ν, we find a constant plateau f(τ)=/~const, while at long times, 1/ν

Subject(s)
Algorithms , Fertility , Models, Statistical , Statistical Distributions , Computer Simulation , Humans
9.
Phys Rev E Stat Nonlin Soft Matter Phys ; 83(5 Pt 2): 056101, 2011 May.
Article in English | MEDLINE | ID: mdl-21728599

ABSTRACT

The dynamics of technological, economic and social phenomena is controlled by how humans organize their daily tasks in response to both endogenous and exogenous stimulations. Queueing theory is believed to provide a generic answer to account for the often observed power-law distributions of waiting times before a task is fulfilled. However, the general validity of the power law and the nature of other regimes remain unsettled. Using anonymized data collected by Google at the World Wide Web level, we identify the existence of several additional regimes characterizing the time required for a population of Internet users to execute a given task after receiving a message. Depending on the under- or over-utilization of time by the population of users and the strength of their response to perturbations, the pure power law is found to be coextensive with an exponential regime (tasks are performed without too much delay) and with a crossover to an asymptotic plateau (some tasks are never performed).

10.
Phys Rev E Stat Nonlin Soft Matter Phys ; 81(1 Pt 2): 016108, 2010 Jan.
Article in English | MEDLINE | ID: mdl-20365433

ABSTRACT

Empirical analyses show that after the update of a browser, or the publication of the vulnerability of a software, or the discovery of a cyber worm, the fraction of computers still using the older browser or software version, or not yet patched, or exhibiting worm activity decays as a power law approximately 1/t(alpha) with 0

11.
Phys Rev E Stat Nonlin Soft Matter Phys ; 79(6 Pt 1): 061110, 2009 Jun.
Article in English | MEDLINE | ID: mdl-19658476

ABSTRACT

Many time series in natural and social sciences can be seen as resulting from an interplay between exogenous influences and an endogenous organization. We use a simple epidemic-type aftershock model of events occurring sequentially, in which future events are influenced (partially triggered) by past events to ask the question of how well can one disentangle the exogenous events from the endogenous ones. We apply both model-dependent and model-independent stochastic declustering methods to reconstruct the tree of ancestry and estimate key parameters. In contrast with previously reported positive results, we have to conclude that declustered catalogs are rather unreliable for the synthetic catalogs that we have investigated, which contains of the order of thousands of events, typical of realistic applications. The estimated rates of exogenous events suffer from large errors. The branching ratio n, quantifying the fraction of events that have been triggered by previous events, is also badly estimated in general from declustered catalogs. We find, however, that the errors tend to be smaller and perhaps acceptable in some cases for small triggering efficiency and branching ratios. The high level of randomness together with the long memory makes the stochastic reconstruction of trees of ancestry and the estimation of the key parameters perhaps intrinsically unreliable for long-memory processes. For shorter memories (larger "bare" Omori exponent), the results improve significantly.

12.
Phys Rev Lett ; 101(21): 218701, 2008 Nov 21.
Article in English | MEDLINE | ID: mdl-19113459

ABSTRACT

Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.


Subject(s)
Models, Statistical , Models, Theoretical , Software , Stochastic Processes
13.
Phys Rev E Stat Nonlin Soft Matter Phys ; 77(6 Pt 2): 066109, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18643338

ABSTRACT

We develop an efficient numerical scheme to solve accurately the set of nonlinear integral equations derived previously in [A. Saichev and D. Sornette, J. Geophys. Res. 112, B04313 (2007)], which describes the distribution of interevent times in the framework of a general model of earthquake clustering with long memory. Detailed comparisons between the linear and nonlinear versions of the theory and direct synthetic catalogs show that the nonlinear theory provides an excellent fit to the synthetic catalogs, while there are significant biases resulting from the use of the linear approximation. We then address the suggestions proposed by some authors to use the empirical distribution of interevent times to obtain a better determination of the so-called clustering parameter. Our theory and tests against synthetic and empirical catalogs find a rather dramatic lack of power for the distribution of interevent times to distinguish between quite different sets of parameters, casting doubt on the usefulness of this statistic for the specific purpose of identifying the clustering parameter.

14.
Proc Natl Acad Sci U S A ; 104(16): 6562-7, 2007 Apr 17.
Article in English | MEDLINE | ID: mdl-17420476

ABSTRACT

Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. We propose to formulate the validation of a given model as an iterative construction process that mimics the often implicit process occurring in the minds of scientists. We offer a formal representation of the progressive build-up of trust in the model. Thus, we replace static claims on the impossibility of validating a given model by a dynamic process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the methodology first with the maturation of quantum mechanics as the arguably best established physics theory and then with several concrete examples drawn from some of our primary scientific interests: a cellular automaton model for earthquakes, a multifractal random walk model for financial time series, an anomalous diffusion model for solar radiation transport in the cloudy atmosphere, and a computational fluid dynamics code for the Richtmyer-Meshkov instability.


Subject(s)
Algorithms , Models, Theoretical , Computer Simulation , Disasters , Quantum Theory
15.
Phys Rev Lett ; 97(7): 078501, 2006 Aug 18.
Article in English | MEDLINE | ID: mdl-17026277

ABSTRACT

We propose a simple theory for the "universal" scaling law previously reported for the distributions of waiting times between earthquakes. It is based on a largely used benchmark model of seismicity, which just assumes no difference in the physics of foreshocks, mainshocks, and aftershocks. Our theoretical calculations provide good fits to the data and show that universality is only approximate. We conclude that the distributions of interevent times do not reveal more information than what is already known from the Gutenberg-Richter and the Omori power laws. Our results reinforce the view that triggering earthquakes by other earthquakes is a key physical mechanism to understand seismicity.

16.
Phys Rev E Stat Nonlin Soft Matter Phys ; 74(1 Pt 1): 011111, 2006 Jul.
Article in English | MEDLINE | ID: mdl-16907064

ABSTRACT

We find that multifractal scaling is a robust property of a large class of continuous stochastic processes, constructed as exponentials of long-memory processes. The long memory is characterized by a power law kernel with tail exponent phi+1/2, where phi>0. This generalizes previous studies performed only with phi=0(with a truncation at an integral scale) by showing that multifractality holds over a remarkably large range of dimensionless scales for phi>0. The intermittency multifractal coefficient can be tuned continuously as a function of the deviation phi from 1/2 and of another parameter sigma2 embodying information on the short-range amplitude of the memory kernel, the ultraviolet cutoff ("viscous") scale, and the variance of the white-noise innovations. In these processes, both a viscous scale and an integral scale naturally appear, bracketing the "inertial" scaling regime. We exhibit a surprisingly good collapse of the multifractal spectra zeta(q) on a universal scaling function, which enables us to derive high-order multifractal exponents from the small-order values and also obtain a given multifractal spectrum zeta(q) by different combinations of phi and sigma2.

17.
Phys Rev E Stat Nonlin Soft Matter Phys ; 72(5 Pt 2): 056122, 2005 Nov.
Article in English | MEDLINE | ID: mdl-16383703

ABSTRACT

Motivated by its potential application to earthquake statistics as well as for its intrinsic interest in the theory of branching processes, we study the exactly self-similar branching process introduced recently by Vere-Jones. This model extends the ETAS class of conditional self-excited branching point-processes of triggered seismicity by removing the problematic need for a minimum (as well as maximum) earthquake size. To make the theory convergent without the need for the usual ultraviolet and infrared cutoffs, the distribution of magnitudes m' of daughters of first-generation of a mother of magnitude m has two branches m < m' with exponent beta - d and m' > m with exponent beta + d, where beta and d are two positive parameters. We investigate the condition and nature of the subcritical, critical, and supercritical regime in this and in an extended version interpolating smoothly between several models. We predict that the distribution of magnitudes of events triggered by a mother of magnitude m over all generations has also two branches m' < m with exponent and with exponent beta - h, with h=d squareroot of (1-s), where s is the fraction of triggered events. This corresponds to a renormalization of the exponent d into h by the hierarchy of successive generations of triggered events. For a significant part of the parameter space, the distribution of magnitudes over a full catalog summed over an average steady flow of spontaneous sources (immigrants) reproduces the distribution of the spontaneous sources with a single branch and is blind to the exponents beta, d of the distribution of triggered events. Since the distribution of earthquake magnitudes is usually obtained with catalogs including many sequences, we conclude that the two branches of the distribution of aftershocks are not directly observable and the model is compatible with real seismic catalogs. In summary, the exactly self-similar Vere-Jones model provides an attractive new approach to model triggered seismicity, which alleviates delicate questions on the role of magnitude cutoffs in other non-self-similar models. The new prediction concerning two branches in the distribution of magnitudes of aftershocks could be tested with recently introduced stochastic reconstruction methods, tailored to disentangle the different triggered sequences.

18.
Phys Rev E Stat Nonlin Soft Matter Phys ; 72(5 Pt 2): 056124, 2005 Nov.
Article in English | MEDLINE | ID: mdl-16383705

ABSTRACT

We formulate the problem of probabilistic predictions of global failure in the simplest possible model based on site percolation and on one of the simplest models of time-dependent rupture, a hierarchical fiber bundle model. We show that conditioning the predictions on the knowledge of the current degree of damage (occupancy density p or number and size of cracks) and on some information on the largest cluster improves significantly the prediction accuracy, in particular by allowing one to identify those realizations which have anomalously low or large clusters (cracks). We quantify the prediction gains using two measures, the relative specific information gain (which is the variation of entropy obtained by adding new information) and the root mean square of the prediction errors over a large ensemble of realizations. The bulk of our simulations have been obtained with the two-dimensional site percolation model on a lattice of size L x L=20 x 20 and hold true for other lattice sizes. For the hierarchical fiber bundle model, conditioning the measures of damage on the information of the location and size of the largest crack extends significantly the critical region and the prediction skills. These examples illustrate how ongoing damage can be used as a revelation of both the realization-dependent preexisting heterogeneity and the damage scenario undertaken by each specific sample.

19.
Phys Rev E Stat Nonlin Soft Matter Phys ; 71(5 Pt 2): 056127, 2005 May.
Article in English | MEDLINE | ID: mdl-16089622

ABSTRACT

Using the epidemic-type aftershock sequence (ETAS) branching model of triggered seismicity, we apply the formalism of generating probability functions to calculate exactly the average difference between the magnitude of a mainshock and the magnitude of its largest aftershock over all generations. This average magnitude difference is found empirically to be independent of the mainshock magnitude and equal to 1.2, a universal behavior known as Båth's law. Our theory shows that Båth's law holds only sufficiently close to the critical regime of the ETAS branching process. Allowing for error bars +/- 0.1 for Båth's constant value around 1.2, our exact analytical treatment of Båth's law provides new constraints on the productivity exponent alpha and the branching ratio n: 0.9 approximately < alpha < or =1. We propose a method for measuring alpha based on the predicted renormalization of the Gutenberg-Richter distribution of the magnitudes of the largest aftershock. We also introduce the "second Båth law for foreshocks:" the probability that a main earthquake turns out to be the foreshock does not depend on its magnitude rho.

20.
Phys Rev E Stat Nonlin Soft Matter Phys ; 72(1 Pt 2): 016112, 2005 Jul.
Article in English | MEDLINE | ID: mdl-16090041

ABSTRACT

We present an extensive study of the foreshock and aftershock signatures accompanying peaks of book sales. The time series of book sales are derived from the ranking system of Amazon.com. We present two independent ways of classifying peaks, one based on the acceleration pattern of sales and the other based on the exponent of the relaxation. They are found to be consistent and reveal the coexistence of two types of sales peaks: exogenous peaks occur abruptly and are followed by a power law relaxation, while endogenous sale peaks occur after a progressively accelerating power law growth followed by an approximately symmetrical power law relaxation which is slower than for exogenous peaks. We develop a simple epidemic model of buyers connected within a network of acquaintances which propagates rumors and opinions on books. The comparison between the predictions of the model and the empirical data confirms the validity of the model and suggests in addition that social networks have evolved to converge very close to criticality (here in the sense of critical branching processes of opinion spreading). We test in detail the evidence for a power law distribution of book sales and confirm a previous indirect study suggesting that the fraction of books (density distribution) P (S) of sales S is a power law P(S) approximately 1/ S(1+mu) with mu approximately equal to 2 .

SELECTION OF CITATIONS
SEARCH DETAIL
...