Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters










Publication year range
1.
Proc Natl Acad Sci U S A ; 121(9): e2309624121, 2024 Feb 27.
Article in English | MEDLINE | ID: mdl-38381782

ABSTRACT

We propose Multiscale Flow, a generative Normalizing Flow that creates samples and models the field-level likelihood of two-dimensional cosmological data such as weak lensing. Multiscale Flow uses hierarchical decomposition of cosmological fields via a wavelet basis and then models different wavelet components separately as Normalizing Flows. The log-likelihood of the original cosmological field can be recovered by summing over the log-likelihood of each wavelet term. This decomposition allows us to separate the information from different scales and identify distribution shifts in the data such as unknown scale-dependent systematics. The resulting likelihood analysis can not only identify these types of systematics, but can also be made optimal, in the sense that the Multiscale Flow can learn the full likelihood at the field without any dimensionality reduction. We apply Multiscale Flow to weak lensing mock datasets for cosmological inference and show that it significantly outperforms traditional summary statistics such as power spectrum and peak counts, as well as machine learning-based summary statistics such as scattering transform and convolutional neural networks. We further show that Multiscale Flow is able to identify distribution shifts not in the training data such as baryonic effects. Finally, we demonstrate that Multiscale Flow can be used to generate realistic samples of weak lensing data.

2.
Entropy (Basel) ; 24(10)2022 Sep 21.
Article in English | MEDLINE | ID: mdl-37420349

ABSTRACT

In many hypothesis testing applications, we have mixed priors, with well-motivated informative priors for some parameters but not for others. The Bayesian methodology uses the Bayes factor and is helpful for the informative priors, as it incorporates Occam's razor via the multiplicity or trials factor in the look-elsewhere effect. However, if the prior is not known completely, the frequentist hypothesis test via the false-positive rate is a better approach, as it is less sensitive to the prior choice. We argue that when only partial prior information is available, it is best to combine the two methodologies by using the Bayes factor as a test statistic in the frequentist analysis. We show that the standard frequentist maximum likelihood-ratio test statistic corresponds to the Bayes factor with a non-informative Jeffrey's prior. We also show that mixed priors increase the statistical power in frequentist analyses over the maximum likelihood test statistic. We develop an analytic formalism that does not require expensive simulations and generalize Wilks' theorem beyond its usual regime of validity. In specific limits, the formalism reproduces existing expressions, such as the p-value of linear models and periodograms. We apply the formalism to an example of exoplanet transits, where multiplicity can be more than 107. We show that our analytic expressions reproduce the p-values derived from numerical simulations. We offer an interpretation of our formalism based on the statistical mechanics. We introduce the counting of states in a continuous parameter space using the uncertainty volume as the quantum of the state. We show that both the p-value and Bayes factor can be expressed as an energy versus entropy competition.

3.
Nat Commun ; 12(1): 2729, 2021 05 12.
Article in English | MEDLINE | ID: mdl-33980836

ABSTRACT

Estimating rates of COVID-19 infection and associated mortality is challenging due to uncertainties in case ascertainment. We perform a counterfactual time series analysis on overall mortality data from towns in Italy, comparing the population mortality in 2020 with previous years, to estimate mortality from COVID-19. We find that the number of COVID-19 deaths in Italy in 2020 until September 9 was 59,000-62,000, compared to the official number of 36,000. The proportion of the population that died was 0.29% in the most affected region, Lombardia, and 0.57% in the most affected province, Bergamo. Combining reported test positive rates from Italy with estimates of infection fatality rates from the Diamond Princess cruise ship, we estimate the infection rate as 29% (95% confidence interval 15-52%) in Lombardy, and 72% (95% confidence interval 36-100%) in Bergamo.


Subject(s)
COVID-19/mortality , SARS-CoV-2/isolation & purification , Adolescent , Adult , Aged , Aged, 80 and over , COVID-19/epidemiology , COVID-19/virology , Child , Child, Preschool , Humans , Infant , Infant, Newborn , Italy/epidemiology , Middle Aged , Pandemics , SARS-CoV-2/physiology , Survival Rate , Young Adult
4.
Proc Natl Acad Sci U S A ; 118(16)2021 Apr 20.
Article in English | MEDLINE | ID: mdl-33853944

ABSTRACT

The goal of generative models is to learn the intricate relations between the data to create new simulated data, but current approaches fail in very high dimensions. When the true data-generating process is based on physical processes, these impose symmetries and constraints, and the generative model can be created by learning an effective description of the underlying physics, which enables scaling of the generative model to very high dimensions. In this work, we propose Lagrangian deep learning (LDL) for this purpose, applying it to learn outputs of cosmological hydrodynamical simulations. The model uses layers of Lagrangian displacements of particles describing the observables to learn the effective physical laws. The displacements are modeled as the gradient of an effective potential, which explicitly satisfies the translational and rotational invariance. The total number of learned parameters is only of order 10, and they can be viewed as effective theory parameters. We combine N-body solver fast particle mesh (FastPM) with LDL and apply it to a wide range of cosmological outputs, from the dark matter to the stellar maps, gas density, and temperature. The computational cost of LDL is nearly four orders of magnitude lower than that of the full hydrodynamical simulations, yet it outperforms them at the same resolution. We achieve this with only of order 10 layers from the initial conditions to the final output, in contrast to typical cosmological simulations with thousands of time steps. This opens up the possibility of analyzing cosmological observations entirely within this framework, without the need for large dark-matter simulations.

5.
Phys Rev Lett ; 121(14): 141101, 2018 Oct 05.
Article in English | MEDLINE | ID: mdl-30339429

ABSTRACT

The nature of dark matter (DM) remains unknown despite very precise knowledge of its abundance in the Universe. An alternative to new elementary particles postulates DM as made of macroscopic compact halo objects (MACHO) such as black holes formed in the very early Universe. Stellar-mass primordial black holes (PBHs) are subject to less robust constraints than other mass ranges and might be connected to gravitational-wave signals detected by the Laser Interferometer Gravitational-Wave Observatory (LIGO). New methods are therefore necessary to constrain the viability of compact objects as a DM candidate. Here we report bounds on the abundance of compact objects from gravitational lensing of type Ia supernovae (SNe). Current SNe data sets constrain compact objects to represent less than 35.2% (Joint Lightcurve Analysis) and 37.2% (Union 2.1) of the total matter content in the Universe, at 95% confidence level. The results are valid for masses larger than ∼0.01 M_{⊙} (solar masses), limited by the size SNe relative to the lens Einstein radius. We demonstrate the mass range of the constraints by computing magnification probabilities for realistic SNe sizes and different values of the PBH mass. Our bounds are sensitive to the total abundance of compact objects with M≳0.01 M_{⊙} and complementary to other observational tests. These results are robust against cosmological parameters, outlier rejection, correlated noise, and selection bias. PBHs and other MACHOs are therefore ruled out as the dominant form of DM for objects associated to LIGO gravitational wave detections. These bounds constrain early-Universe models that predict stellar-mass PBH production and strengthen the case for lighter forms of DM, including new elementary particles.

6.
Phys Rev Lett ; 121(10): 101301, 2018 Sep 07.
Article in English | MEDLINE | ID: mdl-30240255

ABSTRACT

We develop a new method to constrain primordial non-Gaussianities of the local kind using unclustered tracers of the large scale structure. We show that, in the limit of low noise, zero bias tracers yield large improvement over standard methods, mostly due to vanishing sampling variance. We propose a simple technique to construct such a tracer, using environmental information obtained from the original sample and validate our method with N-body simulations. Our results indicate that σ_{f_{NL}^{loc}}≃1 can be reached using only information on a single tracer of sufficiently high number density.

7.
Nature ; 482(7386): 475-7, 2012 Feb 22.
Article in English | MEDLINE | ID: mdl-22358831
8.
Nature ; 464(7286): 256-8, 2010 Mar 11.
Article in English | MEDLINE | ID: mdl-20220843

ABSTRACT

Although general relativity underlies modern cosmology, its applicability on cosmological length scales has yet to be stringently tested. Such a test has recently been proposed, using a quantity, E(G), that combines measures of large-scale gravitational lensing, galaxy clustering and structure growth rate. The combination is insensitive to 'galaxy bias' (the difference between the clustering of visible galaxies and invisible dark matter) and is thus robust to the uncertainty in this parameter. Modified theories of gravity generally predict values of E(G) different from the general relativistic prediction because, in these theories, the 'gravitational slip' (the difference between the two potentials that describe perturbations in the gravitational metric) is non-zero, which leads to changes in the growth of structure and the strength of the gravitational lensing effect. Here we report that E(G) = 0.39 +/- 0.06 on length scales of tens of megaparsecs, in agreement with the general relativistic prediction of E(G) approximately 0.4. The measured value excludes a model within the tensor-vector-scalar gravity theory, which modifies both Newtonian and Einstein gravity. However, the relatively large uncertainty still permits models within f(R) theory, which is an extension of general relativity. A fivefold decrease in uncertainty is needed to rule out these models.

9.
Phys Rev Lett ; 103(9): 091303, 2009 Aug 28.
Article in English | MEDLINE | ID: mdl-19792780

ABSTRACT

Galaxy surveys are one of the most powerful means to extract cosmological information and for a given volume the attainable precision is determined by the galaxy shot noise sigma(n);(2) relative to the power spectrum P. It is generally assumed that shot noise is white and given by the inverse of the number density n[over ]. In this Letter we argue one may considerably improve upon this due to mass and momentum conservation. We explore this idea with N-body simulations by weighting central halo galaxies by halo mass and find that the resulting shot noise can be reduced dramatically relative to expectations, with a 10-30 suppression at n[over ] = 4x10(-3) (h/Mpc)(3). These results open up new opportunities to extract cosmological information in galaxy surveys and may have important consequences for the planning of future redshift surveys.

10.
Phys Rev Lett ; 102(2): 021302, 2009 Jan 16.
Article in English | MEDLINE | ID: mdl-19257263

ABSTRACT

Recent work has emphasized the possibility to probe non-Gaussianity of local type by measuring the power spectrum of highly biased tracers of large scale structure on very large scales. This method is limited by the cosmic variance, by the finite number of structures on the largest scales, and by the partial degeneracy with other cosmological parameters that can mimic the same effect. We propose an alternative method based on the fact that on large scales, halos are linearly biased, but not stochastic, tracers of dark matter: by correlating a highly biased tracer of large scale structure against an unbiased tracer, one eliminates the cosmic variance error, which can lead to a significant increase in signal to noise. For an ideal survey out to z approximately 2, the error reduction can be as large as a factor of 7, which should guarantee a detection of non-Gaussianity from an all-sky survey of this type.

11.
Phys Rev Lett ; 97(19): 191303, 2006 Nov 10.
Article in English | MEDLINE | ID: mdl-17155611

ABSTRACT

We use the Ly-alpha forest power spectrum measured by the Sloan Digital Sky Survey and high-resolution spectroscopy observations in combination with cosmic microwave background and galaxy clustering constraints to place limits on a sterile neutrino as a dark matter candidate in the warm dark matter scenario. Such a neutrino would be created in the early Universe through mixing with an active neutrino and would suppress structure on scales smaller than its free-streaming scale. We ran a series of high-resolution hydrodynamic simulations with varying neutrino masses to describe the effect of a sterile neutrino on the Ly-alpha forest power spectrum. We find that the mass limit is m(s) >13 keV at 95% C.L. (9 keV at 99.9%), which is above the upper limit allowed by x-ray constraints, excluding this candidate from being all of the dark matter in this model.

SELECTION OF CITATIONS
SEARCH DETAIL
...