Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
1.
J Hazard Mater ; 474: 134638, 2024 Aug 05.
Article in English | MEDLINE | ID: mdl-38838529

ABSTRACT

Parameterization of dry deposition is key for modelling of atmospheric transport and deposition of radioactive particles. Still, very simple parameterizations are often encountered in radioactive preparedness models such as the SNAP model (SNAP=Severe Nuclear Accident Program) of the Norwegian Meteorological Institute. In SNAP a constant dry deposition velocity (=0.2 cm/s) neglecting aerodynamic and surface resistances, is presently used. Therefore, two new dry depositions schemes (the Emerson scheme and the EMEP (European Monitoring and Evaluation Programme) scheme) have been implemented in SNAP to evaluate the benefits of including aerodynamic and surface resistances codes with respect to model prediction skills. The three dry deposition schemes are evaluated using 137Cs total deposition from soil sample data (n = 540) for a 60 km radial zone out from the Chernobyl Nuclear Power Plant (ChNPP) collected during the months after the accident. The present study capitalizes on high resolution meteorological data (2.5 km horizontal resolution), a detailed land-use data set with 273 sub-classes and the hitherto most comprehensive source term description for the Chernobyl accident. Based on our findings it is recommended to replace the present simple SNAP scheme with the Emerson or EMEP dry deposition scheme.

2.
J Hazard Mater ; 451: 131156, 2023 Jun 05.
Article in English | MEDLINE | ID: mdl-36893593

ABSTRACT

Releases of radionuclides to the atmosphere occasionally occur with no warning and with first observation at radioactivity monitoring stations. The Chernobyl accident of 1986 was first detected at Forsmark, Sweden, long before the official announcement by the Soviet Union, and the release of Ruthenium 106 detected across Europe in 2017 still has no official release location. The current study details a method based on footprint analysis of an atmospheric dispersion model to locate the source of an atmospheric release. The method was applied to the European Tracer EXperiment of 1994 to validate the method and to the Ruthenium observations of autumn 2017 to determine likely release locations and time characteristics of this release. The method can readily utilise an ensemble of numerical weather prediction data which improves the localisation results by taking into account meteorological uncertainties compared to only using deterministic weather data. In applying the method to the ETEX scenario, the most likely release location improved from a distance of 113 km from the true release location when using deterministic meteorology, to a distance of 63 km when using ensemble meteorology data, although such improvements may be scenario dependent. The method was constructed to be robust with respect to the choices of model parameters and measurement uncertainties. The localisation method can be useful for decision makers to enact countermeasures to protect the environment against the effects of radioactivity when observations are available from environmental radioactivity monitoring networks.

3.
J Environ Radioact ; 246: 106836, 2022 May.
Article in English | MEDLINE | ID: mdl-35151962

ABSTRACT

Environmental air sampling is one of the principal monitoring technologies employed for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). By combining the analysis of environmental samples with Atmospheric Transport and Dispersion Modelling (ATDM), and using a Bayesian source reconstruction algorithm, an estimate of the release location, duration, and quantity can be computed. Bayesian source reconstruction uses an uncertainty distribution of the input parameters, or priors, in a statistical framework to produce posterior probability estimates of the event parameters. The quality of the event reconstruction directly depends on the accuracy of the prior uncertainty distribution. With many of the input parameters, the selection of the uncertainty distribution is not difficult. However, with environmental samples, there is one component of the uncertainty at the interface between sample measurements and the ATDM that has been overlooked. Typically, a much smaller volume or quantity of material is sampled from the much larger domain represented in the ATDM. By examining the response of a dense network of radionuclide detectors on the West Coast of Canada during the passage of the Fukushima debris plume, an initial estimate of this uncertainty was determined to be between 20% and 30% depending on sample integration time.


Subject(s)
Air Pollutants, Radioactive , Radiation Monitoring , Air Pollutants, Radioactive/analysis , Bayes Theorem , Radioisotopes/analysis , Uncertainty
4.
Sci Total Environ ; 806(Pt 1): 150128, 2022 Feb 01.
Article in English | MEDLINE | ID: mdl-34583084

ABSTRACT

Atmospheric dispersion models are crucial for nuclear risk assessment and emergency response systems since they rapidly predict air concentrations and deposition of released radionuclides, providing a basis for dose estimations and countermeasure strategies. Atmospheric dispersion models are associated with relatively large and often unknown uncertainties that are mostly attributed to meteorology, source terms and parametrisation of the dispersion model. By developing methods that can provide reliable uncertainty ranges for model outputs, decision makers have an improved basis for handling nuclear emergency situations. In the present work, model skill of the Severe Nuclear Accident Programme (SNAP) model was quantified by employing an ensemble method in which 51 meteorological realisations from a numerical weather prediction model were combined with 9 source term descriptions for the accidental 137Cs releases from Fukushima Daiichi Nuclear Power Plant during 14th-17th March 2011. The meteorological forecast was compared to observations of wind speed from 30 meteorological stations. The 459 dispersion realisations were compared with hourly observations of activity concentrations from 100 air filter stations. Exclusive use of deterministic meteorology resulted in most members of the dispersion ensemble showing too low concentration values, however this was mitigated by applying ensemble meteorology. Ensemble predictions, including both the meteorological and source term ensemble, show an overall higher prediction skill compared to individual meteorology and source term runs, with true predictive rate accuracy increasing from 30%-50% to 70%-90%, with a decrease in positive predictive rate accuracy from 75%-80% to 65%-75%. Skill scores and other ensemble indicators also showed improvements in using ensembles of source terms and meteorology. From the present study on the Fukushima accident there are strong indications that ensemble predictions improve the basis for decision making in the early phase after a nuclear accident, which emphasises the importance of including ensemble prediction in nuclear preparedness tools of the future.


Subject(s)
Air Pollutants, Radioactive , Fukushima Nuclear Accident , Radiation Monitoring , Air Pollutants, Radioactive/analysis , Cesium Radioisotopes/analysis , Japan , Nuclear Power Plants , Uncertainty
5.
Eur Biophys J ; 50(6): 915-926, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34009404

ABSTRACT

Double-dispersion impedance models are important for the accurate fitting of spectral impedance measurements in Electrical Impedance Spectroscopy (EIS). While the Cole-Cole model is the most widely known, it is possible to define double-dispersion Cole-Davidson and Havriliak-Negami models as well. In this work, we show that more freedom can be exercised when these three models are combined together and that this combination can be done in various forms. Experimental results using a two-stage optimization algorithm applied on the suggested models are provided.


Subject(s)
Algorithms , Dielectric Spectroscopy , Electric Impedance
6.
Appl Radiat Isot ; 166: 109383, 2020 Dec.
Article in English | MEDLINE | ID: mdl-32942086

ABSTRACT

The current investigation presents a comprehensive program called KIANA to assess and analyze the environmental effects due to the release of radioactive materials from the stack of nuclear installations. Bushehr Nuclear Power Plant Unit One (BNPP-1) is modeled using the KIANA software and the release of radioactive materials from the BNPP-1 stack on the members of the public through the airborne pathway is evaluated during normal operation as well as in the accident conditions including Design Basis Accidents (DBA) and Beyond Design Basis Accidents (BDBA). To verify and validate the KIANA software, the results of the KIANA are compared with the results of DOZA_M, ESTE, PC-CREAM 98, RECASS Express, and the values of the total effective dose monitored by the local detectors of BNPP-1. The KIANA is developed based on the Gaussian diffusion model written using C# programming language. In the current research, the total effective dose received by a member of the public due to the radioactive plume passage through the airborne pathway is calculated in the normal condition of BNPP-1. Moreover, the total effective dose in the case of the primary-to-secondary leakage inside the steam generators, the total effective dose in the case of Large Break Loss of Coolant Accident (LBLOCA) and Small Break Loss of Coolant Accident (SBLOCA), and the equivalent dose of the thyroid gland tissue for an infant group (1-8 years old) in the case of LBLOCA in DBA conditions are evaluated. Finally, the absorbed dose of the whole body of adults at the initial stage after a BDBA, absorbed dose of the thyroid gland tissue for an infant group (1-8 years old) at the initial stage after a BDBA, and the total effective dose in the first year after the accident in the case of a BDBA are assessed. The results of the KIANA software indicate a good agreement with the results of the DOZA_M, ESTE, PC-CREAM 98, RECASS Express computer programs, and the values of the total effective dose monitored by the local detectors of BNPP-1. The developed software has the potential of calculations of concentration and radionuclide dose received through all the exposure pathways such as airborne, foodstuff, marine, soil, animals, and vegetation without any restriction in normal and accident conditions, simultaneously.

7.
Sci Total Environ ; 748: 141211, 2020 Dec 15.
Article in English | MEDLINE | ID: mdl-32814285

ABSTRACT

Apart from the aerodynamic performance (efficiency and safety), the wake after an on-road vehicle substantially influences the tailpipe pollutant dispersion (environment). Remote sensing is the most practicable measures for large-scale emission control. Its reliability, however, is largely dictated by how well the complicated vehicular flows and instrumentation constraint are tackled. Specifically, the broad range of motion scales and the short sampling duration (less than 1 s) are the most prominent ones. Their impact on remote sensing has not been studied. Large-eddy simulation (LES) is thus employed in this paper to look into the dynamics and the plume dispersion after an on-road heavy-duty truck at speed U∞ so as to elucidate the transport mechanism, examine the sampling uncertainty and develop the remedial measures. A major recirculation of size comparable to the truck height h is induced collectively by the roof-level prevailing flows, side entrainment and underbody wall jet. The tailpipe is enclosed by dividing streamlines so the plume is carried back to the truck right after emission. The recirculation augments the pollutant mixing, resulting in a more homogeneous pollutant distribution together with a rather high fluctuating concentration (over 20% of the time-averaged concentrations). The plume ascends mildly before being purged out of the major recirculation to the far field by turbulence, leading to a huge reduction in pollutant concentration (an order of magnitude) outside the near wake. In the far-field, the plume is higher than the tailpipe and disperses in a conventional Gaussian distribution manner. Under this circumstance, a sampling duration for remote sensing longer than h/U∞ would be prone to underestimating the tailpipe emission.

8.
Environ Sci Pollut Res Int ; 27(12): 13384-13395, 2020 Apr.
Article in English | MEDLINE | ID: mdl-32020451

ABSTRACT

Atmospheric dispersion model (ADM) simulations are increasingly used as management tools in air pollution monitoring programs, even in the absence of proper validation. Biomonitors can provide important information for ADM validation, but an open question is their temporal frame of application, particularly when native organisms are used. In this study, we tested two alternative ADM simulating the total suspended particulate (TSP) released by a coal power station, against the element content of two native lichens collected at 40 sites, integrated by soil samples. The ADM simulations differed by the time references: the 6-month period preceding lichen sampling, approximately corresponding to the estimated age of the samples (Mod. A), and the whole year 2005, representative of the local average conditions and used in the plant authorization processes (Mod. B). A generalized regression model analysis clearly showed that the Cr, Pb and V content of lichen samples was spatially associated to the outcomes of Mod. A, but not with Mod. B. Interestingly, the Cr content of lichen samples consistently correlated to TSP concentration predicted by Mod. A along two transects placed downwind from the coal power station. This result was corroborated by an air particulate matter sampling which pointed out that air Cr concentrations increased during the operative period of the source. Overall, our results suggest that lichen bioaccumulation data can proficiently be used to validate ADM simulations if the exposure time of the biological samples is consistent with the temporal domain of the ADM simulations.


Subject(s)
Air Pollutants/analysis , Air Pollution , Lichens , Coal , Environmental Monitoring , Italy
9.
Sci Total Environ ; 710: 136245, 2020 Mar 25.
Article in English | MEDLINE | ID: mdl-31918187

ABSTRACT

This manuscript focuses on the implementation of the hierarchical complexity of space-time deterministic and stochastic dynamical systems to study the pollution dispersion behavior. Considering the concurrent environmental scope and requisites to understand the evolution of various types of environmentally related pollutants of high concern, herein, several suitable mathematical models are anticipated. Aiming to study the current pollution phenomenon at hand, we employed a lumped-linear or nonlinear structure and directly discussed in support of relevant equations. Up to some extent, by intuition, the researcher knows which model is more complex (suitable) than others, so the basic concepts are coated with linked references. Hence, the structural complexity features of the dynamical system are discussed in detail. The continuous dynamical system is discretized, and from the associated time series, a complexity measure can be obtained. There also exists a research gap on complexity theory, which generally deals with the behavior (solutions to the representing differential equations) in a system. Taking all these into account to cover the left behind literature gaps, herein, we propose to glimpse a family of classical models used to describe pollution and bacterial dispersion in the environment. From this review, we offer a qualitative complexity measure to each modeling paradigm by taking into account the underlying space of definition of the model and the key issue of the related differentiability. For instance, a lumped-linear set of differential equations is relatively simple with respect to its nonlinear counterpart because the former lives in the three-dimensional (3-D) real space R3, where the notion of differentiability shows up naturally. However, the latter needs to translate such conception to a manifold by means of differential geometry. Going further, we reflected on this issue for random systems where the notion of differentiability is transformed into an integral equivalence by means of Ito's lemma and so on for more exotic modeling perspectives. Moreover, the study presents a qualitative measure of complexity in terms of underlying sets and feasibility of differentiability.

10.
Environ Int ; 134: 105261, 2020 01.
Article in English | MEDLINE | ID: mdl-31704563

ABSTRACT

Odors have received increasing attention among atmospheric pollutants. Indeed, odor emissions are a common source of complaints, affecting the quality of life of humans and animals. The odor is a property of a mixture of different volatile chemical species (sulfur, nitrogen, and volatile organic compounds) capable of stimulating the olfaction sense sufficiently to trigger a sensation of odor. The impact of odors on the surrounding areas depends on different factors, such as the amount of odors emitted from the site, the distance from the site, weather conditions, topography, other than odors sensitivity and tolerance of the neighborhood. Due to the complexity of the odor issue, the aim of this review was to give an overview of: (i) techniques (sensorial and analytical) that can be used to determine a quantitative and qualitative characterization; (ii) air dispersion models applied for the evaluation of the spatial and temporal distribution of atmospheric pollutants in terms of concentration in air and/or deposition in the studied domain; (iii) major sources of odor nuisance (waste and livestock); (iv) mitigation actions against odor impact. Among sensorial techniques dynamic olfactometry, field inspection, and recording from residents were considered; whereas, for analytical methodologies: gas chromatography-mass spectrometry, identification of specific compounds, and electronic nose. Both kinds of techniques evaluate the odor concentration. Instead, to account for the effective impact of odors on the population, air dispersion models are used. They can provide estimates of odor levels in both current and future emission scenarios. Moreover, they can be useful to estimate the efficiency of mitigation strategies. Most of the odor control strategies involve measures oriented to prevent, control dispersion, minimize the nuisance or remove the odorants from emissions, such as adequate process design, buffer zones, odor covers, and treatment technologies.


Subject(s)
Odorants/analysis , Animals , Environmental Pollutants , Gas Chromatography-Mass Spectrometry , Humans , Quality of Life
11.
Biochem Pharmacol ; 169: 113596, 2019 11.
Article in English | MEDLINE | ID: mdl-31398312

ABSTRACT

The liver is the most important drug metabolizing organ, endowed with a plethora of metabolizing enzymes and transporters to facilitate drug entry and removal via metabolism and/or biliary excretion. For this reason, much focus surrounds the development of clearance concepts, which are based on normalizing the rate of removal to the input or arterial concentration. By so doing, some authors have recently claimed that it implies one specific model of hepatic elimination, namely, the widely used well-stirred or venous equilibration model (WSM). This commentary challenges this claim and aims to provide a comprehensive discussion of not only the WSM but other currently applied hepatic clearance models - the parallel tube model (PTM), the dispersion model (DM), the zonal liver model (ZLM), and the heterogeneous capillary transit time model of Goresky and co-workers (GM). The WSM, PTM, and DM differ in the patterns of internal blood flow, assuming bulk, plug, and dispersive flows, respectively, which render different degrees of mixing within the liver that are characterized by the magnitudes of the dispersion number (DN), resulting in different implications concerning the (unbound) substrate concentration in liver (CuH). Early models assumed perfusion rate-limited distribution, which have since been modified to include membrane-limited transport. The recent developments associated with the misconceptions and the sensitivity of the models are hereby addressed. Since the WSM has been and will likely remain widely used, the pros and cons of this model relative to physiological reality are further discussed.


Subject(s)
Hepatobiliary Elimination/physiology , Hepatocytes/metabolism , Liver/metabolism , Models, Biological , Animals , Humans , Metabolic Clearance Rate , Pharmaceutical Preparations/metabolism , Protein Binding , Rats , Tissue Distribution
12.
Sci Total Environ ; 659: 973-982, 2019 Apr 01.
Article in English | MEDLINE | ID: mdl-31096427

ABSTRACT

BACKGROUND: Geothermal power plants for the production of electricity are currently active in Mt. Amiata, Italy. The present study aimed to investigate the association between chronic low-level exposure to H2S and health outcomes, using a residential cohort study design. METHODS: Spatial variability of exposure to chronic levels of H2S was evaluated using dispersion modelling. Cohorts included people residing in six municipalities of the geothermal district from 01/01/1998 to 31/12/2016. Residence addresses were georeferenced and each subject was matched with H2S exposure metrics and socio-economic status available at census tract level. Mortality and hospital discharge data for neoplasms and diseases of the respiratory, central nervous and cardiovascular systems were taken from administrative health databases. Cox proportional hazard models were used to test the association between H2S exposure and outcomes, with age as the temporal axis and adjusting for gender, socio-economic status and calendar period. RESULTS: The residential cohort was composed of 33,804 subjects for a total of 391,002 person-years. Analyses reported risk increases associated with high exposure to H2S for respiratory diseases (HR = 1.12 95%CI: 1.00-1.25 for mortality data; HR = 1.02 95%CI: 0.98-1.06 for morbidity data), COPD and disorders of the peripheral nervous system. Neoplasms were negatively associated with increased H2S exposure. CONCLUSIONS: The most consistent findings were reported for respiratory diseases. Associations with increased H2S exposure were coherent in both mortality and hospitalization analyses, for both genders, with evidence of exposure-related trends. No positive associations were found for cancer or cardiovascular diseases.


Subject(s)
Air Pollutants/adverse effects , Environmental Exposure/analysis , Hydrogen Sulfide/adverse effects , Adolescent , Adult , Aged , Aged, 80 and over , Cardiovascular Diseases/chemically induced , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/mortality , Central Nervous System Diseases/chemically induced , Central Nervous System Diseases/epidemiology , Central Nervous System Diseases/mortality , Child , Child, Preschool , Cohort Studies , Female , Humans , Infant , Infant, Newborn , Italy/epidemiology , Male , Middle Aged , Neoplasms/chemically induced , Neoplasms/epidemiology , Neoplasms/mortality , Power Plants , Respiration Disorders/chemically induced , Respiration Disorders/epidemiology , Respiration Disorders/mortality , Young Adult
13.
J Biomed Opt ; 23(3): 1-9, 2018 03.
Article in English | MEDLINE | ID: mdl-29595017

ABSTRACT

A practical algorithm for estimating the wavelength-dependent refractive index (RI) of a turbid sample in the spatial frequency domain with the aid of Kramers-Kronig (KK) relations is presented. In it, phase-shifted sinusoidal patterns (structured illumination) are serially projected at a high spatial frequency onto the sample surface (mouse scalp) at different near-infrared wavelengths while a camera mounted normally to the sample surface captures the reflected diffuse light. In the offline analysis pipeline, recorded images at each wavelength are converted to spatial absorption maps by logarithmic function, and once the absorption coefficient information is obtained, the imaginary part (k) of the complex RI (CRI), based on Maxell's equations, can be calculated. Using the data represented by k, the real part of the CRI (n) is then resolved by KK analysis. The wavelength dependence of n ( λ ) is then fitted separately using four standard dispersion models: Cornu, Cauchy, Conrady, and Sellmeier. In addition, three-dimensional surface-profile distribution of n is provided based on phase profilometry principles and a phase-unwrapping-based phase-derivative-variance algorithm. Experimental results demonstrate the capability of the proposed idea for sample's determination of a biological sample's RI value.


Subject(s)
Refractometry/methods , Spectroscopy, Near-Infrared/methods , Algorithms , Animals , Brain/diagnostic imaging , Male , Mice , Mice, Inbred C57BL , Phantoms, Imaging , Scalp/diagnostic imaging , Surface Properties
14.
Sci Total Environ ; 610-611: 175-190, 2018 Jan 01.
Article in English | MEDLINE | ID: mdl-28803195

ABSTRACT

Gaussian dispersion models are widely used to simulate the concentrations and deposition fluxes of pollutants emitted by source areas. Very often, the calculation time limits the number of sources and receptors and the geometry of the sources must be simple and without holes. This paper presents CAREA, a new GIS-based Gaussian model for complex source areas. CAREA was coded in the Python language, and is largely based on a simplified formulation of the very popular and recognized AERMOD model. The model allows users to define in a GIS environment thousands of gridded or scattered receptors and thousands of complex sources with hundreds of vertices and holes. CAREA computes ground level, or near ground level, concentrations and dry deposition fluxes of pollutants. The input/output and the runs of the model can be completely managed in GIS environment (e.g. inside a GIS project). The paper presents the CAREA formulation and its applications to very complex test cases. The tests shows that the processing time are satisfactory and that the definition of sources and receptors and the output retrieval are quite easy in a GIS environment. CAREA and AERMOD are compared using simple and reproducible test cases. The comparison shows that CAREA satisfactorily reproduces AERMOD simulations and is considerably faster than AERMOD.

15.
Microarrays (Basel) ; 6(1)2017 Feb 10.
Article in English | MEDLINE | ID: mdl-28208652

ABSTRACT

The traditional approach with microarray data has been to apply transformations that approximately normalize them, with the drawback of losing the original scale. The alternative stand point taken here is to search for models that fit the data, characterized by the presence of negative values, preserving their scale; one advantage of this strategy is that it facilitates a direct interpretation of the results. A new family of distributions named gpower-normal indexed by p∈R is introduced and it is proven that these variables become normal or truncated normal when a suitable gpower transformation is applied. Expressions are given for moments and quantiles, in terms of the truncated normal density. This new family can be used to model asymmetric data that include non-positive values, as required for microarray analysis. Moreover, it has been proven that the gpower-normal family is a special case of pseudo-dispersion models, inheriting all the good properties of these models, such as asymptotic normality for small variances. A combined maximum likelihood method is proposed to estimate the model parameters, and it is applied to microarray and contamination data. Rcodes are available from the authors upon request.

16.
Environ Technol ; 38(5): 639-651, 2017 Mar.
Article in English | MEDLINE | ID: mdl-27348460

ABSTRACT

Dispersion modelling was proved by researchers that most part of the models, including the regulatory models recommended by the Environmental Protection Agency of the United States (AERMOD and CALPUFF), do not have the ability to predict under complex situations. This article presents a novel evaluation of the propagation of errors in lateral dispersion coefficient of AERMOD with emphasis on estimate of average times under 10 min. The sources of uncertainty evaluated were parameterizations of lateral dispersion ([Formula: see text]), standard deviation of lateral wind speed ([Formula: see text]) and processing of obstacle effect. The model's performance was tested in two field tracer experiments: Round Hill II and Uttenweiller. The results show that error propagation from the estimate of [Formula: see text] directly affects the determination of [Formula: see text], especially in Round Hill II experiment conditions. After average times are reduced, errors arise in the parameterization of [Formula: see text], even after observation assimilations of [Formula: see text], exposing errors on Lagrangian Time Scale parameterization. The assessment of the model in the presence of obstacles shows that the implementation of a plume rise model enhancement algorithm can improve the performance of the AERMOD model. However, these improvements are small when the obstacles have a complex geometry, such as Uttenweiller.


Subject(s)
Air Pollutants/analysis , Algorithms , Environmental Monitoring/statistics & numerical data , Models, Theoretical , Odorants/analysis , Sulfur Dioxide/analysis
17.
Stat Methods Med Res ; 26(2): 880-897, 2017 04.
Article in English | MEDLINE | ID: mdl-25491718

ABSTRACT

Often in biomedical research, we deal with continuous (clustered) proportion responses ranging between zero and one quantifying the disease status of the cluster units. Interestingly, the study population might also consist of relatively disease-free as well as highly diseased subjects, contributing to proportion values in the interval [0, 1]. Regression on a variety of parametric densities with support lying in (0, 1), such as beta regression, can assess important covariate effects. However, they are deemed inappropriate due to the presence of zeros and/or ones. To evade this, we introduce a class of general proportion density, and further augment the probabilities of zero and one to this general proportion density, controlling for the clustering. Our approach is Bayesian and presents a computationally convenient framework amenable to available freeware. Bayesian case-deletion influence diagnostics based on q-divergence measures are automatic from the Markov chain Monte Carlo output. The methodology is illustrated using both simulation studies and application to a real dataset from a clinical periodontology study.


Subject(s)
Models, Statistical , Bayes Theorem , Biostatistics/methods , Cluster Analysis , Computer Simulation , Data Interpretation, Statistical , Databases, Factual/statistics & numerical data , Humans , Markov Chains , Monte Carlo Method , Periodontal Diseases/diagnosis , Proportional Hazards Models , Regression Analysis , Software
18.
Environ Monit Assess ; 188(9): 516, 2016 Sep.
Article in English | MEDLINE | ID: mdl-27521001

ABSTRACT

In this study, concentration of SO2 from a gas refinery located in complex terrain was calculated by the steady-state, AERMOD model, and nonsteady-state CALPUFF model. First, in four seasons, SO2 concentrations emitted from 16 refinery stacks, in nine receptors, were obtained by field measurements, and then the performance of both models was evaluated. Then, the simulated results for SO2 ambient concentrations made by each model were compared with the results of the observed concentrations, and model results were compared among themselves. The evaluation of the two models to simulate SO2 concentrations was based on the statistical analysis and Q-Q plots. Review of statistical parameters and Q-Q plots has shown that, according to the evaluation of estimations made, performance of both models to simulate the concentration of SO2 in the region can be considered acceptable. The results showed the AERMOD composite ratio between simulated values made by models and the observed values in various receptors for all four average times is 0.72, whereas CALPUFF's ratio is 0.89. However, in the complex conditions of topography, CALPUFF offers better agreement with the observed concentrations.


Subject(s)
Air Pollutants/analysis , Models, Theoretical , Sulfur Dioxide/analysis , Environmental Monitoring/methods , Extraction and Processing Industry , Petroleum
19.
Am J Phys Anthropol ; 155(4): 546-58, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25209335

ABSTRACT

The history of human occupation in Brazil dates to at least 14 kyr BP, and the country has the largest record of early human remains from the continent. Despite the importance and richness of Brazilian human skeletal collections, the biological relationships between groups and their implications for knowledge about human dispersion in the country have not been properly explored. Here, we present a comprehensive assessment of the morphological affinities of human groups from East-Central, Coastal, Northeast, and South Brazil from distinct periods and test for the best dispersion scenarios to explain the observed diversity across time. Our results, based on multivariate assessments of shape and goodness of fit tests of dispersion and adaptation models, favor the idea that Brazil experienced at least two large dispersion waves. The first dispersive event brought the morphological pattern that characterize Late Pleistocene groups continent-wide and that persisted among East-Central Brazil groups until recently. Within the area covered by our samples, the second wave was probably restricted to the coast and is associated with a distinct morphological pattern. Inland and coastal populations apparently did not interact significantly during the Holocene, as there is no clear signal of admixture between groups sharing the two morphological patterns. However, these results cannot be extended to the interior part of the country (Amazonia and Central Brazil), given the lack of skeletal samples in these regions.


Subject(s)
Fossils , Human Migration/history , Models, Biological , Skull/anatomy & histology , Anthropology, Physical , Brazil , Cephalometry , History, Ancient , Humans , Multivariate Analysis
20.
Rev. luna azul ; (34): 195-213, ene.-jun. 2012. ilus, tab
Article in Spanish | LILACS | ID: lil-659390

ABSTRACT

El material particulado es uno de los contaminantes atmosféricos más estudiados en el mundo, este se define como el conjunto de partículas sólidas y/o líquidas (a excepción del agua pura) presentes en suspensión en la atmósfera (Mészáros, 1999), que se originan a partir de una gran variedad de fuentes naturales o antropogénicas y poseen un amplio rango de propiedades morfológicas, físicas, químicas y termodinámicas. La presencia en la atmósfera de este contaminante ocasiona variedad de impactos a la vegetación, materiales y el hombre, entre ellos, la disminución visual en la atmósfera, causada por la absorción y dispersión de la luz (Chen,Ying &Kleeman,2009).Además, la presencia del material particulado está asociada con el incremento del riesgo de muerte por causas cardiopulmonares en muestras de adultos (Pope, 2004) Es necesario, además de realizar mediciones de la concentración de este contaminante, evaluar su comportamiento en el espacio y el tiempo, asociándolo con los fenómenos meteorológicos, composición química y origen, los cuales permitan orientar estrategias de control y realizar seguimiento por parte de las autoridades ambientales interesadas. Este artículo presenta una revisión bibliográfica de los impactos asociados con partículas presentes en la atmósfera, los equipos para su monitoreo, aplicaciones tecnológicas complementarias y tecnologías de control.


Particulate matter is one of the most studied atmospheric pollutants around the world. It is defined as the set of solid or liquid particles (except plain water) which are present in suspension in the atmosphere (Mészáros, 1999), and that originate from a variety of natural or anthropogenic sources and have a wide range of morphological, physical, chemical and thermodynamic properties. The presence of this pollutant in the atmosphere causes a variety of impacts on vegetation, materials and human beings, including a visual decrease in the atmosphere caused by the absorption and diffusion of light (Chen, Ying & Kleeman, 2009). Furthermore, the presence of particulate matter is associated with increased risk of death from cardiopulmonary causes in samples of adults (Pope, 2004) Besides measuring this pollutant concentrations, it is necessary to assess its behavior in space and time associating it with meteorological phenomena, chemical composition, and origin of which allows the orientation of control strategies and the performance of follow-up activities by the environmental authorities concerned. This article presents a literature review of the impacts associated with particles present in the atmosphere, the equipment for monitoring them, complementary technological applications and control technologies.


Subject(s)
Humans , Air Pollution , Pollutants Dispersion , Particulate Matter
SELECTION OF CITATIONS
SEARCH DETAIL
...