Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
1.
Water Res ; 176: 115729, 2020 Jun 01.
Article in English | MEDLINE | ID: mdl-32240845

ABSTRACT

Recreational water quality guidelines protect the public from health risks associated with water recreation by helping to prevent unacceptable concentrations of pathogenic organisms in ambient water. However, illness risk is associated with both the concentration of pathogens in the water and the degree of contact with those pathogens. Different recreational activities can result in different levels of contact with ambient water containing water-borne pathogens. We conducted a systematic literature review and meta-analysis to evaluate risks of illness associated with different recreational activities and different levels of contact to ambient surface waters. We screened 8,618 potentially relevant studies for quantitative measures of risk using inclusion/exclusion criteria established in advance. We categorized recreational activities as swimming, sports-related contact, minimal contact, and sand contact. We combined relative risks using a random effects meta-analysis for adverse health outcome categories representing gastrointestinal illness, respiratory illness, skin, eye, ear, nose, throat, and cold/flu illness. We identified 92 studies meeting our inclusion criteria. Pooled risk estimates indicate significant elevation of gastrointestinal illness with the recreational activity categories swimming (2.19, 95% CI: 1.82, 2.63) and sports-related contact (2.69, 95% CI: 1.04, 6.92), and nonsignificant elevation of gastrointestinal illness with minimal contact (1.27, 95% CI: 0.74, 2.16). We also found a significant elevation of respiratory illness with swimming (1.78, 95% CI: 1.38, 2.29) and sports-related contact (1.49, 95% CI: 1.00, 2.24), and no elevation of respiratory illness with minimal contact (0.90, 95% CI: 0.71, 1.14). This study suggests that exposures associated with different types of recreational activities are important characteristics of the exposure pathway when assessing illness risk associated with recreation in ambient surface waters.


Subject(s)
Swimming Pools , Water Microbiology , Recreation , Risk Assessment , Swimming , Water Quality
2.
Environ Sci Technol ; 53(22): 13382-13389, 2019 Nov 19.
Article in English | MEDLINE | ID: mdl-31577425

ABSTRACT

Increasing interest in recycling water for potable purposes makes understanding the risks associated with potential acute microbial hazards important. We compared risks from de facto reuse, indirect potable reuse (IPR), and direct potable reuse (DPR) scenarios using a previously published quantitative microbial risk assessment methodology and literature review results. The de facto reuse simulation results are compared to a Cryptosporidium spp. database collected for the Long Term 2 Enhanced Surface Water Treatment Rule's information collection rule (ICR) and to a literature review of norovirus (NoV) densities in ambient surface waters. The de facto simulation results with a treated wastewater effluent contribution of 1% in surface waters and a residence time of 30 days most closely match the ICR dataset. The de facto simulations also suggest that using NoV monitoring data from surface waters may overestimate microbial risks, compared to NoV data from raw sewage coupled with wastewater treatment reduction estimates. The predicted risks from IPR and DPR are consistently lower than those for the de facto reuse scenarios assuming the AWTFs are operating within design specifications. These analyses provide insight into the microbial risks associated with various potable reuse scenarios and highlight the need to carefully consider drinking water treatment choices when wastewater effluent is a component of any drinking water supply.


Subject(s)
Drinking Water , Water Purification , Humans , Recycling , Wastewater , Water Supply
3.
Water Res ; 153: 263-273, 2019 04 15.
Article in English | MEDLINE | ID: mdl-30735956

ABSTRACT

Coliphage have been proposed as indicators of fecal contamination in recreational waters because they better reflect the persistence of pathogenic viruses in the environment and through wastewater treatment than traditional fecal indicator bacteria. Herein, we conducted a systematic literature search of peer-reviewed publications to identify coliphage density data (somatic and male-specific, or MSC) in raw wastewater and ambient waters. The literature review inclusion criteria included scope, study quality, and data availability. A non-parametric two-stage bootstrap analysis was used to estimate the coliphage distributions in raw wastewater and account for geographic region and season. Additionally, two statistical methodologies were explored for developing coliphage density distributions in ambient waters, to account for the nondetects in the datasets. In raw wastewater, the analysis resulted in seasonal density distributions of somatic coliphage (SC) (mean 6.5 log10 plaque forming units (PFU)/L; 95% confidence interval (CI): 6.2-6.8) and MSC (mean 5.9 log10 PFU/L; 95% CI: 5.5-6.1). In ambient waters, 49% of MSC samples were nondetects, compared with less than 5% for SC. Overall distributional estimates of ambient densities of coliphage were statistically higher for SC than for MSC (mean 3.4 and 1.0 log10 PFU/L, respectively). Distributions of coliphage in raw wastewater and ambient water will be useful for future microbial risk assessments.


Subject(s)
Viruses , Wastewater , Coliphages , Feces , Humans , Male , Water Microbiology
4.
Environ Int ; 122: 168-184, 2019 01.
Article in English | MEDLINE | ID: mdl-30473382

ABSTRACT

The objective of this paper is to explain how to apply, interpret, and present the results of a new instrument to assess the risk of bias (RoB) in non-randomized studies (NRS) dealing with effects of environmental exposures on health outcomes. This instrument is modeled on the Risk Of Bias In Non-randomized Studies of Interventions (ROBINS-I) instrument. The RoB instrument for NRS of exposures assesses RoB along a standardized comparison to a randomized target experiment, instead of the study-design directed RoB approach. We provide specific guidance for the integral steps of developing a research question and target experiment, distinguishing issues of indirectness from RoB, making individual-study judgments, and performing and interpreting sensitivity analyses for RoB judgments across a body of evidence. Also, we present an approach for integrating the RoB assessments within the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) framework to assess the certainty of the evidence in the systematic review. Finally, we guide the reader through an overall assessment to support the rating of all domains that determine the certainty of a body of evidence using the GRADE approach.


Subject(s)
Bias , Environmental Exposure/analysis , Research Design/standards , Risk Assessment/standards , Humans , Random Allocation
5.
Environ Int ; 120: 382-387, 2018 11.
Article in English | MEDLINE | ID: mdl-30125855

ABSTRACT

Assessing the risk of bias (RoB) of individual studies is a critical part in determining the certainty of a body of evidence from non-randomized studies (NRS) that evaluate potential health effects due to environmental exposures. The recently released RoB in NRS of Interventions (ROBINS-I) instrument has undergone careful development for health interventions. Using the fundamental design of ROBINS-I, which includes evaluating RoB against an ideal target trial, we explored developing a version of the instrument to evaluate RoB in exposure studies. During three sequential rounds of assessment, two or three raters (evaluators) independently applied ROBINS-I to studies from two systematic reviews and one case-study protocol that evaluated the relationship between environmental exposures and health outcomes. Feedback from raters, methodologists, and topic-specific experts informed important modifications to tailor the instrument to exposure studies. We identified the following areas of distinction for the modified instrument: terminology, formulation of the ideal target randomized experiment, guidance for cross-sectional studies and exposure assessment (both quality of measurement method and concern for potential exposure misclassification), and evaluation of issues related to study sensitivity. Using the target experiment approach significantly impacts the process for how environmental and occupational health studies are considered in the Grading of Recommendations Assessment, Development and Evaluation (GRADE) evidence-synthesis framework.


Subject(s)
Bias , Environmental Exposure , Cross-Sectional Studies , Humans , Randomized Controlled Trials as Topic
6.
Curr Environ Health Rep ; 5(2): 283-292, 2018 06.
Article in English | MEDLINE | ID: mdl-29721701

ABSTRACT

PURPOSE OF REVIEW: With the increasing interest in recycling water for potable reuse purposes, it is important to understand the microbial risks associated with potable reuse. This review focuses on potable reuse systems that use high-level treatment and de facto reuse scenarios that include a quantifiable wastewater effluent component. RECENT FINDINGS: In this article, we summarize the published human health studies related to potable reuse, including both epidemiology studies and quantitative microbial risk assessments (QMRA). Overall, there have been relatively few health-based studies evaluating the microbial risks associated with potable reuse. Several microbial risk assessments focused on risks associated with unplanned (or de facto) reuse, while others evaluated planned potable reuse, such as indirect potable reuse (IPR) or direct potable reuse (DPR). The reported QMRA-based risks for planned potable reuse varied substantially, indicating there is a need for risk assessors to use consistent input parameters and transparent assumptions, so that risk results are easily translated across studies. However, the current results overall indicate that predicted risks associated with planned potable reuse scenarios may be lower than those for de facto reuse scenarios. Overall, there is a clear need to carefully consider water treatment train choices when wastewater is a component of the drinking water supply (whether de facto, IPR, or DPR). More data from full-scale water treatment facilities would be helpful to quantify levels of viruses in raw sewage and reductions across unit treatment processes for both culturable and molecular detection methods.


Subject(s)
Drinking Water/microbiology , Recycling , Wastewater/microbiology , Water Purification/methods , Water Supply/methods , Humans , Risk Assessment , Sewage/microbiology
7.
Water Res ; 128: 286-292, 2018 01 01.
Article in English | MEDLINE | ID: mdl-29107913

ABSTRACT

Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects.


Subject(s)
Risk Assessment/methods , Sewage/microbiology , Water Purification/standards , Benchmarking , Humans , Norovirus , Uncertainty , Viruses , Wastewater
8.
Water Res ; 111: 366-374, 2017 03 15.
Article in English | MEDLINE | ID: mdl-28110140

ABSTRACT

Human noroviruses (NoV) are a leading cause of recreational waterborne illnesses and responsible for the majority of viral-associated gastrointestinal illnesses nationwide. We conducted a systematic literature review of published peer-reviewed publications to identify NoV density data in wastewater influent, and provided an approach for developing pathogen density distributions, using the NoV data. Literature review inclusion criteria included scope, study quality, and data availability. A non-parametric bootstrap statistical model was used to estimate the NoV distribution in wastewater influent. The approach used accounts for heterogeneity in study-specific distribution curves, sampling locations, and sampling season and provides a comprehensive representation of the data. Study results illustrate that pooling all of the available NoV data together in a meta-analysis provides a more comprehensive understanding of the technical literature than what could be appreciated from individual studies. The studies included in this analysis indicate a high density of NoV in wastewater influent (overall mean = 4.6 log10 genome copies (GC)/liter (L)), with a higher density of NoV genogroup (G) II (overall mean = 4.9 log10 GC/L) than for GI (overall mean = 4.4 log10 GC/L for GI). The bootstrapping approach was also used to account for differences in seasonal and geographical occurrences of NoV GI and GII. The methods presented are reproducible and can be used to develop QMRA-ready density distributions for other viral pathogens in wastewater influent, effluent, and ambient waters. To our knowledge, our results are the first to quantitatively characterize seasonal and geographic differences, which could be particularly useful for future risk assessments.


Subject(s)
Norovirus/genetics , Sewage/virology , Genotype , Humans , Wastewater/virology , Water Purification
9.
J Expo Sci Environ Epidemiol ; 27(3): 235-243, 2017 05.
Article in English | MEDLINE | ID: mdl-27901016

ABSTRACT

Increased risks of lung and bladder cancer have been observed in populations exposed to high levels of inorganic arsenic. However, studies at lower exposures (i.e., less than 100 µg/l in water) have shown inconsistent results. We therefore conducted an ecological analysis of the association between historical drinking water arsenic concentrations and lung and bladder cancer incidence in U.S. counties. We used drinking water arsenic concentrations measured by the U.S. Geological Survey and state agencies in the 1980s and 1990s as proxies for historical exposures in counties where public groundwater systems and private wells are important sources of drinking water. Relationships between arsenic levels and cancer incidence in 2006-2010 were explored by Poisson regression analyses, adjusted for groundwater dependence and important demographic covariates. The median and 95th percentile county mean arsenic concentrations were 1.5 and 15.4 µg/l, respectively. Water arsenic concentrations were significant and positively associated with female and male bladder cancer, and with female lung cancer. Our findings support an association between low water arsenic concentrations and lung and bladder cancer incidence in the United States. However, the limitations of the ecological study design suggest caution in interpreting these results.


Subject(s)
Arsenic/adverse effects , Drinking Water/adverse effects , Lung Neoplasms/chemically induced , Lung Neoplasms/epidemiology , Urinary Bladder Neoplasms/chemically induced , Urinary Bladder Neoplasms/epidemiology , Aged , Aged, 80 and over , Arsenic/analysis , Databases, Factual , Drinking Water/analysis , Drinking Water/chemistry , Environmental Exposure/adverse effects , Environmental Exposure/analysis , Environmental Monitoring , Female , Humans , Incidence , Male , Middle Aged , Regression Analysis , SEER Program , Sex Distribution , United States/epidemiology
10.
Water Res ; 66: 254-264, 2014 Dec 01.
Article in English | MEDLINE | ID: mdl-25222329

ABSTRACT

We simulate the influence of multiple sources of enterococci (ENT) as faecal indicator bacteria (FIB) in recreational water bodies on potential human health risk by considering waters impacted by human and animal sources, human and non-pathogenic sources, and animal and non-pathogenic sources. We illustrate that risks vary with the proportion of culturable ENT in water bodies derived from these sources and estimate corresponding ENT densities that yield the same level of health protection that the recreational water quality criteria in the United States seeks (benchmark risk). The benchmark risk is based on epidemiological studies conducted in water bodies predominantly impacted by human faecal sources. The key result is that the risks from mixed sources are driven predominantly by the proportion of the contamination source with the greatest ability to cause human infection (potency), not necessarily the greatest source(s) of FIB. Predicted risks from exposures to mixtures comprised of approximately 30% ENT from human sources were up to 50% lower than the risks expected from purely human sources when contamination is recent and ENT levels are at the current water quality criteria levels (35 CFU 100 mL(-1)). For human/non-pathogenic, human/gull, human/pig, and human/chicken faecal mixtures with relatively low human contribution, the predicted culturable enterococci densities that correspond to the benchmark risk are substantially greater than the current water quality criteria values. These findings are important because they highlight the potential applicability of site specific water quality criteria for waters that are predominantly un-impacted by human sources.


Subject(s)
Bacteria , Feces/microbiology , Water Microbiology , Water Quality , Animals , Enterococcus , Environmental Monitoring , Escherichia coli O157 , Gastrointestinal Diseases/microbiology , Humans , Probability , Risk Assessment , Swine , United States , Water Pollutants/analysis , Water Pollution , Water Supply
11.
ISPRS Int J Geoinf ; 3(2): 713-731, 2014 Jun.
Article in English | MEDLINE | ID: mdl-36405525

ABSTRACT

In July 2002, lightning strikes ignited over 250 fires in Quebec, Canada, destroying over one million hectares of forest. The smoke plume generated from the fires had a major impact on air quality across the east coast of the U.S. Using data from the Medicare National Claims History File and the U.S. Environmental Protection Agency (EPA) National air pollution monitoring network, we evaluated the health impact of smoke exposure on 5.9 million elderly people (ages 65+) in the Medicare population in 81 counties in 11 northeastern and Mid-Atlantic States of the US. We estimated differences in the exposure to ambient PM2.5-airborne particulate matter with aerodynamic diameter of ≤2.5 µm-concentrations and hospitalizations for cardiovascular, pulmonary and injury outcomes, before and during the smoke episode. We found that there was an associated 49.6% (95% confidence interval (CI), 29.8, 72.3) and 64.9% (95% CI, 44.3-88.5) increase rate of hospitalization for respiratory and cardiovascular diagnoses, respectively, when the smoke plume was present compared to before the smoke plume had arrived. Our study suggests that rapid increases in PM2.5 concentrations resulting from wildfire smoke can impact the health of elderly populations thousands of kilometers removed from the fires.

12.
Environ Sci Process Impacts ; 15(4): 721-9, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23450296

ABSTRACT

Using exhaled breath condensate (EBC) as a biological media for analysis of biomarkers of exposure may facilitate the understanding of inhalation exposures. In this study, we present method validation for the collection of EBC and analysis of metals in EBC. The collection method was designed for use in a small scale longitudinal study with the goal of improving reproducibility while maintaining economic feasibility. We incorporated the use of an Rtube with additional components as an assembly, and trained subjects to breathe into the apparatus. EBC was collected from 8 healthy adult subjects with no known elevated exposures to Mn, Cr, Ni, and Cd repeatedly (10 times) within 7 days and analyzed for these metals via ICP-MS. Method detection limits were obtained by mimicking the process of EBC collection with ultrapure water, and resulted in 46-62% of samples falling in a range less than the method detection limit. EBC metal concentrations were found to be statistically significantly associated (p < 0.05) with room temperature and relative humidity during collection, as well as with the gender of the subject. The geometric mean EBC metal concentrations in our unexposed subjects were 0.57 µg Mn per L, 0.25 µg Cr per L, 0.87 µg Ni per L, and 0.14 µg Cd per L. The overall standard deviation was greater than the mean estimate, and the major source in EBC metals concentrations was due to fluctuations in subjects' measurements over time rather than to the differences between separate subjects. These results suggest that measurement and control of EBC collection and analytical parameters are critical to the interpretation of EBC metals measurements. In particular, rigorous estimation of method detection limits of metals in EBC provides a more thorough evaluation of accuracy.


Subject(s)
Breath Tests/instrumentation , Cadmium/analysis , Chromium/analysis , Manganese/analysis , Nickel/analysis , Adult , Equipment Design , Female , Humans , Limit of Detection , Male , Reproducibility of Results
13.
Environ Res ; 118: 137-44, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22749113

ABSTRACT

Previous epidemiological studies provide conflicting evidence as to whether environmental perchlorate exposure can affect levels of circulating thyroid hormones in the general population. We investigated the statistical relationships between biomarkers of perchlorate exposure and serum thyroid hormone levels in 2007-2008 National Health and Nutrition Evaluation Survey (NHANES) subjects. Generalized additive mixed models (GAMMs) were developed to estimate the relationships between T3 and T4 levels and creatinine-adjusted urinary perchlorate excretion. The models included covariates related to gender, age, ethnicity, income, smoking status, prescription medications, and biomarkers of exposures to other goitrogenic ions and phthalate ester metabolites. Where necessary, relationships between hormone levels and covariates were represented as nonlinear smoothed terms. The effect of the hypothalamic-pituitary-thyroid (HPT) axis on serum hormone levels was taken into account by including a term for thyroid stimulating hormone (TSH) in the models. Regression coefficients for perchlorate were significant and negative in GAMMs predicting total T4 and free T3 levels in males, females, and for the entire cohort when phthalate ester biomarkers and other covariates were included. Coefficients for perchlorate were also significant and negative in regressions predicting free T4 levels in males and in the entire study population. The consistency of these results suggests that HPT axis controls do not completely compensate for small changes in thyroid hormone levels associated with perchlorate and phthalate ester exposures.


Subject(s)
Biomarkers/blood , Environmental Exposure , Perchlorates/toxicity , Thyroid Hormones/blood , Adult , Female , Humans , Male , Middle Aged , Nutrition Surveys
14.
Men Masc ; 14(5)2011 Nov 01.
Article in English | MEDLINE | ID: mdl-24187483

ABSTRACT

Data were drawn from 845 males in the National Survey of Adolescent Males who were initially aged 15-17, and followed-up 2.5 and 4.5 years later, to their early twenties. Mixed-effects regression models (MRM) and semiparametric trajectory analyses (STA) modeled patterns of change in masculinity attitudes at the individual and group levels, guided by gender intensification theory and cognitive-developmental theory. Overall, men's masculinity attitudes became significantly less traditional between middle adolescence and early adulthood. In MRM analyses using time-varying covariates, maintaining paternal coresidence and continuing to have first sex in uncommitted heterosexual relationships were significantly associated with masculinity attitudes remaining relatively traditional. The STA modeling identified three distinct patterns of change in masculinity attitudes. A traditional-liberalizing trajectory of masculinity attitudes was most prevalent, followed by traditional-stable and nontraditional-stable trajectories. Implications for gender intensification and cognitive-developmental approaches to masculinity attitudes are discussed.

15.
Epidemiology ; 19(2): 209-16, 2008 Mar.
Article in English | MEDLINE | ID: mdl-18223484

ABSTRACT

BACKGROUND: The American Cancer Society study and the Harvard Six Cities study are 2 landmark cohort studies for estimating the chronic effects of fine particulate air pollution (PM2.5) on mortality. Using Medicare data, we assessed the association of PM2.5 with mortality for the same locations included in these studies. METHODS: We estimated the chronic effects of PM2.5 on mortality for the period 2000-2002 using mortality data for cohorts of Medicare participants and average PM2.5 levels from monitors in the same counties included in the 2 studies. We estimated mortality risk associated with air pollution adjusting for individual-level (age and sex) and area-level covariates (education, income level, poverty, and employment). We controlled for potential confounding by cigarette smoking by including standardized mortality ratios for lung cancer and chronic obstructive pulmonary disease. RESULTS: Using the Medicare data, we estimated that a 10 microg/m increase in the yearly average PM2.5 concentration is associated with 10.9% (95% confidence interval = 9.0-12.8) and with 20.8% (14.8-27.1) increases in all-cause mortality for the American Cancer Society and Harvard Six Cities study counties, respectively. The estimates are somewhat higher than those reported by the original investigators. CONCLUSION: Although Medicare data lack information on some potential confounding factors, we estimated risks similar to those in the previously published reports, which incorporated more extensive information on individual-level confounders. We propose that the Medicare files can be used to construct on-going cohorts for tracking the risk of air pollution over time.


Subject(s)
Air Pollutants/adverse effects , Mortality , Particulate Matter/adverse effects , Aged , Aged, 80 and over , American Cancer Society , Cohort Studies , Confounding Factors, Epidemiologic , Environmental Monitoring , Epidemiological Monitoring , Female , Geography , Humans , Male , Medicare , Particle Size , Regression Analysis , United States/epidemiology
16.
J Toxicol Environ Health A ; 68(13-14): 1191-205, 2005.
Article in English | MEDLINE | ID: mdl-16024498

ABSTRACT

Time-series and cohort studies of air pollution on human health have advanced greatly our understanding of the effects of air pollution on health since the earliest studies. Availability of large national databases and progress in computational tools and statistical methods have made possible the estimation of national average pollution effects and the exploration of potential sources of heterogeneity in the effects of air pollution across countries or regions. Interpretation of the findings needs to account for several challenges, including confounding and the resolution of seemingly conflicting results from time-series and cohort studies. This article presents an overview of the time-series and cohort studies' approaches for estimating the relative risk of mortality from particulate air pollution and discusses the statistical issues and challenges inherent in each of these studies. We also discuss policy relevant summaries in air pollution epidemiology, approaches for estimating the impact of particulate matter on mortality from time-series and cohort studies, and research opportunities under the National Medicare Cohort Study (NMCS).


Subject(s)
Air Pollution/adverse effects , Cohort Studies , Longitudinal Studies , Mortality , Humans , Public Policy , Research Design , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...