Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 50
Filter
1.
Water Res ; 264: 122216, 2024 Oct 15.
Article in English | MEDLINE | ID: mdl-39146850

ABSTRACT

In light of increasingly diverse greywater reuse applications, this study proposes risk-based log-removal targets (LRTs) to aid the selection of treatment trains for greywater recycling at different collection scales, including appliance-scale reuse of individual greywater streams. An epidemiology-based model was used to simulate the concentrations of prevalent and treatment-resistant reference pathogens (protozoa: Giardia and Cryptosporidium spp., bacteria: Salmonella and Campylobacter spp., viruses: rotavirus, norovirus, adenovirus, and Coxsackievirus B5) in the greywater streams for collection scales of 5-, 100-, and a 1000-people. Using quantitative microbial risk assessment (QMRA), we calculated LRTs to meet a health benchmark of 10-4 infections per person per year over 10'000 Monte Carlo iterations. LRTs were highest for norovirus at the 5-people scale and for adenovirus at the 100- and 1000-people scales. Example treatment trains were designed to meet the 95 % quantiles of LRTs. Treatment trains consisted of an aerated membrane bioreactor, chlorination, and, if required, UV disinfection. In most cases, rotavirus, norovirus, adenovirus and Cryptosporidium spp. determined the overall treatment train requirements. Norovirus was most often critical to dimension the chlorination (concentration × time values) and adenovirus determined the required UV dose. Smaller collection scales did not generally allow for simpler treatment trains due to the high LRTs associated with viruses, with the exception of recirculating washing machines and handwashing stations. Similarly, treating greywater sources individually resulted in lower LRTs, but the lower required LRTs nevertheless did not generally allow for simpler treatment trains. For instance, LRTs for a recirculating washing machine were around 3-log units lower compared to LRTs for indoor reuse of combined greywater (1000-people scale), but both scenarios necessitated treatment with a membrane bioreactor, chlorination and UV disinfection. However, simpler treatment trains may be feasible for small-scale and application-scale reuse if: (i) less conservative health benchmarks are used for household-based systems, considering the reduced relative importance of treated greywater in pathogen transmission in households, and (ii) higher log-removal values (LRVs) can be validated for unit processes, enabling simpler treatment trains for a larger number of appliance-scale reuse systems.


Subject(s)
Recycling , Water Purification , Water Microbiology , Waste Disposal, Fluid/methods , Cryptosporidium/isolation & purification , Giardia/isolation & purification , Disinfection/methods
2.
J Environ Sci (China) ; 146: 186-197, 2024 Dec.
Article in English | MEDLINE | ID: mdl-38969447

ABSTRACT

As an important means to solve water shortage, reclaimed water has been widely used for landscape water supply. However, with the emergence of large-scale epidemic diseases such as SARS, avian influenza and COVID-19 in recent years, people are increasingly concerned about the public health safety of reclaimed water discharged into landscape water, especially the pathogenic microorganisms in it. In this study, the water quality and microorganisms of the Old Summer Palace, a landscape water body with reclaimed water as the only replenishment water source, were tracked through long-term dynamic monitoring. And the health risks of indicator microorganisms were analyzed using Quantitative Microbial Risk Assessment (QMRA). It was found that the concentration of indicator microorganisms Enterococcus (ENT), Escherichia coli (EC) and Fecal coliform (FC) generally showed an upward trend along the direction of water flow and increased by more than 0.6 log at the end of the flow. The concentrations of indicator microorganisms were higher in summer and autumn than those in spring. And there was a positive correlation between the concentration of indicator microorganisms and COD. Further research suggested that increased concentration of indicator microorganisms also led to increased health risks, which were more than 30% higher in other areas of the park than the water inlet area and required special attention. In addition, (water) surface operation exposure pathway had much higher health risks than other pathways and people in related occupations were advised to take precautions to reduce the risks.


Subject(s)
Water Microbiology , Risk Assessment , Water Quality , Escherichia coli/isolation & purification , Water Supply , Environmental Monitoring , Enterococcus/isolation & purification , Humans
3.
Water Res ; 259: 121852, 2024 Aug 01.
Article in English | MEDLINE | ID: mdl-38889662

ABSTRACT

The purpose of this study was to evaluate the performance of HF183 Bacteroides for estimating pathogen exposures during recreational water activities. We compared the use of Bacteroides-based exposure assessment to exposure assessment that relied on pathogen measurements. We considered two types of recreational water sites: those impacted by combined sewer overflows (CSOs) and those not impacted by CSOs. Samples from CSO-impacted and non-CSO-impacted urban creeks were analysed by quantitative polymerase chain reaction (qPCR) for HF183 Bacteroides and eight human gastrointestinal pathogens. Exposure assessment was conducted two ways for each type of site (CSO-impacted vs. non-CSO impacted): 1) by estimating pathogen concentrations from HF183 Bacteroides concentrations using published ratios of HF183 to pathogens in sewage and 2) by estimating pathogen concentrations from qPCR measurements. QMRA (quantitative microbial risk assessment) was then conducted for swimming, wading, and fishing exposures. Overall, mean risk estimates varied from 0.27 to 53 illnesses per 1,000 recreators depending on exposure assessment, site, activity, and norovirus dose-response model. HF183-based exposure assessment identified CSO-impacted sites as higher risk, and the recommended HF183 risk-based threshold of 525 genomic copies per 100 mL was generally protective of public health at the CSO-impacted sites but was not as protective at the non-CSO-impacted sites. In the context of our urban watershed, HF183-based exposure assessment over- and under-estimated risk relative to exposure assessment based on pathogen measurements, and the etiology of predicted pathogen-specific illnesses differed significantly. Across all sites, the HF183 model overestimated risk for norovirus, adenovirus, and Campylobacter jejuni, and it underestimated risk for E. coli and Cryptosporidium. To our knowledge, this study is the first to directly compare health risk estimates using HF183 and empirical pathogen measurements from the same waterways. Our work highlights the importance of site-specific hazard identification and exposure assessment to decide whether HF183 is applicable for monitoring risk.


Subject(s)
Bacteroides , Recreation , Water Microbiology , Risk Assessment , Bacteroides/isolation & purification , Bacteroides/genetics , Humans , Cities , Norovirus , Sewage/microbiology , Environmental Monitoring/methods
4.
Sci Total Environ ; 932: 172667, 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38677423

ABSTRACT

Urban rivers provide an excellent opportunity for water recreation. This study probabilistically assessed health risks associated with water recreation in urban rivers in the Bitan Scenic Area, Taiwan, by employing quantitative microbial risk assessment and disability-adjusted life years (DALYs). Moreover, the effects of urbanization on the health risks of river recreation induced by waterborne pathogenic Escherichia coli (E. coli) were investigated. First, data on river E. coli levels were collected in both the Bitan Scenic Area and the upstream river section, and model parameters were obtained through a questionnaire administered to river recreationists. Monte Carlo simulation was then employed to address parameter uncertainty. Finally, DALYs were calculated to quantify the cumulative effects in terms of potential life lost and years lived with disability. The results indicated that the 90 % confidence intervals for the disease burden (DB) were 0.2-74.1 × 10-6, 0.01-94.0 × 10-6, and 0.3-128.9 × 10-6 DALY per person per year (pppy) for canoeing, swimming, and fishing, respectively, in the Bitan Scenic Area. Furthermore, urbanization near the Bitan Scenic Area approximately doubled the DB risks to river recreationists in upstream rural areas. At the 95th percentile, the DB risks exceeded the tolerances recommended by the World Health Organization (1 × 10-6) or U.S. Environmental Protection Agency (1 × 10-4). The findings suggest that the simultaneous implementation of effluent sewer systems and best management practices can reduce health risks to river recreationists by at least half, reducing the DALY levels below 1 × 10-4 or even 1 × 10-5 pppy.


Subject(s)
Escherichia coli , Recreation , Rivers , Urbanization , Risk Assessment , Rivers/microbiology , Humans , Taiwan/epidemiology , Escherichia coli/isolation & purification , Disability-Adjusted Life Years , Water Microbiology , Quality-Adjusted Life Years
5.
Appl Environ Microbiol ; 90(3): e0162923, 2024 03 20.
Article in English | MEDLINE | ID: mdl-38335112

ABSTRACT

We used quantitative microbial risk assessment to estimate ingestion risk for intI1, erm(B), sul1, tet(A), tet(W), and tet(X) in private wells contaminated by human and/or livestock feces. Genes were quantified with five human-specific and six bovine-specific microbial source-tracking (MST) markers in 138 well-water samples from a rural Wisconsin county. Daily ingestion risk (probability of swallowing ≥1 gene) was based on daily water consumption and a Poisson exposure model. Calculations were stratified by MST source and soil depth over the aquifer where wells were drilled. Relative ingestion risk was estimated using wells with no MST detections and >6.1 m soil depth as a referent category. Daily ingestion risk varied from 0 to 8.8 × 10-1 by gene and fecal source (i.e., human or bovine). The estimated number of residents ingesting target genes from private wells varied from 910 (tet(A)) to 1,500 (intI1 and tet(X)) per day out of 12,000 total. Relative risk of tet(A) ingestion was significantly higher in wells with MST markers detected, including wells with ≤6.1 m soil depth contaminated by bovine markers (2.2 [90% CI: 1.1-4.7]), wells with >6.1 m soil depth contaminated by bovine markers (1.8 [1.002-3.9]), and wells with ≤6.1 m soil depth contaminated by bovine and human markers simultaneously (3.1 [1.7-6.5]). Antibiotic resistance genes (ARGs) were not necessarily present in viable microorganisms, and ingestion is not directly associated with infection. However, results illustrate relative contributions of human and livestock fecal sources to ARG exposure and highlight rural groundwater as a significant point of exposure.IMPORTANCEAntibiotic resistance is a global public health challenge with well-known environmental dimensions, but quantitative analyses of the roles played by various natural environments in transmission of antibiotic resistance are lacking, particularly for drinking water. This study assesses risk of ingestion for several antibiotic resistance genes (ARGs) and the class 1 integron gene (intI1) in drinking water from private wells in a rural area of northeast Wisconsin, United States. Results allow comparison of drinking water as an exposure route for antibiotic resistance relative to other routes like food and recreational water. They also enable a comparison of the importance of human versus livestock fecal sources in the study area. Our study demonstrates the previously unrecognized importance of untreated rural drinking water as an exposure route for antibiotic resistance and identifies bovine fecal material as an important exposure factor in the study setting.


Subject(s)
Anti-Bacterial Agents , Drinking Water , Animals , Humans , Cattle , Anti-Bacterial Agents/pharmacology , Genes, Bacterial , Livestock , Feces , Soil , Risk Assessment , Drug Resistance, Microbial/genetics , Eating
6.
J Environ Manage ; 354: 120331, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38368808

ABSTRACT

Pathogens are ubiquitously detected in various natural and engineered water systems, posing potential threats to public health. However, it remains unclear which human-accessible waters are hotspots for pathogens, how pathogens transmit to these waters, and what level of health risk associated with pathogens in these environments. This review collaboratively focuses and summarizes the contamination levels of pathogens on the 5 water systems accessible to humans (natural water, drinking water, recreational water, wastewater, and reclaimed water). Then, we showcase the pathways, influencing factors and simulation models of pathogens transmission and survival. Further, we compare the health risk levels of various pathogens through Quantitative Microbial Risk Assessment (QMRA), and assess the limitations of water-associated QMRA application. Pathogen levels in wastewater are consistently higher than in other water systems, with no significant variation for Cryptosporidium spp. among five water systems. Hydraulic conditions primarily govern the transmission of pathogens into human-accessible waters, while environmental factors such as temperature impact pathogens survival. The median and mean values of computed public health risk levels posed by pathogens consistently surpass safety thresholds, particularly in the context of recreational waters. Despite the highest pathogens levels found in wastewater, the calculated health risk is significantly lower than in other water systems. Except pathogens concentration, variables like the exposure mode, extent, and frequency are also crucial factors influencing the public health risk in water systems. This review shares valuable insights to the more accurate assessment and comprehensive management of public health risk in human-accessible water environments.


Subject(s)
Cryptosporidiosis , Cryptosporidium , Drinking Water , Humans , Wastewater , Computer Simulation , Risk Assessment , Water Microbiology
7.
Water Res ; 253: 121197, 2024 Apr 01.
Article in English | MEDLINE | ID: mdl-38341968

ABSTRACT

The membrane bioreactor (MBR) process always offers better wastewater treatment than conventional activated sludge (CAS) treatment. However, the difference in their efficacy of virus reduction remains unknown. To investigate this, we monitored virus concentrations before and after MBR and CAS processes over 2 years. Concentrations of norovirus genotypes I and II (NoV GI and GII), aichivirus (AiV), F-specific RNA phage genotypes I, II, and III (GI-, GII-, and GIII-FRNAPHs), and pepper mild mottle virus (PMMoV) were measured by a quantitative polymerase chain reaction (qPCR) method at two municipal wastewater treatment plants (WWTPs A and B) in Japan. Virus concentration datasets containing left-censored data were estimated by using both maximum likelihood estimation (MLE) and robust regression on order statistics (rROS) approaches. PMMoV was the most prevalent at both WWTPs, with median concentrations of 7.5 to 8.8 log10 copies/L before treatment. Log10 removal values (LRVs) of all viruses based on means and standard deviations of concentrations before and after treatment were consistently higher following MBR than following CAS. We used NoV GII as a model pathogen in a quantitative microbial risk assessment of the treated water, and we estimated the additional reductions required following MBR and CAS processes to meet the guideline of 10-6 DALYs pppy for safe wastewater reuse.


Subject(s)
Viruses , Water Purification , Sewage , Wastewater , Bioreactors , Water Purification/methods , Waste Disposal, Fluid/methods
8.
Sensors (Basel) ; 23(24)2023 Dec 07.
Article in English | MEDLINE | ID: mdl-38139510

ABSTRACT

In order to effectively balance enforced guidance/regulation during a pandemic and limit infection transmission, with the necessity for public transportation services to remain safe and operational, it is imperative to understand and monitor environmental conditions and typical behavioural patterns within such spaces. Social distancing ability on public transport as well as the use of advanced computer vision techniques to accurately measure this are explored in this paper. A low-cost depth-sensing system is deployed on a public bus as a means to approximate social distancing measures and study passenger habits in relation to social distancing. The results indicate that social distancing on this form of public transport is unlikely for an individual beyond a 28% occupancy threshold, with an 89% chance of being within 1-2 m from at least one other passenger and a 57% chance of being within less than one metre from another passenger at any one point in time. Passenger preference for seating is also analysed, which clearly demonstrates that for typical passengers, ease of access and comfort, as well as seats having a view, are preferred over maximising social-distancing measures. With a highly detailed and comprehensive set of acquired data and accurate measurement capability, the employed equipment and processing methodology also prove to be a robust approach for the application.


Subject(s)
Physical Distancing , Transportation , Transportation/methods , Pandemics/prevention & control
9.
J Anim Sci Technol ; 65(5): 1024-1039, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37969349

ABSTRACT

In this study, we performed a quantitative microbial risk assessment (QMRA) of Salmonella through intake of egg consumption after cooking (dry-heat, moist-heat, and raw consumption). Egg samples (n = 201) from retail markets were analyzed for the presence of Salmonella. In addition, temperature and time were investigated during egg transit, storage, and display. A predictive model was developed to characterize the kinetic behavior of Salmonella in eggs, and data on egg consumption and frequency were collected. Eventually, the data was simulated to estimate egg-related foodborne illnesses. Salmonella was not found in any of the 201 egg samples. Thus, the estimated initial contamination level was -4.0 Log CFU/g. With R2 values of 0.898 and 0.922, the constructed predictive models were adequate for describing the fate of Salmonella in eggs throughout distribution and storage. Eggs were consumed raw (1.5%, 39.2 g), dry-heated (57.5%, 43.0 g), and moist-heated (41%, 36.1 g). The probability of foodborne Salmonella illness from the consumption of cooked eggs was evaluated to be 6.8×10-10. Additionally, the probability of foodborne illness not applied cooking methods was 1.9×10-7, indicating that Salmonella can be reduced by cooking. Therefore, the risk of Salmonella infection through consumption of eggs after cooking might be low in S. Korea.

10.
Microorganisms ; 11(10)2023 Sep 27.
Article in English | MEDLINE | ID: mdl-37894073

ABSTRACT

Cows are known carriers of Cryptosporidium parvum (C. parvum), a protozoa that can cause the gastrointestinal illness cryptosporidiosis in humans. Despite this potential exposure, dairy farmers tend to wear personal protective equipment (PPE) to protect the milk from contamination, rather than to protect themselves from zoonotic diseases, such as cryptosporidiosis. In this study, cow feces were collected from individual cattle on dairy farms and analyzed for C. parvum using qPCR. Quantitative microbial risk assessment (QMRA) was used to determine the risk of cryptosporidiosis to the dairy farmer with and without the use of handwashing and PPE (gloves and masks). The annualized risk of cryptosporidiosis to dairy farmers was 29.08% but was reduced significantly in each of the three interventions. Among the individual interventions, glove use provided the greatest reduction in risk, bringing the annual risk of cryptosporidiosis to 4.82%. Implementing regular handwashing, the use of gloves and a mask brought the annual risk of cryptosporidiosis to 1.29%. This study provides evidence that handwashing and PPE use can significantly reduce the risk of cryptosporidiosis to farmers and is worth implementing despite potential barriers such as discomfort and cost.

11.
J Hazard Mater ; 458: 132058, 2023 09 15.
Article in English | MEDLINE | ID: mdl-37459761

ABSTRACT

Antibiotic resistant Enterobacteriaceae pose a significant threat to public health. However, limited studies have evaluated the health risks associated with exposure to antibiotic-resistant bacteria (ARB), especially in natural environments. While quantitative microbial risk assessment (QMRA) assesses microbial risks in terms of the probability of infection, it does not account for the severity of health outcomes. In this study, a QMRA-DALY model was developed to integrate QMRA with health burden (disability-adjusted life years (DALY)) from infections caused by ARB. The model considers uncertainties in probability of infection and health burden assessment using Monte Carlo simulations. The study collected antimicrobial resistance (AMR) surveillance data from surface waters with different land uses. Results revealed water bodies with agricultural land use to be the main AMR hotspots, with the highest additional health burden observed in infections caused by meropenem-resistant E. coli (∆DALY = 0.0105 DALY/event) compared to antibiotic-susceptible E. coli. The estimated ∆DALY for antibiotic-resistant K. pneumoniae was lower than for antibiotic-resistant E. coli (highest ∆DALY = 0.00048 DALY/event). The study highlights the need for better evaluation of AMR associated health burden, and effective measures to mitigate the risks associated with antibiotic-resistant bacteria in natural environments.


Subject(s)
Enterobacteriaceae , Escherichia coli , Quality-Adjusted Life Years , Angiotensin Receptor Antagonists , Disability-Adjusted Life Years , Angiotensin-Converting Enzyme Inhibitors , Risk Assessment , Anti-Bacterial Agents
12.
Int J Food Microbiol ; 403: 110302, 2023 Oct 16.
Article in English | MEDLINE | ID: mdl-37392608

ABSTRACT

EFSA's Panel on Biological Hazards (BIOHAZ Panel) deals with questions on biological hazards relating to food safety and food-borne diseases. This covers food-borne zoonoses, transmissible spongiform encephalopathies, antimicrobial resistance, food microbiology, food hygiene, animal-by products, and associated waste management issues. The scientific assessments are diverse and frequently the development of new methodological approaches is required to deal with a mandate. Among the many risk factors, product characteristics (pH, water activity etc.), time and temperature of processing and storage along the food supply chain are highly relevant for assessing the biological risks. Therefore, predictive microbiology becomes an essential element of the assessments. Uncertainty analysis is incorporated in all BIOHAZ scientific assessments, to meet the general requirement for transparency. Assessments should clearly and unambiguously state what sources of uncertainty have been identified and their impact on the conclusions of the assessment. Four recent BIOHAZ Scientific Opinions are presented to illustrate the use of predictive modelling and quantitative microbial risk assessment principles in regulatory science. The Scientific Opinion on the guidance on date marking and related food information, gives a general overview on the use of predictive microbiology for shelf-life assessment. The Scientific Opinion on the efficacy and safety of high-pressure processing of food provides an example of inactivation modelling and compliance with performance criteria. The Scientific Opinion on the use of the so-called 'superchilling' technique for the transport of fresh fishery products illustrates the combination of heat transfer and microbial growth modelling. Finally, the Scientific Opinion on the delayed post-mortem inspection in ungulates, shows how variability and uncertainty, were quantitatively embedded in assessing the probability of Salmonella detection on carcasses, via stochastic modelling and expert knowledge elicitation.


Subject(s)
Food Microbiology , Foodborne Diseases , Animals , Zoonoses , Food Safety , Risk Assessment/methods
13.
Appl Environ Microbiol ; 89(7): e0012823, 2023 07 26.
Article in English | MEDLINE | ID: mdl-37310232

ABSTRACT

Essential food workers experience elevated risks of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection due to prolonged occupational exposures in food production and processing areas, shared transportation (car or bus), and employer-provided shared housing. Our goal was to quantify the daily cumulative risk of SARS-CoV-2 infection for healthy susceptible produce workers and to evaluate the relative reduction in risk attributable to food industry interventions and vaccination. We simulated daily SARS-CoV-2 exposures of indoor and outdoor produce workers through six linked quantitative microbial risk assessment (QMRA) model scenarios. For each scenario, the infectious viral dose emitted by a symptomatic worker was calculated across aerosol, droplet, and fomite-mediated transmission pathways. Standard industry interventions (2-m physical distancing, handwashing, surface disinfection, universal masking, ventilation) were simulated to assess relative risk reductions from baseline risk (no interventions, 1-m distance). Implementation of industry interventions reduced an indoor worker's relative infection risk by 98.0% (0.020; 95% uncertainty interval [UI], 0.005 to 0.104) from baseline risk (1.00; 95% UI, 0.995 to 1.00) and an outdoor worker's relative infection risk by 94.5% (0.027; 95% UI, 0.013 to 0.055) from baseline risk (0.487; 95% UI, 0.257 to 0.825). Integrating these interventions with two-dose mRNA vaccinations (86 to 99% efficacy), representing a worker's protective immunity to infection, reduced the relative infection risk from baseline for indoor workers by 99.9% (0.001; 95% UI, 0.0002 to 0.005) and outdoor workers by 99.6% (0.002; 95% UI, 0.0003 to 0.005). Consistent implementation of combined industry interventions, paired with vaccination, effectively mitigates the elevated risks from occupationally acquired SARS-CoV-2 infection faced by produce workers. IMPORTANCE This is the first study to estimate the daily risk of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection across a variety of indoor and outdoor environmental settings relevant to food workers (e.g., shared transportation [car or bus], enclosed produce processing facility and accompanying breakroom, outdoor produce harvesting field, shared housing facility) through a linked quantitative microbial risk assessment framework. Our model has demonstrated that the elevated daily SARS-CoV-2 infection risk experienced by indoor and outdoor produce workers can be reduced below 1% when vaccinations (optimal vaccine efficacy, 86 to 99%) are implemented with recommended infection control strategies (e.g., handwashing, surface disinfection, universal masking, physical distancing, and increased ventilation). Our novel findings provide scenario-specific infection risk estimates that can be utilized by food industry managers to target high-risk scenarios with effective infection mitigation strategies, which was informed through more realistic and context-driven modeling estimates of the infection risk faced by essential food workers daily. Bundled interventions, particularly if they include vaccination, yield significant reductions (>99%) in daily SARS-CoV-2 infection risk for essential food workers in enclosed and open-air environments.


Subject(s)
COVID-19 , Occupational Exposure , Humans , SARS-CoV-2 , COVID-19/prevention & control , Respiratory Aerosols and Droplets , Occupational Exposure/prevention & control , Infection Control
14.
Sci Total Environ ; 889: 164282, 2023 Sep 01.
Article in English | MEDLINE | ID: mdl-37209746

ABSTRACT

There is no reference of microbiological water quality in the European Union's Water Framework Directive, adapted into English law, and consequently microbial water quality is not routinely monitored in English rivers, except for two recently designated bathing water sites. To address this knowledge gap, we developed an innovative monitoring approach for quantitative assessment of combined sewer overflow (CSO) impacts on the bacteriology of receiving rivers. Our approach combines conventional and environmental DNA (eDNA) based methods to generate multiple lines of evidence for assessing risks to public health. We demonstrated this approach by investigating spatiotemporal variation in the bacteriology of the Ouseburn in northeast England for different weather conditions in the summer and early autumn of the year 2021 across eight sampling locations that comprised rural, urban, and recreational land use settings. We characterized pollution source attributes by collecting sewage from treatment works and CSO discharge at the peak of a storm event. CSO discharge was characterized by log10 values per 100 mL (average ± stdev) of 5.12 ± 0.03 and 4.90 ± 0.03 for faecal coliforms and faecal streptococci, and 6.00 ± 0.11 and 7.78 ± 0.04 for rodA and HF183 genetic markers, for E. coli and human host associated Bacteroides, respectively, indicating about 5 % sewage content. SourceTracker analysis of sequencing data attributed 72-77 % of bacteria in the downstream section of the river during a storm event to CSO discharge sources, versus only 4-6 % to rural upstream sources. Data from sixteen summer sampling events in a public park exceeded various guideline values for recreational water quality. Quantitative microbial risk assessment (QMRA) predicted a median and 95th percentile risk of 0.03 and 0.39, respectively, of contracting a bacterial gastrointestinal disease when wading and splashing around in the Ouseburn. We show clearly why microbial water quality should be monitored where rivers flow through public parks, irrespective of their bathing water designation.


Subject(s)
Bacteriology , DNA, Environmental , Humans , Escherichia coli , Environmental Monitoring/methods , Sewage/microbiology , Public Health , Bacteria/genetics , Water Microbiology
15.
Foods ; 12(4)2023 Feb 13.
Article in English | MEDLINE | ID: mdl-36832871

ABSTRACT

This study estimated the risk of hepatitis A virus (HAV) foodborne illness outbreaks through the consumption of fermented clams in South Korea. HAV prevalence in fermented clams was obtained from the Ministry of Food and Drug Safety Report, 2019. Fermented clam samples (2 g) were inoculated with HAV and stored at -20-25 °C. Based on the HAV titer (determined using plaque assay) in fermented clams according to storage, the Baranyi predictive models provided by Combase were applied to describe the kinetic behavior of HAV in fermented clams. The initial estimated HAV contamination level was -3.7 Log PFU/g. The developed predictive models revealed that, when the temperature increased, the number of HAV plaques decreased. The Beta-Poisson model was chosen for determining the dose-response of HAV, and the simulation revealed that there was a 6.56 × 10-11/person/day chance of contracting HAV foodborne illness by eating fermented clams. However, when only regular consumers of fermented clams were assumed as the population, the probability of HAV foodborne illness increased to 8.11 × 10-8/person/day. These results suggest that, while there is a low likelihood of HAV foodborne illness from consuming fermented clams across the country, regular consumers should be aware of the possibility of foodborne illness.

16.
Antibiotics (Basel) ; 11(10)2022 Oct 05.
Article in English | MEDLINE | ID: mdl-36290013

ABSTRACT

The occurrence of Staphylococcus aureus (S. aureus) and methicillin-resistant S. aureus (MRSA) in a sub-catchment of the Yodo River Basin, a representative water system of a drinking water source in Japan, was investigated. The chromogenic enzyme-substrate medium method was used for the detection of S. aureus and MRSA by the presence or absence of antimicrobials in the medium for viable bacteria in a culture-based setting. The contributions of S. aureus and MRSA from wastewater to the rivers were estimated based on mass flux-based analysis, and quantitative microbial risk assessment (QMRA) was further conducted for S. aureus and MRSA in river environments. The mean abundance of S. aureus and MRSA was 31 and 29 CFU/mL in hospital effluent, 124 and 117 CFU/mL in sewage treatment plant (STP) influent, 16 and 13 CFU/mL in STP effluent, and 8 and 9 CFU/mL in river water, respectively. Contribution of the pollution load derived from the target STP effluent to river water ranged from 2% to 25%. The QMRA showed that to achieve the established health benchmarks, the drinking water treatment process would need to yield 1.7 log10 and 2.9 log10 inactivation in terms of infection risk and disability-adjusted life year (DALY) indexes, respectively. These findings highlight the link between medical environment and the importance of environmental risk management for antimicrobial-resistant bacteria in aquatic environments.

17.
Environ Monit Assess ; 194(11): 842, 2022 Sep 29.
Article in English | MEDLINE | ID: mdl-36175694

ABSTRACT

When a sensitive host inhales aerosols containing these bacteria, Legionella infection occurs. Therefore, monitoring and assessing Legionella in the environment and water distribution systems of such places are critical due to the prone population in hospitals. However, the health risks of Legionella bacteria in the environment are not adequately evaluated. In this study, for hospitalized patients, we performed a quantitative health risk assessment of Legionella in selected hospitals in Tehran city using two scenarios of shower and toilet faucet exposure. This study identified Legionella in 38 cases (38%) out of 100 samples collected from toilet faucets and showers in 8 hospitals. The information gathered was used for quantitative microbial risk assessment (QMRA). The microbial load transmitted by inhalation was calculated using the concentration of Legionella in water. Other exposure parameters (inhalation rate and exposure time) were obtained using information from other studies and the median length of hospital stay (3.6 days). The exponential model was used to estimate the risk of infection (γ = 0.06) due to Legionella pneumophila (L. pneumophila) inhalation for each exposure event. For the mean concentration obtained for Legionella (103 CFU/L), the risk of infection for toilet faucets and showers was in the range of 0.23-2.3 and 3.5-21.9, respectively, per 10,000 hospitalized patients. The results were compared with the tolerable risk level of infection determined by the US EPA and WHO. The risk values exceeded the WHO values for waterborne pathogens in hospitals in both exposure scenarios. As a result, our QMRA results based on monitoring data showed that despite using treated water (from distribution networks in the urban areas) by hospitals, 38% of the samples were contaminated with Legionella, and faucets and showers can be sources of Legionella transmission. Hence, to protect the health of hospitalized patients, the risk of Legionella infection should be considered.


Subject(s)
Legionella pneumophila , Environmental Monitoring , Hospitals , Humans , Iran/epidemiology , Water
18.
Environ Manage ; 70(4): 633-649, 2022 10.
Article in English | MEDLINE | ID: mdl-35543727

ABSTRACT

Worldwide Low Impact Developments (LIDs) are used for sustainable stormwater management; however, both the stormwater and LIDs carry microbial pathogens. The widespread development of LIDs is likely to increase human exposure to pathogens and risk of infection, leading to unexpected disease outbreaks in urban communities. The risk of infection from exposure to LIDs has been assessed via Quantitative Microbial Risk Assessment (QMRA) during the operation of these infrastructures; no effort is made to evaluate these risks during the planning phase of LID treatment train in urban communities. We developed a new integrated "Regression-QMRA method" by examining the relationship between pathogens' concentration and environmental variables. Applying of this methodology to a planned LID train shows that the predicted disease burden of diarrhea from Campylobacter is highest (i.e. 16.902 DALYs/1000 persons/yr) during landscape irrigation and playing on the LID train, followed by Giardia, Cryptosporidium, and Norovirus. These results illustrate that the risk of microbial infection can be predicted during the planning phase of LID treatment train. These predictions are of great value to municipalities and decision-makers to make informed decisions and ensure risk-based planning of stormwater systems before their development.


Subject(s)
Cryptosporidiosis , Cryptosporidium , Cryptosporidiosis/epidemiology , Humans , Public Health , Risk Assessment/methods , Water Microbiology
19.
Environ Sci Technol ; 56(10): 6315-6324, 2022 05 17.
Article in English | MEDLINE | ID: mdl-35507527

ABSTRACT

Infection risk from waterborne pathogens can be estimated via quantitative microbial risk assessment (QMRA) and forms an important consideration in the management of public groundwater systems. However, few groundwater QMRAs use site-specific hazard identification and exposure assessment, so prevailing risks in these systems remain poorly defined. We estimated the infection risk for 9 waterborne pathogens based on a 2-year pathogen occurrence study in which 964 water samples were collected from 145 public wells throughout Minnesota, USA. Annual risk across all nine pathogens combined was 3.3 × 10-1 (95% CI: 2.3 × 10-1 to 4.2 × 10-1), 3.9 × 10-2 (2.3 × 10-2 to 5.4 × 10-2), and 1.2 × 10-1 (2.6 × 10-2 to 2.7 × 10-1) infections person-1 year-1 for noncommunity, nondisinfecting community, and disinfecting community wells, respectively. Risk estimates exceeded the U.S. benchmark of 10-4 infections person-1 year-1 in 59% of well-years, indicating that the risk was widespread. While the annual risk for all pathogens combined was relatively high, the average daily doses for individual pathogens were low, indicating that significant risk results from sporadic pathogen exposure. Cryptosporidium dominated annual risk, so improved identification of wells susceptible to Cryptosporidium contamination may be important for risk mitigation.


Subject(s)
Cryptosporidiosis , Cryptosporidium , Viruses , Bacteria , Humans , Minnesota , Risk Assessment , Water Microbiology , Water Supply , Water Wells
20.
Water Res ; 216: 118304, 2022 Jun 01.
Article in English | MEDLINE | ID: mdl-35325820

ABSTRACT

Water spray facilities are widely used in public places for sprinkling or beautifying the environment. However, the potential health risk induced by water aerosols increasingly calls for attention. In this study, the spatial distribution of water aerosols was investigated through the molecular sieve adsorption method, and predicted by discrete phase model (DPM). On this basis, the health risk regarding Legionella pneumophila for specific spray scenarios was evaluated by quantitative microbial risk assessment (QMRA). The results showed that the original droplet size can be described by the Rosin_Rommaler distribution (R2>0.99). The spatial distribution of water aerosols produced from a nozzle spray can be well predicted by the DPM. The concentration of water aerosols showed a sharp decline within 5 m from the nozzle and was not significantly different within 5 m (p>0.05) as for various spray scenarios. However, the difference was significant outside 5 m (p<0.05). Furthermore, a safe contact distance of exceeding 8 m is proposed in spray scenarios considering the risk threshold of 0.0001. Sensitivity analysis demonstrated the concentration of Legionella pneumophila in water aerosols as the critical factor affecting the health risk.


Subject(s)
Legionella pneumophila , Aerosols/analysis , Computer Simulation , Water , Water Microbiology
SELECTION OF CITATIONS
SEARCH DETAIL