Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
Add more filters










Publication year range
1.
Pathogens ; 12(10)2023 Oct 03.
Article in English | MEDLINE | ID: mdl-37887732

ABSTRACT

For the microbiological safety of drinking water, disinfection methods are used to remove or inactivate microorganisms. Chlorine and chlorine dioxide are often used as disinfectants in drinking water treatment plants (DWTPs). We investigated the effectiveness of these chemicals in inactivate echovirus 30 (E30), simian 11 rotavirus (RV SA11), and human adenovirus type 2 (HAdV2) in purified water from a DWTP. Within two minutes of contact, chlorine dioxide inactivated E30 by 4-log10, RV SA11 by 3-log10, and HAdV2 could not be detected, while chlorine reduced E30 by 3-log10, RV SA11 by 2-3log10, and HAdV2 by 3-4log10. However, viral genomes could be detected for up to 2 h using qPCR. The CT method, based on a combination of disinfectant concentration and contact time, during such a short initial phase, is problematic. The high concentrations of disinfectant needed to neutralize organic matter may have a strong immediate effect on virus viability. This may lead to the underestimation of disinfection and overdosing of disinfectants in water with organic contamination. These results are useful for the selection of disinfection systems for reuse of treated wastewater and in the risk assessment of water treatment processes using chlorine and chlorine dioxide.

2.
Article in English | MEDLINE | ID: mdl-35886521

ABSTRACT

Irradiation with ultraviolet light (UV) at 254 nm is effective in inactivating a wide range of human pathogens. In Sweden, a UV dose of 400 J/m2 is often used for the treatment of drinking water. To investigate its effect on virus inactivation, enteric viruses with different genomic organizations were irradiated with three UV doses (400, 600, and 1000 J/m2), after which their viability on cell cultures was examined. Adenovirus type 2 (double-stranded DNA), simian rotavirus 11 (double-stranded RNA), and echovirus 30 (single-stranded RNA) were suspended in tap water and pumped into a laboratory-scale Aquada 1 UV reactor. Echovirus 30 was reduced by 3.6-log10 by a UV dose of 400 J/m2. Simian rotavirus 11 and adenovirus type 2 were more UV resistant with only 1-log10 reduction at 400 J/m2 and needed 600 J/m2 for 2.9-log10 and 3.1-log10 reductions, respectively. There was no significant increase in the reduction of viral viability at higher UV doses, which may indicate the presence of UV-resistant viruses. These results show that higher UV doses than those usually used in Swedish drinking water treatment plants should be considered in combination with other barriers to disinfect the water when there is a risk of fecal contamination of the water.


Subject(s)
Drinking Water , Enterovirus , Rotavirus , Water Purification , Adenoviridae/genetics , Disinfection/methods , Humans , Sweden , Ultraviolet Rays , Virus Inactivation/radiation effects , Water Purification/methods
3.
Sci Total Environ ; 831: 154874, 2022 Jul 20.
Article in English | MEDLINE | ID: mdl-35358515

ABSTRACT

Deficiencies in drinking water distribution networks, such as cross-connections, may lead to contamination of the drinking water and pose a serious health risk to consumers. Cross-connections and backflows are considered among the most severe public health risks in distribution networks. The aim of this paper was to provide a framework for estimating the risk of infection from cross-connection and backflow events. Campylobacter, norovirus, and Cryptosporidium were chosen as reference pathogens for this study. The theoretical framework was constructed based on the fault tree analysis methodology. National aggregated cross-connection incident data was used to calculate the probability of a contamination event occurring in Swedish networks. Three risk cases were evaluated: endemic, elevated, and extreme. Quantitative microbial risk assessment (QMRA) was used to assess daily risk of infection for average national estimates. The framework was also evaluated using local data from the Gothenburg network. The daily risk of infection from cross-connection and backflow events in Swedish networks was generally above an acceptable target level of 10-6 for all reference pathogens and modelled cases; the exception was for the Gothenburg system where the risk was lower than 10-7. An outbreak case study was used to validate the framework results. For the outbreak case study, contaminant transport in the network was simulated using hydraulic modelling (EPANET), and risk estimates were calculated using QMRA. The outbreak simulation predicted between 97 and 148 symptomatic infections, while the epidemiological survey conducted during the outbreak reported 179 cases of illness. The fault tree analysis framework was successfully validated using an outbreak case study, though it was shown on the example of Gothenburg that local data is still needed for well-performing systems. The framework can help inform microbial risk assessments for drinking water suppliers, especially ones with limited resources and expertise in this area.


Subject(s)
Cryptosporidiosis , Cryptosporidium , Drinking Water , Giardia , Humans , Risk Assessment/methods , Water Microbiology , Water Supply
4.
Appl Environ Microbiol ; 86(24)2020 11 24.
Article in English | MEDLINE | ID: mdl-33036988

ABSTRACT

Influent wastewater and effluent wastewater at the Rya treatment plant in Gothenburg, Sweden, were continuously monitored for enteric viruses by quantitative PCR (qPCR) during 1 year. Viruses in effluent wastewater were also identified by next-generation sequencing (NGS) in samples collected during spring, early summer, and winter. Samples of incoming wastewater were collected every second week. Seasonal variations in viral concentrations in incoming wastewater were found for noroviruses GII, sapovirus, rotavirus, parechovirus, and astrovirus. Norovirus GI and GIV and Aichi virus were present in various amounts during most weeks throughout the year, while hepatitis A virus, enterovirus, and adenovirus were identified less frequently. Fluctuations in viral concentrations in incoming wastewater were related to the number of diagnosed patients. The viruses were also detected in treated wastewater, however, with a 3- to 6-log10 reduction in concentration. Seven different hepatitis E virus (HEV) strains were identified in the effluents. Five of these strains belonged to genotype 3 and have been isolated in Sweden from swine, wild boars, and humans and in drinking water. The other two strains were divergent and had not been identified previously. They were similar to strains infecting rats and humans. Surveillance of enteric viruses in wastewater is a tool for early detection and follow-up of gastroenteritis outbreaks in society and for the identification of new viruses that can cause infection in humans.IMPORTANCE Both influent wastewater and treated wastewater at a wastewater treatment plant (WWTP) contain a high variety of human viral pathogens with seasonal variability when followed for 1 year. The peak of the amount of 11 different viruses in the inlet wastewater preceded the peak of the number of diagnosed patients by 2 to 4 weeks. The treatment of wastewater reduced viral concentrations by 3 to 6 log10 Despite the treatment of wastewater, up to 5 log10 virus particles per liter were released from into the surrounding river. Hepatitis E virus (HEV) strains previously identified in drinking water and two new strains, similar to those infecting rats and humans, were identified in the treated wastewater released from the WWTP.


Subject(s)
Metagenome , Viruses/isolation & purification , Wastewater/virology , Metagenomics , Real-Time Polymerase Chain Reaction , Seasons , Virus Physiological Phenomena , Viruses/classification , Viruses/genetics
5.
Water Res ; 168: 115141, 2020 Jan 01.
Article in English | MEDLINE | ID: mdl-31590036

ABSTRACT

In this study, next generation sequencing was used to explore the virome in 20L up to 10,000L water from different purification steps at two Swedish drinking water treatment plants (DWTPs), and in tap water. One DWTP used ultrafiltration (UF) with 20 nm pores, the other UV light treatment after conventional treatment of the water. Viruses belonging to 26 different families were detected in raw water, in which 6-9 times more sequence reads were found for phages than for known environmental, plant or vertebrate viruses. The total number of viral reads was reduced more than 4-log10 after UF and 3-log10 over UV treatment. However, for some viruses the reduction was 3.5-log10 after UF, as for hepatitis E virus (HEV), which was also detected in tap water, with sequences similar to those in raw water and after treatment. This indicates that HEV had passed through the treatment and entered into the supply network. However, the viability of the viruses is unknown. In tap water 10-130 International Units of HEV RNA/mL were identified, which is a comparable low amount of virus. The risk of getting infected through consumption of tap water is probably negligible, but needs to be investigated. The HEV strains in the waters belonged to subtypes HEV3a and HEV3c/i, which is associated with unknown source of infection in humans infected in Sweden. None of these subtypes are common among pigs or wild boar, the major reservoirs for HEV, indicating that water may play a role in transmitting this virus. The results indicate that monitoring small fecal/oral transmitted viruses in DWTPs may be considered, especially during community outbreaks, to prevent potential transmission by tap water.


Subject(s)
Hepatitis E virus , Swine Diseases , Viruses , Animals , Genotype , Humans , Phylogeny , RNA, Viral , Sweden , Swine
6.
Sci Total Environ ; 706: 135680, 2020 Mar 01.
Article in English | MEDLINE | ID: mdl-31784151

ABSTRACT

A wide range of organic micropollutants (n = 163) representing several compound categories (pharmaceuticals, pesticides, per- and polyfluorinated alkyl substances, flame retardants, phthalates, food additives, drugs and benzos) were analysed in water samples from the Göta Älv river (Sweden's second largest source water). The sampling also included raw water and finished drinking water from seven drinking water treatment plants and in addition a more detailed sampling at one of the treatment plants after six granulated active carbon filters of varying operational ages. In total, 27 organic micropollutants were detected, with individual concentrations ranging from sub ng L-1 levels to 54 ng L-1. The impact of human activities along the flow path was reflected by increased concentrations downstream the river, with total concentrations ranging from 65 ng L-1 at the start of the river to 120 ng L-1 at the last sampling point. The removal efficiency was significantly (p = 0.014; one-sided t-test) higher in treatment plants that employed granulated active carbon filters (n = 4; average 60%) or artificial infiltration (n = 1; 65%) compared with those that used a more conventional treatment strategy (n = 2; 38%). The removal was also strongly affected by the operational age of the carbon filters. A filter with an operational age of 12 months with recent addition of ~10% new material showed an average removal efficiency of 92%, while a 25-month old filter had an average of 76%, and an even lower 34% was observed for a 71-month old filter. The breakthrough in the carbon filters occurred in the order of dissolved organic carbon, per- and polyfluorinated alkyl substances and then other organic micropollutants. The addition of fresh granulated active carbon seemed to improve the removal of hydrophobic organic compounds, particularly dissolved organic carbon and per- and polyfluorinated alkyl substances.


Subject(s)
Water Purification , Carbon , Drinking Water , Sweden , Water Pollutants, Chemical
7.
PLoS One ; 9(5): e98546, 2014.
Article in English | MEDLINE | ID: mdl-24874010

ABSTRACT

BACKGROUND: The river Göta Älv is a source of freshwater for 0.7 million swedes. The river is subject to contamination from sewer systems discharge and runoff from agricultural lands. Climate models projects an increase in precipitation and heavy rainfall in this region. This study aimed to determine how daily rainfall causes variation in indicators of pathogen loads, to increase knowledge of variations in river water quality and discuss implications for risk management. METHODS: Data covering 7 years of daily monitoring of river water turbidity and concentrations of E. coli, Clostridium and coliforms were obtained, and their short-term variations in relation with precipitation were analyzed with time series regression and non-linear distributed lag models. We studied how precipitation effects varied with season and compared different weather stations for predictive ability. RESULTS: Generally, the lowest raw water quality occurs 2 days after rainfall, with poor raw water quality continuing for several more days. A rainfall event of >15 mm/24-h (local 95 percentile) was associated with a three-fold higher concentration of E. coli and 30% higher turbidity levels (lag 2). Rainfall was associated with exponential increases in concentrations of indicator bacteria while the effect on turbidity attenuated with very heavy rainfall. Clear associations were also observed between consecutive days of wet weather and decreased water quality. The precipitation effect on increased levels of indicator bacteria was significant in all seasons. CONCLUSIONS: Rainfall elevates microbial risks year-round in this river and freshwater source and acts as the main driver of varying water quality. Heavy rainfall appears to be a better predictor of fecal pollution than water turbidity. An increase of wet weather and extreme events with climate change will lower river water quality even more, indicating greater challenges for drinking water producers, and suggesting better control of sources of pollution.


Subject(s)
Rain , Rivers , Water Microbiology , Water Pollution , Geography , Humans , Nonlinear Dynamics , Seasons , Sweden , Weather
8.
Water Res ; 47(13): 4474-84, 2013 Sep 01.
Article in English | MEDLINE | ID: mdl-23764597

ABSTRACT

There are relatively few studies on the association between disturbances in drinking water services and symptoms of gastrointestinal (GI) illness. Health Call Centres data concerning GI illness may be a useful source of information. This study investigates if there is an increased frequency of contacts with the Health Call Centre (HCC) concerning gastrointestinal symptoms at times when there is a risk of impaired water quality due to disturbances at water works or the distribution network. The study was conducted in Gothenburg, a Swedish city with 0.5 million inhabitants with a surface water source of drinking water and two water works. All HCC contacts due to GI symptoms (diarrhoea, vomiting or abdominal pain) were recorded for a three-year period, including also sex, age, and geocoded location of residence. The number of contacts with the HCC in the affected geographical areas were recorded during eight periods of disturbances in the water works (e.g. short stops of chlorine dosing), six periods of large disturbances in the distribution network (e.g. pumping station failure or pipe breaks with major consequences), and 818 pipe break and leak repairs over a three-year period. For each period of disturbance the observed number of calls was compared with the number of calls during a control period without disturbances in the same geographical area. In total about 55, 000 calls to the HCC due to GI symptoms were recorded over the three-year period, 35 per 1000 inhabitants and year, but much higher (>200) for children <3 yrs of age. There was no statistically significant increase in calls due to GI illness during or after disturbances at the water works or in the distribution network. Our results indicate that GI symptoms due to disturbances in water works or the distribution network are rare. The number of serious failures was, however limited, and further studies are needed to be able to assess the risk of GI illness in such cases. The technique of using geocoded HCC data together with geocoded records of disturbances in the drinking water network was feasible.


Subject(s)
Drinking Water , Gastrointestinal Diseases/epidemiology , Health Services/statistics & numerical data , Water Purification , Water Supply , Child, Preschool , Humans , Sweden/epidemiology
9.
J Water Health ; 10(3): 358-70, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22960480

ABSTRACT

The faecal contamination of drinking water sources can lead to waterborne disease outbreaks. To estimate a potential risk for waterborne infections caused by faecal contamination of drinking water sources, knowledge of the pathogen concentrations in raw water is required. We suggest a novel approach to estimate pathogen concentrations in a drinking water source by using microbial source tracking data and fate and transport modelling. First, the pathogen (norovirus, Cryptosporidium, Escherichia coli O157/H7) concentrations in faecal contamination sources around the drinking water source Lake Rådasjön in Sweden were estimated for endemic and epidemic conditions using measured concentrations of faecal indicators (E. coli and Bacteroidales genetic markers). Afterwards, the fate and transport of pathogens within the lake were simulated using a three-dimensional coupled hydrodynamic and microbiological model. This approach provided information on the contribution from different contamination sources to the pathogen concentrations at the water intake of a drinking water treatment plant. This approach addresses the limitations of monitoring and provides data for quantitative microbial risk assessment (QMRA) and risk management in the context of faecal contamination of surface drinking water sources.


Subject(s)
Environmental Monitoring , Models, Theoretical , Water Microbiology , Water Supply/analysis , Computer Simulation , Feces/microbiology , Fresh Water/microbiology , Fresh Water/parasitology , Humans , Sweden , Time Factors , Water , Water Movements , Water Pollutants , Water Pollution/prevention & control
10.
Water Res ; 46(7): 2149-58, 2012 May 01.
Article in English | MEDLINE | ID: mdl-22348998

ABSTRACT

Lifetime distribution functions and current network age data can be combined to provide an assessment of the future replacement needs for drinking water distribution networks. Reliable lifetime predictions are limited by a lack of understanding of deterioration processes for different pipe materials under varied conditions. An alternative approach is the use of real historical data for replacement over an extended time series. In this paper, future replacement needs are predicted through historical data representing more than one hundred years of drinking water pipe replacement in Gothenburg, Sweden. The verified data fits well with commonly used lifetime distribution curves. Predictions for the future are discussed in the context of path dependence theory.


Subject(s)
Drinking Water , Models, Economic , Water Supply/economics , Water Supply/statistics & numerical data , Forecasting/methods , Sweden
11.
Environ Sci Technol ; 46(2): 892-900, 2012 Jan 17.
Article in English | MEDLINE | ID: mdl-22148545

ABSTRACT

The implementation of microbial fecal source tracking (MST) methods in drinking water management is limited by the lack of knowledge on the transport and decay of host-specific genetic markers in water sources. To address these limitations, the decay and transport of human (BacH) and ruminant (BacR) fecal Bacteroidales 16S rRNA genetic markers in a drinking water source (Lake Rådasjön in Sweden) were simulated using a microbiological model coupled to a three-dimensional hydrodynamic model. The microbiological model was calibrated using data from outdoor microcosm trials performed in March, August, and November 2010 to determine the decay of BacH and BacR markers in relation to traditional fecal indicators. The microcosm trials indicated that the persistence of BacH and BacR in the microcosms was not significantly different from the persistence of traditional fecal indicators. The modeling of BacH and BacR transport within the lake illustrated that the highest levels of genetic markers at the raw water intakes were associated with human fecal sources (on-site sewers and emergency sewer overflow). This novel modeling approach improves the interpretation of MST data, especially when fecal pollution from the same host group is released into the water source from different sites in the catchment.


Subject(s)
Bacteria/classification , Bacteria/genetics , Rivers/microbiology , Water Microbiology , Water Supply/standards , Animals , Environmental Monitoring/methods , Feces/microbiology , Genetic Markers , Humans , Light , Models, Biological , Seasons , Time Factors , Water Pollutants
12.
Water Res ; 45(1): 241-53, 2011 Jan.
Article in English | MEDLINE | ID: mdl-20943244

ABSTRACT

Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction.


Subject(s)
Cost-Benefit Analysis , Risk Reduction Behavior , Water Supply/analysis
13.
Water Res ; 43(6): 1641-53, 2009 Apr.
Article in English | MEDLINE | ID: mdl-19157488

ABSTRACT

Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.


Subject(s)
Water Purification/standards , Water Supply/standards , Environmental Monitoring/methods , Models, Biological , Monte Carlo Method , Probability , Reproducibility of Results , Risk Assessment , Sweden , Water Microbiology/standards
14.
Scand J Infect Dis ; 39(4): 323-31, 2007.
Article in English | MEDLINE | ID: mdl-17454896

ABSTRACT

A large community outbreak of norovirus (NV) gastrointestinal infection occurred in Västra Götaland County, Sweden in August 2004, following attendance at recreational lakes. A frequency age-matched case control study was undertaken of persons who had attended these lakes to identify risk factors. 163 cases and 329 controls were included. Analysis indicates that having water in the mouth while swimming (OR=4.7; 95% CI 1.1-20.2), attendance at the main swimming area at Delsjön Lake (OR=25.5; 95% CI 2.5-263.8), taking water home from a fresh water spring near Delsjön lake (OR=17.3; 95% CI 2.7-110.7) and swimming less than 20 m from shore (OR=13.4; 95% CI 2.0-90.2) were significant risk factors. The probable vehicle was local contamination of the lake water (especially at the main swimming area). The source of contamination could not be determined.


Subject(s)
Caliciviridae Infections/epidemiology , Disease Outbreaks , Fresh Water/virology , Gastroenteritis/epidemiology , Norovirus , Swimming , Adolescent , Adult , Case-Control Studies , Child , Child, Preschool , Female , Gastroenteritis/virology , Health Surveys , Humans , Infant , Male , Recreation , Risk Factors , Sweden/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...