ABSTRACT
The correlations between SARS-CoV-2 RNA levels in wastewater from 12 wastewater treatment plants and new COVID-19 cases in the corresponding sewersheds of 10 communities were studied over 17 months. The analysis from the longest continuous surveillance reported to date revealed that SARS-CoV-2 RNA levels correlated well with temporal changes of COVID-19 cases in each community. The strongest correlation was found during the third wave (r = 0.97) based on the population-weighted SARS-CoV-2 RNA levels in wastewater. Different correlations were observed (r from 0.51 to 0.86) in various sizes of communities. The population in the sewershed had no observed effects on the strength of the correlation. Fluctuation of SARS-CoV-2 RNA levels in wastewater mirrored increases and decreases of COVID-19 cases in the corresponding community. Since the viral shedding to sewers from all infected individuals is included, wastewater-based surveillance provides an unbiased and no-discriminate estimation of the prevalence of COVID-19 compared with clinical testing that was subject to testing-seeking behaviors and policy changes. Wastewater-based surveillance on SARS-CoV-2 represents a temporal trend of COVID-19 disease burden and is an effective and supplementary monitoring when the number of COVID-19 cases reaches detectable thresholds of SARS-CoV-2 RNA in wastewater of treatment facilities serving various sizes of populations.
ABSTRACT
With a unique and large size of testing results of 1,842 samples collected from 12 wastewater treatment plants (WWTP) for 14 months through from low to high prevalence of COVID-19, the sensitivity of RT-qPCR detection of SARS-CoV-2 RNA in wastewater that correspond to the communities was computed by using Probit analysis. This study determined the number of new COVID-19 cases per 100,000 population required to detect SARS-CoV-2 RNA in wastewater at defined probabilities and provided an evidence-based framework of wastewater-based epidemiology surveillance (WBE). Input data were positive and negative test results of SARS-CoV-2 RNA in wastewater samples and the corresponding new COVID-19 case rates per 100,000 population served by each WWTP. The analyses determined that RT-qPCR-based SARS-CoV-2 RNA detection threshold at 50%, 80% and 99% probability required a median of 8 (range: 4-19), 18 (9-43), and 38 (17-97) of new COVID-19 cases /100,000, respectively. Namely, the positive detection rate at 50%, 80% and 99% probability were 0.01%, 0.02%, and 0.04% averagely for new cases in the population. This study improves understanding of the performance of WBE SARS-CoV-2 RNA detection using the large datasets and prolonged study period. Estimated COVID-19 burden at a community level that would result in a positive detection of SARS-CoV-2 in wastewater is critical to support WBE application as a supplementary warning/monitoring system for COVID-19 prevention and control.
Subject(s)
COVID-19 , Humans , COVID-19/epidemiology , SARS-CoV-2/genetics , Wastewater/analysis , RNA, Viral/genetics , RNA, Viral/analysis , Alberta/epidemiologyABSTRACT
Wastewater-based surveillance (WBS) data normalization is an analyte measurement correction that addresses variations resulting from dilution of fecal discharge by non-sanitary sewage, stormwater or groundwater infiltration. No consensus exists on what WBS normalization parameters result in the strongest correlations and lead time between SARS-CoV-2 WBS data and COVID-19 cases. This study compared flow, population size and biomarker normalization impacts on the correlations and lead times for ten communities in twelve sewersheds in Alberta (Canada) between September 2020 and October 2021 (n = 1024) to determine if normalization by Pepper Mild Mottle Virus (PMMoV) provides any advantages compared to other normalization parameters (e.g., flow, reported and dynamic population sizes, BOD, TSS, NH3, TP). PMMoV concentrations (GC/mL) corresponded with plant influent flows and were highest in the urban centres. SARS-CoV-2 target genes E, N1 and N2 were all negatively associated with wastewater influent pH, while PMMoV was positively associated with temperature. Pooled data analysis showed that normalization increased ρ-values by almost 0.1 and was highest for ammonia, TKN and TP followed by PMMoV. Normalization by other parameters weakened associations. None of the differences were statistically significant. Site-specific correlations showed that normalization of SARS-CoV-2 data by PMMoV only improved correlations significantly in two of the twelve systems; neither were large sewersheds or combined sewer systems. In five systems, normalization by traditional wastewater strength parameters and dynamic population estimates improved correlations. Lead time ranged between 1 and 4 days in both pooled and site-specific comparisons. We recommend that WBS researchers and health departments: a) Investigate WWTP influent properties (e.g., pH) in the WBS planning phase and use at least two parallel approaches for normalization only if shown to provide value; b) Explore normalization by wastewater strength parameters and dynamic population size estimates further; and c) Evaluate purchasing an influent flow meter in small communities to support long-term WBS efforts and WWTP management.
Subject(s)
COVID-19 , Humans , SARS-CoV-2 , Alberta , Lead , Wastewater-Based Epidemiological MonitoringABSTRACT
Wastewater-based surveillance (WBS) data normalization is an analyte measurement correction that addresses variations resulting from dilution of fecal discharge by non-sanitary sewage, stormwater or groundwater infiltration. No consensus exists on what WBS normalization parameters result in the strongest correlations and lead time between SARS-CoV-2 WBS data and COVID-19 cases. This study compared flow, population size and biomarker normalization impacts on the correlations and lead times for ten communities in twelve sewersheds in Alberta (Canada) between September 2020 and October 2021 (n = 1024) to determine if normalization by Pepper Mild Mottle Virus (PMMoV) provides any advantages compared to other normalization parameters (e.g., flow, reported and dynamic population sizes, BOD, TSS, NH3, TP). PMMoV concentrations (GC/mL) corresponded with plant influent flows and were highest in the urban centres. SARS-CoV-2 target genes E, N1 and N2 were all negatively associated with wastewater influent pH, while PMMoV was positively associated with temperature. Pooled data analysis showed that normalization increased ρ-values by almost 0.1 and was highest for ammonia, TKN and TP followed by PMMoV. Normalization by other parameters weakened associations. None of the differences were statistically significant. Site-specific correlations showed that normalization of SARS-CoV-2 data by PMMoV only improved correlations significantly in two of the twelve systems; neither were large sewersheds or combined sewer systems. In five systems, normalization by traditional wastewater strength parameters and dynamic population estimates improved correlations. Lead time ranged between 1 and 4 days in both pooled and site-specific comparisons. We recommend that WBS researchers and health departments: a) Investigate WWTP influent properties (e.g., pH) in the WBS planning phase and use at least two parallel approaches for normalization only if shown to provide value; b) Explore normalization by wastewater strength parameters and dynamic population size estimates further; and c) Evaluate purchasing an influent flow meter in small communities to support long-term WBS efforts and WWTP management.
Subject(s)
COVID-19 , Wastewater , Humans , SARS-CoV-2 , Alberta , Lead , Wastewater-Based Epidemiological MonitoringABSTRACT
The land application of digestate from anaerobic digestion (AD) is considered a significant route for transmitting antibiotic resistance genes (ARGs) and mobile genetic elements (MGEs) to ecosystems. To date, efforts towards understanding complex non-linear interactions between AD operating parameters with ARG/MGE abundances rely on experimental investigations due to a lack of mechanistic models. Herein, three different machine learning (ML) algorithms, Random Forest (RF), eXtreme Gradient Boosting (XGBoost), and Artificial Neural Network (ANN), were compared for their predictive capacities in simulating ARG/MGE abundance changes during AD. The models were trained and cross-validated using experimental data collected from 33 published literature. The comparison of model performance using coefficients of determination (R2) and root mean squared errors (RMSE) indicated that ANN was more reliable than RF and XGBoost. The mode of operation (batch/semi-continuous), co-digestion of food waste and sewage sludge, and residence time were identified as the three most critical features in predicting ARG/MGE abundance changes. Moreover, the trained ANN model could simulate non-linear interactions between operational parameters and ARG/MGE abundance changes that could be interpreted intuitively based on existing knowledge. Overall, this study demonstrates that machine learning can enable a reliable predictive model that can provide a holistic optimization tool for mitigating the ARG/MGE transmission potential of AD.
Subject(s)
Anti-Bacterial Agents , Refuse Disposal , Algorithms , Anaerobiosis , Anti-Bacterial Agents/pharmacology , Drug Resistance, Microbial/genetics , Ecosystem , Food , Genes, Bacterial , Machine Learning , SewageABSTRACT
With a unique and large size of testing results of 1,842 samples collected from 12 wastewater treatment plants (WWTP) for 14 months through from low to high prevalence of COVID-19, the sensitivity of RT-qPCR detection of SARS-CoV-2 RNA in wastewater that correspond to the communities was computed by using Probit analysis. This study determined the number of new COVID-19 cases per 100,000 population required to detect SARS-CoV-2 RNA in wastewater at defined probabilities and provided an evidence-based framework of wastewater-based epidemiology surveillance (WBE). Input data were positive and negative test results of SARS-CoV-2 RNA in wastewater samples and the corresponding new COVID-19 case rates per 100,000 population served by each WWTP. The analyses determined that RT-qPCR-based SARS-CoV-2 RNA detection threshold at 50%, 80% and 99% probability required a median of 8 (range: 4-19), 18 (9-43), and 38 (17-97) of new COVID-19 cases /100,000, respectively. Namely, the positive detection rate at 50%, 80% and 99% probability were 0.01%, 0.02%, and 0.04% averagely for new cases in the population. This study improves understanding of the performance of WBE SARS-CoV-2 RNA detection using the large datasets and prolonged study period. Estimated COVID-19 burden at a community level that would result in a positive detection of SARS-CoV-2 in wastewater is critical to support WBE application as a supplementary warning/monitoring system for COVID-19 prevention and control.
Subject(s)
COVID-19 , Humans , COVID-19/epidemiology , SARS-CoV-2/genetics , RNA, Viral/genetics , RNA, Viral/analysis , Alberta/epidemiologyABSTRACT
Coastal waters, surface waters, and groundwater are impacted by wastewater and stormwater discharges, as well as agricultural flows containing animal waste and nutrients. A One Water approach posits that components of the water system have overlapping and interactive impacts on other aspects of the system, for which a comprehensive approach to water management is needed to further inform public health decisions. Current frameworks for monitoring wastewater effluent and recreational surface waters include the measurement of fecal indicator bacteria. Although viral pathogens are likely to be transported further and can survive longer than bacterial pathogens, virus monitoring is not required for recreational waters. A scientific consensus is emerging that the use of bacterial indicators alone does not account for nor represent the health risks associated with viral pathogens due to the differences in the fate and transport of bacterial versus viral pathogens in wastewater treatment, surface water, and groundwater. Furthermore, it is likely that the public health risk associated with these waterborne pathogens is variable and diverse. For example, under drought conditions, effluents of urban water systems can comprise most of the dry weather flow in downstream waters, which are often used as sources of drinking water. This de facto reuse could increase viral risk for the end users of this water. A One Water approach will aid in protecting the health of the public from waterborne pathogens, regardless of where those pathogens entered the water system. In this review, we assert that monitoring for fecal indicator viruses can complement the monitoring of bacterial indicators, thereby improving public health protections. Bacteriophages have the strongest research foundation and correlation with viral pathogens along with some prediction power for risk to human health. Methods for detecting and quantifying coliphages are briefly summarized, as are challenges in the implementation of testing. Key knowledge gaps and research priorities are discussed so that the potential value and limitations of coliphage monitoring can be better addressed and understood.
Subject(s)
Drinking Water , Water Purification , Animals , Coliphages , Environmental Monitoring , Feces/microbiology , Water MicrobiologyABSTRACT
Here, we report data of the principal component analysis (PCA) assessment and clustering analysis related to low-temperature thermal hydrolysis process (THP) for enhancing the anaerobic digestion (AD) of sludge in wastewater treatment plants (WWTPs) with primary sludge fermentation (Azizi et al., 2021). The PCA was examined to pinpoint the influence of different THP schemes on the variations of macromolecular compounds solubilization after low-temperature THP and the relative performances in enhancing methane potential in AD. We established 2 experimental setups with a total of 18 treatment conditions (3 exposure times, 30, 60, and 90 min at three temperature levels 50, 70 and 90 °C) in comparison to the untreated control samples. Scheme-1 comprises the THP of a mixture of (1:1 vol ratio) fermented primary sludge (FPS) and thickened waste activated sludge (TWAS); while scheme-2 comprised the THP of TWAS only. The factors employed in the assessment of the PCA encompassed the variations in the macromolecular compounds and other solubilization metrics. This included the variations in the levels of carbohydrates, lipids, proteins, and solubilization of chemical oxygen demand (COD) and volatile suspended solids (VSS). Furthermore, the evaluation considered the changes of volatile fatty acids (VFAs) and total ammonia nitrogen (TAN) with respect to time and temperature. The assessment of PCA classified the THP based on their differences and alterations that occurred after the treatment. The indices of the PCA assessments differed based on the factors of concern and the focus of each individual PCA assessment. In every individual PCA assessment, the respective contribution to the total variance in PCA analysis was calculated and manifested by the highest distribution of the principal components (PCs) axis PC1 and PC2. The differences in distributions of PCs after various PCA examinations can describe the relative influence of THP schemes and the most significant variables that can trigger major differences among THP conditions. The comparative differences demonstrated by PCA support the potential investigations of the efficiency of THPs conditions and their performance categories.
ABSTRACT
The objective of this study was to evaluate the impacts of leachate co-treatment on a full-scale municipal WWTPby comparing plant performance at varying levels of leachate contributions and hydraulic loadings.Leachate BOD:COD ratio was 0.08 ± 0.07 and indicated a stabilized, old matrix and concentrations of zinc, iron, aluminum, chloride and sulfate were 0.174, 38, 1.47, 1803 and 119.1 mg/L, respectively. The average volumetric leachate ratio (VLR%) was approximately 0.01% corresponding to a daily volume of 30 m3 but reaching a maximum of 270 m3(VLR% = 0.1%) and fluctuating on a daily-basis. A cluster analysis revealed 5 VLR% groupings that were used for subsequent analyses:no leachate, 0 < Low ≤ 0.001, 0.001 < Medium ≤ 0.02, 0.02 < High ≤ 0.05, 0.05 < Very high ≤ 0.2. Treated effluent concentrations of TKN, ammonia, fecal coliforms (FC),E. coli(EC), TSS and TP experienced atrend where effluent quality was improved at low and medium VLR%compared to no leachate addition, but deteriorated in high and very high VLR%.Treated effluent UVT% and EC were not statistically significantly different at varying VLR%, but FC was.Plant hydraulic had a significant impact on removal rates.Ammonia removals and nitrite concentrations improved inhigh flow conditions, whileTP, BOD and cBODremovals deteriorated. Finally,VLR%, leachate COD, TKN ammonia, chloride and arsenic had significant relationships with plant performance. Thus,for leachate with comparable age and strength, VLR% should not exceedlow to medium contributions(0 and 0.02%)during co-treatment at this WWTP.
Subject(s)
Water Pollutants, Chemical , Water Purification , Ammonia , Bioreactors , Escherichia coli , Nitrogen/analysis , Waste Disposal, Fluid , Water Pollutants, Chemical/analysisABSTRACT
Wastewater surveillance of SARS-CoV-2 has become a promising tool to estimate population-level changes in community infections and the prevalence of COVID-19 disease. Although many studies have reported the detection and quantification of SARS-CoV-2 in wastewater, remarkable variation remains in the methodology. In this study, we validated a molecular testing method by concentrating viruses from wastewater using ultrafiltration and detecting SARS-CoV-2 using one-step RT-qPCR assay. The following parameters were optimized including sample storage condition, wastewater pH, RNA extraction and RT-qPCR assay by quantification of SARS-CoV-2 or spiked human coronavirus strain 229E (hCoV-229E). Wastewater samples stored at 4 °C after collection showed significantly enhanced detection of SARS-CoV-2 with approximately 2-3 PCR-cycle threshold (Ct) values less when compared to samples stored at -20 °C. Pre-adjustment of the wastewater pH to 9.6 to aid virus desorption followed by pH readjustment to neutral after solid removal significantly increased the recovery of spiked hCoV-229E. Of the five commercially available RNA isolation kits evaluated, the MagMAX-96 viral RNA isolation kit showed the best recovery of hCoV-229E (50.1 ± 20.1%). Compared with two-step RT-qPCR, one-step RT-qPCR improved sensitivity for SARS-CoV-2 detection. Salmon DNA was included for monitoring PCR inhibition and pepper mild mottle virus (PMMoV), a fecal indicator indigenous to wastewater, was used to normalize SARS-CoV-2 levels in wastewater. Our method for molecular detection of SARS-CoV-2 in wastewater provides a useful tool for public health surveillance of COVID-19.
Subject(s)
COVID-19 , SARS-CoV-2 , Humans , RNA, Viral , Wastewater-Based Epidemiological MonitoringABSTRACT
Current wastewater worker guidance from the United States Environmental Protection Agency (USEPA) aligns with the Centers for Disease Control and Prevention (CDC) and the Occupational Safety and Health Administration (OSHA) recommendations and states that no additional specific protections against SARS-CoV-2, the virus that causes COVID-19 infections, are recommended for employees involved in wastewater management operations with residuals, sludge, and biosolids at water resource recovery facilities. The USEPA guidance references a document from 2002 that summarizes practices required for protection of workers handling class B biosolids to minimize exposure to pathogens including viruses. While there is no documented evidence that residuals or biosolids of any treatment level contain infectious SARS-CoV-2 or are a source of transmission of this current pandemic strain of coronavirus, this review summarizes and examines whether the provided federal guidance is sufficient to protect workers in view of currently available data on SARS-CoV-2 persistence and transmission. No currently available epidemiological data establishes a direct link between wastewater sludge or biosolids and risk of infection from the SARS-CoV-2. Despite shedding of the RNA of the virus in feces, there is no evidence supporting the presence or transmission of infectious SARS-CoV-2 through the wastewater system or in biosolids. In addition, this review presents previous epidemiologic data related to other non-enveloped viruses. Overall, the risk for exposure to SARS-CoV-2, or any pathogen, decreases with increasing treatment measures. As a result, the highest risk of exposure is related to spreading and handling untreated feces or stool, followed by untreated municipal sludge, the class B biosolids, while lowest risk is associated with spreading or handling Class A biosolids. This review reinforces federal recommendations and the importance of vigilance in applying occupational risk mitigation measures to protect public and occupational health.
Subject(s)
COVID-19 , Occupational Health , Biosolids , Humans , Pandemics , SARS-CoV-2 , United StatesABSTRACT
As the numbers of COVID-19 cases grew globally, the severe shortages of health care respiratory protective equipment impacted the ability of water resource recovery facilities (WRRFs) to acquire N95 masks for worker protection. While the Occupational Safety and Health Administration (OSHA) encourages WRRFs to conduct job safety assessments to mitigate risks from bioaerosols, it does not provide clear guidance on respiratory protection requirements, leaving the use of N95 masks across the industry non-standardized and difficult to justify. Strategies need to be developed to cope with shortages during pandemics, and these should take into consideration a WRRF's size and disinfection equipment available. Our objective is to provide an overview of respiratory protection-related practices recommended for health care professionals that apply to WRRFs (e.g., elimination, substitution, extended use, reuse, disinfection). Reviewed N95 mask disinfection strategies included using hydrogen peroxide, autoclaving, moist heat, dry heat, ultraviolet germicidal irradiation (UVGI), ethylene oxide, chlorine and ethanol. Of these, dry heat, autoclaving and UVGI present the most promise for WRRFs, with UVGI being limited to larger utilities. We recommend that WRRFs work closely with disinfection technology manufacturers, mask providers, health and safety staff and inspectors to develop suitable programs to cope with N95 mask shortages during pandemics.
Subject(s)
COVID-19 , Pandemics , Equipment Reuse , Humans , N95 Respirators , SARS-CoV-2 , Water ResourcesABSTRACT
While researchers have acknowledged the potential role of environmental scientists, engineers, and industrial hygienists during this pandemic, the role of the water utility professional is often overlooked. The wastewater sector is critical to public health protection and employs collection and treatment system workers who perform tasks with high potential for exposures to biological agents. While various technical guidances and reports have initially provided direction to the water sector, the rapidly growing body of research publications necessitates the constant review of these papers and data synthesis. This paper presents the latest findings and highlights their implications from a water and wastewater utility operation and management perspective. PRACTITIONER POINTS: Extrapolation from SARS-CoV-1 and MERS-CoV, as well as other surrogates, has helped predicting SARS-CoV-2 behavior and risk management. Data from treated wastewater effluent suggest that current processes are sufficient for SARS-CoV-2 control. Scientific evidence supports the possibility of fecal-oral transmission for SARS-CoV-2. Limited evidence supports the potential survival of infective SARS-CoV-2 on surfaces and in aerosols and the efficacy of control measures at reducing transmission. Protective practices and PPE can protect workers from SARS-CoV-2 and other pathogens found in wastewater.
Subject(s)
COVID-19 , SARS-CoV-2 , Humans , Pandemics , WaterABSTRACT
Biological nutrient removal is highly reliant on maintaining a heterogeneous, balanced, and metabolically active microbial community that can adapt to the fluctuating composition of influent wastewater and encompassing environmental conditions. Maintaining this balance can be challenging in municipal wastewater systems that sporadically receive wastewater from industrial facilities due to the impact of heavy metals and other contaminants on the microbial ecology of the activated sludge. A thorough understanding of the impacts of heavy metals on activated sludge and of practical monitoring options is needed to support decision-making at the wastewater utility level. This paper is divided into two parts. In the first part, the review explains what happens when heavy metals interact with activated sludge systems by highlighting biosorption and bioaccumulation processes, and when an activated sludge system switches from bioaccumulation to toxic shock. Here, it also summarizes the impacts of heavy metal exposure on plant performance. In the second part, the review summarizes practical approaches that can be used at the plant outside the realm of traditional toxicological bioassays testing to determine the possible impacts of influent heavy metal concentrations on the BNR process. These approaches include the following: monitoring operational parameters for major shifts; respirometry; microscopy; ATP; chemical analyses of heavy metals with a focus on synergistic impacts and inhibitory limits; and other novel approaches, such as EPS chemical analyses, molecular techniques, and quorum sensing.
Subject(s)
Metals, Heavy/analysis , Microbiota , Environmental Monitoring , Sewage , /analysisABSTRACT
The aim of this study was to compare the antibiotic susceptibility of eighty Escherichia coli isolates from vegetables and food products of animal origin in Tunisia, and to study their genes encoding antibiotic resistance and in vitro biofilm forming capacity. Antimicrobial susceptibilities were determined, as well as PCR investigation of genes associated with antibiotic resistance. Biofilm formation was tested using four different methods: the microtiter plate-, MTT-staining-, XTT-staining-, and the Congo Red Agar assays. High antibiotic resistance rates were observed for amoxicillin (68.7%), amoxicillin/clavulanic acid (73.7%), gentamicin (68.7%), kanamycin (66.2%), nalidixic acid (36.2%), streptomycin (68.7%) and tetracycline (35%). The majority of isolates was multidrug resistant and biofilm producer. MTT testing showed that vegetables isolates were significantly higher biofilm producers compared to foods of animal origins. This study showed that E. coli isolates from food products were reservoirs of genes encoding antibiotic-resistance and have a high propensity to produce biofilm.
Subject(s)
Escherichia coli , Vegetables , Animals , Biofilms , Drug Resistance, Microbial , Escherichia coli/genetics , Humans , Microbial Sensitivity Tests , TunisiaABSTRACT
Association of the water- and foodborne pathogen Campylobacter jejuni with free-living Acanthamoeba spp. trophozoites enhances C. jejuni survival and resistance to biocides and starvation. When facing less than optimal environmental conditions, however, the Acanthamoeba spp. host can temporarily transform from trophozoite to cyst and back to trophozoite, calling the survival of the internalized symbiont and resulting public health risk into question. Studies investigating internalized C. jejuni survival after A. castellanii trophozoite transformation have neither been able to detect its presence inside the Acanthamoeba cyst after encystation nor to confirm its presence upon excystation of trophozoites through culture-based techniques. The purpose of this study was to detect C. jejuni and Mycobacterium avium recovered from A. polyphaga trophozoites after co-culture and induction of trophozoite encystation using three different encystation methods (Neff's medium, McMillen's medium and refrigeration), as well as after cyst excystation. Internalized M. avium was used as a positive control, since studies have consistently detected the organism after co-culture and after host excystation. Concentrations of C. jejuni in A. polyphaga trophozoites were 4.5â¯×â¯105â¯CFU/ml, but it was not detected by PCR or culture post-encystation. This supports the hypothesis that C. jejuni may be digested during encystation of the amoebae. M. avium was recovered at a mean concentration of 1.9â¯×â¯104 from co-cultured trophozoites and 4.4â¯×â¯101â¯CFU/ml after excystation. The results also suggest that M. avium recovery post-excystation was statistically significantly different based on which encystation method was used, ranging from 1.3â¯×â¯101 for Neff's medium to 5.4â¯×â¯101â¯CFU/ml for refrigeration. No M. avium was recovered from A. polyphaga cysts when trophozoites were encysted by McMillen's medium. Since C. jejuni internalized in cysts would be more likely to survive harsh environmental conditions and disinfection, a better understanding of potential symbioses between free-living amoebae and campylobacters in drinking water distribution systems and food processing environments is needed to protect public health. Future co-culture experiments examining survival of internalized C. jejuni should carefully consider the encystation media used, and include molecular detection tools to falsify the hypothesis that C. jejuni may be present in a viable but not culturable state.
Subject(s)
Acanthamoeba/microbiology , Campylobacter jejuni/physiology , Mycobacterium avium/physiology , Acanthamoeba/genetics , Acanthamoeba/growth & development , Bacterial Load , Coculture Techniques , Culture Media/chemistry , DNA, Protozoan/isolation & purification , Nucleic Acid Amplification Techniques , Refrigeration , Symbiosis , TrophozoitesABSTRACT
BACKGROUND: Evidence implicates textiles in health care as potential reservoirs of pathogens. No similar data exist for the wastewater treatment plant (WWTP) industry. We investigated if coveralls worn by WWTP workers could present occupational infection risk. METHODS: We enumerated heterotrophic plate counts (HPCs), total coliforms, Escherichia coli, Pseudomonas aeruginosa, Staphylococcus aureus, methicillin-resistant Staphylococcus aureus, Clostridium difficile, and Acinetobacter spp on coverall swatches experimentally contaminated with raw, primary, secondary, and final effluent. Contaminated swatches were examined by culture-based methods after laundering, tumble-drying, and storing. RESULTS: Concentrations of microorganisms and efficacy of decontamination differed depending on the contaminating wastewater matrix and the organism. Laundering was an effective decontamination method for coveralls contaminated with all microorganisms, except HPCs. Tumble-drying resulted in statistically significant decreases for HPCs, P aeruginosa, and Acinetobacter. Increases in contamination after laundering were seen in Acinetobacter spp, in P aeruginosa when coverall swatches were contaminated with raw and final effluent, and in HPCs when contaminated with secondary effluent. DISCUSSION: Results suggest that solely laundering at 60°C for 25 minutes as per ASTM Standard F1449 may not always be an efficient means of controlling microorganisms on coveralls. CONCLUSIONS: Clearer guidelines are needed to better protect WWTP workers.
Subject(s)
Decontamination/methods , Laundering/methods , Textiles/microbiology , /microbiology , Bacteria/growth & development , Colony Count, Microbial/methods , HumansABSTRACT
This study examined the distribution of antibiotic resistant Escherichia coli and E. coli O157 isolated from water, sediment and biofilms in an intensive agricultural watershed (Elk Creek, British Columbia) between 2005 and 2007. It also examined physical and chemical water parameters associated with antibiotic resistance. Broth microdilution techniques were used to determine minimum inhibitory concentrations (MIC) for E. coli (n=214) and E. coli O157 (n=27) recovered isolates for ampicillin, cefotaxime, ciprofloxacin, nalidixic acid, streptomycin and tetracycline. Both E. coli and E. coli O157 isolates showed highest frequency of resistance to tetracycline, ampicillin, streptomycin and nalidixic acid; respectively. For E. coli, the highest frequency of resistance was observed at the most agriculturally-impacted site, while the lowest frequency of resistance was found at the headwaters. Sediment and river rock biofilms were the most likely to be associated with resistant E. coli, while water was the least likely. While seasonality (wet versus dry) had no relationship with resistance frequency, length of biofilm colonization of the substratum in the aquatic environment only affected resistance frequency to nalidixic acid and tetracycline. Multivariate logistic regressions showed that water depth, nutrient concentrations, temperature, dissolved oxygen and salinity had statistically significant associations with frequency of E. coli resistance to nalidixic acid, streptomycin, ampicillin and tetracycline. The results indicate that antibiotic resistant E. coli and E. coli O157 were prevalent in an agricultural stream. Since E. coli is adept at horizontal gene transfer and prevalent in biofilms and sediment, where ample opportunities for genetic exchange with potential environmental pathogens present themselves, resistant isolates may present a risk to ecosystem, wildlife and public health.
Subject(s)
Agriculture , Biofilms , Drug Resistance, Microbial , Escherichia coli/isolation & purification , Geologic Sediments/microbiology , Water Microbiology , British Columbia , Escherichia coli/drug effects , Microbial Sensitivity TestsABSTRACT
In vitro and animal studies report that some persistent organic pollutants (POPs) trigger the secretion of proinflammatory cytokines. Whether POP exposure is associated with a dysregulation of cytokine response remains to be investigated in humans. We studied the strength of association between plasma POP levels and circulating cytokines as immune activation markers. Plasma levels of fourteen POPs and thirteen cytokines were measured in 39 Caucasians from a comparator sample in Québec City (Canada) and 72 First Nations individuals from two northern communities of Ontario (Canada). Caucasians showed significantly higher levels of organochlorine insecticides (ß-HCH, p,p'-DDE and HCB) compared to First Nations. Conversely, First Nations showed higher levels of Mirex, Aroclor 1260, PCB 153, PCB 170, PCB 180 and PCB 187 compared to Caucasians. While there was no difference in cytokine levels of IL-4, IL-6, IL-10 and IL-22 between groups, First Nations had significantly greater average levels of IFNγ, IL-1ß, IL-2, IL-5, IL-8, IL-12p70, IL-17A, TNFα and TNFß levels compared to Caucasians. Among candidate predictor variables (age, body mass index, insulin resistance and POP levels), high levels of PCBs were the only predictor accounting for a small but significant effect of observed variance (â¼7%) in cytokine levels. Overall, a weak but significant association is detected between persistent organochlorine pollutant exposure and elevated cytokine levels. This finding augments the already existing information that environmental pollution is related to inflammation, a common feature of several metabolic disorders that are known to be especially prevalent in Canada's remote First Nations communities.