Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 87
Filter
1.
Sci Total Environ ; 945: 173997, 2024 Oct 01.
Article in English | MEDLINE | ID: mdl-38879034

ABSTRACT

The demonstration of enteric virus removal for indirect potable reuse of advanced purified water is necessary to ensure safe water reclamation practices. This study evaluated the efficacy of soil treatment in reducing concentrations of Pepper Mild Mottle Virus (PMMoV), Hepatitis A (HAV), and Norovirus (NoV) gene markers through bench scale unsaturated soil columns. Three different infiltration rates were evaluated to determine their impact on viral gene marker removal. The concentrations of viral markers in the column influent and effluent samples were measured through RNA extraction and then RT-qPCR, and the log reduction values (LRVs) were calculated to quantify the effectiveness of removal across the columns. The LRVs achieved for PMMoV were 2.80 ± 0.36, 2.91 ± 0.48, and 2.72 ± 0.32 for infiltration rates of 4.9 mm/h, 9.4 mm/h, and 14.0 mm/h, respectively. A one-way ANOVA indicated no statistically significant differences in LRVs among the various infiltration rates (p-value = 0.329). All samples measured for HAV were below the detection limit both in the influent and effluent of the soil columns. While NoV GI and GII markers were measurable in the soil column influent, they were removed to below the detection limit in the effluent. The use of half the Limit-of-Detection (LoD) for effluent values enabled the estimation of log removals, which were calculated as 1.42 ± 0.07, 1.64 ± 0.29, and 1.74 ± 0.18 for NoV GI and 1.14 ± 0.19, 1.58 ± 0.21, and 1.87 ± 0.41 for NoV GII at infiltration rates of 4.9 mm/h, 9.4 mm/h, and 14.0 mm/h. This highlights the efficacy of soil treatment in reducing virus gene marker concentrations at various infiltration rates, and that spreading basins employed for reclaimed water recharge to ground water aquifers are an effective method for reducing the presence of viral contaminants in indirect potable reuse systems.


Subject(s)
Groundwater , Soil , Groundwater/virology , Groundwater/chemistry , Water Purification/methods , Norovirus/genetics , Norovirus/isolation & purification , Tobamovirus/isolation & purification , Tobamovirus/genetics , Soil Microbiology , Hepatitis A virus/isolation & purification , Hepatitis A virus/genetics
2.
Heliyon ; 10(8): e29462, 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38638959

ABSTRACT

This research evaluated the relationship between daily new Coronavirus Disease 2019 (COVID-19) cases and Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) concentrations in wastewater, followed by effects of differential SARS-CoV-2 shedding loads across various COVID-19 outbreaks. Linear regression analyses were utilized to examine the lead time of the SARS-CoV-2 signal in wastewater relative to new COVID-19 clinical cases. During the Delta wave, no lead time was evident, highlighting limited predictive capability of wastewater monitoring during this phase. However, significant lead times were observed during the Omicron wave, potentially attributed to testing capacity overload and subsequent case reporting delays or changes in shedding patterns. During the Post-Omicron wave (Febuary 23 to May 19, 2022), no lead time was discernible, whereas following the lifting of the COVID-19 state of emergency (May 30, 2022 to May 30, 2023), the correlation coefficient increased and demonstrated the potential of wastewater surveillance as an early warning system. Subsequently, we explored the virus shedding in wastewater through feces, operationalized as the ratio of SARS-CoV-2 concentrations to daily new COVID-19 cases. This ratio varied significantly across the Delta, Omicron, other variants and post-state-emergency phases, with the Kruskal-Wallis H test confirming a significant difference in medians across these stages (P < 0.0001). Despite its promise, wastewater surveillance of COVID-19 disease prevalence presents several challenges, including virus shedding variability, data interpretation complexity, the impact of environmental factors on viral degradation, and the lack of standardized testing procedures. Overall, our findings offer insights into the correlation between COVID-19 cases and wastewater viral concentrations, potential variation in SARS-CoV-2 shedding in wastewater across different pandemic phases, and underscore the promise and limitations of wastewater surveillance as an early warning system for disease prevalence trends.

3.
Water Environ Res ; 96(4): e11015, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38599573

ABSTRACT

The recent SARS-CoV-2 outbreak yielded substantial data regarding virus fate and prevalence at water reclamation facilities (WRFs), identifying influential factors as natural decay, adsorption, light, pH, salinity, and antagonistic microorganisms. However, no studies have quantified the impact of these factors in full scale WRFs. Utilizing a mass balance approach, we assessed the impact of natural decay and other fate mechanisms on genetic marker removal during water reclamation, through the use of sludge and wastewater genetic marker loading estimates. Results indicated negligible removal of genetic markers during P/PT (primary effluent (PE) p value: 0.267; preliminary and primary treatment (P/PT) accumulation p value: 0.904; and thickened primary sludge (TPS) p value: 0.076) indicating no contribution of natural decay and other fate mechanisms toward removal in P/PT. Comparably, adsorption and decomposition was found to be the dominant pathway for genetic marker removal (thickened waste activated sludge (TWAS) log loading 9.75 log10 GC/day); however, no estimation of log genetic marker accumulation could be carried out due to high detections in TWAS. PRACTITIONER POINTS: The mass balance approach suggested that the contribution of natural decay and other fate mechanisms to virus removal during wastewater treatment are negligible compared with adsorption and decomposition in P/PT (p value: 0.904). During (P/PT), a higher viral load remained in the (PE) (14.16 log10 GC/day) compared with TPS (13.83 log10 GC/day); however, no statistical difference was observed (p value: 0.280) indicting that adsorption/decomposition most probably did not occur. In secondary treatment (ST), viral genetic markers in TWAS were consistently detected (13.41 log10 GC/day) compared with secondary effluent (SE), indicating that longer HRT and the potential presence of extracellular polymeric substance-containing enriched biomass enabled adsorption/decomposition. Estimations of total solids and volatile solids for TPS and TWAS indicated that adsorption affinity was different between solids sampling locations (p value: <0.0001).


Subject(s)
COVID-19 , Water Purification , Humans , Sewage/chemistry , SARS-CoV-2/genetics , Genetic Markers , Water , Extracellular Polymeric Substance Matrix , Waste Disposal, Fluid/methods
4.
Water Environ Res ; 96(2): e10990, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38291828

ABSTRACT

The study evaluated the removal efficacy of per- and poly-fluoroalkyl substances (PFAS) across various advanced water treatment (AWT) processes in a field-scale AWT train using secondary effluent samples from a full-scale water reclamation facility (WRF). Samples collected from April to October 2020 revealed PFCAs as the dominant PFAS compounds in the WRF secondary effluent, with PFPeA having the highest average concentration and PFSAs in notably lower amounts. Temporal fluctuations in total PFAS concentrations peaked in September 2020, which may reflect the seasonality in PFAS discharges related to applications like AFFFs and pesticides. In assessing AWT processes, coagulation-flocculation-clarification-filtration system showed no notable PFAS reduction, while ozonation resulted in elevated PFBS and PFBA concentrations. Biological activated carbon (BAC) filtration effectively removed long-chain PFAS like PFOS and PFHxS but saw increased concentrations of short-chain PFAS post-treatment. Granular activated carbon (GAC) filtration was the most effective treatment, reducing all PFSAs below the detection limits and significantly decreasing most PFCAs, though short-chain PFCAs persisted. UV treatment did not remove short-chain PFCAs such as PFBA, PFPeA, and PFHxA. The findings highlight the efficacy of AWT processes like GAC in PFAS reduction for potable reuse, but also underscore the challenge presented by short-chain PFAS, emphasizing the need for tailored treatment strategies. PRACTITIONER POINTS: Secondary effluents showed higher concentrations of PFCAs compared to PFSAs. Advanced water treatment effectively removes long-chain PFAS but not short-chain. Ozonation may contribute to formation of short-chain PFAS. BAC is less effective on short-chain PFAS, requiring further GAC treatment.


Subject(s)
Fluorocarbons , Ozone , Water Pollutants, Chemical , Water Purification , Charcoal , Water Pollutants, Chemical/analysis , Water Purification/methods , Fluorocarbons/analysis
5.
Sci Total Environ ; 912: 169637, 2024 Feb 20.
Article in English | MEDLINE | ID: mdl-38157893

ABSTRACT

This research investigated the removal of contaminants of emerging concern (CECs) and characterized the microbial community across an advanced water treatment (AWT) train consisting of Coagulation/Flocculation/Clarification/Granular Media Filtration (CFCGMF), Ozone-Biological Activated Carbon Filtration (O3/BAC), Granular Activated Carbon filtration, Ultraviolet Disinfection, and Cartridge Filtration (GAC/UV/CF). The AWT train successfully met the goals of CECs and bulk organics removal. The microbial community at each treatment step of the AWT train was characterized using 16S rRNA sequencing on the Illumina MiSeq platform generated from DNA extracted from liquid and solid (treatment media) samples taken along the treatment train. Differences in the microbial community structure were observed. The dominant operational taxonomic units (OTU) decreased along the treatment train, but the treatment steps did impact the microbial community composition downstream of each unit process. These results provide insights into microbial ecology in advanced water treatment systems, which are influenced and shaped by each treatment step, the microbial community interactions, and their potential metabolic contribution to CECs degradation.


Subject(s)
Drinking Water , Ozone , Water Pollutants, Chemical , Water Purification , Charcoal/chemistry , RNA, Ribosomal, 16S , Water Pollutants, Chemical/chemistry , Water Purification/methods , Filtration/methods , Ozone/chemistry
6.
Water Res ; 244: 120474, 2023 Oct 01.
Article in English | MEDLINE | ID: mdl-37611358

ABSTRACT

We investigated short (first post-fire precipitation)- and long-term (11-month) impacts of the Caldor and Mosquito Fires (2021 and 2022) on water quality, dissolved organic matter, and disinfection byproduct (DBP) precursors in burned and adjacent unburned watersheds. Both burned watersheds experienced water quality degradation compared to their paired unburned watersheds, including increases in dissolved organic carbon (DOC), dissolved organic nitrogen (DON), and DBP precursors from precipitation events. DBP precursor concentrations during storm events were greater in the Caldor Fire's burned watershed than in the unburned watershed; precursors of trihalomethanes (THMs), haloacetic acids (HAAs), haloacetonitriles (HANs), and haloacetamides (HAMs) were 533 µg/L, 1,231 µg/L, 64 and 58 µg/L greater. The burned watershed of the Mosquito Fire also had greater median concentrations of THM (44 µg/L), HAA (37 µg/L), HAN (7 µg/L), and HAM (13 µg/L) precursors compared to the unburned watershed during a storm immediately following the fire. Initial flushes from both burned watersheds formed greater concentrations of more toxic DBPs, such as HANs and HAMs. The Caldor Fire burn area experienced a rain-on-snow event shortly after the fire which produced the greatest degradation of water quality of all seasons/precipitation events/watersheds studied. Over the long term, statistical analysis revealed that DOC and DON values in the burned watershed of the Caldor Fire remained higher than the unburned control (0.98 mg C/L and 0.028 mg N/L, respectively). These short and long-term findings indicate that wildfires present potential treatment challenges for public water systems outside of the two studied here.


Subject(s)
Disinfectants , Water Pollutants, Chemical , Water Purification , Wildfires , Disinfection , Rivers , Nitrogen/analysis , Dissolved Organic Matter , Trihalomethanes/analysis , Water Pollutants, Chemical/analysis , Disinfectants/analysis
7.
Chemosphere ; 337: 139384, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37414300

ABSTRACT

With the recent focus on using advanced water treatment processes for water reuse, interest is growing for utilizing enhanced coagulation to remove dissolved chemical species. Up to 85% of the nitrogen in wastewater effluent is made up of dissolved organic nitrogen (DON), but there is a knowledge gap regarding its removal during coagulation, which can be influenced by DON characteristics. To address this issue, tertiary-treated wastewater samples were analyzed before and after coagulation with polyaluminum chloride and ferric chloride. Samples were size-fractionated into four molecular weight fractions (0.45 µm, 0.1 µm, 10 kDa, and 3 kDa) using vacuum filtration and ultrafiltration. Each fraction was further evaluated by coagulating it separately to assess DON removal during enhanced coagulation. The size fractionated samples were also separated into hydrophilic and hydrophobic fractions using C18 solid phase extraction disks. Fluorescence excitation-emission matrices were used to investigate the characteristics of dissolved organic matter contributing to DON during the coagulation process. The results showed that DON compounds of size <3 kDa constituted a majority of the total DON. Coagulation removed more than 80% DON from size fractions 0.45 µm-0.1 µm and 0.1 µm-10 kDa, but less than 20% was removed from 10 kDa to 3 kDa and <3 kDa fractions. Coagulation on pre-filtered samples removed 19% and 25% of the <3 kDa DON fraction using polyaluminum chloride and ferric chloride, respectively. In all molecular weight fractions, hydrophilic DON compounds were found to be dominant (>90%), and enhanced coagulation was not effective in removing hydrophilic DON compounds. LMW fractions respond poorly to enhanced coagulation due to their hydrophilic nature. Enhanced coagulation effectively removes humic acid-like substances, but poorly removes proteinaceous compounds such as tyrosine and tryptophan. This study's findings provide insights into DON behavior during coagulation and factors affecting its removal, potentially improving wastewater treatment strategies.


Subject(s)
Water Pollutants, Chemical , Water Purification , Wastewater , Dissolved Organic Matter , Nitrogen/analysis , Water Pollutants, Chemical/analysis , Water Purification/methods
8.
One Health ; 16: 100536, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37041760

ABSTRACT

Detection of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) viral genome in wastewater has proven to be useful for tracking the trends of virus prevalence within the community. The surveillance also provides precise and early detection of any new and circulating variants, which aids in response to viral outbreaks. Site-specific monitoring of SARS-CoV-2 variants provides valuable information on the prevalence of new or emerging variants in the community. We sequenced the genomic RNA of viruses present in the wastewater samples and analyzed for the prevalence of SARS-CoV-2 variants as well as other respiratory viruses for a period of one year to account for seasonal variations. The samples were collected from the Reno-Sparks metropolitan area on a weekly basis between November 2021 to November 2022. Samples were analyzed to detect the levels of SARS-CoV-2 genomic copies and variants identification. This study confirmed that wastewater monitoring of SARS-CoV-2 variants can be used for community surveillance and early detection of circulating variants and supports wastewater-based epidemiology (WBE) as a complement to clinical respiratory virus testing as a healthcare response effort. Our study showed the persistence of the SARS-CoV-2 virus throughout the year compared to a seasonal presence of other respiratory viruses, implicating SARS-CoV-2's broad genetic diversity and strength to persist and infect susceptible hosts. Through secondary analysis, we further identified antimicrobial resistance (AMR) genes in the same wastewater samples and found WBE to be a feasible tool for community AMR detection and monitoring.

9.
Sci Total Environ ; 881: 163516, 2023 Jul 10.
Article in English | MEDLINE | ID: mdl-37059138

ABSTRACT

Soil aquifer treatment systems are known to further remove contaminants in wastewater effluent when applied through infiltration into the ground. Dissolved organic nitrogen (DON) in the effluent, a precursor for nitrogenous disinfection by-products (DBPs) such as N-nitrosodimethylamine (NDMA), is of great concern upon subsequent use of the groundwater infiltered into the aquifer. In this study, the vadose zone of the soil aquifer treatment system was simulated using 1 m laboratory soil columns under unsaturated conditions representing the vadose zone. The final effluent of a water reclamation facility (WRF) was applied to these columns to investigate the removal of N species with a focus on DON, as well as NDMA precursors. DON removal achieved was up to 99 % with an average of 68 % and was accompanied by a 52 % nitrate increase suggesting the occurrence of ammonification and nitrification through the soil columns. Around 62 % of total DON removal was seen at <10 cm travel distance, which was in accordance with higher adenosine triphosphate (ATP) concentrations at the top of the column due to more oxygen and organic matter availability. Total Dissolved N removal was drastically lowered to 4.5 % in the same column without microbial growth, which highlights the importance of biodegradation. The columns were capable of removing 56 % of the fluorescent dissolved organic matter (FDOM). Soil columns could remove NDMA precursors up to 92 % through the column with the initial concentration of 89.5 ng/L, possibly due to the removal of DON fractions. The results demonstrate the capability of the vadose zone in further treatment of DON and other organic matter before reaching the groundwater through infiltration or indirect discharge to surface water. Differences in applied water quality and the site-specific oxic conditions in SAT systems could lead to variable removal efficiencies.

10.
Sci Total Environ ; 877: 162864, 2023 Jun 15.
Article in English | MEDLINE | ID: mdl-36931510

ABSTRACT

Most wastewater treatment facilities that satisfy stricter discharge restrictions for nutrients, remove dissolved inorganic nitrogen (DIN) species efficiently, leaving dissolved organic nitrogen (DON) to be present at a higher proportion (up to 85 %) of total nitrogen (TN) in the effluent. Discharged DON promotes algae growth in receiving water bodies and is a growing concern in effluent potable reuse applications considering its potential to form hazardous nitrogenous disinfection byproducts (N-DBPs). Enhanced coagulation is an established process in the advanced water treatment train for most potable reuse applications. However, so far, no information has been collected at the pilot scale to address DON removal efficiency and process implications by enhanced coagulation under real conditions. This study performed a comprehensive evaluation of DON removal from the effluent of the Truckee Meadows Water Reclamation Facility (TMWRF) by enhanced coagulation over the course of 11 months at the pilot scale. Three different coagulants (aluminum sulfate (alum), poly­aluminum chloride (PACl), ferric chloride (FC)) and a cationic polymer coagulant aid (Clarifloc) were used. Optimum doses for each coagulant and polymer and ideal pH were determined by jar tests and applied at the pilot. Alum (24 mg/L) resulted in highly variable DON removal (6 % - 40 %, 21 % on average), which was enhanced by the addition of polymer, leading to 32 % DON removal on average. PACl (40 mg/L) and FC (100 mg/L) resulted in more consistent DON removal (on average 45 % and 57 %, respectively); however, polymer addition exerted minimal enhancement for these coagulants. Overall, enhanced coagulation effectively reduced DON in the tertiary effluent at the pilot scale. The treatment showed auxiliary benefits, including dissolved organic carbon (DOC) and orthophosphate removal.

11.
Environ Pollut ; 315: 120367, 2022 Dec 15.
Article in English | MEDLINE | ID: mdl-36240970

ABSTRACT

A model was developed to simulate the pH-dependent speciation and fate of ionizable pharmaceutical and personal care products (iPPCPs) in soils and their plant uptake during thedt application of reclaimed wastewater to agricultural soils. The simulation showed that pH plays an important role in regulating the plant uptake of iPPCPs, i.e., ibuprofen (IBU; with a carboxylic group), triclosan (TCS; phenolic group), and fluoxetine (FXT; amine group) as model compounds. It took 89-487 days for various iPPCPs to reach the steady-state concentrations in soil and plant tissues. The simulated steady-state concentrations of iPPCPs in plant tissues at pH 9 is 2.2-2.3, 2.5-2.6, and 1.07-1.08 times that at pH 5 for IBU, TCS, and FXT, respectively. Assuming sorption only for neutral compounds led to miscalculation of iPPCPs concentrations in plant tissues by up to one and half orders magnitude. Efflux of compounds in soil, lettuce leaf, and soybean pods was primarily contributed by their degradation in soil and dilution due to plant tissue growth. Overall, the results demonstrated the importance of considering pH and speciation of iPPCPs when simulating their fate in the soil-plant system and plant uptake.


Subject(s)
Cosmetics , Soil Pollutants , Triclosan , Soil/chemistry , Soil Pollutants/analysis , Cosmetics/analysis , Triclosan/analysis , Pharmaceutical Preparations , Hydrogen-Ion Concentration
12.
Sci Rep ; 12(1): 16141, 2022 09 27.
Article in English | MEDLINE | ID: mdl-36167869

ABSTRACT

Detection of SARS-CoV-2 viral load in wastewater has been highly informative in estimating the approximate number of infected individuals in the surrounding communities. Recent developments in wastewater monitoring to determine community prevalence of COVID-19 further extends into identifying SARS-CoV-2 variants, including those being monitored for having enhanced transmissibility. We sequenced genomic RNA derived from wastewater to determine the variants of coronaviruses circulating in the communities. Wastewater samples were collected from Truckee Meadows Water Reclamation Facility (TMWRF) from November 2020 to June 2021. SARS-CoV-2 variants resulting from wastewater were compared with the variants detected in infected individuals' clinical specimens (nasal/nasopharyngeal swabs) during the same period and found conclusively in agreement. Therefore, wastewater monitoring for SARS-CoV-2 variants in the community is a feasible strategy as a complementary tool to clinical specimen testing in the latter's absence.


Subject(s)
COVID-19 , SARS-CoV-2 , COVID-19/diagnosis , COVID-19/epidemiology , Humans , RNA , RNA, Viral/genetics , SARS-CoV-2/genetics , Wastewater
13.
Article in English | MEDLINE | ID: mdl-36078831

ABSTRACT

This paper presents a new, innovative technological approach, in line with Circular Economy principles, to the effective management of sludge generated during municipal wastewater treatment processes and subsequently used for biogas production. This approach allows for optimal, functional, and controlled cascade-type biotechnological thermal conversion of carbon compounds present in sewage sludge, later in solid digestate residues (after biogas production), and finally in the ash structure (after incineration, purposefully dosed nanostructural additives make the production of a useful solid product possible, especially for cyclic adsorption and slow release of nutrients (N, P, K) in the soil). The idea is generally targeted at achieving an innovative conversion cycle under a Circular Economy framework. In particular, it is based on an energy carrier (methane biogas) and direct energy production. The functionalized combustion by-products can be advantageous in agriculture. The use of ashes with nanostructural additives (halloysite, kaolinite) from combustion of sewage sludge after the anaerobic fermentation as an adsorbent of selected nutrients important in agriculture (Na+, K+, NO3-, SO42-, PO43-, Cl-) was verified at laboratory scale. The tests were carried out both for pure ash and for the ash derived from combustion with the purposeful addition of kaolinite or halloysite. The equilibrium conditions for nitrate, potassium, sodium, phosphate(V), sulphate(VI), and chloride ions from aqueous solutions with the use of the three adsorbent structures were determined. The obtained innovative results were interpreted theoretically with adsorption isotherm models (Langmuir, Freundlich, Temkin, Jovanovic). The most spectacular and clearly favorable results related to the influence of nanostructural additives in the process of sludge combustion, and formation of sorption surfaces under high temperature conditions were identified in the case of sorption-based separation of phosphate(V) ions (an increase from 1.13% to 61.24% with the addition of kaolinite, and even up to 76.19% with addition of halloysite).


Subject(s)
Biofuels , Sewage , Adsorption , Clay , Digestion , Kaolin , Nutrients , Phosphates , Sewage/chemistry
14.
Water Environ Res ; 94(5): e10726, 2022 May.
Article in English | MEDLINE | ID: mdl-35621226

ABSTRACT

Ozone-biological activated carbon (ozone-BAC)-based technologies are emerging as an appealing option for potable reuse systems; however, uncertainty remains regarding the reduction of waterborne pathogens. Common log reduction requirements have been modeled after California Department of Drinking Water's 12-10-10 log reduction value (LRV) for enteric virus, Cryptosporidium, and Giardia, respectively. The objective of this research was to investigate appropriate LRVs of pathogens that can be achieved in ozone-BAC-based treatment systems and to assess the applicability of employing drinking water pathogen guidelines for potable reuse applications. A pilot scale ozone-BAC-based treatment train was operated at two water reclamation facilities in Reno, Nevada, USA. Virus, Cryptosporidium, Giardia, and bacterial indicators were monitored across individual and combined treatment processes. Pathogen barriers investigated include conventional filtration, ozonation, and ultraviolet disinfection. Based on sampling and treatment validation strategies, the three pathogen barriers can provide minimum LRVs of 13-9-9.5 for virus, Giardia, and Cryptosporidium. Secondary biological treatment can provide additional pathogen LRVs with site-specific sampling. The present study addresses regulatory uncertainties associated with ozone-BAC pathogen reduction. PRACTITIONER POINTS: Ozone-biological activated carbon-based advanced treatment can meet pathogen LRV requirements with a minimum of three pathogen barriers. Successfully applied drinking water pathogen reduction guidelines for potable reuse applications verified by operational criteria. Low presence of pathogens requires surrogates and indicator analyses and variety of monitoring techniques to verify pathogen log reduction.


Subject(s)
Cryptosporidiosis , Cryptosporidium , Drinking Water , Ozone , Water Purification , Charcoal , Giardia , Humans , Water Purification/methods
15.
Appl Microbiol Biotechnol ; 106(7): 2763-2773, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35294588

ABSTRACT

This study investigated the microbial community structure and composition across two treatment steps used in advanced water reclamation for potable reuse applications, namely Coagulation/Flocculation/Clarification/Granular Media Filtration (CFCGMF) and Ozone-Biological Activated Carbon filtration (O3/BAC). The study examined the richness, variations, and similarities of the microorganisms involved at each treatment step to better understand the role of ecology and the dynamics on unit process performance and the microbial community developed within it. The bacterial microbiomes at each treatment step were independently characterized using 16S metagenomic sequencing. Combining both treatment steps, a total of 3801 species were detected. From the total species detected, 38% and 98% were identified at CFCGMF and O3/BAC, respectively. The most abundant phyla were Proteobacteria, Bacteroidetes, Actinobacteria, and Firmicutes in both treatment steps. The identified species were classified based on their preferences to free-living style (59%) vs attached-living style (22%) showing a relatively low richness in the BAC media, but higher diversities. At the taxonomic class level, Betaproteobacteria was the predominant in both system processes. Additionally, a list of eight genera were identified as potential bacterial pathogens present in both process effluents. They are Aeromonas, Clostridium, Enterobacter, Escherichia, Flavobacterium, Legionella, Mycobacterium, and Pseudomonas. CFCGMF effluent yielded less pathogenic bacteria than both the ozone and BAC filter effluent from the O3/BAC process unit; their relative abundance accounted for about 2% and 8% for CFCGMF and O3/BAC, respectively. Detailed studies to characterize the microbial communities are crucial in interpreting the mechanisms and synergies between processes performance and microorganisms by identifying the needs and best practices to ensure public health protection. Key points • Microbial communities of two treatment processes are characterized using 16S rRNA sequencing. • Organisms that can tolerate ozone and form biofilms define microbial community in subsequent biofilters. • In relatively low abundances, potential pathogenic bacteria are detected in the treated water.


Subject(s)
Drinking Water , Microbiota , Ozone , Water Purification , Bacteria/genetics , Drinking Water/microbiology , RNA, Ribosomal, 16S/genetics
16.
Res Sq ; 2022 Mar 17.
Article in English | MEDLINE | ID: mdl-35313589

ABSTRACT

Detection of SARS-CoV-2 viral load in wastewater has been highly informative in estimating the approximate number of infected individuals in the surrounding communities. Recent developments in wastewater monitoring to determine community prevalence of COVID-19 further extends into identifying SARS-CoV-2 variants, including those being monitored for having enhanced transmissibility. We sequenced genomic RNA derived from wastewater to determine the variants of coronaviruses circulating in the communities. Wastewater samples were collected from Truckee Meadows Water Reclamation Facility (TMWRF) from November 2021 to June 2021 were analyzed for SARS-CoV-2 variants and were compared with the variants detected in the clinical specimens (nasal/nasopharyngeal swabs) of infected individuals during the same period. The comparison was found to be conclusively in agreement. Therefore, wastewater monitoring for SARS-CoV-2 variants in the community is a feasible strategy both as a complementary tool to clinical specimen testing and in the latter's absence.

17.
Sci Total Environ ; 817: 152958, 2022 Apr 15.
Article in English | MEDLINE | ID: mdl-35016937

ABSTRACT

In this study, wastewater-based surveillance was carried out to establish the correlation between SARS-CoV-2 viral RNA concentrations in wastewater and the incidence of corona virus disease 2019 (COVID-19) from clinical testing. The influent wastewater of three major water reclamation facilities (WRFs) in Northern Nevada, serving a population of 390,750, was monitored for SARS-CoV-2 viral RNA gene markers, N1 and N2, from June 2020 through September 2021. A total of 614 samples were collected and analyzed. The SARS-CoV-2 concentrations in wastewater were observed to peak twice during the study period. A moderate correlation trend between coronavirus disease 2019 (COVID-19) incidence data from clinical testing and SARS-CoV-2 viral RNA concentrations in wastewater was observed (Spearman r = 0.533). This correlation improved when using weekly average SARS-CoV-2 marker concentrations of wastewater and clinical case data (Spearman r = 0.790), presumably by mitigating the inherent variability of the environmental dataset and the effects of clinical testing artifacts (e.g., reporting lags). The research also demonstrated the value of wastewater-based surveillance as an early warning signal for early detection of trends in COVID-19 incidence. This was accomplished by identifying that the reported clinical cases had a stronger correlation to SARS-CoV-2 wastewater monitoring data when they were estimated to lag 7-days behind the wastewater data. The results aided local decision makers in developing strategies to manage COVID-19 in the region and provide a framework for how wastewater-based surveillance can be applied across localities to enhance the public health monitoring of the ongoing pandemic.


Subject(s)
COVID-19 , Wastewater , COVID-19/epidemiology , Genetic Markers , Humans , RNA, Viral , SARS-CoV-2/genetics
18.
Sci Total Environ ; 807(Pt 3): 151053, 2022 Feb 10.
Article in English | MEDLINE | ID: mdl-34673065

ABSTRACT

The State of Nevada, USA Administrative Code requires a 12-log enteric virus reduction/inactivation, 10-log Giardia cyst reduction, and 10-log Cryptosporidium oocyst reduction for Category A+ reclaimed water suitable for indirect potable reuse (IPR) based on raw wastewater to potable reuse water. Accurately demonstrating log10 reduction values (LRVs) through secondary biological treatment prior to an advanced water treatment train enables redundancy and resiliency for IPR projects while maintaining a high level of public confidence. LRVs for Cryptosporidium and Giardia resulting from secondary biological treatment are not fully established due to a wide range of performance variabilities resulting from different types of secondary biological treatment processes employed in water reclamation. A one-year investigation of two full-scale northern Nevada (e.g. ≤4 mgd; 1.5 × 107 L/day) water reclamation facilities (WRFs) was conducted to monitor Cryptosporidium oocysts and Giardia cysts in untreated wastewater and secondary effluent. This study aimed at establishing secondary treatment LRVs, monitor WRF performance and attempted to correlate performance to protozoan reduction. California's IPR regulations, in which Nevada IPR regulations were modeled after, were based on a maximum concentration of 5-logs (cysts/L) of Giardia and 4-logs (oocysts/L) of Cryptosporidium. The recovery-corrected Giardia and Cryptosporidium concentrations measured in untreated influent (20 samples each at each WRF) were below 5-log cysts/L at the 99th percentile (maximum 4.4-log cysts/L) and 4-log oocysts/L (maximum 2.7 log oocysts/L), respectively. Both secondary treatment WRFs produced secondary effluent that is consistently better than federal and the State of Nevada requirements and perform within an operating envelop for other secondary facilities. Given the results, it appears that a minimum conservative estimate for LRVs for well-operated secondary activated sludge treatment plants (at the 5th percentile) of 0.5 LRV credit for Cryptosporidium and 2.0 LRV for Giardia is warranted. These minimum LRVs are consistent with a conservative review of the available literature.


Subject(s)
Cryptosporidium , Giardia/isolation & purification , Water Purification , Cryptosporidium/isolation & purification , Nevada , Oocysts/isolation & purification , Wastewater
19.
Sci Total Environ ; 805: 150390, 2022 Jan 20.
Article in English | MEDLINE | ID: mdl-34818797

ABSTRACT

The response to disease outbreaks, such as SARS-CoV-2, can be constrained by a limited ability to measure disease prevalence early at a localized level. Wastewater based epidemiology is a powerful tool identifying disease spread from pooled community sewer networks or at influent to wastewater treatment plants. However, this approach is often not applied at a granular level that permits detection of local hot spots. This study examines the spatial patterns of SARS-CoV-2 in sewage through a spatial sampling strategy across neighborhood-scale sewershed catchments. Sampling was conducted across the Reno-Sparks metropolitan area from November to mid-December of 2020. This research utilized local spatial autocorrelation tests to identify the evolution of statistically significant neighborhood hot spots in sewershed sub-catchments that were identified to lead waves of infection, with adjacent neighborhoods observed to lag with increasing viral RNA concentrations over subsequent dates. The correlations between the sub-catchments over the sampling period were also characterized using principal component analysis. Results identified distinct time series patterns, with sewersheds in the urban center, outlying suburban areas, and outlying urbanized districts generally following unique trends over the sampling period. Several demographic parameters were identified as having important gradients across these areas, namely population density, poverty levels, household income, and age. These results provide a more strategic approach to identify disease outbreaks at the neighborhood level and characterized how sampling site selection could be designed based on the spatial and demographic characteristics of neighborhoods.


Subject(s)
COVID-19 , Water Purification , Humans , SARS-CoV-2 , Wastewater , Wastewater-Based Epidemiological Monitoring
20.
Water Environ Res ; 93(12): 2998-3010, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34606145

ABSTRACT

Dewatering of anaerobic digested (AD) sludge containing waste-activated sludge (WAS) from enhanced biological phosphorus removal (EBPR) poses numerous challenges including poor dewaterability, struvite scale formation, and recycling of high N and P levels in the sludge liquor to the treatment process. A full-scale water reclamation facility was investigated to mitigate these problems by experimenting with direct dewatering of EBPR WAS, bypassing the AD step. The investigations experimented with various blends of AD primary sludge with undigested thickened WAS to achieve dewatering performance improvements and overall operational cost savings. Direct thickened WAS dewatering has had many positive impacts including enhanced sludge cake solids concentration, reduced chemical use for facility operations, reduced struvite scaling, reduced biogas conditioning media servicing, eliminated need for centrate treatment, recovered capacity of existing unit operations including anaerobic digesters, and eliminated several proposed capital improvement projects that were previously deemed necessary. Although bypassing of WAS to AD reduced total biogas production, the specific gas yield increased to meet all of the facility's biogas demands and minimized excess gas flaring. The overall biosolids production mass increased causing increased transportation costs for disposal and caused notable odors, both of which are being currently investigated. PRACTITIONER POINTS: Direct WAS dewatering bypassing anaerobic digestion yields operational and process benefits in an EBPR water resource recovery facility Dewatered cake solids were increased compared with combined primary and WAS anaerobic digestion and dewatering Nutrient loads in sludge processing returns streams and operational costs are reduced by direct WAS dewatering.


Subject(s)
Sewage , Waste Disposal, Fluid , Cost-Benefit Analysis , Struvite , Water Resources
SELECTION OF CITATIONS
SEARCH DETAIL
...