Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 16 de 16
Filter
1.
Water Res ; 47(3): 1421-32, 2013 Mar 01.
Article in English | MEDLINE | ID: mdl-23290124

ABSTRACT

The reuse of domestic greywater has become common in Australia, especially during periods of extreme drought. Greywater is typically used in a raw, untreated form, primarily for landscape irrigation, but more than a quarter of greywater users irrigate vegetable gardens with the water, despite government advice against this practice. Greywater can be contaminated with enteric pathogens and may therefore pose a health risk if irrigated produce is consumed raw. A quantitative microbial risk assessment (QMRA) model was constructed to estimate the norovirus disease burden associated with consumption of greywater-irrigated lettuce. The annual disease burdens (95th percentile; DALYs per person) attributed to greywater irrigation ranged from 2 × 10(-8) to 5 × 10(-4), depending on the source of greywater and the existence of produce washing within households. Accounting for the prevalence of produce-washing behaviours across Melbourne, the model predicted annual disease burdens ranging from 4 × 10(-9) for bathroom water use only to 3 × 10(-6) for laundry water use only, and accounting for the proportionate use of each greywater type, the annual disease burden was 2 × 10(-6). We recommend the preferential use of bathroom water over laundry water where possible as this would reduce the annual burden of disease to align with the current Australian recycled water guidelines, which recommend a threshold of 10(-6) DALYs per person. It is also important to consider other exposure pathways, particularly considering the high secondary attack rate of norovirus, as it is highly likely that the estimated norovirus disease burden associated with greywater irrigation of vegetables is negligible relative to household contact with an infected individual.


Subject(s)
Agricultural Irrigation , Lactuca , Australia , Models, Statistical , Norovirus , Water Pollution
2.
BMC Med Res Methodol ; 9: 72, 2009 Nov 09.
Article in English | MEDLINE | ID: mdl-19900290

ABSTRACT

BACKGROUND: Quantitative Microbial Risk Assessment (QMRA), a modelling approach, is used to assess health risks. Inputs into the QMRA process include data that characterise the intensity, frequency and duration of exposure to risk(s). Data gaps for water exposure assessment include the duration and frequency of urban non-potable (non-drinking) water use. The primary objective of this study was to compare household water usage results obtained using two data collection tools, a computer assisted telephone interview (CATI) and a 7-day water activity diary, in order to assess the effect of different methodological survey approaches on derived exposure estimates. Costs and logistical aspects of each data collection tool were also examined. METHODS: A total of 232 households in an Australian dual reticulation scheme (where households are supplied with two grades of water through separate pipe networks) were surveyed about their water usage using both a CATI and a 7-day diary. Householders were questioned about their use of recycled water for toilet flushing, garden watering and other outdoor activities. Householders were also questioned about their water use in the laundry. Agreement between reported CATI and diary water usage responses was assessed. RESULTS: Results of this study showed that the level of agreement between CATI and diary responses was greater for more frequent water-related activities except toilet flushing and for those activities where standard durations or settings were employed. In addition, this study showed that the unit cost of diary administration was greater than for the CATI, excluding consideration of the initial selection and recruitment steps. CONCLUSION: This study showed that it is possible to successfully 'remotely' coordinate diary completion providing that adequate instructions are given and that diary recording forms are well designed. In addition, good diary return rates can be achieved using a monetary incentive and the diary format allows for collective recording, rather than an individual's estimation, of household water usage. Accordingly, there is merit in further exploring the use of diaries for collection of water usage information either in combination with a mail out for recruitment, or potentially in the future with Internet-based recruitment (as household Internet uptake increases).


Subject(s)
Data Collection/methods , Records , Surveys and Questionnaires , Telephone , Water Supply/statistics & numerical data , Australia , Humans
4.
J Clin Microbiol ; 46(7): 2252-62, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18448696

ABSTRACT

In the present study, we analyzed genetic variation in Cryptosporidium species from humans (n = 62) with clinical cryptosporidiosis in South Australia. Sequence variation was assessed in regions within the small subunit of nuclear rRNA (p-SSU), the 70-kDa heat shock protein (p-hsp70), and the 60-kDa glycoprotein (p-gp60) genes by employing single-strand conformation polymorphism analysis and sequencing. Based on the analyses of p-SSU and p-hsp70, Cryptosporidium hominis (n = 38) and Cryptosporidium parvum (n = 24) were identified. The analysis of p-gp60 revealed eight distinct subgenotypes, classified as C. hominis IaA17R1 (n = 3), IbA9G3R2 (n = 14), IbA10G2R2 (n = 20), and IfA12G1R1 (n = 1), as well as C. parvum IIaA18G3R1 (n = 15), IIaA20G3R1 (n = 6), IIaA22G4R1 (n = 2), and IIcA5G3R2 (n = 1). Subgenotypes IaA17R1 and IIaA22G4R1 are new. Of the six other subgenotypes, IbA10G2R2, IIaA18G3R1, IIaA20G3R1, and IIcA5G3R2 were reported previously from the state of Victoria. This is the fourth record in Australia of C. parvum subgenotype IIaA18G3R1 from humans, which, to date, has been isolated only from cattle in other countries. This subgenotype might be a significant contributor to sporadic human cryptosporidiosis and may indicate a greater zoonotic contribution to the infection of humans in the area of study. Comparative analyses revealed, for the first time, the differences in the genetic makeup of Cryptosporidium populations between two relatively close, major metropolitan cities.


Subject(s)
Cryptosporidiosis/parasitology , Cryptosporidium/classification , Cryptosporidium/isolation & purification , DNA, Protozoan/genetics , Polymorphism, Genetic , Animals , Base Sequence , Cluster Analysis , Cryptosporidium/genetics , DNA, Protozoan/chemistry , Genotype , HSP70 Heat-Shock Proteins/genetics , Humans , Membrane Glycoproteins/genetics , Molecular Epidemiology , Molecular Sequence Data , Mutation , Nucleic Acid Hybridization/methods , Protozoan Proteins/genetics , RNA, Ribosomal, 18S/genetics , Sequence Alignment , Sequence Analysis, DNA , South Australia
5.
J Gastroenterol Hepatol ; 20(11): 1685-90, 2005 Nov.
Article in English | MEDLINE | ID: mdl-16246186

ABSTRACT

BACKGROUND AND AIM: Many individuals with gastrointestinal symptoms do not seek medical attention and so there is little known about the pathogens involved in most cases of community gastroenteritis. We aimed to identify the pathogens responsible for community gastroenteritis and to examine the associated symptoms. METHODS: In a prospective study of 2811 subjects over 15 months, fecal pathogens were examined following highly credible gastroenteritis (HCG) events. The population consisted of family units of at least two children (< or =15 years-old) and two adults each. Fecal samples were tested for a range of bacterial, viral and protozoal pathogens. Gastroenteric episode duration and symptoms such as vomiting, nausea and diarrhea were measured. RESULTS: One or more pathogens were identified in 198 of a total 791 specimens collected. Pathogens detected most often were Norovirus virus (10.7%), pathogenic E. coli (6.7%), Campylobacter spp. (3.0%) and Giardia sp. (2.5%). Children were more prone than adults to all the pathogens tested, except E. coli. Children infected with Campylobacter were 8.3 times more likely (95% CI: 2.7-25.4) to have a longer duration of diarrhea than children with Norovirus (P < 0.001). Similarly, children infected with E. coli had increased persistence of diarrhea compared to Norovirus (OR = 3.5; 95% CI: 1.3-9.5; P = 0.02). Infection with Norovirus in children meant greater persistence of vomiting symptoms than infection with Campylobacter (P = 0.005), E. coli (P = 0.03), or if no pathogen was identified (P = 0.004). Adults usually vomited for fewer days than children while duration of diarrhea was similar to children. CONCLUSIONS: Many of the pathogens responsible for cases of gastroenteritis in the Australian community are likely to go undetected by current surveillance systems and routine clinical practice.


Subject(s)
Family Health , Gastroenteritis/microbiology , Gastroenteritis/parasitology , Adolescent , Adult , Australia/epidemiology , Caliciviridae Infections/epidemiology , Campylobacter Infections/epidemiology , Child , Cohort Studies , Escherichia coli Infections/epidemiology , Feces/microbiology , Feces/parasitology , Gastroenteritis/epidemiology , Giardiasis/epidemiology , Humans , Norovirus , Prospective Studies , Randomized Controlled Trials as Topic
6.
Mol Cell Probes ; 19(6): 394-9, 2005 Dec.
Article in English | MEDLINE | ID: mdl-16169706

ABSTRACT

Cryptosporidium oocyst DNA samples (n=80) from humans with cryptosporidiosis in Australia and the UK were characterized genetically and categorized by capillary electrophoretic (CE) analysis of part of the small subunit gene (pSSU; approximately 300bp) and second internal transcribed spacer (pITS-2; approximately 230bp) of nuclear ribosomal DNA. The amplicons were heat denatured and subjected to capillary electrophoresis in LPA matrix (Amersham) in a MegaBACEtrade mark 1000 system (Amersham). The chromatograms captured were stored electronically and then analysed using MegaBACEtrade mark Fragment Profiler software. Using reference DNA control samples representing Cryptosporidium hominis and Cryptosporidium parvum, particular peaks in the profiles were defined for their specific identification and differentiation. The two species could be readily differentiated based on their profile in the pSSU, the peak differences being associated with a nucleotide difference of <1.7%. While no variation was detectable in the pSSU profiles within each species, significant intraspecific variability in the positions of peaks in the pITS-2 chromatograms was displayed. For the 80 samples subjected to CE analysis of the pITS-2, four different genetic variants (genotypes) were detected within C. hominis and seven within C. parvum. Based on CE analysis of either pSSU and pITS-2 amplicons, it was readily possible to detect both species in 'mixed samples'. This CE method is time- and cost-effective, and may find applicability as a tool for the high throughput analysis of oocyst DNA samples for epidemiological surveys and for the monitoring of cryptosporidiosis outbreaks.


Subject(s)
Cryptosporidium parvum/genetics , Cryptosporidium/genetics , DNA, Ribosomal Spacer/analysis , Electrophoresis, Capillary/methods , Polymorphism, Restriction Fragment Length , Animals , Cost-Benefit Analysis , DNA, Ribosomal/analysis , DNA, Ribosomal Spacer/chemistry , Genetic Variation , Humans , Sequence Analysis, DNA/methods
7.
J Gastroenterol Hepatol ; 20(9): 1390-4, 2005 Sep.
Article in English | MEDLINE | ID: mdl-16105126

ABSTRACT

UNLABELLED: Abstract Background and Aim: Previous reports regarding the clinical significance and pathogenicity of Blastocystis hominis have been contradictory. The aim of this study was to examine the association between Blastocystis and gastrointestinal symptoms in immunocompetent individuals. METHODS: We monitored over 2800 healthy people for a period of 15 months, and took stool specimens during both asymptomatic periods and during periods of gastrointestinal symptoms. RESULTS: After exclusion of individuals who had simultaneous identification of other fecal pathogens, we compared the proportions of asymptomatic versus symptomatic individuals positive for Blastocystis and found no significant difference (P = 0.5). Symptom status did not correlate with parasite abundance. We found that some individuals were likely to have Blastocystis detected during both asymptomatic and symptomatic periods, possibly suggesting carriage of the organism. CONCLUSION: In conclusion, we found no correlation between clinical symptoms and the presence or absence of Blastocystis among this healthy cohort.


Subject(s)
Blastocystis Infections/complications , Blastocystis hominis , Gastrointestinal Diseases/parasitology , Adolescent , Adult , Animals , Blastocystis Infections/immunology , Blastocystis Infections/parasitology , Child , Child, Preschool , Cohort Studies , Feces/parasitology , Female , Gastrointestinal Diseases/immunology , Humans , Immunocompetence , Male
8.
J Infect Dis ; 192(4): 618-21, 2005 Aug 15.
Article in English | MEDLINE | ID: mdl-16028130

ABSTRACT

This study assessed whether serological responses to Cryptosporidium antigens are associated with a reduced risk of diarrheal illness in cases of infection with human immunodeficiency virus (HIV). The association between serological responses to the Cryptosporidium 15/17-kDa and 27-kDa antigen groups and reported diarrheal illness was examined by use of data from a previously published study of cases of HIV infection. In immunosuppressed individuals, a strong serological response to the 27-kDa antigen group was associated with a reduced risk of diarrhea without weight loss. This finding suggests that acquired protective immunity to cryptosporidiosis may be important in controlling the burden of cryptosporidiosis in immunosuppressed individuals.


Subject(s)
Antigens, Protozoan/immunology , Cryptosporidiosis/immunology , Cryptosporidium/immunology , HIV Infections/immunology , AIDS-Related Opportunistic Infections/immunology , Adult , Animals , Antibodies, Protozoan/biosynthesis , CD4 Lymphocyte Count , Cross-Sectional Studies , Diarrhea/parasitology , Humans , Odds Ratio
9.
Emerg Infect Dis ; 10(10): 1797-805, 2004 Oct.
Article in English | MEDLINE | ID: mdl-15504266

ABSTRACT

As part of a study to determine the effects of water filtration on the incidence of community-acquired gastroenteritis in Melbourne, Australia, we examined fecal samples from patients with gastroenteritis and asymptomatic persons for diarrheagenic strains of Escherichia coli. Atypical strains of enteropathogenic E. coli (EPEC) were the most frequently identified pathogens of all bacterial, viral, and parasitic agents in patients with gastroenteritis. Moreover, atypical EPEC were more common in patients with gastroenteritis (89 [12.8%] of 696) than in asymptomatic persons (11 [2.3%] of 489, p < 0.0001). Twenty-two random isolates of atypical EPEC that were characterized further showed marked heterogeneity in terms of serotype, genetic subtype, and carriage of virulence-associated determinants. Apart from the surface protein, intimin, no virulence determinant or phenotype was uniformly present in atypical EPEC strains. This study shows that atypical EPEC are an important cause of gastroenteritis in Melbourne.


Subject(s)
Disease Outbreaks , Escherichia coli Infections/epidemiology , Escherichia coli/pathogenicity , Gastroenteritis/microbiology , Bacterial Adhesion , Cell Line , Community-Acquired Infections/microbiology , Escherichia coli/genetics , Escherichia coli/isolation & purification , Feces/microbiology , Gastroenteritis/epidemiology , Genetic Variation , Humans , Phenotype , Seasons , Victoria/epidemiology , Water Microbiology
10.
J Food Prot ; 67(4): 818-22, 2004 Apr.
Article in English | MEDLINE | ID: mdl-15083738

ABSTRACT

Poor food handling practices in the home are a likely cause of gastroenteritis. This study examined how often reported practices in Australian homes met public health food safety recommendations. During 1998 in Melbourne, Australia, food handling and food storage questionnaires were completed by an adult member in 524 and 515 families, respectively. Each family consisted of at least two adults and two children. Respondents were surveyed regarding washing of hands, cutting boards, and fresh produce; use of kitchen cloths; egg storage; where cooked foods were cooled; the duration before refrigeration of cooked foods; where food types were positioned in the refrigerator; and the method of thawing chicken. Nearly every household reported handling food in a way that could cause food to become contaminated. Overall, 99.0% of respondents reported some form of mishandling, which encompassed 70.3% who handled food preparation surfaces poorly, 46.6% who did not wash their hands appropriately or in a timely manner, 41.7% who mishandled raw foods, and 70.1% who mishandled cooked foods. Food was inappropriately located in the refrigerator by 81.2%, and chicken was thawed using unsafe means by 76.3% of respondents. People preparing food in the home need to be reminded of the increased risk of disease that can arise from poor food handling practices.


Subject(s)
Consumer Product Safety , Disinfection/methods , Food Handling/methods , Foodborne Diseases/prevention & control , Health Knowledge, Attitudes, Practice , Adolescent , Adult , Australia , Child , Child, Preschool , Cross Infection , Female , Food Microbiology , Humans , Infant , Male , Surveys and Questionnaires
12.
Ophthalmology ; 111(1): 75-84, 2004 Jan.
Article in English | MEDLINE | ID: mdl-14711717

ABSTRACT

OBJECTIVE: To determine whether treatment with vitamin E (500 IU daily) reduces either the incidence or rate of progression of age-related cataracts. DESIGN: A prospective, randomized, double-masked, placebo-controlled clinical trial entitled the Vitamin E, Cataract and Age-Related Maculopathy Trial. PARTICIPANTS: Of 1906 screened volunteers, 1193 eligible subjects with early or no cataract, aged 55 to 80 years, were enrolled and followed up for 4 years. INTERVENTION: Subjects were assigned randomly to receive either 500 IU of natural vitamin E in soybean oil encapsulated in gelatin or a placebo with an identical appearance. MAIN OUTCOME MEASURES: The incidence and progression rates of age-related cataract were assessed annually with both clinical lens opacity gradings and computerized analysis of Scheimpflug and retroillumination digital lens images obtained with a Nidek EAS-1000 lens camera. The analysis was undertaken using data from the eye with the more advanced opacity for each type of cataract separately and for any cataract changes in each individual. RESULTS: Overall, 87% of the study population completed the 4 years of follow-up, with 74% of the vitamin E group and 76% of the placebo group continuing on their randomized treatment allocation throughout this time. For cortical cataract, the 4-year cumulative incidence rate was 4.5% among those randomized to vitamin E and 4.8% among those randomized to placebo (P = 0.87). For nuclear cataract, the corresponding rates were 12.9% and 12.1% (P = 0.77). For posterior subcapsular cataract, the rates were 1.7% and 3.5% (P = 0.08), whereas for any of these forms of cataract, they were 17.1% and 16.7%, respectively. Progression of cortical cataract was seen in 16.7% of the vitamin E group and 18.4% of the placebo group (P = 0.76). Corresponding rates for nuclear cataract were 11.4% and 11.9% (P = 0.84), whereas those of any cataract were 16.5% and 16.7%, respectively. There was no difference in the rate of cataract extraction between the 2 groups (P = 0.87). Lens characteristics of the participants withdrawn from the randomized medications were not different from those who continued. CONCLUSIONS: Vitamin E given for 4 years at a dose of 500 IU daily did not reduce the incidence of or progression of nuclear, cortical, or posterior subcapsular cataracts. These findings do not support the use of vitamin E to prevent the development or to slow the progression of age-related cataracts.


Subject(s)
Antioxidants/administration & dosage , Cataract/physiopathology , Lens, Crystalline/physiopathology , Vitamin E/administration & dosage , Aged , Aged, 80 and over , Aging/physiology , Capsules , Cataract/epidemiology , Cataract/prevention & control , Diagnostic Techniques, Ophthalmological , Dietary Supplements , Disease Progression , Double-Blind Method , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Prospective Studies , Quality Control , Victoria/epidemiology
13.
J Med Virol ; 69(4): 568-78, 2003 Apr.
Article in English | MEDLINE | ID: mdl-12601766

ABSTRACT

Endemic gastroenteritis associated with the Norwalk-like viruses (NLVs) is little understood. This study tested for NLV in gastroenteritis cases in 257 households in Melbourne, Australia, for the period September 1997 to February 1999 by a reverse transcription hemi-nested polymerase chain reaction. Positive samples were studied by nucleotide sequencing and phylogenetic analysis. NLV was detected in 73 (11.4%) of 638 faecal specimens tested. Twelve (1.9%) were NLV genogroup 1 (G1) and 61 (9.6%) NLV genogroup 2 (G2). Gastroenteritis symptoms associated with NLV G2/no other pathogens were significantly more severe than where no NLV was detected. NLV G1 and NLV G2 were detected in adults and children, males and females. NLV G2 incidence showed a marked seasonal periodicity with significant peaks in the Australian late spring/early summer periods. NLV G1 seasonality was significantly different from that of NLV G2. Seven major NLV clusters were identified by phylogenetic analysis.


Subject(s)
Gastroenteritis/epidemiology , Gastroenteritis/physiopathology , Norovirus/classification , Norovirus/genetics , Adolescent , Adult , Australia/epidemiology , Caliciviridae Infections/epidemiology , Caliciviridae Infections/physiopathology , Caliciviridae Infections/virology , Child , Child, Preschool , Endemic Diseases , Feces/virology , Female , Gastroenteritis/virology , Humans , Incidence , Infant , Infant, Newborn , Male , Middle Aged , Norovirus/isolation & purification , Norovirus/pathogenicity , Seasons , Sequence Analysis, DNA
14.
Aust N Z J Public Health ; 27(4): 399-404, 2003.
Article in English | MEDLINE | ID: mdl-14705301

ABSTRACT

OBJECTIVE: To provide recent data regarding the epidemiology of community-based respiratory infections in Australia. METHODS: A longitudinal study between 1997-99 involving collection of a health diary from 600 families in Melbourne. RESULTS: More than 80% of study participants reported at least one respiratory episode over 15 months. An average of 2.2 respiratory episodes per person per year was reported, with a mean episode duration of 6.3 days. On average, subjects were symptomatic for 4.2% of the study days. Compared with other age groups, children aged less than two years were most likely to have at least one respiratory episode, a greater number of episodes per person and the longest episode duration (6.8 days). Approximately, one in three (28.7%) respiratory episodes were associated with a doctor's visit, and one in four (23%) necessitated time off school or work. Exposure to other people with respiratory symptoms was commonly reported. CONCLUSIONS: Respiratory infections are common, cause a significant amount of morbidity, and are major contributors to the total community health burden. IMPLICATIONS: The direct and indirect costs of respiratory infections to the community are substantial.


Subject(s)
Cost of Illness , Respiratory Tract Infections/epidemiology , Adolescent , Adult , Age Factors , Australia/epidemiology , Child , Child, Preschool , Gastroenteritis/complications , Gastroenteritis/microbiology , Humans , Infant , Infant, Newborn , Longitudinal Studies , Middle Aged , Residence Characteristics , Respiratory Tract Infections/complications , Time Factors , Water Supply
15.
Med J Aust ; 177(11-12): 609-13, 2002.
Article in English | MEDLINE | ID: mdl-12463978

ABSTRACT

The risk of contamination of drinking water supplies with microbial pathogens is minimised by modern approaches to water management, but continues to be the major public health concern. Chemical contaminants usually pose little health risk except at very high levels, but debate continues over the potential adverse health effects of low-level, chronic exposure to compounds such as disinfection byproducts. Recreational water contact can be associated with adverse health outcomes either from microbial infections or exposure to cyanobacterial toxins. Environmental issues such as increasing salinity and global warming are likely to affect the sustainability of our current drinking water supplies and increase the threat of waterborne disease outbreaks. New technologies, use of alternative water sources, such as rainwater tanks, water reuse and restrictions will undoubtedly be part of the solution to our diminishing water resources, but have the potential to introduce new health threats.


Subject(s)
Water Pollution/adverse effects , Australia , Disease Outbreaks , Humans , Infections/epidemiology , Infections/etiology , Water Microbiology , Water Pollutants, Chemical/adverse effects , Water Pollutants, Chemical/analysis , Water Pollution/analysis , Water Pollution/prevention & control , Water Supply/standards
16.
Environ Health Perspect ; 110(7): 679-87, 2002 Jul.
Article in English | MEDLINE | ID: mdl-12117645

ABSTRACT

We addressed the need for a biomarker of ingestion exposure to drinking water disinfection by-products by performing a human exposure trial. We evaluated urinary excretion of trichloroacetic acid (TCAA) as an exposure biomarker using 10 volunteers who normally consume their domestic tap water. We recruited the volunteers at a water quality research laboratory in Adelaide, Australia. Participants maintained a detailed consumption and exposure diary over the 5-week study. We also analyzed tap water and first morning urine (FMU) samples for TCAA, and tap water for chloral hydrate (CH). We documented both interindividual and intraindividual variability in TCAA ingestion and urinary excretion, and both were substantial. With a TCAA-free bottled water intervention, we used creatinine-adjusted urinary TCAA levels to estimate urinary TCAA excretion half-lives for three of the participants. We observed correspondence over time between estimated TCAA excretion, calculated from TCAA + CH ingestion levels, and measured TCAA urinary excretion. This study demonstrates the merits and feasibility of using TCAA in FMU as an exposure biomarker, and reveals remaining concerns about possible alternate sources of TCAA exposure for individuals with low drinking water ingestion exposure.


Subject(s)
Biomarkers/analysis , Caustics/analysis , Disinfectants/adverse effects , Environmental Exposure , Trichloroacetic Acid/urine , Water Supply , Adult , Female , Humans , Male , Middle Aged , Organic Chemicals
SELECTION OF CITATIONS
SEARCH DETAIL
...