Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 32
Filter
1.
Sci Rep ; 13(1): 1444, 2023 01 25.
Article in English | MEDLINE | ID: mdl-36697451

ABSTRACT

The rate of soil-transmitted helminth (STH) infection is estimated to be around 20% in Indonesia. Health promotion and health education are cost-effective strategies to supplement STH prevention and control programs. Existing studies suggest that quantitative tools for knowledge, attitudes and practices (KAP) are important to monitor effective community-based STH interventions. However, evidence is limited regarding the applicability of such tools. This study aims to identify the socio-demographic predictors for STH-related knowledge and practices and validate the quantitative tools in population use. A cross-sectional study design was conducted among residents of 16 villages in Central Java, Indonesia. Adult and child respondents were interviewed to assess general knowledge and practices in relation to STH. Two mixed effects models identified the significant factors in predicting knowledge and practice scores. The model predicted knowledge and practice scores were compared with the observed scores to validate the quantitative measurements developed in this study. Participants' socio-demographic variables were significant in predicting an individual's STH-related knowledge level and their hand washing and hygiene practices, taking into account household-level variability. Model validation results confirmed that the quantitative measurement tools were suitable for assessing STH associated knowledge and behaviour. The questionnaire developed in this study can be used to support school- and community-based health education interventions to maximize the effect of STH prevention and control programs.


Subject(s)
Helminthiasis , Helminths , Child , Adult , Humans , Animals , Soil , Indonesia/epidemiology , Cross-Sectional Studies , Helminthiasis/epidemiology , Helminthiasis/prevention & control , Surveys and Questionnaires , Prevalence , Feces
3.
Epidemiol Infect ; 147: e152, 2019 01.
Article in English | MEDLINE | ID: mdl-31063089

ABSTRACT

Clostridium difficile infections (CDIs) affect patients in hospitals and in the community, but the relative importance of transmission in each setting is unknown. We developed a mathematical model of C. difficile transmission in a hospital and surrounding community that included infants, adults and transmission from animal reservoirs. We assessed the role of these transmission routes in maintaining disease and evaluated the recommended classification system for hospital- and community-acquired CDIs. The reproduction number in the hospital was 1 for nearly all scenarios without transmission from animal reservoirs (range: 1.0-1.34). However, the reproduction number for the human population was 3.5-26.0%) of human exposures originated from animal reservoirs. Symptomatic adults accounted for <10% transmission in the community. Under conservative assumptions, infants accounted for 17% of community transmission. An estimated 33-40% of community-acquired cases were reported but 28-39% of these reported cases were misclassified as hospital-acquired by recommended definitions. Transmission could be plausibly sustained by asymptomatically colonised adults and infants in the community or exposure to animal reservoirs, but not hospital transmission alone. Under-reporting of community-onset cases and systematic misclassification underplays the role of community transmission.


Subject(s)
Carrier State/epidemiology , Carrier State/veterinary , Clostridium Infections/transmission , Community-Acquired Infections/transmission , Disease Reservoirs , Disease Transmission, Infectious , Animals , Carrier State/microbiology , Clostridium Infections/epidemiology , Community-Acquired Infections/epidemiology , Humans , Infant , Models, Theoretical
4.
J Hosp Infect ; 102(2): 157-164, 2019 Jun.
Article in English | MEDLINE | ID: mdl-30880267

ABSTRACT

BACKGROUND: Clostridium difficile infection (CDI) is the leading cause of antibiotic-associated diarrhoea with peak incidence in late winter or early autumn. Although CDI is commonly associated with hospitals, community transmission is important. AIM: To explore potential drivers of CDI seasonality and the effect of community-based interventions to reduce transmission. METHODS: A mechanistic compartmental model of C. difficile transmission in a hospital and surrounding community was used to determine the effect of reducing transmission or antibiotic prescriptions in these settings. The model was extended to allow for seasonal antibiotic prescriptions and seasonal transmission. FINDINGS: Modelling antibiotic seasonality reproduced the seasonality of CDI, including approximate magnitude (13.9-15.1% above annual mean) and timing of peaks (0.7-1.0 months after peak antibiotics). Halving seasonal excess prescriptions reduced the incidence of CDI by 6-18%. Seasonal transmission produced larger seasonal peaks in the prevalence of community colonization (14.8-22.1% above mean) than seasonal antibiotic prescriptions (0.2-1.7% above mean). Reducing transmission from symptomatic or hospitalized patients had little effect on community-acquired CDI, but reducing transmission in the community by ≥7% or transmission from infants by ≥30% eliminated the pathogen. Reducing antibiotic prescription rates led to approximately proportional reductions in infections, but limited reductions in the prevalence of colonization. CONCLUSION: Seasonal variation in antibiotic prescription rates can account for the observed magnitude and timing of C. difficile seasonality. Even complete prevention of transmission from hospitalized patients or symptomatic patients cannot eliminate the pathogen, but interventions to reduce transmission from community residents or infants could have a large impact on both hospital- and community-acquired infections.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Clostridium Infections/prevention & control , Clostridium Infections/transmission , Disease Transmission, Infectious/prevention & control , Drug Utilization , Infection Control/methods , Models, Theoretical , Adult , Aged , Humans , Infant , Prescriptions/statistics & numerical data , Prevalence , Seasons
5.
J Hosp Infect ; 99(4): 453-460, 2018 Aug.
Article in English | MEDLINE | ID: mdl-29258917

ABSTRACT

BACKGROUND: Clostridium difficile infections occur frequently among hospitalized patients, with some infections acquired in hospital and others in the community. International guidelines classify cases as hospital-acquired if symptom onset occurs more than two days after admission. This classification informs surveillance and infection control, but has not been verified by empirical or modelling studies. AIM: To assess current classification of C. difficile acquisition using a simulation model as a reference standard. METHODS: C. difficile transmission was simulated in a range of hospital scenarios. The sensitivity, specificity and precision of classifications that use cut-offs ranging from 0.25 h to 40 days were calculated. The optimal cut-off that correctly estimated the proportion of cases that were hospital acquired and the balanced cut-off that had equal sensitivity and specificity were identified. FINDINGS: The recommended two-day cut-off overestimated the incidence of hospital-acquired cases in all scenarios and by >100% in the base scenario. The two-day cut-off had good sensitivity (96%) but poor specificity (48%) and precision (52%) to identify cases acquired during the current hospitalization. A five-day cut-off was balanced, and a six-day cut-off was optimal in the base scenario. The optimal and balanced cut-offs were more than two days for nearly all scenarios considered (ranges: four to nine days and two to eight days, respectively). CONCLUSION: Current guidelines for classifying C. difficile infections overestimate the proportion of cases acquired in hospital in all model scenarios. To reduce misclassification bias, an infection should be classified as being acquired prior to admission if symptoms begin within five days of admission.


Subject(s)
Clostridioides difficile/isolation & purification , Clostridium Infections/epidemiology , Community-Acquired Infections/diagnosis , Community-Acquired Infections/epidemiology , Cross Infection/diagnosis , Cross Infection/epidemiology , Epidemiologic Methods , Clostridioides difficile/classification , Clostridioides difficile/genetics , Clostridium Infections/microbiology , Community-Acquired Infections/microbiology , Cross Infection/microbiology , Humans , Incidence , Models, Theoretical , Sensitivity and Specificity
6.
J Hosp Infect ; 97(2): 115-121, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28576454

ABSTRACT

BACKGROUND: Hospital volume is known to have a direct impact on the outcomes of major surgical procedures. However, it is unclear if the evidence applies specifically to surgical site infections. AIMS: To determine if there are procedure-specific hospital outliers [with higher surgical site infection rates (SSIRs)] for four major surgical procedures, and to examine if hospital volume is associated with SSIRs in the context of outlier performance in New South Wales (NSW), Australia. METHODS: Adults who underwent one of four surgical procedures (colorectal, joint replacement, spinal and cardiac procedures) at a NSW healthcare facility between 2002 and 2013 were included. The hospital volume for each of the four surgical procedures was categorized into tertiles (low, medium and high). Multi-variable logistic regression models were built to estimate the expected SSIR for each procedure. The expected SSIRs were used to compute indirect standardized SSIRs which were then plotted in funnel plots to identify hospital outliers. FINDINGS: One hospital was identified to be an overall outlier (higher SSIRs for three of the four procedures performed in its facilities), whereas two hospitals were outliers for one specific procedure throughout the entire study period. Low-volume facilities performed the best for colorectal surgery and worst for joint replacement and cardiac surgery. One high-volume facility was an outlier for spinal surgery. CONCLUSIONS: Surgical site infections seem to be mainly a procedure-specific, as opposed to a hospital-specific, phenomenon in NSW. The association between hospital volume and SSIRs differs for different surgical procedures.


Subject(s)
Arthroplasty, Replacement/statistics & numerical data , Cardiac Surgical Procedures/statistics & numerical data , Colorectal Surgery/statistics & numerical data , Hospitals/statistics & numerical data , Spine/surgery , Surgical Wound Infection/epidemiology , Aged , Cross Infection/epidemiology , Databases, Factual/statistics & numerical data , Female , Health Services Research , Humans , Logistic Models , Male , Middle Aged , New South Wales/epidemiology
7.
Prev Vet Med ; 140: 78-86, 2017 May 01.
Article in English | MEDLINE | ID: mdl-28460753

ABSTRACT

Results obtained from a nationwide longitudinal study were extended to estimate the population-level effects of selected risk factors on the incidence of bovine respiratory disease (BRD) during the first 50days at risk in medium-sized to large Australian feedlots. Population attributable fractions (PAF) and population attributable risks (PAR) were used to rank selected risk factors in order of importance from the perspective of the Australian feedlot industry within two mutually exclusive categories: 'intervention' risk factors had practical strategies that feedlot managers could implement to avoid exposure of cattle to adverse levels of the risk factor and a precise estimate of the population-level effect while 'others' did not. An alternative method was also used to quantify the expected effects of simultaneously preventing exposure to multiple management-related factors whilst not changing exposure to factors that were more difficult to modify. The most important 'intervention' risk factors were shared pen water (PAF: 0.70, 95% credible interval: 0.45-0.83), breed (PAF: 0.67, 95% credible interval: 0.54-0.77), the animal's prior lifetime history of mixing with cattle from other herds (PAF: 0.53, 95% credible interval: 0.30-0.69), timing of the animal's move to the vicinity of the feedlot (PAF: 0.45, 95% credible interval: 0.17-0.68), the presence of Bovine viral diarrhoea virus 1 (BVDV-1) in the animal's cohort (PAF: 0.30, 95% credible interval: 0.04-0.50), the number of study animals in the animal's group 13days before induction (PAF: 0.30, 95% credible interval: 0.10-0.44) and induction weight (PAF: 0.16, 95% credible interval: 0.09-0.23). Other important risk factors identified and prioritised for further research were feedlot region, season of induction and cohort formation patterns. An estimated 82% of BRD incidence was attributable to management-related risk factors, whereby the lowest risk category of a composite management-related variable comprised animals in the lowest risk category of at least four of the five component variables (shared pen water, mixing, move timing, BVDV-1 in the cohort and the number of animals in the animal's group-13). This indicated that widespread adoption of appropriate interventions including ensuring pen water is not shared between pens, optimising animal mixing before induction, timing of the animal's move to the vicinity of the feedlot, and group size prior to placing animals in feedlot pens, and avoiding BVDV-1 in cohorts could markedly reduce the incidence of BRD in medium-sized to large Australian feedlots.


Subject(s)
Animal Husbandry/methods , Bovine Respiratory Disease Complex/epidemiology , Bovine Respiratory Disease Complex/etiology , Animals , Australia/epidemiology , Cattle , Female , Incidence , Logistic Models , Longitudinal Studies , Male , Risk Factors
8.
Clin Microbiol Infect ; 23(1): 48.e1-48.e7, 2017 Jan.
Article in English | MEDLINE | ID: mdl-27615716

ABSTRACT

OBJECTIVES: To investigate the prevalence and risk factors for asymptomatic toxigenic (TCD) and nontoxigenic Clostridium difficile (NTCD) colonization in a broad cross section of the general hospital population over a 3-year period. METHODS: Patients without diarrhoea admitted to two Australian tertiary hospitals were randomly selected through six repeated cross-sectional surveys conducted between 2012 and 2014. Stool specimens were cultured under anaerobic conditions, and C. difficile isolates were tested for the presence of toxin genes and ribotyped. Patients were then grouped into noncolonized, TCD colonized or NTCD colonized for identifying risk factors using multinomial logistic regression models. RESULTS: A total of 1380 asymptomatic patients were enrolled; 76 patients (5.5%) were TCD colonized and 28 (2.0%) were NTCD colonized. There was a decreasing annual trend in TCD colonization, and asymptomatic colonization was more prevalent during the summer than winter months. TCD colonization was associated with gastro-oesophageal reflux disease (relative risk ratio (RRR) = 2.20; 95% confidence interval (CI) 1.17-4.14), higher number of admissions in the previous year (RRR = 1.24; 95% CI 1.10-1.39) and antimicrobial exposure during the current admission (RRR = 2.78; 95% CI 1.23-6.28). NTCD colonization was associated with chronic obstructive pulmonary disease (RRR = 3.88; 95% CI 1.66-9.07) and chronic kidney failure (RRR = 5.78; 95% CI 2.29-14.59). Forty-eight different ribotypes were identified, with 014/020 (n = 23), 018 (n = 10) and 056 (n = 6) being the most commonly isolated. CONCLUSIONS: Risk factors differ between patients with asymptomatic colonization by toxigenic and nontoxigenic strains. Given that morbidity is largely driven by toxigenic strains, this novel finding has important implications for disease control and prevention.


Subject(s)
Carrier State , Clostridioides difficile/isolation & purification , Hospitals , Adult , Aged , Aged, 80 and over , Australia/epidemiology , Clostridium Infections/epidemiology , Clostridium Infections/microbiology , Cross-Sectional Studies , Female , Humans , Male , Middle Aged , Risk Factors , Seasons
9.
Sci Rep ; 6: 30299, 2016 07 25.
Article in English | MEDLINE | ID: mdl-27452598

ABSTRACT

To prevent diseases associated with inadequate sanitation and poor hygiene, people needing latrines and behavioural interventions must be identified. We compared two indicators that could be used to identify those people. Indicator 1 of household latrine coverage was a simple Yes/No response to the question "Does your household have a latrine?" Indicator 2 was more comprehensive, combining questions about defecation behaviour with observations of latrine conditions. Using a standardized procedure and questionnaire, trained research assistants collected data from 6,599 residents of 16 rural villages in Indonesia. Indicator 1 identified 30.3% as not having a household latrine, while Indicator 2 identified 56.0% as using unimproved sanitation. Indicator 2 thus identified an additional 1,710 people who were missed by Indicator 1. Those 1,710 people were of lower socioeconomic status (p < 0.001), and a smaller percentage practiced appropriate hand-washing (p < 0.02). These results show how a good indicator of need for sanitation and hygiene interventions can combine evidences of both access and use, from self-reports and objective observation. Such an indicator can inform decisions about sanitation-related interventions and about scaling deworming programmes up or down. Further, a comprehensive and locally relevant indicator allows improved targeting to those most in need of a hygiene-behaviour intervention.


Subject(s)
Hand Disinfection , Hygiene , Sanitation , Adolescent , Adult , Aged , Aged, 80 and over , Clinical Trials as Topic , Environment , Family Characteristics , Female , Humans , Indonesia/epidemiology , Male , Middle Aged , Rural Population , Social Class , Surveys and Questionnaires , Toilet Facilities , Young Adult
10.
Environ Sci Pollut Res Int ; 23(18): 18639-48, 2016 Sep.
Article in English | MEDLINE | ID: mdl-27306209

ABSTRACT

In this study, we have evaluated the efficacy of propidium monoazide quantitative polymerase chain reaction (PMA-qPCR) to differentiate between viable and non-viable Ancylostoma caninum ova. The newly developed method was validated using raw wastewater seeded with known numbers of A. caninum ova. Results of this study confirmed that PMA-qPCR has resulted in average of 88 % reduction (P < 0.05) in gene copy numbers for 50 % viable +50 % non-viable when compared with 100 % viable ova. A reduction of 100 % in gene copies was observed for 100 % non-viable ova when compared with 100 % viable ova. Similar reductions (79-80 %) in gene copies were observed for A. caninum ova-seeded raw wastewater samples (n = 18) collected from wastewater treatment plants (WWTPs) A and B. The newly developed PMA-qPCR method was applied to determine the viable ova of different helminths (A. caninum, A. duodenale, Necator americanus and Ascaris lumbricoides) in raw wastewater, human fecal and soil samples. None of the unseeded wastewater samples were positive for the above-mentioned helminths. N. americanus and A. lumbricoides ova were found in unseeded human fecal and soil samples. For the unseeded human fecal samples (1 g), an average gene copy concentration obtained from qPCR and PMA-qPCR was found to be similar (6.8 × 10(5) ± 6.4 × 10(5) and 6.3 × 10(5) ± 4.7 × 10(5)) indicating the presence of viable N. americanus ova. Among the 24 unseeded soil samples tested, only one was positive for A. lumbricoides. The mean gene copy concentration in the positively identified soil sample was 1.0 × 10(5) ± 1.5 × 10(4) (determined by qPCR) compared to 4.9 × 10(4) ± 3.7 × 10(3) (determined by PMA-qPCR). The newly developed PMA-qPCR methods were able to detect viable helminth ova from wastewater and soil samples and could be adapted for health risk assessment.


Subject(s)
Environmental Monitoring/methods , Feces/parasitology , Helminths/physiology , Ovum , Propidium , Real-Time Polymerase Chain Reaction/methods , Soil/parasitology , Wastewater/parasitology , Animals , Azides , Humans , Propidium/analogs & derivatives
11.
Prev Vet Med ; 128: 23-32, 2016 Jun 01.
Article in English | MEDLINE | ID: mdl-27237387

ABSTRACT

Bovine respiratory disease (BRD) is the major cause of clinical disease and death in feedlot cattle. A prospective longitudinal study was conducted in a population of Australian feedlot cattle to assess associations between factors related to feedlot management and risk of BRD. In total, 35,131 animals in 170 pens (cohorts) inducted into 14 feedlots were included in statistical analyses. Causal diagrams were used to inform model building to allow separate estimation of total and direct effects. Multilevel mixed effects logistic regression models were fitted within the Bayesian framework. The placement of pen water troughs such that they could be accessed by animals in adjoining pens was associated with markedly increased risk of BRD (OR 4.3, 95% credible interval: 1.4-10.3). Adding animals to pens over multiple days was associated with increased risk of BRD across all animals in those pens compared to placing all animals in the pen on a single day (total effect: OR 1.9, 95% credible interval: 1.2-2.8). The much attenuated direct effect indicated that this was primarily mediated via factors on indirect pathways so it may be possible to ameliorate the adverse effects of adding animals to pens over multiple days by altering exposure to these intervening factors (e.g. mixing history). In pens in which animals were added to the pen over multiple days, animals added ≥7 days (OR: 0.7, credible interval: 0.5-0.9) or 1-6 days (OR: 0.8, credible interval: 0.7-1.0) before the last animal was added were at modestly reduced risk of BRD compared to the animals that were added to the pen on the latest day. Further research is required to disentangle effects of cohort formation patterns at animal-level and higher levels on animal-level risk of BRD. Vaccination against Bovine herpesvirus 1 at feedlot entry was investigated but results were inconclusive and further research is required to evaluate vaccine efficacy. We conclude that there are practical interventions available to feedlot managers to reduce the risk of cattle developing BRD at the feedlot. We recommend placement of water troughs in feedlot pens so that they cannot be accessed by animals in adjoining pens. Further research is required to identify practical and cost-effective management strategies that allow longer adaption times for cattle identified prior to induction as being at higher risk of developing BRD.


Subject(s)
Animal Husbandry/methods , Bovine Respiratory Disease Complex/epidemiology , Animals , Australia/epidemiology , Bayes Theorem , Bovine Respiratory Disease Complex/virology , Cattle , Logistic Models , Longitudinal Studies , Prospective Studies , Risk Factors
12.
Prev Vet Med ; 127: 37-43, 2016 May 01.
Article in English | MEDLINE | ID: mdl-27094138

ABSTRACT

Bovine respiratory disease (BRD) is the major cause of clinical disease and death in feedlot populations worldwide. A longitudinal study was conducted to assess associations between risk factors related to on-farm management prior to transport to the feedlot and risk of BRD in a population of feedlot beef cattle sourced from throughout the cattle producing regions of Australia. Exposure variables were derived from questionnaire data provided by farmers supplying cattle (N=10,721) that were a subset of the population included in a nationwide prospective study investigating numerous putative risk factors for BRD. Causal diagrams were used to inform model building to allow estimation of effects of interest. Multilevel mixed effects logistic regression models were fitted within the Bayesian framework. Animals that were yard weaned were at reduced risk (OR: 0.7, 95% credible interval: 0.5-1.0) of BRD at the feedlot compared to animals immediately returned to pasture after weaning. Animals that had previously been fed grain (OR: 0.6, 95% credible interval: 0.3-1.1) were probably at reduced risk of BRD at the feedlot compared to animals not previously fed grain. Animals that received prior vaccinations against Bovine viral diarrhoea virus 1 (OR: 0.8, 95% credible interval: 0.5-1.1) or Mannheimia haemolytica (OR: 0.8, 95% credible interval: 0.6-1.0) were also probably at reduced risk compared to non-vaccinated animals. The results of this study confirm that on-farm management before feedlot entry can alter risk of BRD after beef cattle enter feedlots.


Subject(s)
Animal Husbandry/methods , Bovine Respiratory Disease Complex/epidemiology , Animals , Australia/epidemiology , Bovine Respiratory Disease Complex/etiology , Cattle , Logistic Models , Longitudinal Studies , Prospective Studies , Risk Factors
13.
Prev Vet Med ; 127: 121-33, 2016 May 01.
Article in English | MEDLINE | ID: mdl-26972273

ABSTRACT

Bovine respiratory disease (BRD) is the most important cause of clinical disease and death in feedlot cattle. Respiratory viral infections are key components in predisposing cattle to the development of this disease. To quantify the contribution of four viruses commonly associated with BRD, a case-control study was conducted nested within the National Bovine Respiratory Disease Initiative project population in Australian feedlot cattle. Effects of exposure to Bovine viral diarrhoea virus 1 (BVDV-1), Bovine herpesvirus 1 (BoHV-1), Bovine respiratory syncytial virus (BRSV) and Bovine parainfluenza virus 3 (BPIV-3), and to combinations of these viruses, were investigated. Based on weighted seroprevalences at induction (when animals were enrolled and initial samples collected), the percentages of the project population estimated to be seropositive were 24% for BoHV-1, 69% for BVDV-1, 89% for BRSV and 91% for BPIV-3. For each of the four viruses, seropositivity at induction was associated with reduced risk of BRD (OR: 0.6-0.9), and seroincrease from induction to second blood sampling (35-60 days after induction) was associated with increased risk of BRD (OR: 1.3-1.5). Compared to animals that were seropositive for all four viruses at induction, animals were at progressively increased risk with increasing number of viruses for which they were seronegative; those seronegative for all four viruses were at greatest risk (OR: 2.4). Animals that seroincreased for one or more viruses from induction to second blood sampling were at increased risk (OR: 1.4-2.1) of BRD compared to animals that did not seroincrease for any viruses. Collectively these results confirm that prior exposure to these viruses is protective while exposure at or after feedlot entry increases the risk of development of BRD in feedlots. However, the modest increases in risk associated with seroincrease for each virus separately, and the progressive increases in risk with multiple viral exposures highlights the importance of concurrent infections in the aetiology of the BRD complex. These findings indicate that, while efficacious vaccines could aid in the control of BRD, vaccination against one of these viruses would not have large effects on population BRD incidence but vaccination against multiple viruses would be expected to result in greater reductions in incidence. The findings also confirm the multifactorial nature of BRD development, and indicate that multifaceted approaches in addition to efficacious vaccines against viruses will be required for substantial reductions in BRD incidence.


Subject(s)
Bovine Respiratory Disease Complex/epidemiology , Viruses/isolation & purification , Animals , Australia/epidemiology , Bovine Respiratory Disease Complex/virology , Case-Control Studies , Cattle , Female , Male , Prevalence , Seroepidemiologic Studies
14.
Prev Vet Med ; 126: 159-69, 2016 Apr 01.
Article in English | MEDLINE | ID: mdl-26907209

ABSTRACT

Viruses play a key role in the complex aetiology of bovine respiratory disease (BRD). Bovine viral diarrhoea virus 1 (BVDV-1) is widespread in Australia and has been shown to contribute to BRD occurrence. As part of a prospective longitudinal study on BRD, effects of exposure to BVDV-1 on risk of BRD in Australian feedlot cattle were investigated. A total of 35,160 animals were enrolled at induction (when animals were identified and characteristics recorded), held in feedlot pens with other cattle (cohorts) and monitored for occurrence of BRD over the first 50days following induction. Biological samples collected from all animals were tested to determine which animals were persistently infected (PI) with BVDV-1. Data obtained from the Australian National Livestock Identification System database were used to determine which groups of animals that were together at the farm of origin and at 28days prior to induction (and were enrolled in the study) contained a PI animal and hence to identify animals that had probably been exposed to a PI animal prior to induction. Multi-level Bayesian logistic regression models were fitted to estimate the effects of exposure to BVDV-1 on the risk of occurrence of BRD. Although only a total of 85 study animals (0.24%) were identified as being PI with BVDV-1, BVDV-1 was detected on quantitative polymerase chain reaction in 59% of cohorts. The PI animals were at moderately increased risk of BRD (OR 1.9; 95% credible interval 1.0-3.2). Exposure to BVDV-1 in the cohort was also associated with a moderately increased risk of BRD (OR 1.7; 95% credible interval 1.1-2.5) regardless of whether or not a PI animal was identified within the cohort. Additional analyses indicated that a single quantitative real-time PCR test is useful for distinguishing PI animals from transiently infected animals. The results of the study suggest that removal of PI animals and/or vaccination, both before feedlot entry, would reduce the impact of BVDV-1 on BRD risk in cattle in Australian feedlots. Economic assessment of these strategies under Australian conditions is required.


Subject(s)
Bovine Virus Diarrhea-Mucosal Disease/epidemiology , Diarrhea Virus 1, Bovine Viral , Animal Feed/virology , Animal Husbandry , Animals , Antibodies, Viral/blood , Bovine Virus Diarrhea-Mucosal Disease/diagnosis , Bovine Virus Diarrhea-Mucosal Disease/prevention & control , Bovine Virus Diarrhea-Mucosal Disease/transmission , Cattle , Cohort Studies , Diarrhea Virus 1, Bovine Viral/genetics , Diarrhea Virus 1, Bovine Viral/isolation & purification , Prevalence , Real-Time Polymerase Chain Reaction/veterinary , Risk Factors , Viral Vaccines/administration & dosage
15.
Prev Vet Med ; 125: 66-74, 2016 Mar 01.
Article in English | MEDLINE | ID: mdl-26830058

ABSTRACT

A prospective longitudinal study was conducted in a population of Australian feedlot cattle to assess associations between animal characteristic and environmental risk factors and risk of bovine respiratory disease (BRD). Animal characteristics were recorded at induction, when animals were individually identified and enrolled into study cohorts (comprising animals in a feedlot pen). Environmental risk factors included the year and season of induction, source region and feedlot region and summary variables describing weather during the first week of follow-up. In total, 35,131 animals inducted into 170 cohorts within 14 feedlots were included in statistical analyses. Causal diagrams were used to inform model building and multilevel mixed effects logistic regression models were fitted within the Bayesian framework. Breed, induction weight and season of induction were significantly and strongly associated with risk of BRD. Compared to Angus cattle, Herefords were at markedly increased risk (OR: 2.0, 95% credible interval: 1.5-2.6) and tropically adapted breeds and their crosses were at markedly reduced risk (OR: 0.5, 95% credible interval: 0.3-0.7) of developing BRD. Risk of BRD declined with increased induction weight, with cattle in the heaviest weight category (≥480kg) at moderately reduced risk compared to cattle weighing <400kg at induction (OR: 0.6, 95% credible interval: 0.5-0.7). Animals inducted into feedlots during summer (OR: 2.4, 95% credible interval: 1.4-3.8) and autumn (OR: 2.1, 95% credible interval: 1.2-3.2) were at markedly increased risk compared to animals inducted during spring. Knowledge of these risk factors may be useful in predicting BRD risk for incoming groups of cattle in Australian feedlots. This would then provide the opportunity for feedlot managers to tailor management strategies for specific subsets of animals according to predicted BRD risk.


Subject(s)
Bovine Respiratory Disease Complex/epidemiology , Cattle Diseases/epidemiology , Environment , Animals , Australia/epidemiology , Body Weight , Bovine Respiratory Disease Complex/etiology , Bovine Respiratory Disease Complex/genetics , Cattle , Cattle Diseases/etiology , Cattle Diseases/genetics , Longitudinal Studies , Prospective Studies , Risk Factors , Seasons
16.
Epidemiol Infect ; 143(9): 1816-25, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25366865

ABSTRACT

There were multiple waves of influenza-like illness in 1918, the last of which resulted in a highly lethal pandemic killing 50 million people. It is difficult to study the initial waves of influenza-like illness in early 1918 because few deaths resulted and few morbidity records exist. Using extant military mortality records, we constructed mortality maps based on location of burial in France and Belgium in the British Army, and on home town in Vermont and New York in the USA Army. Differences between early and more lethal later waves in late 1918 were consistent with historical descriptions in France. The maps of Vermont and New York support the hypothesis that previous exposure may have conferred a degree of protection against subsequent infections; soldiers from rural areas, which were likely to have experienced less mixing than soldiers from urban areas, were at higher risk of mortality. Differences between combat and disease mortality in 1918 were consistent with limited influenza virus circulation during the early 1918 wave. We suggest that it is likely that more than one influenza virus was circulating in 1918, which might help explain the higher mortality rates in those unlikely to have been infected in early 1918.


Subject(s)
Influenza A Virus, H1N1 Subtype/physiology , Influenza, Human/history , Pandemics , France/epidemiology , History, 20th Century , Humans , Influenza A Virus, H1N1 Subtype/genetics , Influenza, Human/epidemiology , Influenza, Human/mortality , Military Personnel , New York/epidemiology , Retrospective Studies , Risk Factors , United Kingdom/epidemiology , Vermont/epidemiology , Warfare
17.
Prev Vet Med ; 117(1): 160-9, 2014 Nov 01.
Article in English | MEDLINE | ID: mdl-25070491

ABSTRACT

A nationwide longitudinal study was conducted to investigate risk factors for bovine respiratory disease (BRD) in cattle in Australian feedlots. After induction (processing), cattle were placed in feedlot pens (cohorts) and monitored for occurrence of BRD over the first 50 days on feed. Data from a national cattle movement database were used to derive variables describing mixing of animals with cattle from other farms, numbers of animals in groups before arrival at the feedlot, exposure of animals to saleyards before arrival at the feedlot, and the timing and duration of the animal's move to the vicinity of the feedlot. Total and direct effects for each risk factor were estimated using a causal diagram-informed process to determine covariates to include in four-level Bayesian logistic regression models. Mixing, group size and timing of the animal's move to the feedlot were important predictors of BRD. Animals not mixed with cattle from other farms prior to 12 days before induction and then exposed to a high level of mixing (≥4 groups of animals mixed) had the highest risk of developing BRD (OR 3.7) compared to animals mixed at least 4 weeks before induction with less than 4 groups forming the cohort. Animals in groups formed at least 13 days before induction comprising 100 or more (OR 0.5) or 50-99 (OR 0.8) were at reduced risk compared to those in groups of less than 50 cattle. Animals moved to the vicinity of the feedlot at least 27 days before induction were at reduced risk (OR 0.4) compared to cattle undergoing short-haul transportation (<6h) to the feedlot within a day of induction, while those experiencing longer transportation durations (6h or more) within a day of induction were at slightly increased risk (OR 1.2). Knowledge of these risk factors could potentially be used to inform management decisions to reduce the risk of BRD in feedlot cattle.


Subject(s)
Animal Husbandry , Bovine Respiratory Disease Complex/epidemiology , Housing, Animal , Animals , Australia/epidemiology , Cattle , Risk Factors
18.
Spat Spatiotemporal Epidemiol ; 3(3): 225-34, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22749208

ABSTRACT

The health effects of environmental hazards are often examined using time series of the association between a daily response variable (e.g., death) and a daily level of exposure (e.g., temperature). Exposures are usually the average from a network of stations. This gives each station equal importance, and negates the opportunity for some stations to be better measures of exposure. We used a Bayesian hierarchical model that weighted stations using random variables between zero and one. We compared the weighted estimates to the standard model using data on health outcomes (deaths and hospital admissions) and exposures (air pollution and temperature) in Brisbane, Australia. The improvements in model fit were relatively small, and the estimated health effects of pollution were similar using either the standard or weighted estimates. Spatial weighted exposures would be probably more worthwhile when there is either greater spatial detail in the health outcome, or a greater spatial variation in exposure.


Subject(s)
Air Pollution/statistics & numerical data , Bayes Theorem , Environmental Exposure/statistics & numerical data , Environmental Monitoring/methods , Particulate Matter/adverse effects , Temperature , Air Pollution/adverse effects , Australia/epidemiology , Hospitalization/statistics & numerical data , Humans , Mortality , Spatial Analysis
19.
J Hosp Infect ; 80(4): 331-9, 2012 Apr.
Article in English | MEDLINE | ID: mdl-22119567

ABSTRACT

BACKGROUND: Staffing deficits and workload have may a bearing on transmission of meticillin-resistant Staphylococcus aureus (MRSA) within intensive care units (ICUs). New MRSA acquistions may provide a clearer picture of the relationship between MRSA acquisition and staffing in the ICU setting. AIM: To determine whether staffing and bed occupancy rates had an immediate or delayed impact on the number of new MRSA acquisitions in a well-staffed ICU, and whether these variables could be used as predictors of future MRSA acquisitions. METHODS: Data on new MRSA acquisitions in the ICU of a 796-bed metropolitan Australian hospital between January 2003 and December 2006 were used to build a model to predict the probabilility of actual new MRSA acquisitions in 2007. Cross validation was performed using receiver operator characteristic analysis. FINDINGS: Sixty-one new MRSA acquisitions (21 infections, 40 colonizations) were identified in 51 individual weeks over the study period. The number of non-permanent staffing hours was relatively small. The area under the curve in the cross-validation analysis was 0.46 [95% CI 0.25-0.67] which suggests that the model, built on data from 2003-2006, was not able to predict weeks in which new MRSA acquisitions occurred in 2007. CONCLUSION: The risks posed by high workloads may have been mitigated by good compliance with infection control measures, nurse training and adequate staffing ratios in the ICU. Consequently, staffing policies and the infection control practices in the ICU do not need to be modified to address the rate of new MRSA acquisitions.


Subject(s)
Health Workforce/statistics & numerical data , Methicillin-Resistant Staphylococcus aureus/isolation & purification , Staphylococcal Infections/epidemiology , Staphylococcal Infections/microbiology , Workload/statistics & numerical data , Attitude of Health Personnel , Australia/epidemiology , Guideline Adherence , Humans , Incidence , Intensive Care Units , Risk Assessment , Staphylococcal Infections/prevention & control
20.
Ann Trop Med Parasitol ; 104(4): 303-18, 2010 Jun.
Article in English | MEDLINE | ID: mdl-20659391

ABSTRACT

In terms of their applicability to the field of tropical medicine, geographical information systems (GIS) have developed enormously in the last two decades. This article reviews some of the pertinent and representative applications of GIS, including the use of such systems and remote sensing for the mapping of Chagas disease and human helminthiases, the use of GIS in vaccine trials, and the global applications of GIS for health-information management, disease epidemiology, and pandemic planning. The future use of GIS as a decision-making tool and some barriers to the widespread implementation of such systems in developing settings are also discussed.


Subject(s)
Geographic Information Systems/organization & administration , Public Health Informatics/organization & administration , Satellite Communications/organization & administration , Tropical Medicine , Forecasting , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...