Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
1.
Sci Total Environ ; 799: 149102, 2021 Dec 10.
Article in English | MEDLINE | ID: mdl-34388889

ABSTRACT

Infectious disease epidemics are plaguing the world and a lot of research is focused on the development of models to reproduce disease dynamics for eco-environmental and biological investigation, and disease management. Leptospirosis is an example of a neglected zoonosis strongly mediated by ecohydrological dynamics with emerging endemic and epidemic patterns worldwide in both animal and human populations. By accounting for large heterogeneities of affected areas we show how exponential endemics and scale-free epidemics are largely predictable and linked to common socio-environmental features via scaling laws with different exponents that inform about vulnerability factors. This led to the development of a novel pattern-oriented integrated model that can be used as an early-warning signal (EWS) tool for endemic-epidemic regime classification, risk determinant attribution, and near real-time forecast of outbreaks. Forecasts are grounded on expected outbreak recurrence time dependent on exceedance probabilities and statistical EWS that sense outbreak onset. A stochastic spatially-explicit model is shown to comprehensively predict outbreak dynamics (early sensing, timing, magnitude, decay, and eco-environmental determinants) and derive a spreading factor characterizing endemics and epidemics, where average over maximum rainfall is the critical factor characterizing disease transitions. Dynamically, case cross-correlation considering neighboring communities senses 2-weeks in advance outbreaks. Eco-environmental scaling relationships highlight how predicted host suitability and topographic index can be used as epidemiological footprints to effectively distinguish and control Leptospirosis regimes and areas dependent on hydro-climatological dynamics as the main trigger. The spatio-temporal scale-invariance of epidemics - underpinning persistent criticality and neutrality or independence among areas - is emphasized by the high accuracy in reproducing sequence and magnitude of cases via reliable surveillance. Further investigations of robustness and universality of eco-environmental determinants are required; nonetheless a comprehensive and computationally simple EWS method for the full characterization of Leptospirosis is provided. The tool is extendable to other climate-sensitive zoonoses to define vulnerability factors and predict outbreaks useful for optimal disease risk prevention and control.


Subject(s)
Leptospirosis , Animals , Climate , Disease Outbreaks , Forecasting , Humans , Leptospirosis/epidemiology , Zoonoses
2.
J Dairy Sci ; 98(11): 8227-39, 2015 Nov.
Article in English | MEDLINE | ID: mdl-26364104

ABSTRACT

The objective of this study was to evaluate the performance of bacterial culture of feces and serum ELISA to correctly identify cows with Mycobacterium avium ssp. paratuberculosis (MAP) at heavy, light, and non-fecal-shedding levels. A total of 29,785 parallel test results from bacterial culture of feces and serum ELISA were collected from 17 dairy herds in Minnesota, Pennsylvania, and Colorado. Samples were obtained from adult cows from dairy herds enrolled for up to 10 yr in the National Johne's Disease Demonstration Herd Project. A Bayesian latent class model was fitted to estimate the probabilities that bacterial culture of feces (using 72-h sedimentation or 30-min centrifugation methods) and serum ELISA results correctly identified cows as high positive, low positive, or negative given that cows were heavy, light, and non-shedders, respectively. The model assumed that no gold standard test was available and conditional independency existed between diagnostic tests. The estimated conditional probabilities that bacterial culture of feces correctly identified heavy shedders, light shedders, and non-shedders were 70.9, 32.0, and 98.5%, respectively. The same values for the serum ELISA were 60.6, 18.7, and 99.5%, respectively. Differences in diagnostic test performance were observed among states. These results improve the interpretation of results from bacterial culture of feces and serum ELISA for detection of MAP and MAP antibody (respectively), which can support on-farm infection control decisions and can be used to evaluate disease-testing strategies, taking into account the accuracy of these tests.


Subject(s)
Paratuberculosis/blood , Paratuberculosis/diagnosis , Animals , Bayes Theorem , Cattle , Cattle Diseases/blood , Cattle Diseases/diagnosis , Cattle Diseases/microbiology , Colorado , Enzyme-Linked Immunosorbent Assay/veterinary , Feces/microbiology , Minnesota , Mycobacterium avium subsp. paratuberculosis/isolation & purification , Pennsylvania
3.
Zoonoses Public Health ; 59(6): 389-92, 2012 Sep.
Article in English | MEDLINE | ID: mdl-23057081

ABSTRACT

Pork has been traditionally considered an important source of human Toxoplasma gondii infection. Pigs, as other meat-producing animals, can become infected by the ingestion of oocysts that are shed in the environment by infected cats or by the consumption of cysts present in tissues of infected mammals, commonly small rodents. The objective of this study was to investigate the level of T. gondii infection in swine from southern Chile that can be associated with the ingestion of oocysts and therefore exposure to a contaminated environment. A total of 340 serum samples from swine were obtained from three commercial slaughterhouses located in the Araucania and Los Rios Regions from southern Chile. Study animals originated from local farms, mainly small commercial producers, and the meat is sold locally. Overall, 8.8% (30/340) of the samples showed T. gondii-specific IgG antibodies. Of these sero-positive animals, 80% (24/30) were also positive for antibodies specific against the oocyst stage of the parasite, indicating that animals had been infected recently by the ingestion of oocysts. The observed results suggest a high level of environmental contamination with oocysts on the farms of origin. In addition to the food safety problems associated with the consumption of meat from infected animals, the high level of environmental contamination on the farm represents a direct health risk for people living and/or working on these farms. Consequently, there is a need to develop on-farm monitoring programmes and identify risk reduction strategies (food storage, water purification, rodent control and contact with cats) that are appropriate and cost-effective for informal and outdoor type of farms.


Subject(s)
Antibodies, Protozoan/blood , Antigens, Protozoan/immunology , Meat/parasitology , Swine Diseases/parasitology , Toxoplasma/immunology , Toxoplasmosis, Animal/parasitology , Abattoirs , Animals , Chile/epidemiology , Environment , Feces/parasitology , Female , Humans , Male , Oocysts/immunology , Pilot Projects , Seroepidemiologic Studies , Species Specificity , Sporozoites/immunology , Swine , Swine Diseases/epidemiology , Swine Diseases/transmission , Toxoplasma/isolation & purification , Toxoplasmosis, Animal/epidemiology , Toxoplasmosis, Animal/transmission
4.
J Dairy Sci ; 92(12): 5964-76, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19923600

ABSTRACT

This study evaluates the changes in milk production (yield; MY) and milk electrical conductivity (MEC) before and after disease diagnosis and proposes a cow health monitoring scheme based on observing individual daily MY and MEC. All reproductive and health events were recorded on occurrence, and MY and MEC were collected at each milking from January 2004 through November 2006 for 587 cows. The first 24 mo (January 2004 until December 2005) were used to investigate the effects of disease on MY and MEC, model MY and MEC of healthy animals, and develop a health monitoring scheme to detect disease based on changes in a cow's MY or MEC. The remaining 11 mo of data (January to November 2006) were used to compare the performance of the health monitoring schemes developed in this study to the disease detection system currently used on the farm. Mixed model was used to examine the effect of diseases on MY and MEC. Days in milk (DIM), DIM x DIM, and ambient temperature were entered as quantitative variables and number of calves, parity, calving difficulty, day relative to breeding, day of somatotropin treatment, and 25 health event categories were entered as categorical variables. Significant changes in MY and MEC were observed as early as 10 and 9 d before diagnosis. Greatest cumulative effect on MY over the 59-d evaluation period was estimated for miscellaneous digestive disorders (mainly diarrhea) and udder scald, at -304.42 and -304.17 kg, respectively. The greatest average daily effect was estimated for milk fever with a 10.36-kg decrease in MY and 8.3% increase in MEC. Milk yield and MEC was modeled by an autoregressive model using a subset of healthy cow records. Six different self-starting cumulative sum and Shewhart charting schemes were designed using 3 different specificities (98, 99, and 99.5%) and based on MY alone or MY and MEC. Monitoring schemes developed in this study issue alerts earlier relative to the day of diagnosis of udder, reproductive, or metabolic problems, are more sensitive, and give fewer false-positive alerts than the disease detection system currently used on the farm.


Subject(s)
Cattle Diseases/physiopathology , Dairying/methods , Electric Conductivity , Lactation/physiology , Milk/chemistry , Animals , Cattle , Cattle Diseases/diagnosis , Cattle Diseases/metabolism , Female , Models, Statistical , Predictive Value of Tests , Sensitivity and Specificity , Time Factors
5.
Vet Microbiol ; 133(3): 297-302, 2009 Jan 13.
Article in English | MEDLINE | ID: mdl-18778902

ABSTRACT

This paper describes a method to provide improved probability estimates that exposure to a specific dose of an airborne infectious pathogen will result in animal infection. Individual animals were exposed to a specific dose of airborne pathogen. Following exposure, animals were individually housed and monitored for evidence of infection. The detection of specific antibodies and/or the pathogen in diagnostic specimens was evidence that the exposure dose resulted in infection. If replicated over a range of doses, the results can be used to derive a dose-response curve for a variety of animal species and infectious pathogens. This information is useful in estimating the likelihood of infection associated with exposure to airborne infectious microorganisms. Applications include predicting the risk of transmission associated with exposure to airborne pathogens, modeling the transmission of airborne pathogens, and determining requirements for effective exposure doses for vaccines delivered in aerosols.


Subject(s)
Air Microbiology , Porcine Reproductive and Respiratory Syndrome/virology , Porcine respiratory and reproductive syndrome virus , Animals , Inhalation Exposure , Porcine Reproductive and Respiratory Syndrome/transmission , Swine
6.
J Dairy Sci ; 91(1): 433-41, 2008 Jan.
Article in English | MEDLINE | ID: mdl-18096968

ABSTRACT

The present study examines the capability of 1,501 herds in the Upper Midwest and the performance of statistical process control charts and indices as a way of monitoring and controlling milk quality on the farm. For 24 mo, daily or every other day bulk tank somatic cell count (SCC) data were collected. Consistency indices for 5 different SCC standards were developed. The indices calculate the maximum variation allowed to meet a desired SCC level at a given mean bulk tank SCC and were used to identify herds not capable of meeting a specific SCC standard. Consistency index method was compared with a test identifying future bulk tank SCC standard violators based on herds' past violations. The performance of the consistency index test and the past violation method was evaluated by logistic regression. The comparison focused on detection probability and certainty associated with a result. For the 5 SCC levels, detection probability and certainty associated with a result ranged from 51 to 98%. Detection probability of all violators and certainty associated with a negative result was greater for the consistency index across all 5 SCC levels (by 0.7 to 7.4% and 2.1 to 5.1%, respectively). Control charts were plotted and monthly consistency indices calculated for individual farms. Charts in combination with the consistency indices would warn from 66 to 80% of the herds about an upcoming violation within 30 d before it occurred. They offer a proactive approach to maintaining consistently high milk quality. By assessing process capability and distinguishing between significant changes and random variation in bulk tank SCC, tools presented in this article encourage fact-based decisions in dairy farm milk quality management.


Subject(s)
Cattle , Cell Count/veterinary , Dairying/methods , Milk/cytology , Milk/standards , Animals , Cell Count/methods , Female , Reproducibility of Results , Statistics, Nonparametric
7.
Vet Microbiol ; 110(1-2): 7-16, 2005 Sep 30.
Article in English | MEDLINE | ID: mdl-16098692

ABSTRACT

At the most elemental level, the design of effective strategies to control and/or eliminate porcine reproductive and respiratory syndrome (PRRS) virus depend on an accurate and comprehensive understanding of virus transmission. As a general rule, transmission is highly dependent on the route of exposure and the dose of virus. The objective of this study was to derive PRRS virus isolate VR-2332 dose-response curves for oral and intranasal routes of exposure, i.e., determine the probability that a specific virus dose would result in infection. Individually housed pigs approximately 21 days of age were exposed to specific doses of PRRS virus isolate VR-2332 by either oral or intranasal routes. Positive controls were intramuscularly inoculated with 10(2.2) 50% tissue culture infective dose (TCID50) of PRRS virus and negative controls were orally administered 100ml of diluent with no virus. Pigs were monitored for evidence of infection for 21 days following exposure, i.e., serum samples were collected on days 0, 7, 14, 21, and tested for virus and PRRS virus-specific antibodies. Dose-response curves and 95% confidence intervals for oral and intranasal routes of exposure were derived using logistic models (logit and probit). The infectious dose50 (ID50) for oral exposure was estimated to be 10(5.3) TCID50 (95% CI, 10(4.6) and 10(5.9)); the ID50 for intranasal exposure was estimated to be 10(4.0) TCID50 (95% CI, 10(3.0) and 10(5.0)). Given these estimates, it is worth noting that intramuscular exposure of animals to 10(2.2) TCID50 (positive controls) resulted in infection in all animals. Thus pigs were the most susceptible to infection via parenteral exposure.


Subject(s)
Antibodies, Viral/blood , Porcine Reproductive and Respiratory Syndrome/transmission , Porcine respiratory and reproductive syndrome virus/pathogenicity , Administration, Intranasal , Administration, Oral , Animals , Carrier State/veterinary , Injections, Intramuscular/veterinary , Logistic Models , Maximum Tolerated Dose , Porcine Reproductive and Respiratory Syndrome/immunology , Porcine Reproductive and Respiratory Syndrome/prevention & control , Porcine respiratory and reproductive syndrome virus/immunology , Random Allocation , Swine
8.
J Am Vet Med Assoc ; 219(10): 1426-31, 2001 Nov 15.
Article in English | MEDLINE | ID: mdl-11724183

ABSTRACT

OBJECTIVE: To evaluate risk of bovine viral diarrhea virus (BVDV) infection between birth and 9 months of age for dairy replacement heifers raised under typical dry-lot management conditions. DESIGN: Longitudinal observational study. ANIMALS: 446 calves. PROCEDURE: Calves were randomly selected from 2 dairies that used killed and modified-live BVDV vaccines. Repeated serologic and BVDV polymerase chain reaction assays were used to estimate risk of BVDV infection in calves of various ages (1 to 60 days; 61 to 100 days; 101 days to 9 months) and to estimate overall infection rate by 9 months of age. RESULTS: Risk of BVDV infection increased with age (maximum risk, 150 to 260 days). Proportion of calves infected with BVDV by 9 months of age was higher for dairy A (0.665), compared with dairy B (0.357). Percentage infected with BVDV type I did not differ between dairy A (18.2%) and dairy B (15.2%), whereas percentage infected with BVDV type II for dairy A (50%) was twice that for dairy B (21%). Between 210 and 220 days of age, infection with BVDV regardless of type was > 1.3%/d on dairy A and 0.5%/d on dairy B. CONCLUSIONS AND CLINICAL RELEVANCE: Under dry-lot conditions, a considerable amount of BVDV infection may occur before 9 months of age. Risk of infection increases with age. Although dairies may appear to have similar management practices, there can be considerably different risks of BVDV infection among dairies.


Subject(s)
Bovine Virus Diarrhea-Mucosal Disease/etiology , Dairying/methods , Age Factors , Animals , Animals, Newborn , Bovine Virus Diarrhea-Mucosal Disease/blood , Bovine Virus Diarrhea-Mucosal Disease/epidemiology , Bovine Virus Diarrhea-Mucosal Disease/prevention & control , California/epidemiology , Cattle , Diarrhea Viruses, Bovine Viral/immunology , Female , Longitudinal Studies , Risk Factors , Vaccination/veterinary , Vaccines, Attenuated , Vaccines, Inactivated , Viral Vaccines/administration & dosage , Viral Vaccines/immunology
9.
J Am Vet Med Assoc ; 219(7): 968-75, 2001 Oct 01.
Article in English | MEDLINE | ID: mdl-11601795

ABSTRACT

OBJECTIVE: To estimate transmission of bovine viral diarrhea virus (BVDV) and crude morbidity and mortality ratios in BVDV-vaccinated and unvaccinated dairy heifer calves managed under typical dairy drylot conditions. DESIGN: Randomized clinical trial. ANIMALS: 106 female Holstein calves. PROCEDURE: Seroconversion rates for BVDV types I and II and proportional morbidity and mortality ratios were compared between calves given a killed BVDV type-I vaccine at 15 days of age and a modified-live BVDV type-I vaccine at 40 to 45 days of age (n = 53) and calves given no BVDV vaccines (53). Sera were collected at 45-day intervals as calves moved from individual hutches to corrals holding increasingly larger numbers of calves. Seroconversion was used as evidence of exposure to BVDV. RESULTS: Crude proportional morbidity (0.16) and mortality (0.17) ratios for control calves did not differ significantly from those of vaccinated calves (0.28 and 0.12, respectively). The proportion of control calves that seroconverted to BVDV type I through 9 months of age (0.629) was significantly higher than that of vaccinated calves that seroconverted, unrelated to vaccination, during the same period (0.536). Estimated overall protective effect of vaccination against BVDV type I through 4 to 9 months of age was 48%. The proportion of control calves that seroconverted to BVDV type II (0.356) was not different from that of vaccinated calves (0.470). CONCLUSIONS AND CLINICAL RELEVANCE: Findings suggest that calfhood vaccination may be an appropriate strategy to help reduce short-term transmission of some but not necessarily all strains of BVDV.


Subject(s)
Antibodies, Viral/blood , Bovine Virus Diarrhea-Mucosal Disease/prevention & control , Diarrhea Viruses, Bovine Viral/immunology , Viral Vaccines/immunology , Animals , Animals, Newborn , Antibodies, Viral/immunology , Bovine Virus Diarrhea-Mucosal Disease/epidemiology , Bovine Virus Diarrhea-Mucosal Disease/transmission , Cattle , Female , Morbidity , Vaccination/veterinary , Vaccines, Attenuated , Vaccines, Inactivated , Virus Shedding
10.
J Vet Diagn Invest ; 12(3): 195-203, 2000 May.
Article in English | MEDLINE | ID: mdl-10826831

ABSTRACT

The study was conducted to develop methodology for least-cost strategies for using polymerase chain reaction (PCR)/probe testing of pooled blood samples to identify animals in a herd persistently infected with bovine viral diarrhea virus (BVDV). Cost was estimated for 5 protocols using Monte Carlo simulations for herd prevalences of BVDV persistent infection (BVDV-PI) ranging from 0.5% to 3%, assuming a cost for a PCR/probe test of $20. The protocol associated with the least cost per cow involved an initial testing of pools followed by repooling and testing of positive pools. For a herd prevalence of 1%, the least cost per cow was $2.64 (95% prediction interval = $1.72, $3.68), where pool sizes for the initial and repooled testing were 20 and 5 blood samples per pool, respectively. Optimization of the least cost for pooled-sample testing depended on how well a presumed prevalence of BVDV-PI approximated the true prevalence of BVDV infection in the herd. As prevalence increased beyond 3%, the least cost increased, thereby diminishing the competitive benefit of pooled testing. The protocols presented for sample pooling have general application to screening or surveillance using a sensitive diagnostic test to detect very low prevalence diseases or pathogens in flocks or herds.


Subject(s)
Bovine Virus Diarrhea-Mucosal Disease/diagnosis , DNA, Viral/blood , Diarrhea Viruses, Bovine Viral/isolation & purification , Disease Reservoirs/veterinary , Polymerase Chain Reaction/veterinary , Animals , Bovine Virus Diarrhea-Mucosal Disease/epidemiology , Bovine Virus Diarrhea-Mucosal Disease/prevention & control , Cattle , Computer Simulation , DNA Primers/chemistry , Diarrhea Viruses, Bovine Viral/genetics , Monte Carlo Method , Polymerase Chain Reaction/economics , Polymerase Chain Reaction/methods , Prevalence
11.
Lab Anim Sci ; 49(6): 617-21, 1999 Dec.
Article in English | MEDLINE | ID: mdl-10638496

ABSTRACT

OBJECTIVE: Our purpose was to assess the extent to which early weaning and other weaning-management factors affect development of postweaning chronic diarrhea in captive rhesus monkeys at the California Regional Primate Research Center between 1992 and 1995. METHODS: Data for weaning, management, and onset of diarrhea were obtained from daily records. The Cox proportional hazard model was used to assess whether the risk of chronic diarrhea was related to early weaning. RESULTS: Monkeys that were lighter at weaning had a threefold increase in risk of postweaning chronic diarrhea (P = 0.07), compared with that in heavier monkeys. An episode of preweaning diarrhea increased the risk of postweaning chronic diarrhea twofold (P = 0.08). Relocation of monkeys to outdoor facilities in the fall was associated with a fivefold decrease in risk (P < 0.001), compared with that of other seasons, and weaning in 1993 was associated with a twofold decrease in risk, compared with that of other years (P = 0.04). CONCLUSIONS: Multiple factors need to be considered for prevention of postweaning chronic diarrhea, including weaning weight, preweaning diarrhea, season weaned, and weaning conditions that change from year to year.


Subject(s)
Diarrhea/veterinary , Macaca mulatta/physiology , Monkey Diseases/physiopathology , Weaning , Animal Husbandry/methods , Animals , Body Weight , Chronic Disease , Diarrhea/etiology , Diarrhea/physiopathology , Diarrhea/prevention & control , Female , Male , Monkey Diseases/etiology , Monkey Diseases/prevention & control , Proportional Hazards Models , Risk Factors , Seasons , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...