Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
J Dairy Sci ; 103(2): 1541-1552, 2020 Feb.
Article in English | MEDLINE | ID: mdl-31864753

ABSTRACT

The objective of this study was to evaluate the effect of supplementing enzymatically hydrolyzed yeast (EHY; Celmanax Arm and Hammer Animal Nutrition, Princeton, NJ) on transition dairy cattle. Forty multiparous Holstein cows were blocked by predicted transmitting ability and randomly assigned to 1 of 2 treatments (EHY, n = 20; or control, CON, n = 20) from 21 d before expected calving to 60 d postpartum. The EHY cows received 56 and 28 g/d in close-up and lactating diets, respectively. Dry matter intake, health events, milk production parameters, feed efficiency, colostrum quality, reproductive parameters, body weight, and body condition score were monitored. Fecal samples collected on -21, -14, -7, 0, 1, 3, 5, 7, and 14 d relative to calving were analyzed for total coliforms, Clostridium perfringens, Salmonella, and Escherichia coli O157:H7. Blood samples were collected at 7, 14, and 21 d postpartum for analysis of ß-hydroxybutyrate. Sterile quarter milk samples collected at dry-off, calving, and wk 1, 2, and 3 of lactation were analyzed for milk pathogens and somatic cell count. Pre- or postpartum dry matter intake, body weight, body condition score, milk yield, and milk protein and fat yields did not differ among treatments. Milk fat and protein concentrations were greater in EHY cows than CON cows. ß-Hydroxybutyrate and health events were not different among treatments. The presence of fecal C. perfringens did not differ prepartum, but was lower in EHY cows postpartum. Milk pathogens and total intramammary infections did not differ between treatments at dry-off, calving, wk 1, or wk 2, but more EHY cows were infected with Staphylococcus sp. during wk 3 than CON cows. The EHY cows showed heat earlier than CON cows, but no other reproductive parameters were affected. The EHY supplementation during the transition period did not affect dry matter intake, milk yield, health events, or reproductive parameters but did increase milk protein and fat concentrations.


Subject(s)
3-Hydroxybutyric Acid/blood , Cattle/physiology , Milk/metabolism , Reproduction , Yeast, Dried/metabolism , Animals , Body Weight , Colostrum/chemistry , Diet/veterinary , Female , Lactation , Milk/chemistry , Milk Proteins/analysis , Postpartum Period , Pregnancy , Random Allocation
2.
J Dairy Sci ; 102(7): 6391-6403, 2019 Jul.
Article in English | MEDLINE | ID: mdl-31030920

ABSTRACT

Antimicrobials are frequently administered to calves with diarrhea, despite evidence suggesting questionable efficacy. Even if efficacious, providing the appropriate therapy to an animal requires accurate disease detection. The objective of this study was to use previously collected data and compare clinical scoring by a veterinarian to treatment decisions by on-farm personnel. Data describing daily clinical scores and farm treatments were previously collected from 4 farms for calves from birth to age 28 d. In this data set, a total of 460 calves were enrolled. Daily observations and clinical assessments were made on each farm by the same veterinarian, for a total of 12,101 calf observation days. Farm personnel made all treatment decisions based on their own observations, and these treatments were recorded by study personnel. Overall, the cumulative incidence of a calf exhibiting at least one abnormal clinical sign over the 28-d observation period was 0.93, with cumulative incidences of 0.85 and 0.33 for diarrhea and dehydration, respectively. The cumulative incidence of any treatment (including antibiotics and electrolytes) was 0.85, although the majority of treatments used an antimicrobial. The farm-specific probabilities that a calf with clinical signs of dehydration or diarrhea, respectively, received fluid or electrolyte therapy ranged from 0.08 to 0.27 and 0.03 to 0.12. These probabilities were greater for the day a clinical sign was first observed. The farm-specific probabilities that a calf with clinical signs of diarrhea received an antimicrobial was 0.23 to 0.65, and the probability that a calf exhibiting clinical signs of respiratory disease received an antimicrobial was 0.33 to 0.76. The first observation of diarrhea had similar probabilities to those for all observations of diarrhea. There was greater probability of treatment for calves with their first observed abnormal respiratory signs. Probabilities that treatment with antimicrobials, or fluids or electrolytes, was associated with an abnormal clinical sign were low-that is, calves received treatments in the absence of any abnormal clinical signs. This study illustrates incongruity between treatment decisions by calf treaters (the designated personnel on each farm responsible for calf health assessment and treatment decisions) and those of an observer using a clinical scoring system to identify calves with abnormal clinical signs. These findings indicate opportunities and the need for dairy farmers and advisors to evaluate calf treatment protocols, reasons for treatment, and training programs for calf health and disease detection, as well as to develop monitoring programs for treatment protocol compliance and health outcomes following therapy.


Subject(s)
Decision Making , Diarrhea/veterinary , Veterinarians/psychology , Animals , Anti-Bacterial Agents/administration & dosage , Cattle , Cattle Diseases/epidemiology , Cattle Diseases/psychology , Diarrhea/drug therapy , Diarrhea/epidemiology , Diarrhea/psychology , Farmers/psychology , Farms/statistics & numerical data , Female , Humans , Incidence , Male , Pregnancy , Retrospective Studies
3.
J Dairy Sci ; 100(12): 9881-9891, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28987578

ABSTRACT

Automated milk feeders are used by dairy producers to manage preweaned calves in group housing, but little is known about how these feeding systems are being used in the United States. To better understand how US dairy producers are operating these systems, this study investigated characteristics of barn design, environment, and management practices on 38 farms in the Upper Midwest of the United States via a questionnaire and on-farm measurements. Farms using automated feeders ranged in size from 7 to 300 calves on site. Natural ventilation was used on 50% of the farms, followed by barns with mechanical ventilation (39.5%), tunnel ventilation (7.9%), or outdoor facilities (sheltered plastic domes; 2.6%). Calves were kept in groups of 17.6 ± 9.8 animals (range: 5.9 to 60.5) with an average space allowance of 4.6 ± 2.0 m2/animal (range: 1.6 to 11.9). Calves on these farms received 3.7 ± 0.75 L (range: 2 to 6) of colostrum, but 22% of the tested calves had serum total protein values lower than 5.0 g/dL. Calves had an initial daily allowance of 5.4 ± 2.1 L (range: 3 to 15 L) of milk or milk replacer, rising to a peak amount of 8.3 ± 2.0 L (range: 5 to 15 L) over 18 ± 11.4 d (range: 0 to 44 d). Milk replacer was fed to calves on 68.4% of the farms compared with whole milk supplemented with nutrient balancer on 23.7% and whole milk alone on 7.9% of the farms. Calves were completely weaned at 56.8 ± 9.0 d of age (range: 40 to 85.5) and 52.1 ± 7.5 d (range: 40 to 79) since introduction into the group pen with the feeder. Notably, bacterial contamination of milk was common; the median coliform count was 10,430 cfu/mL (interquartile range: 233,111; range: 45 to 28,517,000) and standard plate count was 2,566,867 cfu/mL (interquartile range: 15,860,194; range 6,668 to 82,825,000) for samples collected from the feeder tube end (or feeder hose). Some areas of deficiency might be of concern as they might be influencing the success of using automated calf feeding systems. In particular, a better understanding of the dynamics of pathogen load is needed in both the group pen area and in the automated feeder unit itself, as these reservoirs represent significant risk to calf health and welfare.


Subject(s)
Dairying/methods , Housing, Animal/statistics & numerical data , Milk/microbiology , Animals , Cattle , Dairying/statistics & numerical data , Female , Iowa , Minnesota , Wisconsin
4.
J Dairy Sci ; 100(11): 9186-9193, 2017 Nov.
Article in English | MEDLINE | ID: mdl-28918142

ABSTRACT

Automated calf feeding systems are increasing in use across the United States, yet information regarding health and mortality outcomes of animals in these systems is limited. The objective of this study was to investigate the relationship between farm management practices, housing, and environmental factors with mortality and health treatment rates of preweaned dairy calves housed in groups with automated feeding systems. Farm records were collected for health treatments and mortality on 26 farms in the Upper Midwest of the United States. Relationships between factors of interest and mortality or treatment rate were calculated using a correlation analysis. Overall median annual mortality rate was 2.6 (interquartile range = 3.6; range = 0.24-13.4%), and 57% of farms reported mortality rates below 3%/yr. Farms that disinfected the navels of newborn calves had lower mortality rate (mean = 3.0%; standard error = 0.8; 78% of farms) than farms that did not disinfect (mean = 7.3%; standard error = 1.6; 22% of farms). Farm size (number of cows on site) was negatively associated [correlation coefficient (r) = -0.53], whereas the age range in calf groups was positively associated (r = 0.58) with mortality rate. Average serum total protein concentration tended to be negatively associated with annual mortality rate (r = -0.39; median = 5.4; range = 5.0-6.4 g/dL). Health treatment rate was positively associated with coliform bacterial count in feeder tube milk samples [r = 0.45; mean ± standard deviation (SD) = 6.45 ± 4.50 ln(cfu/mL)] and the age of calves at grouping (r = 0.50; mean ± SD = 5.1 ± 3.6 d). A positive trend was detected for coliform bacterial count of feeder mixing tank milk samples [r = 0.37; mean ± SD = 3.2 ± 6.4 ln(cfu/mL)] and calf age at weaning (r = 0.37, mean ± SD = 57.4 ± 9.6 d). Seasonal patterns indicated that winter was the season of highest treatment rate. Taken together, these results indicate that, although automated feeding systems can achieve mortality rates below the US average, improvements are needed in fundamental calf care practices, such as colostrum management and preventing bacterial contamination of the liquid diet and the calf environment.


Subject(s)
Animal Husbandry/standards , Cattle Diseases/prevention & control , Milk , Animals , Bacterial Load , Cattle , Colostrum , Dairying/methods , Farms , Female , Midwestern United States , Pregnancy , Risk Factors , Seasons
5.
J Dairy Sci ; 100(12): 9769-9774, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28941820

ABSTRACT

The time required to adequately bucket-train a dairy calf to drink its milk allotment is unknown. Additionally, factors that could predict calves who are slow to learn have not been identified. A prospective observational study was conducted to describe timing of bucket training and possible calf birth and colostrum quality factors that might predict calves requiring extra time to train. On one dairy farm, 1,235 calves were enrolled at birth in a prospective cohort study. Calving ease score, calf presentation at birth, twinning, calf sex, and dam parity were recorded by farm personnel. An as-fed colostrum sample for each calf was collected and evaluated for total solids, total plate bacterial count, and coliform bacterial count. Calf serum total protein values were obtained by d 2 to 3 of life. Calves were observed before the morning milk feeding for attitude/posture, and after feeding for assistance needed to drink milk from their bucket. Attitude/posture was significantly associated with whether a calf required assistance or not. Almost 60% (n = 724) of calves consumed their morning milk allotment (2 L) after d 3 of life without assistance. Significant factors associated with the odds of requiring assistance with drinking after 3 d of age included calf sex, being born a twin, and the week the calf was enrolled. Knowing how long it takes to train a calf to drink from a bucket could be useful in allocating the time or labor required to successfully train calves.


Subject(s)
Cattle/physiology , Colostrum/chemistry , Dairying/methods , Drinking , Parturition/physiology , Animals , Female , Male , Milk , Prospective Studies , Time Factors
6.
J Dairy Sci ; 100(7): 5675-5686, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28456403

ABSTRACT

Automated calf feeding systems are becoming more common on US dairy farms. The objective of this study was to evaluate calf health in these systems and to identify risk factors associated with adverse health outcomes on farms in the Upper Midwest United States. Over an 18-mo period on bimonthly farm visits to 38 farms, calves (n = 10,179) were scored for attitude, ear, eye, and nasal health, as well as evidence of diarrhea (hide dirtiness score of perianal region, underside of the tail, and tailhead). For all health score categories, a score of 0 represented an apparently healthy animal. Rectal temperatures were taken in calves scoring a ≥2 in any category, and those with a temperature >39.4°C were categorized as having a fever (n = 550). Associations were determined between farm-level variables and health scores to identify risk factors for higher (worse) scores. All health outcomes were associated with season of measurement, with fall and winter seasons increasing the odds of a high health score or detected fever. High bacterial counts measured in the milk or milk replacer were associated with increased odds for higher attitude and ear scores, and higher odds for calves having a detected fever. Higher peak milk allowance (L/d) was associated with lower hide dirtiness score, whereas a longer period of time (d) to reach peak milk allowance was associated with increased odds of higher scores for attitude, ear, eye, and hide dirtiness, as well as fever. Higher fat content in milk was associated with increased odds of high eye score. Less space per calf (m2/calf) was associated with higher ear and eye scores, whereas larger group sizes were associated with increased odds of higher nasal score and decreased odds of higher hide dirtiness score. Rectangular pen shape was associated with decreased odds of higher eye score. Absence of a positive pressure ventilation tube was associated with increased odds of having a calf detected with a fever. Based on these results, we hypothesize that these factors could be managed to improve health outcomes for dairy calves on automated feeding systems.


Subject(s)
Cattle Diseases/diagnosis , Feeding Methods/veterinary , Health Status , Animals , Cattle , Diarrhea/diagnosis , Diarrhea/veterinary , Feeding Methods/instrumentation , Milk/standards , Risk Factors , Seasons , United States
7.
J Anim Sci ; 93(2): 731-6, 2015 Feb.
Article in English | MEDLINE | ID: mdl-26020754

ABSTRACT

Previous studies have determined that stress causes decreases in feed intake and efficiency in livestock, but the effect of repeated transport on these parameters has not been well studied. This study determined how repeated transport affected calf post-transport behavior, feed intake, ADG, and feed conversion. Thirty-six 4-mo-old Holstein steer calves were housed in groups of 6 with each group randomly assigned to either transport or control treatments. Each calf was assigned to an individual Calan gate feeder and feed intake was recorded daily. Transport calves were transported for 6 h in their groups in a 7.3 by 2.4 m gooseneck trailer divided into 3 compartments, at an average density of 0.87 m/calf, every 7 d for 5 consecutive weeks. After return to their home pens, behavior was recorded for transported calves at 5-min intervals for 1 h. Calf ADG and feed conversion were analyzed in a mixed model ANOVA, whereas feed intake was analyzed as a repeated measure in a mixed model ANOVA. Post-transport, calves followed a pattern of drinking, eating, and then lying down. The highest (82 ± 5% calves) and lowest (0 ± 5% calves) incidences of eating behavior occurred 10 and 60 min post-transport, respectively. Control calves had a higher feed intake than transported calves overall (7.29 ± 0.22 kg for control and 6.91 ± 0.21 kg for transport; = 0.01), for the feeding posttreatment (6.78 ± 0.27 kg for control and 6.01 ± 0.28 kg for transport; = 0.007), and the day after treatment (7.83 ± 0.23 kg for control and 7.08 ± 0.15 kg for transport; = 0.02). Feed intake for the feeding post-transport for transport calves significantly decreased after the second transport but increased with each successive transport ( < 0.0001). Overall, control calves had higher ADG than transported calves (1.34 ± 0.13 kg/d for control and 1.15 ± 0.12 kg/d for transport; = 0.006). No significant difference ( = 0.12) between treatments was detected for feed conversion. These results suggest that calves exposed to repeated transport may decrease feed intake compared to nontransported calves as an initial response to transport; however, overall feed conversion was not affected and these Holstein calves may have quickly acclimated to repeated transport.


Subject(s)
Behavior, Animal/physiology , Cattle/physiology , Drinking/physiology , Eating/physiology , Feeding Behavior/physiology , Transportation , Acclimatization/physiology , Animals , Cattle/psychology , Housing, Animal , Male , Random Allocation , Stress, Psychological/psychology , Time Factors , Weight Gain/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...