Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 92
Filter
1.
mSystems ; 9(6): e0115823, 2024 Jun 18.
Article in English | MEDLINE | ID: mdl-38785438

ABSTRACT

In low-microbial biomass samples such as bovine milk, contaminants can outnumber endogenous bacteria. Because of this, milk microbiome research suffers from a critical knowledge gap, namely, does non-mastitis bovine milk contain a native microbiome? In this study, we sampled external and internal mammary epithelia and stripped and cisternal milk and used numerous negative controls, including air and sampling controls and extraction and library preparation blanks, to identify the potential sources of contamination. Two algorithms were used to mathematically remove contaminants and track the potential movement of microbes among samples. Results suggest that the majority (i.e., >75%) of sequence data generated from bovine milk and mammary epithelium samples represents contaminating DNA. Contaminants in milk samples were primarily sourced from DNA extraction kits and the internal and external skin of the teat, while teat canal and apex samples were mainly contaminated during the sampling process. After decontamination, the milk microbiome displayed a more dispersed, less diverse, and compositionally distinct bacterial profile compared with epithelial samples. Similar microbial compositions were observed between cisternal and stripped milk samples, as well as between teat apex and canal samples. Staphylococcus and Acinetobacter were the predominant genera detected in milk sample sequences, and bacterial culture showed growth of Staphylococcus and Corynebacterium spp. in 50% (7/14) of stripped milk samples and growth of Staphylococcus spp. in 7% (1/14) of cisternal milk samples. Our study suggests that microbiome data generated from milk samples obtained from clinically healthy bovine udders may be heavily biased by contaminants that enter the sample during sample collection and processing workflows.IMPORTANCEObtaining a non-contaminated sample of bovine milk is challenging due to the nature of the sampling environment and the route by which milk is typically extracted from the mammary gland. Furthermore, the very low bacterial biomass of bovine milk exacerbates the impacts of contaminant sequences in downstream analyses, which can lead to severe biases. Our finding showed that bovine milk contains very low bacterial biomass and each contamination event (including sampling procedure and DNA extraction process) introduces bacteria and/or DNA fragments that easily outnumber the native bacterial cells. This finding has important implications for our ability to draw robust conclusions from milk microbiome data, especially if the data have not been subjected to rigorous decontamination procedures. Based on these findings, we strongly urge researchers to include numerous negative controls into their sampling and sample processing workflows and to utilize several complementary methods for identifying potential contaminants within the resulting sequence data. These measures will improve the accuracy, reliability, reproducibility, and interpretability of milk microbiome data and research.


Subject(s)
Microbiota , Milk , Animals , Cattle , Milk/microbiology , Microbiota/genetics , Female , DNA, Bacterial/analysis , DNA, Bacterial/genetics , Bacteria/isolation & purification , Bacteria/genetics , Bacteria/classification , Mammary Glands, Animal/microbiology , Specimen Handling/methods , RNA, Ribosomal, 16S/genetics , RNA, Ribosomal, 16S/analysis
2.
Appl Environ Microbiol ; 90(4): e0223423, 2024 Apr 17.
Article in English | MEDLINE | ID: mdl-38497641

ABSTRACT

The primary objective of this study was to identify associations between the prepartum teat apex microbiome and the presence of Staphylococcus aureus intramammary infections (IMI) in primiparous cows during the first 5 weeks after calving. We performed a case-control study using shotgun metagenomics of the teat apex and culture-based milk data collected longitudinally from 710 primiparous cows on five organic dairy farms. Cases had higher odds of having S. aureus metagenomic DNA on the teat apex prior to parturition compared to controls (OR = 38.9, 95% CI: 14.84-102.21). Differential abundance analysis confirmed this association, with cases having a 23.8 higher log fold change (LFC) in the abundance of S. aureus in their samples compared to controls. Of the most prevalent microorganisms in controls, those associated with a lower risk of post-calving S. aureus IMI included Microbacterium phage Min 1 (OR = 0.37, 95% CI: 0.25-0.53), Corynebacterium efficiens (OR = 0.53, 95% CI: 0.30-0.94), Kocuria polaris (OR = 0.54, 95% CI: 0.35-0.82), Micrococcus terreus (OR = 0.64, 95% CI: 0.44-0.93), and Dietzia alimentaria (OR = 0.45, 95% CI: 0.26-0.75). Genes encoding for Microcin B17 AMPs were the most prevalent on the teat apex of cases and controls (99.7% in both groups). The predicted abundance of genes encoding for Microcin B17 was also higher in cases compared to controls (LFC 0.26). IMPORTANCE: Intramammary infections (IMI) caused by Staphylococcus aureus remain an important problem for the dairy industry. The microbiome on the external skin of the teat apex may play a role in mitigating S. aureus IMI risk, in particular the production of antimicrobial peptides (AMPs) by commensal microbes. However, current studies of the teat apex microbiome utilize a 16S approach, which precludes the detection of genomic features such as genes that encode for AMPs. Therefore, further research using a shotgun metagenomic approach is needed to understand what role prepartum teat apex microbiome dynamics play in IMI risk.


Subject(s)
Mastitis, Bovine , Staphylococcal Infections , Female , Cattle , Animals , Staphylococcus aureus/genetics , Metagenome , Case-Control Studies , Mastitis, Bovine/epidemiology , Mastitis, Bovine/microbiology , Staphylococcal Infections/epidemiology , Staphylococcal Infections/veterinary , Staphylococcal Infections/microbiology , Milk/microbiology , Mammary Glands, Animal/microbiology
3.
J Dairy Sci ; 107(6): 3899-3915, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38216037

ABSTRACT

Acidogenic boluses can mitigate potential negative effects of high milk yield at dry-off on udder health. This randomized controlled trial aimed to investigate the effect of administering acidogenic boluses at dry-off on dry period intramammary infection (IMI) dynamics and on milk production parameters, somatic cell count linear score (LSCC), clinical mastitis (CM), and herd removal in the next lactation. A total of 901 cows from 3 dairy farms were randomly allocated to a control (CON, n = 458; no administration of acidogenic boluses at dry-off) or treatment group (TRT, n = 443; administration of 2 acidogenic boluses at dry-off). Quarter milk samples were collected at dry-off and after calving and submitted for bacteriological milk culture. The effects of treatment on the presence of quarter-level postpartum IMI, cure of existing IMI, and acquisition of new IMI, and on the prevalence of cow-level high LSCC (LSCC ≥4) in the first 30 days in milk (DIM) were analyzed using mixed effects logistic regression. Mixed linear regression was used to analyze cow-level milk production parameters (i.e., milk yield, fat corrected milk, fat and protein yield, and LSCC) in the first 90 DIM and until 300 DIM. For CM and herd removal, Cox proportional hazard regression models were used. In addition to treatment group, lactation group at dry-off, presence of high LSCC in the last test-day, average milk yield in the week before dry-off, presence of CM in the lactation of enrollment, and biologically relevant interactions were offered in all models. There was no evidence of a difference in IMI dynamics or in milk, fat corrected milk, protein or fat yields in the subsequent lactation between groups. The TRT group had a lower LSCC in the first 2 mo postpartum compared with the CON group (2.58 ± 0.3 vs. 2.92 ± 0.3 and 2.42 ± 0.3 vs. 2.81 ± 0.3, for first and second month postpartum). The prevalence of high LSCC in the first 30 DIM was 9.1% lower in the TRT compared with the CON group (16.3% vs. 25.5%; risk difference: -9.2; 95% confidence interval [CI]: -15.8, -2.5). Cows in the TRT group exhibited reduced hazards of CM in the subsequent lactation compared with cows in the CON group (hazard ratio: 0.75; 95% CI: 0.63, 0.89) as well as a reduced hazard of herd removal (hazard ratio: 0.82, 95% CI: 0.77, 0.88). The administration of acidogenic boluses as a component of dry-off management is a promising approach to maintain good udder health and reduce the hazard of CM and herd removal during the subsequent lactation.


Subject(s)
Lactation , Mammary Glands, Animal , Mastitis, Bovine , Milk , Animals , Cattle , Female , Cell Count/veterinary
4.
JDS Commun ; 4(4): 293-297, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37521060

ABSTRACT

Elevated milk production at dry-off can lead to increased udder pressure and, in turn, increased stress due to pain and discomfort, affecting natural behaviors. Administering acidogenic boluses at dry-off acts by inducing temporary and mild decreases in blood pH. This decreases dry matter intake, reduces milk yield, and increases cow comfort by lessening udder pressure. The objective of this study was to assess the effect of oral administration of acidogenic boluses at dry-off on total daily activity (TDA) and total daily rumination (TDR) behaviors in the first 2 wk of the dry period. This randomized clinical trial was conducted on a single farm and cows were randomly assigned to either treatment (TRT; n = 30) or control (CON; n = 34). The TRT group received 2 acidogenic boluses at dry-off and the CON group received no intervention. All cows received dry-cow therapy (intramammary antibiotic and internal teat sealant). The TDA and TDR data from 7 d before to 14 d after dry-off were measured using ear-mounted activity monitors. Analyses were performed using linear mixed-effects models with repeated measures. We observed a similar TDA in both groups throughout the study follow-up period. Overall, cows in the TRT group spent 17 min/d less time active than cows in the CON group in the first 2 wk after dry-off with the greatest difference observed on the second day of the dry period (TRT = 395 min/d; 95% CI: 370 to 420 vs. CON = 428 min/d; 95% CI: 404 to 451). The TRT group had lower TDR in the first 24 h after bolus administration (TRT = 437 min/d; 95% CI: 414 to 461 vs. CON = 488 min/d; 95% CI: 466 to 510) when compared with the CON group, but no differences were observed when comparing both groups in the 13 subsequent days. Our results indicate that administering acidogenic boluses at dry-off slightly decreased TDA during the first 2 wk of the dry period and decreased TDR on the first day after administration.

5.
J Dairy Sci ; 104(10): 11035-11046, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34253362

ABSTRACT

The objective of this observational study was to compare 4 cow-level algorithms to predict cow-level intramammary infection (IMI) status (culture and MALDI-TOF) in late-lactation US dairy cows using standard measures of test performance. Secondary objectives were to estimate the likely effect of each algorithm, if used to guide selective dry cow therapy (SDCT), on dry cow antibiotic use in US dairy herds, and to investigate the importance of including clinical mastitis criteria in algorithm-guided SDCT. Cows (n = 1,594) from 56 US dairy herds were recruited as part of a previously published cross-sectional study of bedding management and IMI in late-lactation cows. Each herd was visited twice for sampling. At each farm visit, aseptic quarter-milk samples were collected from 20 cows approaching dry-off (>180 d pregnant), which were cultured using standard bacteriological methods and MALDI-TOF for identification of isolates. Quarter-level culture results were used to establish cow-level IMI status, which was considered the reference test in this study. Clinical mastitis records and Dairy Herd Improvement Association test-day somatic cell count data were extracted from herd records and used to perform cow-level risk assessments (low vs. high risk) using 4 algorithms that have been proposed for SDCT in New Zealand, the Netherlands, United Kingdom, and the United States. Agreement between aerobic culture (reference test; IMI vs. no-IMI) and algorithm status (high vs. low risk) was described using Cohen's kappa, test sensitivity, specificity, negative predictive value, and positive predictive value. The proportion of cows classified as high risk among the 4 algorithms ranged from 0.31 to 0.63, indicating that these approaches to SDCT could reduce antibiotic use at dry-off by 37 to 69% in the average US herd. All algorithms had poor agreement with IMI status, with kappa values ranging from 0.05 to 0.13. Sensitivity varied by pathogen, with higher values observed when detecting IMI caused by Streptococcus uberis, Streptococcus dysgalactiae, Staphylococcus aureus, and Lactococcus lactis. Negative predictive values were high for major pathogens among all algorithms (≥0.87), which may explain why algorithm-guided SDCT programs have been successfully implemented in field trials, despite poor agreement with overall IMI status. Removal of clinical mastitis criteria for each algorithm had little effect on the algorithm classification of cows, indicating that algorithms based on SCC alone may have similar performance to those based on SCC and clinical mastitis criteria. We recommend that producers implementing algorithm-guided SDCT use algorithm criteria that matches their relative aspirations for reducing antibiotic use (high specificity, positive predictive value) or minimizing untreated IMI at dry-off (high sensitivity, negative predictive value).


Subject(s)
Mastitis, Bovine , Algorithms , Animals , Cattle , Cell Count/veterinary , Cross-Sectional Studies , Female , Lactation , Mammary Glands, Animal , Milk , Pregnancy , Streptococcus
6.
J Dairy Sci ; 104(5): 5652-5664, 2021 May.
Article in English | MEDLINE | ID: mdl-33685701

ABSTRACT

The objectives of this study were to (1) use partial budget analysis to estimate the cash impact for herds that switch from blanket dry cow therapy (BDCT) to culture- or algorithm-guided selective dry cow therapy (SDCT) and (2) conduct a sensitivity analysis to investigate effects in situations where SDCT increased clinical and subclinical mastitis risk during the subsequent lactation. A partial budget model was created using Monte Carlo simulation with @Risk software. Expenditures associated with dry-off procedures and health outcomes (clinical and subclinical mastitis) during the first 30 d in milk were used to model herd-level effects, expressed in units of US dollars per cow dry-off. Values for each economic component were derived from findings from a recent multisite clinical trial, peer-reviewed journal articles, USDA databases, and our experiences in facilitating the implementation of SDCT on farms. Fixed values were used for variables expected to have minimal variation within the US dairy herd population (e.g., cost of rapid culture plates) and sampling distributions were used for variables that were hypothesized to vary enough to effect the herd net cash impact of one or more DCT approach(es). For Objective 1, herd-level udder health was assumed to be unaffected by the implementation of SDCT. For culture-guided SDCT, producers could expect to save an average of +$2.14 (-$2.31 to $7.23 for 5th and 95th percentiles) per cow dry-off as compared with BDCT, with 75.5% of iterations being ≥$0.00. For algorithm-guided SDCT, the mean net cash impact was +$7.85 ($3.39-12.90) per cow dry-off, with 100% of iterations being ≥$0.00. The major contributors to variance in cash impact for both SDCT approaches were percent of quarters treated at dry-off and the cost of dry cow antibiotics. For Objective 2, we repeated the partial budget model with the 30-d clinical and subclinical mastitis incidence increasing by 1, 2, and 5% (i.e., risk difference = 0.01, 0.02, and 0.05) in both SDCT groups compared with BDCT. For algorithm-guided SDCT, average net cash impacts were ≥$0.00 per cow dry-off (i.e., cost effective) when mastitis incidence increased slightly. However, as clinical mastitis incidence increased, economic returns for SDCT diminished. These findings indicate that when SDCT is implemented appropriately (i.e., no to little negative effect on health), it might be a cost-effective practice for US herds under a range of economic conditions.


Subject(s)
Cattle Diseases , Mastitis, Bovine , Algorithms , Animals , Anti-Bacterial Agents/pharmacology , Cattle , Cattle Diseases/drug therapy , Cell Count/veterinary , Dairying , Female , Lactation , Mammary Glands, Animal , Mastitis, Bovine/drug therapy , Milk
7.
J Dairy Sci ; 104(5): 6061-6079, 2021 May.
Article in English | MEDLINE | ID: mdl-33685704

ABSTRACT

The objective of this prospective cohort study was to explore associations between intramammary infection (IMI) in late-lactation cows and postcalving udder health and productivity. Cows (n = 2,763) from 74 US dairy herds were recruited as part of a previously published cross-sectional study of bedding management and IMI in late-lactation cows. Each herd was visited twice for sampling. At each visit, aseptic quarter milk samples were collected from 20 cows approaching dry-off (>180 d pregnant), which were cultured using standard bacteriological methods and MALDI-TOF for identification of isolates. Quarter-level culture results were used to establish cow-level IMI status at enrollment. Cows were followed from enrollment until 120 d in milk (DIM) in the subsequent lactation. Herd records were used to establish whether subjects experienced clinical mastitis or removal from the herd, and DHIA test-day data were used to record subclinical mastitis events (somatic cell count >200,000 cells/mL) and milk yield (kg/d) during the follow-up period. Cox regression and generalized estimating equations were used to evaluate the associations between IMI and the outcome of interest. The presence of late-lactation IMI caused by major pathogens was positively associated with postcalving clinical mastitis [hazard ratio = 1.5, 95% confidence interval (CI): 1.2, 2.0] and subclinical mastitis (risk ratio = 1.5, 95% CI: 1.3, 1.9). Species within the non-aureus Staphylococcus (NAS) group varied in their associations with postcalving udder health, with some species being associated with increases in clinical and subclinical mastitis in the subsequent lactation. Late-lactation IMI caused by Streptococcus and Streptococcus (Strep)-like organisms, other than Aerococcus spp. (i.e., Enterococcus, Lactococcus, and Streptococcus spp.) were associated with increases in postcalving clinical and subclinical mastitis. Test-day milk yield from 1 to 120 DIM was lower (-0.9 kg, 95% CI: -1.6, -0.3) in late-lactation cows with any IMI compared with cows without IMI. No associations were detected between IMI in late lactation and risk for postcalving removal from the herd within the first 120 DIM. Effect estimates reported in this study may be less than the underlying quarter-level effect size for IMI at dry-off and postcalving clinical and subclinical mastitis, because of the use of late-lactation IMI as a proxy for IMI at dry-off and the use of cow-level exposure and outcome measurements. Furthermore, the large number of models run in this study (n = 94) increases the chance of identifying chance associations. Therefore, confirmatory studies should be conducted. We conclude that IMI in late lactation may increase risk of clinical and subclinical mastitis in the subsequent lactation. The relationship between IMI and postcalving health and productivity is likely to vary among pathogens, with Staphylococcus aureus, Streptococcus spp., Enterococcus spp., and Lactococcus spp. being the most important pathogens identified in the current study.


Subject(s)
Aerococcus , Cattle Diseases , Mastitis, Bovine , Animals , Cattle , Cell Count/veterinary , Cross-Sectional Studies , Enterococcus , Female , Lactation , Lactococcus , Mammary Glands, Animal , Milk , Pregnancy , Prospective Studies , Staphylococcus , Streptococcus
8.
J Dairy Sci ; 104(3): 3495-3507, 2021 Mar.
Article in English | MEDLINE | ID: mdl-33358809

ABSTRACT

Pair housing of dairy heifer calves during the preweaning period helps meet the natural social needs of the calf and has been shown to improve growth and starter intake during the preweaning period as compared with individual housing. However, there is little evidence to suggest that pair-housed calves maintain their social and growth advantages past the weaning phase. The objective of this study was to investigate the effect of pair housing on measures of calf performance, health, and behavior up to 16 wk of age. Healthy Holstein and crossbred heifer calves were enrolled in the study after colostrum feeding, with the first calf randomly assigned to 1 of 2 housing treatments: pair (PR; 2 hutches with common outdoor space) or individual (INDV; 1 hutch plus outdoor space). All calves were bucket fed 4 L of milk replacer twice daily and weaned at 50 d of age. Weaned calves (6/group) remained with their treatment group until exit from the study at 16 wk. A venous blood sample was collected from each calf between 24 h and 7 d of age to test for serum total protein (g/dL). Body weights (kg) were obtained at birth, weaning, and 16 wk. Each enrolled calf was scored for health each week and calf health treatments were also collected. A hair sample was collected from the left shoulder at birth and 16 wk to assess hair cortisol (pg/mL). At enrollment, each calf was fitted with a triaxial accelerometer on the left hind leg for continuous recording of standing and lying time (min/24 h) for 16 wk. Latency to find feed, water, and lie down (min) at entrance to the weaned pen were recorded by continuous video observation. Open field testing with a novel object was performed at 5, 10, and 16 wk. Behaviors analyzed by video observation included latency to approach the object (s), vocalizations (n), and time spent immobile, walking, or running (s/10 min). Linear mixed models were used to determine the effect of treatment (INDV or PR) on calf growth, activity, and behavioral outcomes, which accounted for time, breed, the interaction of time and treatment, the random pen, and variability in testing day and repeated measurements within calf when appropriate. Twenty-four Holstein and crossbred calves (PR: n = 12, 6 pairs; INDV: n = 12) were enrolled from November 2 to December 23, 2018. The PR calves were 7.1 kg heavier at weaning and gained 0.15 kg/d more during the preweaning period as compared with INDV calves. In the 24 h after movement to the postweaning pen, PR calves lay down for longer periods of time (14.3 vs. 11.0 ± 0.4 h/d), and PR calves urinated more during novel object testing at 5 wk of age. Our study demonstrated benefits, such as better growth and increased lying time, of pair housing calves during the preweaning period.


Subject(s)
Housing, Animal , Milk , Animal Feed/analysis , Animals , Behavior, Animal , Body Weight , Cattle , Diet/veterinary , Female , Pregnancy , Weaning
9.
J Dairy Sci ; 103(7): 6473-6492, 2020 Jul.
Article in English | MEDLINE | ID: mdl-32448572

ABSTRACT

Selective dry-cow therapy (SDCT) could be used to reduce antibiotic use on commercial dairy farms in the United States but is not yet widely adopted, possibly due to concerns about the potential for negative effects on cow health. The objective of this study was to compare culture- and algorithm-guided SDCT programs with blanket dry-cow therapy (BDCT) in a multi-site, randomized, natural exposure, non-inferiority trial for the following quarter-level outcomes: antibiotic use at dry-off, dry period intramammary infection (IMI) cure risk, dry period new IMI risk, and IMI risk at 1 to 13 d in milk (DIM). Two days before planned dry-off, cows in each of 7 herds were randomly allocated to BDCT, culture-guided SDCT (cult-SDCT), or algorithm-guided SDCT (alg-SDCT). At dry-off, BDCT cows received an intramammary antibiotic (500 mg of ceftiofur hydrochloride) in all 4 quarters. Antibiotic treatments were selectively allocated to quarters of cult-SDCT cows by treating only quarters from which aseptically collected milk samples tested positive on the Minnesota Easy 4Cast plate (University of Minnesota, St. Paul, MN) after 30 to 40 h of incubation. For alg-SDCT cows, antibiotic treatments were selectively allocated at the cow level, with all quarters receiving antibiotic treatment if the cow had either a Dairy Herd Improvement Association test somatic cell count >200,000 cells/mL during the current lactation or 2 or more clinical mastitis cases during the current lactation. All quarters of all cows were treated with an internal teat sealant. Intramammary infection status at enrollment and at 1 to 13 DIM was determined using standard bacteriological methods. The effect of treatment group on dry period IMI cure, dry period new IMI, and IMI risk at 1 to 13 DIM was determined using generalized linear mixed models (logistic), with marginal standardization to derive risk difference (RD) estimates. Quarter-level antibiotic use at dry-off for each group was BDCT (100%), cult-SDCT (45%), and alg-SDCT (45%). The crude dry period IMI cure risk for all quarters was 87.5% (818/935), the crude dry period new IMI risk was 20.1% (764/3,794), and the prevalence of IMI at 1 to 13 DIM was 23% (961/4,173). Non-inferiority analysis indicated that culture- and algorithm-guided SDCT approaches performed at least as well as BDCT for dry period IMI cure risk. In addition, the final models indicated that the risks for each of the 3 IMI measures were similar between all 3 treatment groups (i.e., RD estimates and 95% confidence intervals all close to 0). These findings indicate that under the conditions of this trial, culture- and algorithm-guided SDCT can substantially reduce antibiotic use at dry-off without negatively affecting IMI dynamics.


Subject(s)
Anti-Bacterial Agents/pharmacology , Cephalosporins/pharmacology , Lactation , Mammary Glands, Animal/drug effects , Mastitis, Bovine/prevention & control , Animals , Cattle , Cell Count/veterinary , Cephalosporins/administration & dosage , Female , Milk/drug effects , Prevalence
10.
J Dairy Sci ; 103(8): 7611-7624, 2020 Aug.
Article in English | MEDLINE | ID: mdl-32448583

ABSTRACT

Passive immunity in calves is evaluated or quantified by measuring serum or plasma IgG or serum total protein within the first 7 d of age. While these measurements inform about circulating concentrations of this important protein, they are also a proxy for evaluating all of the additional benefits of colostral ingestion. The current individual calf standard for categorizing dairy calves with successful passive transfer or failure of passive transfer of immunity are based on serum IgG concentrations of ≥10 and <10 g/L, respectively. This cutoff was based on higher mortality rates in calves with serum IgG <10 g/L. Mortality rates have decreased since 1991, but the percentage of calves with morbidity events has not changed over the same time period. Almost 90% of calves sampled in the USDA National Animal Health Monitoring System's Dairy 2014 study had successful passive immunity based on the dichotomous standard. Based on these observations, a group of calf experts were assembled to evaluate current data and determine if changes to the passive immunity standards were necessary to reduce morbidity and possibly mortality. In addition to the USDA National Animal Health Monitoring System's Dairy 2014 study, other peer-reviewed publications and personal experience were used to identify and evaluate potential standards. Four options were evaluated based on the observed statistical differences between categories. The proposed standard includes 4 serum IgG categories: excellent, good, fair, and poor with serum IgG levels of ≥25.0, 18.0-24.9, 10.0-17.9, and <10 g/L, respectively. At the herd level, we propose an achievable standard of >40, 30, 20, and <10% of calves in the excellent, good, fair, and poor categories, respectively. Because serum IgG concentrations are not practical for on-farm implementation, we provide corresponding serum total protein and %Brix values for use on farm. With one-third of heifer calves in 2014 already meeting the goal of ≥25 g/L serum IgG at 24 h of life, this achievable standard will require more refinement of colostrum management programs on many dairy farms. Implementation of the proposed standard should further reduce the risk of both mortality and morbidity in preweaned dairy calves, improving overall calf health and welfare.


Subject(s)
Cattle/immunology , Immunity, Herd , Immunoglobulin G/blood , Animals , Animals, Newborn/immunology , Colostrum/immunology , Consensus , Female , Male , Pregnancy , United States
11.
J Dairy Sci ; 103(7): 6493-6503, 2020 Jul.
Article in English | MEDLINE | ID: mdl-32331877

ABSTRACT

The objective of this study was to compare culture- and algorithm-guided selective dry-cow therapy (SDCT) programs with blanket dry-cow therapy (BDCT) in a multi-site, randomized, natural exposure clinical trial for the following cow-level outcomes: clinical mastitis, removal from the herd, and Dairy Herd Improvement Association (DHIA) test-day milk yield and SCC measures during the first 120 d in milk (DIM). Two days before planned dry-off, cows in each of 7 herds were randomly allocated to BDCT, culture-guided SDCT (cult-SDCT), or algorithm-guided SDCT (alg-SDCT). At dry-off, BDCT cows received an intramammary antibiotic (500 mg of ceftiofur hydrochloride) in all 4 quarters. Antibiotic treatments were selectively allocated to quarters of cult-SDCT cows by only treating quarters from which aseptically collected milk samples tested positive on a rapid culture system after 30 to 40 h of incubation. For alg-SDCT cows, antibiotic treatments were selectively allocated at the cow level, with all quarters receiving antibiotic treatment if the cow met at least one of the following criteria: (1) any DHIA test with a somatic cell count >200,000 cells/mL during the current lactation, and (2) ≥2 clinical mastitis cases during the current lactation. All quarters of all cows were treated with an internal teat sealant. Clinical mastitis and removal from the herd events (i.e., culling or death) and DHIA test-day data from dry-off to 120 DIM were extracted from herd records. Hazard ratios (HR) for the effect of treatment group on clinical mastitis and removal from the herd during 1 to 120 DIM were determined using Cox proportional hazards regression. The effects of treatment group on test-day loge-transformed SCC and milk yield were determined using linear mixed models. Final models indicated that either SDCT program was unlikely to increase clinical mastitis risk (HRcult-SDCT/BDCT = 0.82, 95% CI: 0.58, 1.15; HRalg-SDCT/BDCT = 0.83, 95% CI: 0.63, 1.09) or test-day logeSCC (cult-SDCT minus BDCT = 0.05, 95% CI: -0.09, 0.18; alg-SDCT minus BDCT = 0.07, 95% CI: -0.07, 0.21). Risk of removal from the herd and test-day milk yield were similar between treatment groups. Findings from this study indicate that culture- or algorithm-guided SDCT can be used at dry-off without negatively affecting cow health and performance in early lactation.


Subject(s)
Anti-Bacterial Agents/pharmacology , Cephalosporins/pharmacology , Lactation/drug effects , Mammary Glands, Animal/drug effects , Mastitis, Bovine/prevention & control , Animals , Anti-Bacterial Agents/administration & dosage , Anti-Bacterial Agents/adverse effects , Cattle , Cell Count/veterinary , Cephalosporins/administration & dosage , Cephalosporins/adverse effects , Colostrum , Female , Milk/cytology , Pregnancy , Proportional Hazards Models
12.
J Dairy Sci ; 103(6): 5398-5413, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32278556

ABSTRACT

The use of an internal teat sealant (ITS) at dry-off has been repeatedly shown to improve udder health in the subsequent lactation. However, almost all ITS research conducted in North America has evaluated one product (Orbeseal, Zoetis, Parsippany, NJ). The objective of this study was to evaluate a new ITS product (Lockout, Boehringer-Ingelheim Animal Health, Duluth, GA), by comparing it directly to Orbeseal in a multi-site, randomized, positively controlled equivalence trial for health indicators during the dry period [quarter-level new intramammary infection (IMI) risk, IMI cure risk, and IMI risk at 1 to 13 d in milk, DIM] and during the first 100 DIM [clinical mastitis and culling or death risk and test-day milk somatic cell count (SCC) and milk yield]. At dry-off, cows were randomly allocated to be treated with Orbeseal or Lockout after blanket administration of a cloxacillin dry cow therapy product. Cows were then followed from dry-off until 100 DIM. Intramammary infection status at enrollment and at 1 to 13 DIM was determined using standard bacteriological methods, allowing for the measurement of IMI dynamics during the dry period (i.e., IMI cures and new IMI). The effect of ITS group on dry period IMI cure, dry period new IMI, and IMI risk at 1 to 13 DIM was determined using generalized linear mixed models (logistic). Marginal standardization was used to derive risk difference estimates. An equivalence hypothesis test was conducted to compare ITS groups for dry period new IMI risk (margin of equivalence was ±5% units). The effect of ITS group on clinical mastitis and culling or death was determined using Cox proportional hazards regression. The effect of ITS group on test-day SCC and milk yield was determined using linear mixed models. Final models indicated that measures of quarter-level IMI dynamics were similar between ITS groups (i.e., risk difference estimates and 95% confidence intervals all close to zero). Furthermore, Lockout was found to be equivalent to Orbeseal for dry period new IMI risk using an equivalence hypothesis test. Hazard ratio estimates for clinical mastitis and culling or death were close to 1 and differences in SCC and milk yield between ITS groups were close to 0, indicating negligible effects of ITS group on test-day SCC and milk yield. In most cases, these effect estimates were relatively precise (i.e., narrow 95% confidence intervals). We conclude that producers using blanket dry cow therapy could consider including Orbeseal or Lockout treatment in their programs.


Subject(s)
Anti-Bacterial Agents , Mammary Glands, Animal , Mastitis, Bovine , Tissue Adhesives , Animals , Cattle , Female , Anti-Bacterial Agents/pharmacology , Cell Count/veterinary , Cloxacillin/therapeutic use , Lactation , Mammary Glands, Animal/drug effects , Mastitis, Bovine/prevention & control , Milk/cytology , North America , Proportional Hazards Models , Tissue Adhesives/therapeutic use
13.
J Dairy Sci ; 102(12): 11384-11400, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31606215

ABSTRACT

Objectives of this study were to (1) describe the intramammary infection (IMI) prevalence and pathogen profiles in quarters of cows approaching dry-off in US dairy herds, (2) compare IMI prevalence in quarters of cows exposed to different bedding material types, and (3) identify associations between bedding bacteria count and IMI in cows approaching dry-off. Eighty herds using 1 of 4 common bedding materials (manure solids, organic non-manure, new sand, and recycled sand) were recruited in a multi-site cross-sectional study. Each herd was visited twice for sampling. At each visit, aseptic quarter-milk samples were collected from 20 cows approaching dry-off (>180 d pregnant). Samples of unused and used bedding were also collected. Aerobic culture was used to determine the IMI status of 10,448 quarters and to enumerate counts (log10 cfu/mL) of all bacteria, Staphylococcus spp., Streptococcus spp. and Streptococcus-like organisms (SSLO), coliforms, Klebsiella spp., noncoliform gram-negatives, Bacillus spp., and Prototheca spp. in unused (n = 148) and used (n = 150) bedding. The association between bedding bacteria count and IMI was determined using multivariable logistic regression with mixed effects. Quarter-level prevalence of IMI was 21.1%, which was primarily caused by non-aureus Staphylococcus spp. (11.4%) and SSLO (5.6%). Only modest differences in IMI prevalence were observed between the 4 common bedding material types. Counts of all bacteria in unused bedding was positively associated with odds of IMI caused by any pathogen [ALL-IMI; odds ratio (OR) = 1.08]. A positive association was also observed for counts of SSLO in unused bedding and SSLO-IMI (OR = 1.09). These patterns of association were generally consistent across the 4 common bedding materials. In contrast, the association between counts of all bacteria in used bedding and ALL-IMI varied by bedding type, with positive associations observed in quarters exposed to manure solids (OR = 2.29) and organic non-manure (OR = 1.51) and a negative association in quarters exposed to new sand (OR = 0.47). Findings from this study suggest that quarter-level IMI prevalence in late-lactation cows is low in US dairy herds. Furthermore, bedding material type may not be an important risk factor for IMI in late lactation. Higher levels of bacteria in bedding may increase IMI prevalence at dry-off in general, but this relationship is likely to vary according to bedding material type.


Subject(s)
Klebsiella Infections/veterinary , Mastitis, Bovine/microbiology , Milk/microbiology , Staphylococcal Infections/veterinary , Streptococcal Infections/veterinary , Animals , Bacterial Load/veterinary , Bedding and Linens/microbiology , Bedding and Linens/veterinary , Cattle , Cross-Sectional Studies , Female , Klebsiella/isolation & purification , Klebsiella Infections/epidemiology , Klebsiella Infections/microbiology , Lactation , Logistic Models , Mammary Glands, Animal/microbiology , Manure/microbiology , Mastitis, Bovine/epidemiology , Prevalence , Risk Factors , Staphylococcal Infections/epidemiology , Staphylococcal Infections/microbiology , Staphylococcus/isolation & purification , Streptococcal Infections/epidemiology , Streptococcal Infections/microbiology , Streptococcus/isolation & purification
14.
J Dairy Sci ; 102(12): 11401-11413, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31606221

ABSTRACT

Because cloth udder towels (CUT) may function as a fomite for mastitis-causing pathogens, most udder health laboratories offer towel culture services as a tool to monitor towel hygiene. However, no studies have investigated if an association exists between bacteria levels in CUT and udder health outcomes. The objectives of this cross-sectional study were to (1) describe associations between herd-level measures of towel bacteria count (ToBC) and quarter-level intramammary infection (IMI) status in late-lactation cows, (2) establish pathogen-specific target levels of bacteria in CUT to aid the interpretation of towel culture reports, and (3) identify laundering-related risk factors for high ToBC. The study was conducted in 67 herds from 10 dairy states in the United States that used CUT. These 67 herds were originally recruited as part of a larger (80 herd) cross-sectional study of bedding management. Each herd was visited once during December 2017 to April 2018 and quarter-milk samples (n = 4,656) were collected from late-gestation (>180 d pregnant) cows (n = 1,313). Two recently laundered CUT were collected and a questionnaire was used to collect information about pre-milking teat preparation and CUT management practices. Quarter-level IMI status was determined using standard bacteriologic methods. In addition, colony-forming units of all bacteria (total bacteria), Staphylococcus spp., Streptococcus spp. or Streptococcus-like organisms (SSLO), coliforms, noncoliform gram-negatives, and Bacillus spp. were determined for each pair of CUT (log10 cfu/cm2). The association between ToBC and IMI was determined using multivariable logistic regression with mixed effects. After dichotomizing ToBC into high and low categories, associations between towel management practices and ToBC category were determined using unconditional logistic regression. The quarter-level prevalence of IMI was 19.6%, which was predominantly caused by non-aureus Staphylococcus spp. (NAS; 10.2%) and SSLO (5.1%). The predominant bacteria in CUT were Bacillus spp. (median = 3.13 log10 cfu/cm2). Total bacteria count was not associated with odds of IMI (odds ratio = 1.06), likely due to the predominance of Bacillus spp. in CUT and low number of IMI caused by Bacillus spp. In contrast, counts of Staphylococcus spp. and SSLO were positively associated with odds of IMI caused by NAS (odds ratio = 1.33) and SSLO (odds ratio = 1.45), respectively. Of 12 CUT management practices evaluated, only the failure to use a dryer was identified as a clear predictor of risk for a high ToBC (risk ratio of high coliform count = 8.17). Our study findings suggest that CUT may act as a fomite for NAS and SSLO. We recommend that herds aim to keep counts of Staphylococcus spp. and SSLO in CUT below 32 cfu/cm2 (or 5 cfu/in2), and that laundered towels be completely dried in a hot air dryer.


Subject(s)
Klebsiella Infections/veterinary , Mastitis, Bovine/microbiology , Milk/microbiology , Staphylococcal Infections/veterinary , Streptococcal Infections/veterinary , Animals , Bacterial Load/veterinary , Cattle , Cross-Sectional Studies , Female , Klebsiella/isolation & purification , Klebsiella Infections/epidemiology , Klebsiella Infections/microbiology , Lactation , Logistic Models , Mammary Glands, Animal/microbiology , Manure/microbiology , Mastitis, Bovine/epidemiology , Pregnancy , Prevalence , Risk Factors , Staphylococcal Infections/epidemiology , Staphylococcal Infections/microbiology , Staphylococcus/isolation & purification , Streptococcal Infections/epidemiology , Streptococcal Infections/microbiology , Streptococcus/isolation & purification
15.
J Dairy Sci ; 102(11): 10213-10234, 2019 Nov.
Article in English | MEDLINE | ID: mdl-31447166

ABSTRACT

Bedding is an important source of teat end exposure to environmental mastitis pathogens. To better control environmental mastitis, we need an improved understanding of the relationships among bedding selection and management, bedding bacteria counts (BBC), and udder health (UH). The objectives of this cross-sectional observational study were (1) to describe BBC, bedding characteristics, udder hygiene scores, bulk tank milk (BTM) quality, and UH in US dairy herds using 1 of 4 bedding materials; (2) describe the relationship between BBC and herd measures of UH; and (3) identify benchmarks for monitoring bedding hygiene. Local dairy veterinarians and university researchers enrolled and sampled 168 herds from 17 states. Herds were on a Dairy Herd Improvement Association (DHIA) testing program and used 1 of 4 bedding types for lactating cows: new sand, reclaimed sand, manure solids (MNS), or organic non-manure materials. Each herd was sampled twice (winter and summer) in 2016. Samples and data collected included unused and used bedding, BTM samples, udder hygiene scores, DHIA test data, and descriptions of facilities and herd management practices. Bedding was cultured to determine the total bacteria count and counts of Bacillus spp., coliforms, Klebsiella spp., non-coliform gram-negative organisms, streptococci or streptococci-like organisms (SSLO), and Staphylococcus spp. Bedding dry matter, organic matter, and pH were also measured. Bulk tank milk samples were cultured to determine counts of coliforms, NAS, SSLO, Staphylococcus aureus, and Mycoplasma spp. Udder health measures included DHIA test-day average linear score (LS); the proportion of cows with an intramammary infection (IMI), where infection was defined as LS ≥4.0; the proportion of cows with a new IMI, where new IMI was defined as LS changing from <4.0 to ≥4.0 in the last 2 tests; the proportion of cows with a chronic infection, where chronic was defined as LS ≥4.0 on the last 2 tests; and the cumulative incidence of clinical mastitis in the 30-d period preceding sample collection. Although much variation existed within and among bedding types, mixed linear regression showed the use of MNS bedding to be generally associated with higher BBC, dirtier udders, increased coliform and SSLO counts in BTM, and poorer UH measures compared with organic non-manure materials, reclaimed sand, or new sand bedding materials. While controlling for important farm traits and management practices, mixed linear regression showed that increased counts of coliforms, Klebsiella spp., SSLO, and Staphylococcus spp. in both unused and used bedding were associated with poorer values for 1 or more herd-level measures of UH. Achievable benchmarks identified for counts of coliforms (unused: ≤500 cfu/cm3; used: ≤10,000 cfu/cm3), Klebsiella spp. (0 cfu/cm3 for unused and used), Staphylococcus spp. (0 cfu/cm3 for unused and used), and SSLO (unused: 0 cfu/cm3; used: ≤500,000 cfu/cm3) can be used to monitor bedding hygiene in most bedding materials, with minor variations suggested for SSLO in unused MNS (≤1,000 cfu/cm3).


Subject(s)
Housing, Animal , Hygiene , Mastitis, Bovine/microbiology , Milk/standards , Animals , Bacterial Load/veterinary , Bedding and Linens/veterinary , Cattle , Cross-Sectional Studies , Dairying , Farms , Female , Floors and Floorcoverings , Lactation , Mammary Glands, Animal/microbiology , Manure/microbiology
16.
J Dairy Sci ; 102(8): 6885-6900, 2019 Aug.
Article in English | MEDLINE | ID: mdl-31202649

ABSTRACT

Mesophilic and thermophilic spore-forming bacteria represent a challenge to the dairy industry, as these bacteria are capable of surviving adverse conditions associated with processing and sanitation and eventually spoil dairy products. The dairy farm environment, including soil, manure, silage, and bedding, has been implicated as a source for spores in raw milk. High levels of spores have previously been isolated from bedding, and different bedding materials have been associated with spore levels in bulk tank (BT) raw milk; however, the effect of different bedding types, bedding management practices, and bedding spore levels on the variance of spore levels in BT raw milk has not been investigated. To this end, farm and bedding management surveys were administered and unused bedding, used bedding, and BT raw milk samples were collected from dairy farms (1 or 2 times per farm) across the United States over 1 yr; the final data set included 182 dairy farms in 18 states. Bedding suspensions and BT raw milk were spore pasteurized (80°C for 12 min), and mesophilic and thermophilic spores were enumerated. Piecewise structural equation modeling analysis was used to determine direct and indirect pathways of association among farm and bedding practices, levels of spores in unused and used bedding, and levels of spores in BT raw milk. Separate models were constructed for mesophilic and thermophilic spore levels. The analyses showed that bedding material had a direct influence on levels of spores in unused and used bedding as well as an indirect association with spore levels in BT raw milk through used bedding spore levels. Specific bedding and farm management practices as well as cow hygiene in the housing area were associated with mesophilic and thermophilic spore levels in unused bedding, used bedding, and BT raw milk. Notably, levels of spores in used bedding were positively related to those in unused bedding, and used bedding spore levels were positively related to those in BT raw milk. The results of this study increase the understanding of the levels and ecology of mesophilic and thermophilic spores in raw milk, emphasize the possible role of bedding as a source of spores on-farm, and present opportunities for dairy producers to reduce spore levels in BT raw milk.


Subject(s)
Dairying/methods , Housing, Animal , Milk/microbiology , Spores, Bacterial/isolation & purification , Animals , Bedding and Linens/microbiology , Cattle , Colony Count, Microbial , Farms , Female , Pasteurization , Silage/microbiology , United States
17.
J Dairy Sci ; 102(5): 4704-4712, 2019 May.
Article in English | MEDLINE | ID: mdl-30852006

ABSTRACT

The majority of dairy heifer calves in the United States are destined to be dairy replacements. However, many dairy heifer and bull calves die before 6 mo of age. Of these calves, about 6% (more than 500,000 calves) die at birth or shortly after (i.e., currently termed "stillbirth"). An additional 6% of dairy heifers die during the preweaning period. Death loss in dairy calves is primarily due to stillbirths, failure to adapt to extrauterine life, and infectious disease processes. The reasons for preweaning heifer calf deaths caused by infectious diseases are generally categorized based on easily recognizable clinical signs such as digestive disease/scours or respiratory disease. Most causes of calf death can be mitigated by appropriate preventive care or well-tailored treatments, meaning that the typical death loss percentage could be decreased with better management. Producers could gather information on the circumstances near birth and at death if they had appropriate guidance on what details to record and monitor. This paper provides recommendations on data to collect at the time of birth (i.e., calf birth certificate data). The recording of these critical pieces of information is valuable in evaluating trends over time in morbidity and mortality events in dairy calves. Ideally, necropsy examination would substantially improve the identification of cause of death, but even without necropsy, attribution of cause of death can be improved by more carefully defining death loss categories in on-farm record systems. We propose a death loss categorization scheme that more clearly delineates causes of death. Recommendations are provided for additional data to be collected at the time of death. Recording and analyzing birth certificate and death loss data will allow producers and veterinarians to better evaluate associations between calf risk factors and death, with the goal of reducing dairy calf mortality.


Subject(s)
Animal Husbandry/methods , Birth Certificates , Cattle Diseases/mortality , Stillbirth/veterinary , Animals , Animals, Newborn , Animals, Suckling , Cattle , Dairying , Farms , Female , Male , Parturition , Pregnancy , Risk Factors
18.
J Dairy Sci ; 101(11): 10126-10141, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30172404

ABSTRACT

The objective of this prospective cohort study was to describe the relationship between exposure to antimicrobials, through both the milk diet and systemic therapy, and to describe antimicrobial resistance of fecal Escherichia coli in dairy calves pre- and postweaning. A convenience sample of 15 Minnesota dairy farms was chosen, representing 3 equal cohorts of milk diet fed to preweaned calves: medicated milk replacer (MMR), nonmedicated milk replacer (NMR), or pasteurized nonsaleable milk (PNM). Five newborn calves were enrolled on each farm, with fecal samples collected from each calf at 1, 3, 5, and 16 wk of age. After isolation, 3 colonies of E. coli were randomly selected from each sample to determine antimicrobial susceptibility by minimum inhibitory concentration (Sensititer, Thermo Scientific, Waltham, MA) to 8 antimicrobials in 8 classes. The isolate was given an antimicrobial resistance score (ARS) according to the number of antimicrobial classes to which it was resistant. Any isolate resistant to 3 or more antimicrobials was defined as being multidrug resistant (MDR). Relationships between ARS and MDR (dependent variables) and possible explanatory variables were analyzed using mixed multivariable linear and logistic regression models, respectively, with critical P-values adjusted for multiple contrasts. Seventy percent of isolates were resistant to sulfadimethoxine. For wk 1 and 3, the mean ARS values were greatest for fecal E. coli from calves fed MMR or PNM compared with NMR, with no difference in ARS values between the MMR and PNM groups at either time point. At wk 5, the mean ARS value was greatest for fecal E. coli from calves fed MMR (3.56 ± 0.45; mean ± SE), intermediate for calves fed PNM (2.64 ± 0.45), and lowest for calves fed NMR (1.54 ± 0.45). However, by wk 16, the mean ARS values were ≤1.0 and did not differ among milk diets. Evaluation of the proportion of isolates with MDR mirrored the results of the ARS analysis (MDR more prevalent in MMR and PNM groups preweaning; no difference among milk diets at 16 wk). There was a tendency for an increase in ARS at wk 5 (1.28 ± 0.70), and the odds for MDR in fecal E. coli were estimated to be 5.2 (95% confidence interval = 0.67, 35.7) and 101.1 (95% confidence interval = 1.15, >999.9) higher at wk 3 and 5 if the calf was treated with a systemic antimicrobial within the 14-d period before sampling. These findings suggest that exposure to antimicrobials through the milk diet or systemic therapy may result in a transient increase in resistance in fecal E. coli, but once the antimicrobial pressure is removed, susceptible E. coli are able to flourish again, resulting in an overall decrease in resistance.


Subject(s)
Anti-Infective Agents/pharmacology , Drug Resistance, Bacterial/drug effects , Escherichia coli/drug effects , Milk/chemistry , Animals , Cattle , Cohort Studies , Dairying , Diet/veterinary , Farms , Feces/microbiology , Female , Logistic Models , Microbial Sensitivity Tests/veterinary , Minnesota , Prospective Studies
19.
J Dairy Sci ; 101(9): 8135-8145, 2018 Sep.
Article in English | MEDLINE | ID: mdl-30007809

ABSTRACT

Group housing and computerized feeding of preweaned dairy calves are gaining in popularity among dairy producers, yet disease detection remains a challenge for this management system. The aim of this study was to investigate the application of statistical process control charting techniques to daily average feeding behavior to predict and detect illness and to describe the diagnostic test characteristics of using this technique to find a sick calf compared with detection by calf personnel. This prospective cross-sectional study was conducted on 10 farms in Minnesota (n = 4) and Virginia (n = 6) utilizing group housing and computerized feeding from February until October 2014. Calves were enrolled upon entrance to the group pen. Calf personnel recorded morbidity and mortality events. Farms were visited either every week (MN) or every other week (VA) to collect calf enrollment data, computer-derived feeding behavior data, and calf personnel-recorded calf morbidity and mortality. Standardized self-starting cumulative sum (CUSUM) charts were generated for each calf for each daily average feeding behavior, including drinking speed (mL/min), milk consumption (L/d), and visits to the feeder without a milk meal (no.). A testing subset of 352 calves (176 treated, 176 healthy) was first used to find CUSUM chart parameters that provided the highest diagnostic test sensitivity and best signal timing, which were then applied to all calves (n = 1,052). Generalized estimating equations were used to estimate the diagnostic test characteristics of a single negative mean CUSUM chart signal to detect a sick calf for a single feeding behavior. Combinations of feeding behavior signals were also explored. Single signals and combinations of signals that included drinking speed provided the most sensitive and timely signal, finding a sick calf up to an average (±SE) of 3.1 ± 8.8 d before calf personnel. However, there was no clear advantage to using CUSUM charting over calf observation for any one feeding behavior or combination of feeding behaviors when predictive values were considered. The results of this study suggest that, for the feeding behaviors monitored, the use of CUSUM control charts does not provide sufficient sensitivity or predictive values to detect a sick calf in a timely manner compared with calf personnel. This approach to examining daily average feeding behaviors cannot take the place of careful daily observation.


Subject(s)
Cattle Diseases/epidemiology , Feeding Behavior , Housing, Animal , Animals , Cattle , Cattle Diseases/prevention & control , Cross-Sectional Studies , Minnesota , Prospective Studies , Virginia
20.
J Dairy Sci ; 101(9): 8100-8109, 2018 Sep.
Article in English | MEDLINE | ID: mdl-29908803

ABSTRACT

The objective of this study was to describe the effect of offering a fixed or increasing milk allowance in the first 1 to 2 wk of life. We hypothesized that calves offered a fixed amount of milk early in life would not experience more scours, but rather would experience improved health and growth compared with calves that had their daily milk allowance slowly increased over a period of 1 to 2 wk. This randomized controlled clinical trial was conducted on 5 dairy farms in Minnesota with both a summer (June-August 2016) and winter (December-February 2017) period of enrollment. Heifer calves were enrolled at birth, weighed, and systematically assigned by birth order to either the slowly increasing (INC) control group or fixed allowance (FIX) treatment group by farm personnel. Calves assigned to the INC group were slowly increased from 4 to 5 L/d to gradually reach the full peak milk allowance of 6 to 8 L/d over a 7- to 14-d period, whereas calves assigned to the FIX group were offered a full peak milk allowance of 6 to 8 L/d beginning on d 1 after birth. The average FIX calf consumed an extra 14 L of milk as compared with INC calves over the first 2 wk of life, corresponding to an average INC intake of 5 L/d during first 1 to 2 wk of life as compared with an average intake of 6.8 L/d in FIX calves. Study technicians visited all farms weekly to collect health and performance data. Multivariable mixed models were used to describe the effect of treatment (INC/FIX) on 3-wk average daily gain (kg/d), 3-wk weight (kg), and hip height at wk 1, 3, and 7, controlling for the effect of season, birth weight, and the random effect of calf within farm. Multivariable logistic regression models were used to describe the effect of treatment on odds of technician and producer reported health events. A total of 1,264 heifer calves were enrolled (FIX n = 641; INC n = 623) with no difference in enrollment weight or hip height between groups. By 3 wk of age, FIX calves weighed 1.4 (0.59) kg more than INC calves, though the magnitude of this difference varied depending on the period of time INC calves were slowly increased in milk allowance (7 vs. 10 vs. 14 d). Calves in the FIX group grew 0.1 kg/d faster and were taller at wk 3 (0.3 ± 0.15 cm) of life. Forty-two percent (536/1,264) of all enrolled calves had a first treatment event, with no effect of treatment on technician-reported health scores and no overall effect on producer-reported treatment or mortality events. Under the conditions of this study, offering a fixed milk allowance from d 1 of life improved calf growth during the first 3 wk as compared with a gradual increase in milk allowance, with no detrimental effect on calf health.


Subject(s)
Animal Feed , Cattle/growth & development , Animal Nutritional Physiological Phenomena , Animals , Animals, Newborn/growth & development , Diet , Farms , Female , Milk , Minnesota , Pregnancy , Random Allocation , Seasons , Weaning
SELECTION OF CITATIONS
SEARCH DETAIL
...