Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 236
Filter
1.
J Dairy Sci ; 106(8): 5402-5415, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37331873

ABSTRACT

This study evaluated the effects of supplementing calf milk replacer with essential AA on immune responses, blood metabolites, and nitrogen metabolism of 32 Holstein bull calves [28 d of age, 44 ± 0.8 kg of body weight (BW)] exposed to lipopolysaccharide (LPS). Calves were bottle-fed a commercial milk replacer (20% crude protein and 20% fat, dry matter basis) twice daily along with a calf starter (19% crude protein, dry matter basis) for 45 d. The experiment was a randomized complete block design and treatments were a 2 × 2 factorial arrangement. Treatments were milk replacer (fed twice daily at 0.5 kg/d of powder) supplemented with or without 10 essential AA (+AA vs. -AA), and subcutaneous injection of sterile saline with or without LPS (+LPS vs. -LPS) at 3 h after the morning feeding on d 15 (4 µg LPS per kg of BW) and 17 (2 µg LPS per kg of BW). Calves also received a 2-mL subcutaneous injection of ovalbumin (6 mg of ovalbumin/mL) on d 16 and 30. Rectal temperature and blood samples were collected on d 15 before LPS injection and at h 4, 8, 12, and 24 thereafter. From d 15 to 19, total fecal and urinary output were collected, and feed refusals were documented. Rectal temperature was greater in +LPS than -LPS calves at h 4, 8, and 12 after LPS injection. Serum cortisol was greater for +LPS than -LPS at h 4 after LPS exposure. At d 28, serum antiovalbumin IgG level was greater in +LPS +AA calves compared with +LPS -AA. Serum glucose was lower for +LPS than -LPS at h 4 and 8. Serum insulin was greater in +LPS than -LPS calves. Plasma concentrations of Thr, Gly, Asn, Ser, and hydroxyproline were lower for +LPS versus -LPS calves. Plasma concentrations of Met, Leu, Phe, His, Ile, Trp, Thr, and Orn were greater in +AA calves than -AA calves. Plasma urea N and N retention were not different among LPS and AA treatments. The lower concentrations of AA in +LPS than -LPS calves indicate higher demand for AA in immuno-compromised calves fed milk replacer. Additionally, higher concentration of ovalbumin-specific IgG level in +LPS calves supplemented with +AA compared with +LPS calves with -AA suggests that supplementing AA to immune-compromised calves might improve immune status.


Subject(s)
Diet , Milk , Animals , Cattle , Male , Diet/veterinary , Milk/metabolism , Amino Acids, Essential , Endotoxins , Lipopolysaccharides , Ovalbumin , Body Weight , Immunity , Immunoglobulin G , Nitrogen/metabolism , Animal Feed/analysis , Weaning
2.
Animal ; 15(5): 100214, 2021 May.
Article in English | MEDLINE | ID: mdl-34029789

ABSTRACT

Nutritional strategies that optimize immunity of feedlot cattle are warranted due to increasing regulations with the use of feed-grade antimicrobials. This study evaluated physiological, health, and performance responses of cattle receiving a synbiotic supplement (yeast-derived prebiotic + Bacillus subtilis probiotic), which replaced feed-grade antimicrobials or were fed in conjunction with monensin during the initial 45 days in the feedlot. Angus-influenced steers (n = 256) were acquired from an auction facility on day -2, and transported (800 km) to the feedlot. Shrunk BW was recorded upon arrival (day -1). Steers were allocated to 1 of 18 pens (day 0), and pens were assigned to receive (n = 6/treatment) a free-choice diet containing: (1) monensin and tylosin (RT; 360 mg/steer daily from Rumensin and 90 mg/steer daily from Tylan; Elanco Animal Health, Greenfield, IN, USA), (2) yeast-derived ingredient and B. subtilis probiotic (CC; 18 g/steer daily of Celmanax and 28 g/steer daily of Certillus; Church and Dwight Co., Inc., Princeton, NJ, USA), or (3) monensin in addition to yeast-derived and B. subtilis ingredients (RCC) as in RT and CC. Steers were assessed for bovine respiratory disease (BRD) and DMI daily. Steer BW was recorded on days 45 and 46, and averaged for final BW. Blood samples were collected on days 0, 7, 17, 31, and 45. Feed intake was greater (P ≤ 0.05) in CC vs. RCC and RT during the initial 3 weeks upon feedlot arrival. No treatment differences were noted (P ≥ 0.41) for average daily gain, BW, and feed efficiency. Incidence of BRD did not differ (P = 0.77) between treatments (average 80.1%). A greater proportion (P ≤ 0.03) of RT steers diagnosed with BRD required a second antimicrobial treatment compared with CC and RCC (57.3, 37.3, and 38.6%, respectively). Removal of steers from the trial due to severe morbidity + mortality was greater (P = 0.02) in RT vs. CC (22.4 and 7.0%), and did not differ (P ≥ 0.16) among RCC (12.9%) vs. RT and CC. Plasma glucose concentrations were greater (P ≤ 0.02) in CC vs. RCC and RT on day 7. Plasma concentrations of nonesterified fatty acids were greater (P ≤ 0.02) in RT and RCC vs. CC on day 7, and in RT vs. CC on day 17. Steers receiving the synbiotic supplement had improved response to BRD treatment, suggesting heightened immunocompetence from partially enhanced metabolism and the nutraceutical effects of B. subtilis and yeast compounds.


Subject(s)
Probiotics , Yeast, Dried , Animal Feed/analysis , Animals , Cattle , Diet/veterinary , Dietary Supplements , Probiotics/pharmacology
3.
Surgeon ; 14(5): 270-3, 2016 Oct.
Article in English | MEDLINE | ID: mdl-26148760

ABSTRACT

AIMS: Recently, lymph-node ratio (LNR) has emerged as a prognostic tool in staging rectal cancer. Studies to date have demonstrated threshold values above and below which survival is differentially altered. Neoadjuvant therapy significantly reduces the number of lymph node retrieved. The aim of the present study was to determine the effect of neoadjuvant therapy on LNR and its prognostic properties. METHODS: Consecutive patients who underwent curative rectal cancer resections in a single institution from 2007 to 2010 were reviewed. LNR was stratified into five subgroups of 0, 0.01-0.17, 0.18-0.41, 0.42-0.69 and 0.7-1.0 based on a previous study. The effect of neoadjuvant therapy on lymph node retrieval, LNR, locoregional (LR) and systemic recurrence (SR), disease-free (DFS) and overall survival (OS) was compared between patients who did (Neoadjuvant) and did not (Surgery Alone) receive neoadjuvant therapy. RESULTS: Neoadjuvant and Surgery Alone groups were comparable in gender, age and tumour stage. The number of lymph nodes retrieved were significantly lower in the Neoadjuvant group (p < 0.01). However, LNR remained similar in both groups (p = 0.36). There was no statistical difference in the DFS and OS between the Neoadjuvant and Surgery Alone groups at the various LNR cut off values in patients with AJCC Stage 3 tumours. CONCLUSIONS: LNR ratio remains unaltered despite reduced lymph node retrieval after neoadjuvant therapy in rectal cancer. LNR may therefore be a more reliable prognostic indicator in this subgroup of patients.


Subject(s)
Adenocarcinoma/pathology , Adenocarcinoma/therapy , Fluorouracil/therapeutic use , Immunosuppressive Agents/therapeutic use , Neoadjuvant Therapy , Rectal Neoplasms/pathology , Rectal Neoplasms/therapy , Adenocarcinoma/mortality , Aged , Chemotherapy, Adjuvant/methods , Disease-Free Survival , Female , Humans , Lymph Node Excision , Lymph Nodes/pathology , Lymphatic Metastasis , Male , Middle Aged , Neoadjuvant Therapy/methods , Neoplasm Staging , Prognosis , Radiotherapy, Adjuvant/methods , Rectal Neoplasms/mortality , Retrospective Studies
5.
J Anim Sci ; 91(10): 4984-90, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23942707

ABSTRACT

The effects of bacitracin methylene disalicylate (BMD) and scours on the fecal microbiome, animal performance, and health were studied in Holstein bull calves. Holstein bull calves (n = 150) were obtained from a single source at 12 to 24 h of age. Bull calves were randomly assigned to 1 of 2 treatments including CON (no BMD; n = 75 calves) and BMD (n = 75 calves). Starting 3 d after arrival, BMD was added into milk replacer (0.5 g/feeding; twice daily) and fed to the calves for 10 consecutive d. No differences (P > 0.10) were observed in ADG for d 0 to 28 and d 0 to 56, DMI for d 0 to 28, d 29 to 56, and d 0 to 56, or G:F for d 0 to 28, d 29 to 56, and d 0 to 56; ADG for d 29 to 56 tended to increase (P < 0.10) for BMD-treated calves compared with controls. Fecal samples were collected from 15 scouring calves and 10 cohorts (nonscouring calves received on the same day and administered the same treatment as the scouring calves). Animal morbidity and fecal score did not vary between the 2 treatments. Mortality was not influenced by the treatments in the BMD administration period or throughout the experiment. Fecal samples were subjected to pyrotagged 454 FLX pyrosequencing of 16S rRNA gene amplicon to examine compositional dynamics of fecal microbes. Escherichia, Enterococcus, and Shigella had greater (P < 0.05) populations in the BMD group whereas Dorea, Roseburia, Fecalibacterium, Papillibacter, Collinsella, Eubacterium, Peptostreptococcus, and Prevotella were decreased (P < 0.05) by BMD treatment. Genus populations were also compared between scouring and nonscouring calves. Streptococcus was the only genus that had notable increase (P < 0.05) in fecal samples from scouring calves whereas populations of Bacteroides, Roseburia, and Eubacterium were markedly (P < 0.05) greater in nonscouring calves. These results show that BMD has the ability to alter the composition of the fecal microbiome but failed to improve performance in Holstein bull calves. Discrepancy of microorganism profiles between scouring and nonscouring calves might be associated with the occurrence of scours and bacterial genera identified might be potential target of treating diarrhea.


Subject(s)
Bacitracin/pharmacology , Bacteria/classification , Bacteria/genetics , Cattle Diseases/microbiology , Diarrhea/veterinary , Gastrointestinal Tract/microbiology , Animal Feed/analysis , Animals , Animals, Newborn , Bacitracin/administration & dosage , Bacteria/drug effects , Cattle , Cattle Diseases/prevention & control , Diarrhea/microbiology , Diarrhea/prevention & control , Feces/microbiology , Male , RNA, Bacterial/genetics , RNA, Ribosomal, 16S/genetics
6.
J Dent Res ; 92(8): 694-701, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23752171

ABSTRACT

Prevention reduces tooth loss, but little evidence supports biannual preventive care for all adults. We used risk-based approaches to test tooth loss association with 1 vs. 2 annual preventive visits in high-risk (HiR) and low-risk (LoR) patients. Insurance claims for 16 years for 5,117 adults were evaluated retrospectively for tooth extraction events. Patients were classified as HiR for progressive periodontitis if they had ≥ 1 of the risk factors (RFs) smoking, diabetes, interleukin-1 genotype; or as LoR if no RFs. LoR event rates were 13.8% and 16.4% for 2 or 1 annual preventive visits (absolute risk reduction, 2.6%; 95%CI, 0.5% to 5.8%; p = .092). HiR event rates were 16.9% and 22.1% for 2 and 1 preventive visits (absolute risk reduction, 5.2%; 95%CI, 1.8% to 8.4%; p = .002). Increasing RFs increased events (p < .001). Oral health care costs were not increased by any single RF, regardless of prevention frequency (p > .41), but multiple RFs increased costs vs. no (p < .001) or 1 RF (p = .001). For LoR individuals, the association between preventive dental visits and tooth loss was not significantly different whether the frequency was once or twice annually. A personalized medicine approach combining gene biomarkers with conventional risk factors to stratify populations may be useful in resource allocation for preventive dentistry (ClinicalTrials.gov, NCT01584479).


Subject(s)
Appointments and Schedules , Dental Care/statistics & numerical data , Tooth Loss/prevention & control , Adult , Chronic Disease , Cohort Studies , Dental Care/economics , Diabetes Mellitus, Type 1/epidemiology , Diabetes Mellitus, Type 2/epidemiology , Disease Susceptibility , Female , Genotype , Health Care Costs/statistics & numerical data , Humans , Insurance, Dental/statistics & numerical data , Interleukin-1/genetics , Male , Michigan/epidemiology , Middle Aged , Periodontitis/epidemiology , Preventive Dentistry/statistics & numerical data , Retrospective Studies , Risk Assessment , Risk Factors , Smoking/epidemiology , Tooth Extraction/statistics & numerical data , Tooth Loss/epidemiology
7.
Int J Colorectal Dis ; 28(10): 1377-84, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23715847

ABSTRACT

BACKGROUND: Lymph node ratio (LNR) is increasingly accepted as a useful prognostic indicator in colorectal cancer. However, variations in methodology, statistical stringency and cohort composition has led to inconsistency in respect of the optimally prognostic LNR. OBJECTIVE: The aim was to apply a robust regression-based analysis to generate and appraise LNRs optimally prognostic for colon and rectal cancer, both separately and in combination. METHODS: LNR was established for all patients undergoing either a colonic (n = 379) or rectal (n = 160) cancer resection with curative intent. The optimal LNR associated with disease-free and overall survival were established using a classification and regression tree technique. This process was repeated separately for patients who underwent either colonic or rectal resection and for the combined cohort. Survival associated with differing LNR was estimated using the Kaplan-Meier method and compared using a log-rank test. Relationships between LNR, disease-free survival (DFS) and overall survival (OS) were further characterised using Cox regression analysis. All statistical analyses were conducted in the R programming environment, with statistical significance was taken at a level of p < 0.05. RESULTS: Optimal LNRs differed between each cohort, when either overall or disease-free survival was considered. LNRs generated from combined cohorts also differed from those generated by individual cohorts. In relation to DFS, LNR values were obtained and included 0.18 for the colon cancer cohort and 0.19 for the rectal and combined colorectal cancer cohorts. In relation to OS, multiple LNR values were obtained for colon and combined cohorts; however, an optimal LNR was not evident in the rectal cancer cohort. Survival patterns according to LNR closely resembled those associated with standard nodal staging. CONCLUSION: Application of a data-driven approach based on recursive partitioning generates differing lymph node ratios for colon, rectal and combined colorectal cohorts. In each cohort, LNR was similarly prognostic to standard nodal staging in respect to overall and disease-free survival. Overall survival was associated with a multiplicity of LNR values, whilst disease-free survival was associated with a single LNR only. The paper demonstrates the merits of utilising a data-driven approach to determining lymph node ratios from specific patient cohorts. Utilising such an approach enabled the generation of those LNRs that were most associated with particular survival trends in relation to overall and disease-free survival. These differed markedly for colon cancer, rectal cancer and combined cohorts. In general, the survival patterns associated with LNRs generated were similar to those observed with standard nodal staging.


Subject(s)
Colonic Neoplasms/pathology , Lymph Nodes/pathology , Rectal Neoplasms/pathology , Aged , Colorectal Neoplasms/pathology , Demography , Disease-Free Survival , Female , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Regression Analysis
8.
Osteoarthritis Cartilage ; 21(7): 930-8, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23602982

ABSTRACT

OBJECTIVE: Within the interleukin-1 (IL-1) cytokine family, IL-1 receptor antagonist (IL1RN) gene variants have been associated with radiological severity of knee osteoarthritis (OA) in cross-sectional studies. The present study tested the relation between IL1RN gene variants and progression of knee OA assessed radiographically by change in Kellgren-Lawrence (KL) score over time. DESIGN: 1153 Caucasian adults (age range: 44-89) from the Johnson County Osteoarthritis Project were evaluated for unequivocal radiographic evidence of knee OA at baseline, defined as KL score ≥2, and were re-examined after 4-11 years for radiographic changes typical of OA progression. IL1RN gene variants were tested for association with OA progression and for potential interaction with body mass index (BMI). Other IL-1 gene variations were tested for association with OA progression as a secondary objective. RESULTS: Of 154 subjects with OA at baseline, 88 showed progression at follow-up. Seven IL1RN single nucleotide polymorphisms (SNPs) and one IL-1 receptor SNP were associated with progression. Four IL1RN haplotypes, each occurring in >5% of this population, showed different relationships with progression, including one (rs315931/rs4251961/rs2637988/rs3181052/rs1794066/rs419598/rs380092/rs579543/rs315952/rs9005/rs315943/rs1374281; ACAGATACTGCC) associated with increased progression [odds ratio (OR) 1.91 (95%CI 1.16-3.15); P = 0.012]. Haplotypes associated with progression by KL score were also associated with categorical change in joint space narrowing. BMI was associated with OA progression in subjects carrying a specific IL1RN haplotype, but not in subjects without that haplotype. CONCLUSION: A significantly greater likelihood of radiological progression of knee OA was associated with a commonly occurring IL1RN haplotype that could be tagged by three IL1RN SNPs (rs419598, rs9005, rs315943). Interactions were also observed between IL1RN gene variants and BMI relative to OA progression. This suggests that IL1RN gene markers may be useful in stratifying patients for medical management and drug development.


Subject(s)
Haplotypes/genetics , Interleukin 1 Receptor Antagonist Protein/genetics , Osteoarthritis, Knee/genetics , Polymorphism, Single Nucleotide/genetics , Adult , Aged , Aged, 80 and over , Body Mass Index , Female , Follow-Up Studies , Humans , Male , Middle Aged , Osteoarthritis, Knee/diagnostic imaging , Radiography , Risk Factors
9.
Philos Trans A Math Phys Eng Sci ; 370(1981): 5767-82, 2012 Dec 28.
Article in English | MEDLINE | ID: mdl-23166379

ABSTRACT

Artificial spin-ice systems consisting of nanolithographic arrays of isolated nanomagnets are model systems for the study of frustration-induced phenomena. We have recently demonstrated that monopoles and Dirac strings can be directly observed via synchrotron-based photoemission electron microscopy, where the magnetic state of individual nanoislands can be imaged in real space. These experimental results of Dirac string formation are in excellent agreement with Monte Carlo simulations of the hysteresis of an array of dipoles situated on a kagome lattice with randomized switching fields. This formation of one-dimensional avalanches in a two-dimensional system is in sharp contrast to disordered thin films, where avalanches associated with magnetization reversal are two-dimensional. The self-organized restriction of avalanches to one dimension provides an example of dimensional reduction due to frustration. We give simple explanations for the origin of this dimensional reduction and discuss the disorder dependence of these avalanches. We conclude with the explicit demonstration of how these avalanches can be controlled via locally modified anisotropies. Such a controlled start and stop of avalanches will have potential applications in data storage and information processing.

10.
J Dent Res ; 91(7 Suppl): 8S-11S, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22699674

ABSTRACT

Human differences in disease phenotype and treatment responses are well documented. Technological advances now allow healthcare providers to improve the prevention and treatment of chronic diseases by stratifying patient populations. Although personalized medicine has great promise, it has, so far, been primarily applied in oncology. Wider adoption requires changes in the healthcare system and in clinical decision-making, and early applications of personalized medicine appear to require strong clinical utility and sufficient value to drive adoption. Personalized medicine is likely to enter dentistry as patients start to demand it and as new drugs are developed for pathways common to oral diseases.


Subject(s)
Dentistry/trends , Precision Medicine/trends , Biomarkers/analysis , Chronic Disease , Decision Making , Dental Caries/therapy , Forecasting , Genetic Predisposition to Disease/genetics , Humans , Mouth Diseases/therapy , Neoplasms/drug therapy , Periodontitis/classification , Periodontitis/therapy , Pharmacogenetics , Phenotype , Risk Assessment , Temporomandibular Joint Disorders/therapy
11.
Ir J Med Sci ; 181(3): 309-13, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22422079

ABSTRACT

BACKGROUND: The use of radial augmentation index (rAI) as an indicator of vascular disease was investigated in the vascular imaging laboratory in a regional hospital. AIMS: The aim of this study was to investigate whether a correlation exists between ankle-brachial pressure index (ABPI) and rAI in normal subjects, patients with peripheral obstructive arterial disease, and diabetic patients. METHODS: A group of 46 patients and 14 controls had ABPI and rAI measured and factors affecting AI were assessed. RESULTS: rAI was found to have a negative correlation with ABPI (Spearman's ρ = -0.513, p < 0.01). There was significant increase in the rAI scores of diabetic patients compared to normal patients (normal median was 64% lower than diabetic median, p < 0.01) and in peripheral obstructive vascular disease patients compared to normal (normal median 69% lower, p < 0.001). Of the various affecting factors, age stood out with rAI having a positive correlation to age (Spearman's ρ = 0.68, p < 0.01). CONCLUSIONS: The augmentation index appears be a significant indicator of cardiovascular disease and may be a useful tool in the diagnosis of vascular pathology.


Subject(s)
Arterial Occlusive Diseases/physiopathology , Diabetes Mellitus/physiopathology , Peripheral Arterial Disease/physiopathology , Radial Artery/physiopathology , Vascular Stiffness , Adult , Aged , Aged, 80 and over , Ankle Brachial Index , Arterial Occlusive Diseases/diagnosis , Female , Humans , Male , Manometry , Middle Aged , Peripheral Arterial Disease/diagnosis , Young Adult
12.
J Anim Sci ; 90(8): 2818-25, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22408085

ABSTRACT

As the dairy industry continues to grow, more dairy calves are available for dairy, beef, and veal purposes. Rearing systems must be highly efficient to make this industry cost efficient, making the evaluation of rearing methods important to establish the most practical method. A study was designed and conducted to evaluate effects of housing and feeding systems on performance of neonatal Holstein bull calves. Treatments (2 × 2 factorial arrangement) consisted of: 1) individually housed, bottle-fed (n = 5 bull calves); 2) individually housed, bucket-fed (n = 5 bull calves); 3) group-housed, bottle-fed (n = 5 pens; 4 bull calves/pen); and 4) group-housed, bucket- (trough) fed (n = 5 pens; 3 or 4 bull calves/pen). Feeding treatments began on d 7 when calves had been acclimated to their new environment. Body weight measurements were collected every 7 d and blood samples were collected on d 0, 28, 55, and 66 for ß-hydroxybutyrate (BHBA) concentration as a gross indicator of ruminal development. No housing × feeding interactions or feeding treatment effects were observed (P > 0.10). Average DMI (dry feed plus milk replacer) was increased (P < 0.05) for group-housed vs. individual animals after d 41, and final BW was greater (P < 0.05) for group-housed calves compared with individually housed calves. Feed efficiency and ADG, however, remained similar (P > 0.10) for all treatments. Fecal scores (P > 0.26), CV for BW (P > 0.26), and BHBA concentrations (P > 0.14) showed no differences among treatments. Housing system had greater effect on calf performance compared with milk feeding regimen.


Subject(s)
Animal Husbandry , Animals, Newborn , Cattle/growth & development , Housing, Animal , Aging , Animal Feed , Animal Nutritional Physiological Phenomena , Animal Welfare , Animals , Dairying , Diet/veterinary , Male
13.
Sex Transm Dis ; 38(7): 651-6, 2011 Jul.
Article in English | MEDLINE | ID: mdl-21301384

ABSTRACT

BACKGROUND: In randomized controlled trials of expedited partner therapy (EPT), among patients in the EPT arm, the proportion of partners believed to have taken the medication ranged from 56% to 85%. Little is known about the content of successful and unsuccessful EPT negotiations between patients and their partners. The aim of this study was to describe how patients made decisions about EPT and what they did with the EPT medication packs dispensed to them. METHODS: We performed a qualitative study at the Baltimore City Health Department sexually transmitted disease clinics, which instituted an EPT pilot program in 2007. In-depth interviews were conducted with 31 patients, 1 week to 3 months after they had accepted EPT to bring to their partners. Taped interviews were transcribed verbatim and coded using ATLAS.ti 6 qualitative software. Codes were further combined into more comprehensive themes that were mapped onto the study's main aim. RESULTS: Participants were innovative about how to get medication to their partners and indicated a deep sense of concern and responsibility for their partners' health. On the other hand, participants reported of being anxious about the interaction and sometimes felt that they lacked the words to talk with their partners about EPT. Some participants used EPT in unexpected ways, such as giving it to people other than their sex partners or taking it themselves. CONCLUSIONS: Enhancing the counseling that accompanies EPT may improve patients' success in delivering it to their partners.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Chlamydia Infections/drug therapy , Gonorrhea/drug therapy , Sexual Partners , Sexually Transmitted Diseases, Bacterial/drug therapy , Adult , Baltimore , Chlamydia Infections/diagnosis , Chlamydia Infections/prevention & control , Contact Tracing , Female , Gonorrhea/diagnosis , Gonorrhea/prevention & control , Humans , Interviews as Topic , Male , Patient Acceptance of Health Care , Qualitative Research , Sexually Transmitted Diseases, Bacterial/diagnosis , Sexually Transmitted Diseases, Bacterial/prevention & control , Treatment Refusal , Young Adult
14.
PLoS One ; 5(6): e11190, 2010 Jun 23.
Article in English | MEDLINE | ID: mdl-20585646

ABSTRACT

BACKGROUND: Relative to the attention given to improving the quality of and access to maternal health services, the influence of women's socio-economic situation on maternal health care use has received scant attention. The objective of this paper is to examine the relationship between women's economic, educational and empowerment status, introduced as the 3Es, and maternal health service utilization in developing countries. METHODS/PRINCIPAL FINDINGS: The analysis uses data from the most recent Demographic and Health Surveys conducted in 31 countries for which data on all the 3Es are available. Separate logistic regression models are fitted for modern contraceptive use, antenatal care and skilled birth attendance in relation to the three covariates of interest: economic, education and empowerment status, additionally controlling for women's age and residence. We use meta-analysis techniques to combine and summarize results from multiple countries. The 3Es are significantly associated with utilization of maternal health services. The odds of having a skilled attendant at delivery for women in the poorest wealth quintile are 94% lower than that for women in the highest wealth quintile and almost 5 times higher for women with complete primary education relative to those less educated. The likelihood of using modern contraception and attending four or more antenatal care visits are 2.01 and 2.89 times, respectively, higher for women with complete primary education than for those less educated. Women with the highest empowerment score are between 1.31 and 1.82 times more likely than those with a null empowerment score to use modern contraception, attend four or more antenatal care visits and have a skilled attendant at birth. CONCLUSIONS/SIGNIFICANCE: Efforts to expand maternal health service utilization can be accelerated by parallel investments in programs aimed at poverty eradication (MDG 1), universal primary education (MDG 2), and women's empowerment (MDG 3).


Subject(s)
Maternal Health Services/statistics & numerical data , Power, Psychological , Social Class , Developing Countries , Female , Humans
15.
Domest Anim Endocrinol ; 38(2): 86-94, 2010 Feb.
Article in English | MEDLINE | ID: mdl-19783118

ABSTRACT

To differentiate between the effects of heat stress (HS) and decreased dry matter intake (DMI) on physiological and metabolic variables in growing beef cattle, we conducted an experiment in which a thermoneutral (TN) control group (n=6) was pair fed (PF) to match nutrient intake with heat-stressed Holstein bull calves (n=6). Bulls (4 to 5 mo old, 135 kg body weight [BW]) housed in climate-controlled chambers were subjected to 2 experimental periods (P): (1) TN (18 degrees C to 20 degrees C) and ad libitum intake for 9 d, and (2) HS (cyclical daily temperatures ranging from 29.4 degrees C to 40.0 degrees C) and ad libitum intake or PF (in TN conditions) for 9 d. During each period, blood was collected daily and all calves were subjected to an intravenous insulin tolerance test (ITT) on day 7 and a glucose tolerance test (GTT) on day 8. Heat stress reduced (12%) DMI and by design, PF calves had similar nutrient intake reductions. During P1, BW gain was similar between environments and averaged 1.25 kg/d, and both HS and PF reduced (P<0.01) average daily gain (-0.09 kg/d) during P2. Compared to PF, HS decreased (P<0.05) basal circulating glucose concentrations (7%) and tended (P<0.07) to increase (30%) plasma insulin concentrations, but neither HS nor PF altered plasma nonesterified fatty acid concentrations. Although there were no treatment differences in P2, both HS and PF increased (P<0.05) plasma urea nitrogen concentrations (75%) compared with P1. In contrast to P1, both HS and PF had increased (16%) glucose disposal, but compared with PF, HS calves had a greater (67%; P<0.05) insulin response to the GTT. Neither period nor environment acutely affected insulin action, but during P2, calves in both environments tended (P=0.11) to have a blunted overall glucose response to the ITT. Independent of reduced nutrient intake, HS alters post-absorptive carbohydrate (basal and stimulated) metabolism, characterized primarily by increased basal insulin concentrations and insulin response to a GTT. However, HS-induced reduction in feed intake appears to fully explain decreased average daily gain in Holstein bull calves.


Subject(s)
Adaptation, Physiological/physiology , Cattle/growth & development , Cattle/metabolism , Hot Temperature , Animals , Blood Glucose/analysis , Body Temperature , Cattle/physiology , Diet , Eating , Fatty Acids, Nonesterified/blood , Glucose Tolerance Test/veterinary , Heart Rate , Insulin/blood , Male , Respiration , Weight Gain
16.
J Anim Sci ; 87(12): 4092-100, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19717762

ABSTRACT

Two experiments were conducted to evaluate the effectiveness of zilpaterol hydrochloride (ZH) to enhance growth performance and carcass characteristics in calf-fed Holstein steers. In Exp. 1, Holstein steers (n = 2,311) were fed in a large-pen trial in 2 phases at a commercial feed yard in the desert Southwest. In Exp. 2, a total of 359 steers were fed in a small-pen university study. In Exp. 1 and 2, cattle were implanted with a combination trenbolone acetate-estradiol implant approximately 120 d before slaughter. Cattle were fed ZH for 0, 20, 30, or 40 d before slaughter at a rate of 8.3 mg/kg (DM basis). A 3-d withdrawal was maintained immediately before slaughter. Cattle within an experiment were fed to a common number of days on feed. During the last 120 d before slaughter, ADG was not enhanced by feeding ZH for 20 d (P = 0.33 in Exp. 1, and P = 0.79 in Exp. 2). Gain-to-feed conversion was increased by feeding ZH for all durations in Exp. 1 (P < 0.05). Feeding ZH increased HCW by 9.3 (Exp. 2) to 11.6 (Exp. 1) kg at 20 d compared with the control groups. Across both experiments, dressing percent was increased for all durations of feeding ZH (P < 0.05). Although skeletal maturity score, liver integrity, lean color, fat thickness, and KPH were not affected by feeding ZH for 20 d in either experiment (P >or= 0.6), LM area was increased for all durations of feeding ZH (P < 0.05). The percentage of carcasses identified as USDA Choice was reduced (P < 0.01) for all durations of feeding ZH in Exp. 1. This effect was not observed in Exp. 2. Holstein steers clearly respond to the beta-agonist ZH, and 20 d of feeding ZH with a 3-d withdrawal significantly increased carcass weights, muscling, and carcass leanness.


Subject(s)
Cattle/growth & development , Growth Substances/pharmacology , Trimethylsilyl Compounds/pharmacology , Animal Feed , Animal Husbandry , Animals , Cattle/physiology , Diet/veterinary , Food Additives/pharmacology , Male , Meat/standards , Nutritive Value , Weight Gain/drug effects
17.
J Anim Sci ; 86(6): 1426-33, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18310491

ABSTRACT

The objective of this study was to compare carcass characteristics of a newly introduced breed, the Waguli (Wagyu x Tuli), with the carcass characteristics of the Brahman breed. Brahman cattle are used extensively in the Southwest of the United States because of their tolerance to adverse environmental conditions. However, Brahman carcasses are discounted according to the height of their humps because of meat tenderness issues. The Waguli was developed in an attempt to obtain a breed that retained the heat tolerance of the Brahman but had meat quality attributes similar to the Wagyu. Twenty-four animals were used. Six steers from each breed were fed a 94% concentrate diet and 6 steers from each breed were fed an 86% concentrate diet. Eight steers, 2 from each group, were harvested after 128 d, after 142 d, and after 156 d on feed. Waguli steers had larger LM, greater backfat thickness, greater marbling scores, and greater quality grades than the Brahman steers (P < 0.05). The Japanese Wagyu breed is well known for its highly marbled and tender meat, and these traits are also present in the Waguli. The Waguli had significantly lower Warner-Bratzler shear force values than the Brahman steers after 7 and 10 d of postmortem aging (P < 0.05); this difference decreased after 14 d postmortem (P = 0.2), when tenderness of the slower aging Brahman had increased to acceptable levels. Toughness of the Brahman has been associated with high levels of calpastatin in Brahman muscle, and the Waguli LM had significantly less calpastatin activity (P = 0.02) at 0 h postmortem than the Brahman LM. At 0-h postmortem, the total LM calpain activity did not differ between the Brahman and Waguli (P = 0.57). Neither diet nor days on feed had any significant effect on the 0-h postmortem calpain or at 0-h postmortem calpastatin activity, nor an effect on Warner-Bratzler shear-force values. In conclusion, LM muscle from the Waguli steers had a high degree of marbling, lower shear force values, and low calpastatin activity, all of which are related to more tender meat.


Subject(s)
Animal Feed , Body Composition/physiology , Calcium-Binding Proteins/metabolism , Calpain/metabolism , Meat/standards , Muscle, Skeletal/metabolism , Adipose Tissue/metabolism , Animal Nutritional Physiological Phenomena , Animals , Breeding , Calcium-Binding Proteins/adverse effects , Cattle , Food Technology , Male , Muscle, Skeletal/anatomy & histology , Phenotype , Random Allocation , Time Factors
18.
Allergy ; 62(5): 514-9, 2007 May.
Article in English | MEDLINE | ID: mdl-17441792

ABSTRACT

BACKGROUND: Cystatin A (CSTA) is a strong candidate for atopic dermatitis (AD) because it maps to AD susceptibility locus on chromosome 3q21 and it does inhibit Der p 1 and Der f 1, major house dust mite cysteine proteases and environmental triggers for AD and asthma. OBJECTIVE: To examine any association between polymorphisms in CSTA and AD and study the effect on the CSTA mRNA expression level. METHODS: We identified three polymorphisms and characterized the linkage disequilibrium mapping of the CSTA gene. All three CSTA polymorphisms were genotyped in 100 AD patients and 203 matched controls. Subsequently, we performed transfection-based RNA stability assays. RESULTS: We found a significant association between the CSTA +344C variant and AD [odds ratio (OR) = 1.91; P = 0.024]. When further 61 control samples were genotyped. The association with CSTA +344C allele was enhanced OR = 2.13; P = 0.006. To test whether the CSTA +344 affected the CSTA transcriptional activity, the decay rates of RNAs transcribed from the CSTA +344C and CSTA +344T variants were investigated. COS-7 cells were transfected with a pcDNA3.1-CSTA+344C or a pcDNA3.1-CSTA+344T construct and cultured in the presence or absence of actinomycin D. Real-time RT-PCR revealed that CSTA +344C mRNA is more than two times less stable than the CSTA +344T mRNA (P < 0.001). CONCLUSION: These results suggest that the CSTA +344C allele associated with unstable mRNA could result in failing to protect the skin barrier in AD patients from both exogenous and endogenous proteases.


Subject(s)
Amino Acid Substitution/genetics , Cystatins/genetics , Cystatins/immunology , Cysteine Proteinase Inhibitors/immunology , Dermatitis, Atopic/immunology , Pyroglyphidae/immunology , RNA Stability/immunology , RNA, Messenger/metabolism , Animals , COS Cells , Case-Control Studies , Child, Preschool , Chlorocebus aethiops , Cystatin A , Cystatins/chemistry , Cysteine Proteinase Inhibitors/chemistry , Humans , Polymorphism, Single Nucleotide , Pyroglyphidae/genetics , Risk Factors
19.
J Anim Sci ; 85(3): 823-40, 2007 Mar.
Article in English | MEDLINE | ID: mdl-17085724

ABSTRACT

Morbidity and mortality from bovine respiratory disease (BRD) and associated losses in performance and carcass merit continue to plague the beef cattle industry. Several viral/bacterial agents are responsible for BRD, and interactions occur among the agents. Viral agents often predispose animals to bacterial infections, and Mannheimia haemolytica is the most frequently isolated organism in cattle with BRD. Laboratory tests are available to characterize organisms causing BRD using easily obtained nasal swab samples. Testing for persistent infection with bovine viral diarrhea virus can be done by a 2-stage technique using PCR and immunohistochemistry. Preconditioning programs that include preweaning viral vaccination programs along with castration could have a significant influence on decreasing BRD in the cattle feeding industry. Metaphylactic antibiotic programs continue to be effective; however, antibiotic resistance is a public concern, and additional management options (e.g., direct-fed microbials or other compounds with antimicrobial properties) deserve attention. Diets with an increased energy concentration achieved by decreasing the dietary roughage concentration may slightly increase the rate of BRD morbidity; however, these diets also increase ADG, DMI, and G:F compared with lower-energy, greater-roughage diets. The extent to which performance and BRD morbidity are affected by dietary protein concentration needs further study, but low and high protein concentrations should probably be avoided. Several trace minerals (e.g., Cu, Se, and Zn) affect immune function, but the effects of supplementation on performance and immune function in model challenge systems and in field studies are equivocal. Adding vitamin E to receiving diets at pharmacological levels (e.g., >1,000 IU x animal(-1) x day(-1)) seems beneficial for decreasing BRD morbidity, but it has little effect on performance. Given the limited ability to consistently modify immune function and BRD morbidity through dietary manipulations, we recommend that the diets for newly received cattle be formulated to adjust nutrient concentrations for low feed intake and to provide optimal performance during the receiving period.


Subject(s)
Animal Husbandry/methods , Cattle Diseases/prevention & control , Cattle/physiology , Respiratory Tract Infections/veterinary , Stress, Physiological/veterinary , Animals , Respiratory Tract Infections/prevention & control , Stress, Physiological/prevention & control
20.
J Anim Sci ; 84(12): 3421-32, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17093237

ABSTRACT

As cattle mature, the dietary protein requirement, as a percentage of the diet, decreases. Thus, decreasing the dietary CP concentration during the latter part of the finishing period might decrease feed costs and N losses to the environment. Three hundred eighteen medium-framed crossbred steers (315 +/- 5 kg) fed 90% (DM basis) concentrate, steam-flaked, corn-based diets were used to evaluate the effect of phase-feeding of CP on performance and carcass characteristics, serum urea N concentrations, and manure characteristics. Steers were blocked by BW and assigned randomly to 36 feedlot pens (8 to 10 steers per pen). After a 21-d step-up period, the following dietary treatments (DM basis) were assigned randomly to pens within a weight block: 1) 11.5% CP diet fed throughout; 2) 13% CP diet fed throughout; 3) switched from an 11.5 to a 10% CP diet when approximately 56 d remained in the feeding period; 4) switched from a 13 to an 11.5% CP diet when 56 d remained; 5) switched from a 13 to a 10% CP diet when 56 d remained; and 6) switched from a 13 to an 11.5% CP diet when 28 d remained. Blocks of cattle were slaughtered when approximately 60% of the cattle within the weight block were visually estimated to grade USDA Choice (average days on feed = 182). Nitrogen volatilization losses were estimated by the change in the N:P ratio of the diet and pen surface manure. Cattle switched from 13 to 10% CP diets with 56 d remaining on feed or from 13 to 11.5% CP with only 28 d remaining on feed had lower (P < 0.05) ADG, DMI, and G:F than steers fed a 13% CP diet throughout. Steers on the phase-feeding regimens had lower (P = 0.05) ADG and DMI during the last 56 d on feed than steers fed 13.0% CP diet throughout. Carcass characteristics were not affected by dietary regimen. Performance by cattle fed a constant 11.5% CP diet did not differ from those fed a 13% CP diet. Serum urea N concentrations increased (P < 0.05) with increasing dietary CP concentrations. Phase-feeding decreased estimated N excretion by 1.5 to 3.8 kg/steer and nitrogen volatilization losses by 3 to 5 kg/steer. The results suggest that modest changes in dietary CP concentration in the latter portion of the feeding period may have relatively small effects on overall beef cattle performance, but that decreasing dietary CP to 10% of DM would adversely affect performance of cattle fed high-concentrate, steam-flaked, corn-based diets.


Subject(s)
Cattle/physiology , Dietary Proteins/pharmacology , Feces/chemistry , Nitrogen/analysis , Nitrogen/metabolism , Animal Feed , Animals , Blood Urea Nitrogen , Body Composition/drug effects , Diet/veterinary , Dietary Proteins/metabolism , Male , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...