Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
J Anim Sci ; 97(3): 1158-1170, 2019 Mar 01.
Article in English | MEDLINE | ID: mdl-30590611

ABSTRACT

The objectives of this study were to evaluate the effectiveness and accuracy of monitoring feeding behavior patterns using cumulative summation (CUSUM) procedures to predict the onset of bovine respiratory disease (BRD) in beef cattle. Growing bulls (N = 231) on a 70-d growth and efficiency trial were used in this study. Between days 28 and 38 of the study, 30 bulls were treated for BRD based on observed clinical signs and elevated rectal temperature (>39.5 °C); remaining bulls (n = 201) were considered healthy. Clinically-ill and healthy bulls were used to evaluate sensitivity and specificity of CUSUM models, with accuracy calculated as the average of sensitivity and specificity. All data were standardized prior to generating CUSUM charts in a daily accumulative manner. Eight univariate CUSUM models were evaluated including DMI, bunk visit (BV) frequency, BV duration, head down (HD) duration, eating rate, maximal nonfeeding interval (NFI Max), SD of nonfeeding interval (NFI SD), and time to bunk (TTB). Accuracies for detection of BRD were 80.1, 69.4, 72.4, 79.1, 63.7, 64.6, 73.2, and 48.7%, respectively, and average day of detection prior to observed symptoms of BRD were 1.0, 3.2, 3.2, 4.8, 10.2, 2.7, 1.5, and 0.6 d, respectively. Principal component analysis (PCA) of all 8 univariate traits (full model) was used to construct multivariate factors that were similarly monitored with CUSUM. Two reduced multivariate models were also constructed that included the 3 best performing feeding behavior traits (BV duration, HD duration, NFI SD) with (RBD) and without DMI (RB). Accuracy of the full multivariate model was similar to the best of the univariate models (75.0%). However, both of the reduced multivariate models (RB and RBD) were more accurate (84.0%) than the full multivariate model. All 3 of the multivariate models signaled (P < 0.05) 2.0 to 2.1 d prior to clinical observation. These results demonstrate that the use of PCA-derived multivariate factors in CUSUM charts was more accurate compared with univariate CUSUM charts, for pre-clinical detection of BRD. Furthermore, adding DMI to the RB model did not further improve accuracy or signal day of BRD detection. The use of PCA-based multivariate models to monitor feeding behavior traits should be more robust than relying on univariate trait models for preclinical detection of BRD. Results from this study demonstrate the value of using CUSUM procedures to monitor feeding behavior patterns to more accurately detect BRD prior to clinical symptoms in feedlot cattle.


Subject(s)
Cattle Diseases/diagnosis , Eating , Feeding Behavior , Fever/veterinary , Respiratory Tract Diseases/veterinary , Animal Feed , Animals , Cattle , Diet/veterinary , Male , Monitoring, Physiologic/statistics & numerical data , Monitoring, Physiologic/veterinary , Phenotype , Respiratory Tract Diseases/diagnosis , Sensitivity and Specificity , Time Factors
2.
CMAJ ; 183(1): 47-53, 2011 Jan 11.
Article in English | MEDLINE | ID: mdl-21135082

ABSTRACT

BACKGROUND: Recent studies have reported a trend toward earlier initiation of dialysis (i.e., at higher levels of glomerular filtration rate) and an association between early initiation and increased risk of death. We examined trends in initiation of hemodialysis within Canada and compared the risk of death between patients with early and late initiation of dialysis. METHODS: The analytic cohort consisted of 25 910 patients at least 18 years of age who initiated hemodialysis, as identified from the Canadian Organ Replacement Register (2001-2007). We defined the initiation of dialysis as early if the estimated glomerular filtration rate was greater than 10.5 mL/min per 1.73 m². We fitted time-dependent proportional-hazards Cox models to compare the risk of death between patients with early and late initiation of dialysis. RESULTS: Between 2001 and 2007, mean estimated glomerular filtration rate at initiation of dialysis increased from 9.3 (standard deviation [SD] 5.2) to 10.2 (SD 7.1) (p < 0.001), and the proportion of early starts rose from 28% (95% confidence interval [CI] 27%-30%) to 36% (95% CI 34%-37%). Mean glomerular filtration rate was 15.5 (SD 7.7) mL/min per 1.73 m² among those with early initiation and 7.1 (SD 2.0) mL/min per 1.73 m² among those with late initiation. The unadjusted hazard ratio (HR) for mortality with early relative to late initiation was 1.48 (95% CI 1.43-1.54). The HR decreased to 1.18 (95% CI 1.13-1.23) after adjustment for demographic characteristics, serum albumin, primary cause of end-stage renal disease, vascular access type, comorbidities, late referral and transplant status. The mortality differential between early and late initiation per 1000 patient-years narrowed after one year of follow-up, but never crossed and began widening again after 24 months of follow-up. The differences were significant at 6, 12, 30 and 36 months. INTERPRETATION: In Canada, dialysis is being initiated at increasingly higher levels of glomerular filtration rate. A higher glomerular filtration rate at initiation of dialysis is associated with an increased risk of death that is not fully explained by differences in baseline characteristics.


Subject(s)
Glomerular Filtration Rate , Kidney Diseases/mortality , Kidney Diseases/therapy , Renal Dialysis , Adult , Aged , Canada , Cohort Studies , Female , Humans , Kidney Diseases/physiopathology , Male , Middle Aged , Patient Selection , Retrospective Studies , Survival Rate , Time Factors , Treatment Outcome
3.
Arch Intern Med ; 171(5): 396-403, 2011 Mar 14.
Article in English | MEDLINE | ID: mdl-21059968

ABSTRACT

BACKGROUND: A dramatic increase in the "early start" of dialysis with an estimated glomerular filtration rate (eGFR) at least 10 mL/min/1.73 m(2) has occurred in the United States since at least 1996. Several recent studies have reported a comorbidity-adjusted survival disadvantage of early start of dialysis. The current study examines a relatively "healthy" dialysis cohort to minimize confounding issues and determine whether early initiation of hemodialysis is associated with a survival benefit or harm. METHODS: We examined demographics, year of dialysis initiation, primary etiology of renal failure, and body mass index, hemoglobin, and serum albumin levels in 81,176 nondiabetic, 20- to 64-year-old, in-center incident hemodialysis patients with no reported comorbidity besides hypertension. We compared survival, using a piecewise proportional hazards model to estimate covariate-adjusted mortality hazard ratios (HRs) for eGFR at the time of initiation of dialysis. We also performed time-dependent adjusted analysis stratified by initial serum albumin levels lower than 2.5 g/dL, 2.5 to 3.49 g/dL, and 3.5 g/dL or higher (the "healthiest" group [HG]). RESULTS: Unadjusted 1-year mortality by eGFR ranged from 6.8% in the reference group (eGFR <5.0 mL/min/1.73 m(2)) to 20.1% in the highest eGFR group (≥15.0 mL/min/1.73 m(2)). Compared with the reference group, the HR for the HG was 1.27 (eGFR, 5.0-9.9 mL/min/1.73 m(2)), 1.53 (eGFR, 10.0-14.9 mL/min/1.73 m(2)), and 2.18 (eGFR ≥15.0 mL/min/1.73 m(2)) and ranged from 1.50 to 3.53 mL/min/1.73 m(2) in the first year of dialysis for the early-start group. CONCLUSION: The increased HR during hemodialysis associated with early start in the healthiest group of patients undergoing dialysis indicates that early start of dialysis may be harmful.


Subject(s)
Kidney Failure, Chronic/therapy , Renal Dialysis/methods , Adult , Cohort Studies , Comorbidity , Female , Glomerular Filtration Rate , Humans , Hypertension/mortality , Kidney Failure, Chronic/mortality , Male , Middle Aged , United States , Young Adult
4.
Environ Toxicol Chem ; 28(12): 2715-24, 2009 Dec.
Article in English | MEDLINE | ID: mdl-19400597

ABSTRACT

The toxicity of oxytetracycline (OTC) was evaluated in adult grass shrimp, Palaemonetes pugio. Initially, static acute (96 h) toxicity tests were conducted with shrimp exposed from 0 to 1,000 mg/L OTC. A calculated lethal concentration 50% value of 683.30 mg/L OTC (95% confidence interval 610.85-764.40 mg/L) was determined from these tests, along with a lowest-observable-effect concentration of 750 mg/L and no-observable-effect concentration of 500 mg/L. Moreover, chronic sublethal effects of OTC exposure on grass shrimp intestinal bacterial population were assessed using doses from 0 to 32 mg/L OTC. The total viable counts in digestive tract content had levels between 5.2 and 1x10(4) colony-forming units per gram of tissue at times 0 and 96 h, respectively. Aeromonas hydrophila were the most resistant isolates (27.78%) to OTC exposure. Vibrio alginolyticus showed significant positive growth following exposure to OTC, whereas other bacterial species abundance declined over time. A total of 268 bacterial isolates were screened using antibiotic resistance analysis from a library containing 459 isolates. Among the tested isolates from the OTC treatments, 15.4% were resistant to OTC and 84.6% were OTC sensitive. Oxytetracycline was generally not consistently quantifiable with liquid chromatography-mass spectroscopy technique in shrimp homogenates. The only peak detected was at the 32 mg/L dose of OTC at 96 h. Nevertheless, OTC had a significant biological effect on the bacterial population. Antibiotic resistance to five other antibiotics (penicillin G, sulfathiazole, trimethoprim, trimethoprim and sulfamethoxazole, and tetracycline) was strongly associated with OTC exposures. The present study indicates that OTC toxicity effects in P. pugio and changes in the shrimp microbial community would only be expected under special circumstances.


Subject(s)
Anti-Bacterial Agents/toxicity , Bacteria/drug effects , Oxytetracycline/toxicity , Palaemonidae/microbiology , Water Pollutants, Chemical/toxicity , Animals , Drug Resistance, Bacterial , Oxytetracycline/analysis , Oxytetracycline/metabolism
5.
Pediatr Blood Cancer ; 50(1): 98-103, 2008 Jan.
Article in English | MEDLINE | ID: mdl-17610265

ABSTRACT

BACKGROUND: This report evaluated the association between surviving pediatric cancer and receiving a diagnosis of a stress-related mental disorder (SRMD) (i.e., post-traumatic stress disorder (PTSD), acute stress disorder, or adjustment disorders). PROCEDURE: The dataset comprised a cohort of Medicaid eligible children, ages birth to 15 years during baseline years 1996-2001 and followed at least 1 year until age 19 years or the end of 2003. Childhood cancer survivors (N = 390) identified from the SC Central Cancer Registry were frequency matched within age groups at each baseline year to children with no history of malignancy (N = 1,329). Survival curves and cumulative incidence of SRMD were estimated using the Kaplan-Meier method. Cox proportional hazards models were used to estimate hazard ratios (HR) and 95% confidence intervals (CI) for pediatric cancer survival and selected covariates. RESULTS: The 8-year incidence of SRMD was 18.6% (95% CI: 12.47, 24.8) among childhood cancer survivors and 7.3% (5.0, 9.6) among children with no history of malignancy, HR = 3.22 (2.17, 4.76). Significant covariates for this group included race, sex, and previous mental disorder, adjusted HR = 3.00 (2.02, 4.45). Significant predictors among the childhood cancer survivors included cancer type, age group, treatment, and previous mental disorder. CONCLUSIONS: Given the potential benefit of interventions for those with prior psychopathology, that children are less likely to verbalize emotional problems, and the detrimental implications of undiagnosed mental disorders, the health evaluations of childhood cancer patients and the follow-up visits for the survivors should incorporate assessment for mental disorders, especially SRMD.


Subject(s)
Mental Disorders/etiology , Neoplasms/psychology , Stress, Psychological/complications , Survivors/psychology , Adolescent , Adult , Child , Child, Preschool , Female , Humans , Infant , Infant, Newborn , Male , Risk Factors
6.
AIDS Patient Care STDS ; 21(9): 667-80, 2007 Sep.
Article in English | MEDLINE | ID: mdl-17919094

ABSTRACT

This study examined the relationships among sociodemographic factors, social support, coping, and adherence to antiretroviral therapy (ART) among HIV-positive women with depression. The analyses reported here were limited to the 224 women receiving ART of 280 women recruited from community-based HIV/AIDS organizations serving rural areas of three states in the southeastern United States. Two indicators of medication adherence were measured; self-report of missed medications and reasons for missed medications in the past month. Descriptive statistics, correlation, and regression analyses were performed to systematically identify sociodemographic, coping, and social support variables that predicted medication adherence. In regression analysis, three variables were determined to be significant predictors accounting for approximately 30% of the variability in the self-report of reasons for missed medications. Coping focused on managing HIV disease was negatively associated, while coping focused on avoidance/denial and number of children were positively associated with reasons for missed medications. Coping by spiritual activities and focusing on the present mediated the effect of social support on self-reported missed medications. The relationship of predictor variables to self-report of missed medications was assessed using t test statistics and logistic regression analysis to determine the odds of self-reported medication adherence. Satisfaction with social support (p = 0.04), and coping focused on managing HIV disease (p = 0.002) were the best positive predictors, whereas number of children (p = 0.02) was the lone significant negative predictor of medication adherence. The study findings have implications for designing, implementing, and testing interventions based on social support and coping theories for achieving better adherence to HIV medications.


Subject(s)
Adaptation, Psychological , Anti-Retroviral Agents/therapeutic use , Depression/complications , HIV Infections/drug therapy , Patient Compliance/psychology , Social Support , Adult , Cross-Sectional Studies , Female , HIV Infections/complications , HIV Infections/psychology , Humans , Middle Aged , Regression Analysis , Rural Health , Southeastern United States
7.
Qual Life Res ; 15(5): 777-89, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16721638

ABSTRACT

The Chronic Illness Quality of Life Ladder (CIQOLL) underwent psychometric testing in a sample of 278 women with HIV disease. The CIQOLL, a self-anchoring striving scale based on Cantril's Ladder, measures seven domains (physical , emotional, financial, family and friends, spiritual well-being, peace of mind, and overall life satisfaction) across four time periods (present, past, future, life without a diagnosis of HIV). The domains were derived from focus groups with persons with HIV disease. Women with a diagnosis of HIV Infection, age 18 or older, residing in rural areas in the southeastern United States, completed questionnaires that measured physical functioning, HIV related symptom frequency and distress, depressive symptoms, social support, and quality of life. Procedures used to assess reliability included item-item, item-total, and subscale-subscale correlations, and Chronbach's coefficient alpha. Criterion-related (concurrent) validity was assessed by correlating the CIQOLL with HIV symptoms, functional status and social support. Construct validity was estimated using factor analysis and predictive modeling. Results provide preliminary evidence that the CIQOLL is a reliable and valid scale that may provide meaningful information about persons living with a chronic illness, such as HIV disease, especially low literacy and unacculturated populations. Additional research is needed to weight the domains, test the sensitivity of the scale to changes over time, and explore the usefulness of discrepancy scores.


Subject(s)
HIV Seropositivity/psychology , Quality of Life/psychology , Rural Population , Adult , Chronic Disease , Female , Humans , Middle Aged , Psychometrics , Southeastern United States , Surveys and Questionnaires
8.
Clin Nurs Res ; 14(3): 273-93, 2005 Aug.
Article in English | MEDLINE | ID: mdl-15995155

ABSTRACT

This study examined the relationships among subjective sleep disturbance, depressive symptoms, and adherence to medications among HIV-infected women. HIV-infected women (N = 173) were recruited through community AIDS service organizations throughout South Carolina. Participants completed the Pittsburgh Sleep Quality Index (PSQI), the Centers for Epidemiological Studies Depression Scale (CES-D), and a modified version of the Adults AIDS Clinical Trials Group Adherence Baseline Questionnaire. Women who reported greater sleep disturbance also reported a higher level of depressive symptoms and reported poor adherence to their medication regimen. Depression helped to explain the relationship between sleep quality and adherence. Results indicate that assessment and management of sleep disturbance and depressive symptoms in women with HIV disease is important to promote medication adherence.


Subject(s)
Antiretroviral Therapy, Highly Active , Depression/psychology , HIV Infections/drug therapy , HIV Infections/psychology , Patient Compliance/psychology , Sleep Wake Disorders/psychology , Adolescent , Adult , Aged , Analysis of Variance , Cross-Sectional Studies , Depression/complications , Female , HIV Infections/complications , Humans , Middle Aged , Sleep Wake Disorders/complications , South Carolina
9.
J Assoc Nurses AIDS Care ; 16(4): 25-38, 2005.
Article in English | MEDLINE | ID: mdl-16435528

ABSTRACT

Depressive symptoms are a common response to HIV disease, and women appear to be at particularly high risk. The authors report results from a cross-sectional analysis of data collected from 280 rural women with HIV/AIDS in the Southeastern United States aimed at identifying risk factors of depressive symptoms. Stress theory provided a framework for identification of potential risk factors. Descriptive statistics, measures of association, and regression analyses were used to systematically identify patterns of risk. The final regression model included 22 factors that accounted for 69% of the variance in depressive symptoms. The majority of variance in depressive symptoms was accounted for by only six variables: the frequency of HIV symptoms, recent experiences of sadness/hopelessness, the availability of social support, and the use of three coping strategies: living positively with HIV, isolation/withdrawal, and denial/avoidance. The results suggest a number of intervention strategies for use with rural women with HIV/AIDS.


Subject(s)
Depression/epidemiology , HIV Seropositivity/psychology , Rural Population , Women/psychology , Adaptation, Psychological , Adolescent , Adult , Aged , Cross-Sectional Studies , Denial, Psychological , Depression/diagnosis , Depression/virology , Factor Analysis, Statistical , Female , Grief , HIV Seropositivity/complications , Humans , Life Change Events , Middle Aged , Models, Psychological , Regression Analysis , Risk Factors , Rural Population/statistics & numerical data , Social Support , Socioeconomic Factors , Southeastern United States/epidemiology , Surveys and Questionnaires
10.
J Pediatr Psychol ; 29(4): 273-83, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15148350

ABSTRACT

OBJECTIVE: To examine the relationships between maternal perceptions of risk, stress, social support, safety-proofing behaviors, supervision practices and unintentional injuries to children under 5 years old. METHODS: Household interviews were conducted with 159 mothers who had a preschool-age child. The secondary data were part of a population-based study that collected self-report data and home observational data. Diaries were used for collecting prospective injury data. RESULTS: White children whose mothers were unemployed and whose homes needed repair were reported to be at higher injury risk than other children. Predicting a higher injury risk were children's behavioral characteristics as well as their being older than 2.5 years. Maternal social support, stress, and coping variables were not related to injury risk. Maternal perceptions of risk variables interacted with maternal safety behavior variables when predicting injury risk. CONCLUSIONS: Childhood injuries are predicted by a set of interrelated sociodemographic, cognitive, behavioral, and child-related factors.


Subject(s)
Intention , Wounds and Injuries/epidemiology , Child , Child, Preschool , Cross-Sectional Studies , Humans , Infant , Prospective Studies , Risk Factors , Social Support
SELECTION OF CITATIONS
SEARCH DETAIL
...