Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 84
Filter
1.
BJS Open ; 4(5): 977-984, 2020 10.
Article in English | MEDLINE | ID: mdl-33179875

ABSTRACT

BACKGROUND: RCTs provide the scientific basis upon which treatment decisions are made. To facilitate critical review, it is important that methods and results are reported transparently. The aim of this study was to explore transparency in surgical RCTs with respect to trial registration, disclosure of funding sources, declarations of investigator conflicts and data-sharing. METHODS: This was a cross-sectional review of published surgical RCTs. Ten high-impact journals were searched systematically for RCTs published in years 2009, 2012, 2015 and 2018. Four domains of transparency were explored: trial registration, disclosure of funding, disclosure of investigator conflicts, and a statement relating to data-sharing. RESULTS: Of 611 RCTs, 475 were eligible for analysis. Some 397 RCTs (83.6 per cent) were registered on a trial database, of which 190 (47·9 per cent) had been registered prospectively. Prospective registration increased over time (26 per cent in 2009, 33·0 per cent in 2012, 54 per cent in 2015, and 72·7 per cent in 2018). Funding disclosure was present in 55·0, 65·0, 69·4 and 75·4 per cent of manuscripts respectively. Conflict of interest disclosure was present in 49·5, 89·1, 94·6 and 98·3 per cent of manuscripts across the same time periods. Data-sharing statements were present in only 15 RCTs (3·2 per cent), 11 of which were published in 2018. CONCLUSION: Trial registration, disclosure of funding and disclosure of investigator conflicts in surgical RCTs have improved markedly over the past 10 years. Disclosure of data-sharing plans is exceptionally low. This may contribute to research waste and represents a target for improvement.


ANTECEDENTES: Los ensayos clínicos aleatorizados y controlados (randomized controlled trials, RCT) proporcionan la base científica para la toma de decisiones terapéuticas. Es importante que los métodos y los resultados se presenten de forma transparente para facilitar la revisión crítica. El objetivo de este estudio fue investigar la transparencia en los RCTs del ámbito quirúrgico según su registro, declaraciones de las fuentes de financiación del estudio y conflicto de interés de los investigadores, así como información referente a compartir los datos. MÉTODOS: Revisión transversal de RCTs quirúrgicos publicados. Se realizó una búsqueda sistemática de los RCTs publicados en 10 revistas de alto impacto en los años 2009, 2012, 2015 y 2018. Se exploraron cuatro dominios de transparencia: el registro de los ensayos, la declaración de los fondos utilizados, la declaración de los conflictos de los investigadores y la información referente a la forma de compartir los datos. RESULTADOS: De 611 RCTs, se incluyeron en el análisis 475. Un total de 397 (83,6%) estudios se registraron en una base de datos de ensayos clínicos, de forma prospectiva en 190 (47,9%). El registro prospectivo aumentó a lo largo del tiempo (26,0% en 2009, 33,0% en 2012, 53,5% en 2015 y 72,7% en 2018). Se mencionaban las fuentes de financiación en el 55%, 65%, 69,4% y 75,4% de los manuscritos, respectivamente. La declaración de conflictos de interés estuvo presente en el 49,5%, 89,1%, 94,6% y 98,3% de los manuscritos en esos mismos períodos de tiempo. Las declaraciones relativas a compartir los datos de la investigación constaban en solo 15 (3,2%) RCTs, 11 de los cuales fueron publicados en el 2018. CONCLUSIÓN: En los últimos 10 años ha mejorado de forma notable el registro de los ensayos y las declaraciones de las fuentes de financiación y conflicto de interés en los RCTs quirúrgicos. La declaración referente a compartir los datos es excepcionalmente baja, lo que puede contribuir al desperdicio de la investigación y constituye un objetivo de mejora.


Subject(s)
Conflict of Interest , Disclosure , General Surgery , Periodicals as Topic/statistics & numerical data , Randomized Controlled Trials as Topic/economics , Cross-Sectional Studies , Editorial Policies , Humans , Journal Impact Factor , Periodicals as Topic/standards , Randomized Controlled Trials as Topic/ethics , Research Support as Topic
3.
Article in English | MEDLINE | ID: mdl-37538870

ABSTRACT

Background: Among ESRD patients, obesity may improve dialysis-survival but decreases likelihood of transplantation, and as such, obesity prevalence may directly affect growth of the dialysis population. Objective: The objective of this study was to assess BMI trends in the ESRD population as compared to the general population. Materials and Methods: Incident adult ESRD patients were identified from the United States Renal Data System from 01/01/1995-12/31/2010 (n=1,458,350). Data from the Behavioral Risk Factor Surveillance System (n=4,303,471) represented the US population. Trends in BMI, obesity classes I (BMI of 30-34.9), II (BMI of 35-39.9), and III (BMI ≥ 40), were examined by year of dialysis initiation. Trends in BMI slope were compared between the ESRD and US populations using linear regression. Results: Mean BMI of ESRD patients in 1995 was 25.2 as compared to 29.4 in 2010, a 16.7% increase, while the US population's mean BMI increased from 25.3 to 27.2, a 7.5% increase. BMI increase among the ESRD population was significantly more rapid than among the US population (ß: 0.16, 95% CI: 0.14-0.18, p<0.001). Conclusions and Recommendations: Mean BMI among the ESRD population is increasing more rapidly than the US population. Given decreased access to kidney transplantation among ESRD patients with obesity, future research should be directed at controlling healthcare expenditures by identifying strategies to address the obesity epidemic among the US ESRD population.

4.
Am J Transplant ; 17(12): 3114-3122, 2017 Dec.
Article in English | MEDLINE | ID: mdl-28696079

ABSTRACT

Excellent outcomes have been demonstrated among select HIV-positive kidney transplant (KT) recipients with well-controlled infection, but to date, no national study has explored outcomes among HIV+ KT recipients by antiretroviral therapy (ART) regimen. Intercontinental Marketing Services (IMS) pharmacy fills (1/1/01-10/1/12) were linked with Scientific Registry of Transplant Recipients (SRTR) data. A total of 332 recipients with pre- and posttransplantation fills were characterized by ART at the time of transplantation as protease inhibitor (PI) or non-PI-based ART (88 PI vs. 244 non-PI). Cox proportional hazards models were adjusted for recipient and donor characteristics. Comparing recipients by ART regimen, there were no significant differences in age, race, or HCV status. Recipients on PI-based regimens were significantly more likely to have an Estimated Post Transplant Survival (EPTS) score of >20% (70.9% vs. 56.3%, p = 0.02) than those on non-PI regimens. On adjusted analyses, PI-based regimens were associated with a 1.8-fold increased risk of allograft loss (adjusted hazard ratio [aHR] 1.84, 95% confidence interval [CI] 1.22-2.77, p = 0.003), with the greatest risk observed in the first posttransplantation year (aHR 4.48, 95% CI 1.75-11.48, p = 0.002), and a 1.9-fold increased risk of death as compared to non-PI regimens (aHR 1.91, 95% CI 1.02-3.59, p = 0.05). These results suggest that whenever possible, recipients should be converted to a non-PI regimen prior to kidney transplantation.


Subject(s)
Anti-Retroviral Agents/pharmacology , Graft Rejection/mortality , HIV Infections/complications , Kidney Transplantation/methods , Postoperative Complications/mortality , Protease Inhibitors/pharmacology , Transplant Recipients , Adult , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/drug therapy , Graft Rejection/etiology , Graft Survival , HIV Infections/drug therapy , HIV Infections/virology , HIV-1/drug effects , Humans , Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/surgery , Kidney Function Tests , Male , Middle Aged , Prognosis , Risk Factors , Survival Rate
5.
Br J Surg ; 104(6): 734-741, 2017 May.
Article in English | MEDLINE | ID: mdl-28218394

ABSTRACT

BACKGROUND: Evidence supporting the implementation of novel surgical devices is unstandardized, despite recommendations for assessing novel innovations. This study aimed to determine the proportion of novel implantable devices used in gastrointestinal surgery that are supported by evidence from RCTs. METHODS: A list of novel implantable devices placed intra-abdominally during gastrointestinal surgery was produced. Systematic searches were performed for all devices via PubMed and clinical trial registries. The primary outcome measure was the availability of at least one published RCT for each device. Published RCTs were appraised using the Cochrane tool for assessing risk of bias. RESULTS: A total of 116 eligible devices were identified (implantable mesh 42, topical haemostatics 22, antiadhesion barriers 10, gastric bands 8, suture and staple-line reinforcement 7, artificial sphincters 5, other 22). One hundred and twenty-eight published RCTs were found for 33 of 116 devices (28·4 per cent). Most were assessed as having a high risk of bias, with only 12 of 116 devices (10·3 per cent) supported by a published RCT considered to be low risk. A further 95 ongoing and 23 unpublished RCTs were identified for 42 of 116 devices (36·2 per cent), but many (64 of 116, 55·2 per cent) had no evidence from published, ongoing or unpublished RCTs. The highest stage of innovation according to the IDEAL Framework was stage 1 for 11 devices, stage 2a for 23 devices, stage 2b for one device and stage 3 for 33 devices. The remaining 48 devices had no relevant clinical evidence. CONCLUSION: Only one in ten novel implantable devices available for use in gastrointestinal surgical practice is supported by high-quality RCT evidence.


Subject(s)
Digestive System Surgical Procedures/instrumentation , Prostheses and Implants , Cross-Sectional Studies , Diffusion of Innovation , Evidence-Based Medicine , Humans , Prospective Studies , Randomized Controlled Trials as Topic , Review Literature as Topic
6.
Am J Transplant ; 17(1): 173-179, 2017 01.
Article in English | MEDLINE | ID: mdl-27305590

ABSTRACT

Excellent outcomes have been demonstrated in primary human immunodeficiency virus (HIV)-positive (HIV+) kidney transplant recipients, but a subset will lose their graft and seek retransplantation (re-KT). To date, no study has examined outcomes among HIV+ re-KT recipients. We studied risk for death and graft loss among 4149 (22 HIV+ vs. 4127 HIV-negative [HIV-]) adult re-KT recipients reported to the Scientific Registry of Transplant Recipients (SRTR) (2004-2013). Compared to HIV- re-KT recipients, HIV+ re-KT recipients were more commonly African American (63.6% vs. 26.7%, p < 0.001), infected with hepatitis C (31.8% vs. 5.0%, p < 0.001) and had longer median time on dialysis (4.8 years vs. 2.1 years, p = 0.02). There were no significant differences in length of time between the primary and re-KT events by HIV status (1.5 years vs. 1.4 years, p = 0.52). HIV+ re-KT recipients experienced a 3.11-fold increased risk of death (adjusted hazard ratio [aHR]: 3.11, 95% confidence interval [CI]: 1.82-5.34, p < 0.001) and a 1.96-fold increased risk of graft loss (aHR: 1.96, 95% CI: 1.14-3.36, p = 0.01) compared to HIV- re-KT recipients. Re-KT among HIV+ recipients was associated with increased risk for mortality and graft loss. Future research is needed to determine if a survival benefit is achieved with re-KT in this vulnerable population.


Subject(s)
Graft Rejection/mortality , HIV Infections/mortality , Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Postoperative Complications/mortality , Reoperation , Adult , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/etiology , Graft Survival , HIV Infections/surgery , HIV Infections/virology , HIV-1/isolation & purification , Humans , Kidney Failure, Chronic/surgery , Kidney Failure, Chronic/virology , Kidney Function Tests , Kidney Transplantation/adverse effects , Male , Middle Aged , Prognosis , Risk Factors , Transplant Recipients
7.
Am J Transplant ; 16(8): 2377-83, 2016 08.
Article in English | MEDLINE | ID: mdl-27140837

ABSTRACT

For some patient subgroups, human immunodeficiency virus (HIV) infection has been associated with worse outcomes after kidney transplantation (KT); potentially modifiable factors may be responsible. The study goal was to identify factors that predict a higher risk of graft loss among HIV-positive KT recipients compared with a similar transplant among HIV-negative recipients. In this study, 82 762 deceased donor KT recipients (HIV positive: 526; HIV negative: 82 236) reported to the Scientific Registry of Transplant Recipients (SRTR) (2001-2013) were studied by interaction term analysis. Compared to HIV-negative recipients, the hepatitis C virus (HCV) amplified risk 2.72-fold among HIV-positive KT recipients (adjusted hazard ratio [aHR]: 2.72, 95% confidence interval [CI]: 1.75-4.22, p < 0.001). Forty-three percent of the excess risk was attributable to the interaction between HIV and HCV (attributable proportion of risk due to the interaction [AP]: 0.43, 95% CI: 0.23-0.63, p = 0.02). Among HIV-positive recipients with more than three HLA mismatches (MMs), risk was amplified 1.80-fold compared to HIV-negative (aHR: 1.80, 95% CI: 1.31-2.47, p < 0.001); 42% of the excess risk was attributable to the interaction between HIV and more than three HLA MMs (AP: 0.42, 95% CI: 0.24-0.60, p = 0.01). High-HIV-risk (HIV-positive/HCV-positive HLAwith more than three MMs) recipients had a 3.86-fold increased risk compared to low-HIV-risk (HIV-positive/HCV-negative HLA with three or fewer MMs)) recipients (aHR: 3.86, 95% CI: 2.37-6.30, p < 0.001). Avoidance of more than three HLA MMs in HIV-positive KT recipients, particularly among coinfected patients, may mitigate the increased risk of graft loss associated with HIV infection.


Subject(s)
Graft Rejection/prevention & control , HIV Infections/surgery , Hepatitis C/surgery , Kidney Failure, Chronic/surgery , Kidney Transplantation/standards , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Survival , HIV Infections/complications , HIV-1/isolation & purification , Hepacivirus/isolation & purification , Hepatitis C/complications , Histocompatibility Testing , Humans , Kidney Failure, Chronic/complications , Kidney Function Tests , Male , Middle Aged , Prognosis , Risk Factors
8.
Am J Transplant ; 15(8): 2096-104, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25773499

ABSTRACT

Excellent outcomes among HIV+ kidney transplant (KT) recipients have been reported by the NIH consortium, but it is unclear if experience with HIV+ KT is required to achieve these outcomes. We studied associations between experience measures and outcomes in 499 HIV+ recipients (SRTR data 2004-2011). Experience measures examined included: (1) center-level participation in the NIH consortium; (2) KT experiential learning curve; and (3) transplant era (2004-2007 vs. 2008-2011). There was no difference in outcomes among centers early in their experience (first 5 HIV+ KT) compared to centers having performed >6 HIV+ KT (GS adjusted hazard ratio [aHR]: 1.05, 95% CI: 0.68-1.61, p = 0.82; PS aHR: 0.93; 95% CI: 0.56-1.53, p = 0.76), and participation in the NIH-study was not associated with any better outcomes (GS aHR: 1.08, 95% CI: 0.71-1.65, p = 0.71; PS aHR: 1.13; 95% CI: 0.68-1.89, p = 0.63). Transplant era was strongly associated with outcomes; HIV+ KTs performed in 2008-2011 had 38% lower risk of graft loss (aHR: 0.62; 95% CI: 0.42-0.92, p = 0.02) and 41% lower risk of death (aHR: 0.59; 95% CI: 0.39-0.90, p = 0.01) than that in 2004-2007. Outcomes after HIV+ KT have improved over time, but center-level experience or consortium participation is not necessary to achieve excellent outcomes, supporting continued expansion of HIV+ KT in the US.


Subject(s)
HIV Infections/surgery , Kidney Transplantation , Adolescent , Adult , Aged , Humans , Middle Aged , United States , Young Adult
9.
Equine Vet J ; 41(9): 883-8, 2009 Dec.
Article in English | MEDLINE | ID: mdl-20383986

ABSTRACT

REASON FOR PERFORMING STUDY: To improve efficiency at the farm level, a better understanding of how farm management factors impact reproductive performance is important. OBJECTIVE: To assess reproductive efficiency and effectiveness among Thoroughbred mares in central Kentucky. METHODS: A cohort of 1011 mares on 13 farms in central Kentucky was followed during the 2004 mating and 2005 foaling season. Information on farm level practices was collected via interviews with farm managers. Reproductive records were collected for each mare mated to obtain information on mare characteristics. The influence of mare age and status (maiden, foaling, barren) on Days 15 and 40 post mating pregnancy rates, foaling rates and total effective length of the mating season were assessed. The influence of stallion book size on reproductive performance measures was also examined. RESULTS: Per season pregnancy rates on Days 15 and 40 post mating and live foal rate were 92.1, 89.3 and 783%, respectively. Per cycle rates for the same time periods were 64.0, 583 and 50.8%. There were no significant associations between stallion book size and reproductive performance outcomes. The mean +/- s.d. interval from the beginning of the mating season to the last mating of the mare was 36.5 +/- 26.1 days. CONCLUSIONS: Mare age had a significant impact on efficiency of becoming pregnant, maintaining pregnancy and producing a live foal. Overall, fertility did not decrease among stallions with the largest book sizes. Total interval length of the mating season can be reduced if managers ensure maiden and barren mares are mated at the beginning of the season and foaling mares are mated at the earliest oestrus after acceptable uterine involution has been achieved. POTENTIAL RELEVANCE: Measures identified in the study can be used by owners, farm managers and veterinarians to improve mare reproductive performance and identify parameters to assist with the implementation of effective culling practices.


Subject(s)
Horses/physiology , Pregnancy Rate , Pregnancy, Animal/physiology , Animals , Female , Kentucky , Male , Pregnancy
10.
Equine Vet J ; 41(9): 889-94, 2009 Dec.
Article in English | MEDLINE | ID: mdl-20383987

ABSTRACT

REASON FOR PERFORMING STUDY: There have been no studies reporting the impact of reproductive efficiency and mare financial value on economic returns. OBJECTIVE: To explore the economic consequences of differences in reproductive efficiency over time in the Thoroughbred mare. METHODS: Complete production records for 1176 mares were obtained. Production history and drift in foaling date were calculated. Multiple logistic regression was used to identify factors influencing the probability of producing a registered foal in 2005. The 'net present value' and 'internal rate of return' were calculated for economic scenarios involving different initial mare financial values, levels of reproductive efficiency, and durations of investment. RESULTS: Among mares that did not produce a foal every year (63%), the mean time before failing to produce a registered foal was 3.4 years. The majority of mares drifted later in their foaling dates in subsequent foaling seasons. Increasing mare age, foaling after 1st April, needing to be mated multiple times during the season, and producing a lower number of foals in continuous sequence during previous years decreased the probability of producing a registered foal. Over a 7 year investment period, live foals must be produced in all but one year to yield a positive financial return. Profitability was highest among mares of greatest financial value. CONCLUSIONS: Mares are long-term investments due to the extended period before there is a return on the investment. Improving our understanding of mare, stallion and management factors that affect the likelihood of producing a live foal are critical to ensuring a positive financial return. Additional work is needed to test the robustness of the study's conclusions when the cost and revenue assumptions are varied. POTENTIAL RELEVANCE: This information can assist in assessing mare profitability and developing management strategies to maximise profitability.


Subject(s)
Horses/physiology , Pregnancy, Animal/physiology , Animal Husbandry/economics , Animals , Female , Kentucky , Pregnancy , Pregnancy Outcome/veterinary , Pregnancy Rate
11.
J Anim Sci ; 85(5): 1144-55, 2007 May.
Article in English | MEDLINE | ID: mdl-17264235

ABSTRACT

Recent studies associate obesity and insulin resistance in horses with development of abnormal reproductive function and debilitating laminitis. The factors contributing to insulin resistance in obese horses are unknown. However, human studies provide evidence that elevated inflammatory cytokines such as tumor necrosis factor alpha (TNFalpha), IL1, and IL6 play direct roles in development of obesity-associated insulin resistance. Thus, inflammation may be a key link between obesity and insulin resistance in horses. The aim of the current investigation was to examine possible relationships between obesity, inflammatory cytokines, and insulin sensitivity (IS) in the horse. Age was recorded and BCS and percent body fat (% FAT) were determined as measures of obesity in 60 mares. In addition, blood mRNA expression of IL1, IL6, and TNFalpha and circulating concentrations of TNFalpha protein (TNFp) were determined in each mare. Finally, fasted concentrations of insulin were determined, and IS was determined using the hyperinsulinemic, euglycemic clamp. Significant correlations between several variables provided evidence for the design of 4 population regression models to estimate relationships between measures of obesity, inflammatory factors, and IS in the sample population. The results of these analyses revealed that IS decreased as BCS and % FAT increased (P < 0.001) in the sample population. Additionally, increased IL1 (P < 0.05) and TNFp (P < 0.01) were associated with decreased IS. However, increased TNFalpha (P < 0.001) was associated with decreased IS only in mares 20 yr of age and older. Increased BCS and % FAT were associated with increased expression of TNFalpha (P = 0.053) and IL1 (P < 0.05), and increased TNFp (P < 0.05). Surprisingly, increased BCS and % FAT were associated with decreased IL6 expression (P = 0.05) in mares <20 yr of age. Finally, evaluation of the influence of obesity and inflammatory cytokines on IS within the same model suggested that BCS and % FAT (P < 0.001) with TNFalpha [mRNA (P = 0.07) and protein (P < 0.05)] are inversely associated with IS independently of one another. Combined, these results provide the first evidence associating obesity with increased inflammatory factors in the horse. Furthermore, the results suggest that an interrelationship exists among obesity, inflammatory cytokines, and IS in the horse and emphasize the need for further studies to elucidate the nature of these relationships.


Subject(s)
Cytokines/metabolism , Horse Diseases/metabolism , Horses/physiology , Inflammation/metabolism , Insulin Resistance/physiology , Obesity/veterinary , Aging , Animals , Body Composition/physiology , Female , Gene Expression Regulation , Obesity/metabolism , Obesity/physiopathology , RNA/metabolism , Regression Analysis
12.
J Dent Res ; 81(12): 860-5, 2002 Dec.
Article in English | MEDLINE | ID: mdl-12454103

ABSTRACT

Dental care can occur within or outside the formal health-care system. We hypothesized that certain subject characteristics would partly explain one type of dental self-care, non-professional extractions. A representative sample of diverse groups of dentate adults was studied. In-person interviews and clinical examinations were conducted at baseline, 24, 48, and 72 months, with semi-annual telephone interviews in between. Of 699 participants, 291 (42%) reported loss of at least one tooth, of whom 42 (14% of those with tooth loss) reported having lost the tooth at a place other than a health-care facility. Ninety-four percent of non-professionally lost teeth were self-extracted; relatives extracted the remainder. Fifty-eight percent of these teeth were deliberately removed; the remainder came out while subjects were eating or brushing their teeth, or due to injury. Attachment loss and mobility at previous examination were consistent with the occurrence of non-professional extraction. The incidence magnitude was substantive and persistent throughout follow-up.


Subject(s)
Attitude to Health , Self Care/psychology , Tooth Extraction/psychology , Tooth Loss/epidemiology , Aged , Analysis of Variance , Dental Care/psychology , Dental Care/statistics & numerical data , Family , Humans , Incidence , Interviews as Topic , Longitudinal Studies , Middle Aged , Periodontal Attachment Loss/epidemiology , Regression Analysis , Sampling Studies , Self Care/statistics & numerical data , Surveys and Questionnaires , Tooth Extraction/statistics & numerical data
13.
Ann Rheum Dis ; 61(11): 994-9, 2002 Nov.
Article in English | MEDLINE | ID: mdl-12379522

ABSTRACT

OBJECTIVE: To determine whether rheumatoid arthritis (RA) is associated with excess mortality among older women. METHODS: RA associated mortality was examined in a prospective cohort study that was started in 1986, and included 31 336 women aged 55-69 years without a history of RA at baseline. Up to 1997, 158 cases of RA were identified and validated against medical records. The relative risk (RR) and 95% confidence interval (CI) were calculated as measures of association between RA onset and subsequent mortality (overall and cause-specific) using Cox proportional hazards regression. RESULTS: Compared with non-cases, women developing RA during follow up had a significantly increased mortality risk (RR=1.52; 95% CI 1.05 to 2.20). Mortality was higher among rheumatoid factor (RF) positive cases (RR=1.90; 95% CI 1.24 to 2.92) than among RF negative cases (RR=1.00; 95% CI 0.45 to 1.99). There were trends towards increased proportions of RA related deaths from infection (RR=3.61; 95% CI 0.89-14.69) and circulatory disease (RR=1.46; 95% CI 0.76 to 2.81) but not malignancy (RR=0.97; 95% CI 0.46 to 2.04). CONCLUSIONS: RA was associated with significantly increased mortality in a cohort of older women, and the association appeared to be restricted to those with RF positive disease.


Subject(s)
Arthritis, Rheumatoid/mortality , Aged , Female , Follow-Up Studies , Health Surveys , Humans , Iowa/epidemiology , Middle Aged , Proportional Hazards Models , Prospective Studies , Risk Assessment , Survival Rate
14.
Circulation ; 104(13): 1489-93, 2001 Sep 25.
Article in English | MEDLINE | ID: mdl-11571241

ABSTRACT

BACKGROUND: Patients with ischemic LV dysfunction are at high risk of sudden death. However, no benefit from prophylactic defibrillator therapy was observed in a group of patients with LV dysfunction undergoing CABG (CABG Patch trial). Thus, the effect of CABG on future risk of sudden death in patients with LV dysfunction is of considerable interest. METHODS AND RESULTS: Mortality and modes of death in 5410 patients with ischemic LV dysfunction who were enrolled in the Studies of Left Ventricular Dysfunction (SOLVD) trials were evaluated. Outcomes of patients with (n=1870, 35%) versus without (n=3540) history of prior CABG were compared, and stratification by baseline ejection fraction (EF) values (<0.25, 0.25 to 0.30, and >0.30) was performed. Prior CABG was associated with a 25% (95% CI, 15% to 36%) reduction in risk of death and a 46% (95% CI, 30% to 58%) reduction in risk of sudden death independent of EF and severity of heart failure symptoms. As baseline EF declined, absolute reduction in risk of sudden death with prior CABG increased (P<0.01). No alteration in risk of death from progressive heart failure was observed with prior CABG. When these results were applied to a group of patients with LV dysfunction who had not undergone prior surgery (Coronary Artery Surgery Study Registry) predicted annual rates of death (8.2%) and sudden death (2.4%) were similar to those observed in the CABG Patch trial (7.9% and 2.3%, respectively). CONCLUSIONS: In patients with ischemic LV dysfunction, prior CABG is associated with a significant independent reduction in mortality. These results appear to account for the lack of benefit from defibrillator therapy in the CABG Patch trial.


Subject(s)
Coronary Artery Bypass , Myocardial Ischemia/mortality , Risk Assessment/statistics & numerical data , Ventricular Dysfunction, Left/mortality , Adult , Aged , Aged, 80 and over , Death, Sudden, Cardiac/etiology , Female , Humans , Male , Middle Aged , Myocardial Ischemia/complications , Preoperative Care , Ventricular Dysfunction, Left/complications
15.
Am J Respir Crit Care Med ; 163(6): 1331-7, 2001 May.
Article in English | MEDLINE | ID: mdl-11371397

ABSTRACT

There is considerable variability in the clinical course of disease in cystic fibrosis (CF). Although currently unidentified modifier genes might explain some of this heterogeneity, other factors are probably contributory. Socioeconomic status (SES) is an important predictor of health status in many chronic polygenic diseases, but its role in CF has not been systematically evaluated. We performed a historical cohort analysis of pediatric CF patients in the United States using National Cystic Fibrosis Foundation Patient Registry (NCFPR) data for 1986 to 1994, and used Medicaid status as a proxy for low SES. The adjusted risk of death was 3.65 times higher (95% confidence interval [CI]: 3.03 to 4.40) for Medicaid patients than for those not receiving Medicaid. The percent predicted FEV(1) of surviving Medicaid patients was less by 9.1% (95% CI: 6.9 to 11.2). Medicaid patients were 2.19 times more likely to be below the 5th percentile for weight (95% CI: 1.91 to 2.51) and 2.22 times more likely to be below the 5th percentile for height (95% CI: 1.95 to 2.52) than were non-Medicaid patients. Medicaid patients were 1.60 times more likely to require treatment for a pulmonary exacerbation (95% CI: 1.29 to 1.98). There was no difference in the number of outpatient clinic visits for Medicaid and non-Medicaid patients. We conclude that low SES is associated with significantly poorer outcomes in children with CF. Barriers in access to specialty health care do not seem to explain this difference. Further study is indicated to determine what adverse environmental factors might cluster in CF patients of low SES to cause worse outcomes.


Subject(s)
Cystic Fibrosis/economics , Cystic Fibrosis/mortality , Insurance, Health/economics , Medicaid/economics , Medically Uninsured/statistics & numerical data , Poverty/economics , Treatment Outcome , Cause of Death , Child , Cluster Analysis , Cystic Fibrosis/complications , Cystic Fibrosis/therapy , Female , Forced Expiratory Volume , Foundations , Humans , Longitudinal Studies , Lung Diseases/diagnosis , Lung Diseases/etiology , Male , Morbidity , Proportional Hazards Models , Registries , Residence Characteristics/statistics & numerical data , Risk Factors , Survival Analysis , United States/epidemiology , Vital Capacity
16.
NMR Biomed ; 14(1): 12-8, 2001 Feb.
Article in English | MEDLINE | ID: mdl-11252036

ABSTRACT

Human immunodeficiency virus (HIV) infection of the brain causes a complex cascade of cellular events involving several different cell types that eventually leads to neuronal cell death and the manifestation of the AIDS-associated dementia complex (ADC). Upon autopsy HIV-infected individuals show lesions within subcortical regions of the brain, including the cerebellum. Previously we have demonstrated, in primary and cell culture models of rat and human astrocytes, a change in intracellular pH (pH(i)) due to increased Na(+)/H(+) exchange following exposure to inactivated virus or gp120, the major HIV envelope glycoprotein. To further investigate whether any such in vivo pH(i) changes occur in human brains subsequent to HIV infection, we measured the pH(i) of the cerebellum in eight HIV-positive individuals and nine healthy volunteers using (31)P magnetic resonance spectroscopy imaging (MRSI) at high field strength (4.1 T). The results showed a significant difference between the age-adjusted mean pH(i) in the cerebellum in control group and patient groups (7.11 +/- 0.03 vs 7.16 +/- 0.04), and further HIV-infected individuals displayed a significant increase in the number of cerebellar volume elements that were alkaline. We hypothesize that this propensity towards alterations in cerebellar pH(i) may portend later neurological involvement resulting from HIV infection.


Subject(s)
Cerebellum/chemistry , HIV Infections/metabolism , Adenosine Triphosphate/chemistry , Adult , Female , Humans , Hydrogen-Ion Concentration , Magnetic Resonance Spectroscopy , Male , Phosphates/chemistry , Phosphocreatine/chemistry
17.
Prev Med ; 32(2): 163-7, 2001 Feb.
Article in English | MEDLINE | ID: mdl-11162342

ABSTRACT

BACKGROUND: Diabetes mellitus (DM) may increase the risk of colorectal cancer, a leading cause of cancer death in the United States. This report examines factors associated with colorectal cancer screening, including DM status. METHODS: Data from the 1993/1995/1997 North Carolina (NC) Behavioral Risk Factor Surveillance System were analyzed to assess self-reported screening rates within guidelines for sigmoidoscopy/proctoscopy (sig/proct) and fecal occult blood test (FOBT). RESULTS: Overall, 28.6, 27.2, and 19.7% received a sig/proct, FOBT, or either test within guidelines, respectively. Screening rates varied according to some demographic variables, but not by DM status. However, DM status changed some relationships between screening and some demographic/health characteristics. CONCLUSIONS: Colorectal cancer screening in NC is similar to national rates, but certain subgroups are less likely to get screened. Persons with DM are as likely to get colorectal cancer screening, but some groups with DM (ethnic minorities, persons of low socioeconomic status) may be at high risk for not getting screened. Educational efforts to increase screening should target these groups.


Subject(s)
Colorectal Neoplasms/prevention & control , Diabetes Mellitus/epidemiology , Mass Screening/statistics & numerical data , Age Distribution , Aged , Colorectal Neoplasms/epidemiology , Female , Humans , Male , Middle Aged , North Carolina/epidemiology , Occult Blood , Odds Ratio , Patient Selection , Proctoscopy/statistics & numerical data , Risk Factors , Sex Distribution , Sigmoidoscopy/statistics & numerical data
18.
J Stroke Cerebrovasc Dis ; 10(5): 231-5, 2001.
Article in English | MEDLINE | ID: mdl-17903830

ABSTRACT

GOAL: To develop a practical severity scale (Wake Forest Stroke Severity Scale [WFSSS]) to predict acute hospital outcomes and resource use after acute ischemic stroke based on the admission neurologic exam. BACKGROUND: A useful scheme enabling physicians and other health care providers to stratify stroke severity on admission to predict acute hospital outcomes and improve efficiency of inpatient care has not been described. METHODS: The study subjects consisted of 271 consecutive acute stroke patients admitted to the neurology department from July 1995 to June 1996 who were prospectively examined and whose stroke severity was classified on the basis of admission neurologic exam (level of consciousness, strength, dysphasia, neglect, and gait) as mild, moderate, or severe, based on the WFSSS. National Institutes of Health stroke scale (NIHSS) was performed early in admission (70% within 24 hours). Discharge disposition (home, inpatient rehabilitation [rehab], skilled nursing facility [SNF], or death); length of stay (LOS); and hospital charges were associated with initial stroke severity ratings using chi-square and Kruskal-Wallis tests. RESULTS: Fifty-percent (136) of strokes were classified as mild, 22% (60) as moderate, and 28% (75) as severe. Initial severity ratings were significantly related to discharge disposition, LOS, and hospital charges (all P values <.001). CONCLUSIONS: A practical clinical severity scale (WFSSS) for acute ischemic stroke patients based on admission neurologic examination predicts hospital disposition, LOS, and hospital charges, and may allow more accurate severity-adjusted comparisons among institutions.

19.
J Health Psychol ; 6(2): 159-68, 2001 Mar.
Article in English | MEDLINE | ID: mdl-22049319

ABSTRACT

The purpose of this study was to examine whether change in satisfaction with physical function (SF), satisfaction with physical appearance (SA), and self-efficacy (SE) mediate the effects that increased physical activity has on change in subjective well-being (SWB). Participants in this investigation consisted of 854 men (n = 471) and women (n = 383) who took part in the Activity Counseling Trial (ACT). ACT was a 24-month multicenter, randomized controlled trial to evaluate the effectiveness of interventions to promote physical activity in the primary care setting. Participants were assigned to one of three treatments: standard care control, staff-assisted intervention, or staff-counseling intervention. Results revealed that, irrespective of treatment arm, change in physical activity was related to change in SBW and to change in all mediators of interest. A statistical test of mediation revealed that the influence of change in physical activity on SWB was due to change in all three mediators with change in SF making the greatest contribution to the model.

20.
AIHAJ ; 61(5): 738-42, 2000.
Article in English | MEDLINE | ID: mdl-11071427

ABSTRACT

This review seeks to assist industrial hygienists in the prevention of Legionnaires' disease caused by Legionella bacteria. Breathing water droplets contaminated with Legionella bacteria, in which the organism has been permitted to amplify, causes this disease. Possible sources of transmission include nearly all manmade building water systems. Legionella organisms, found in most natural water sources but at very low concentrations, can thrive under conditions of warmth in these manmade systems. Primary prevention of Legionnaires' disease requires prevention of amplification of Legionella in water systems. This, in turn, requires familiarity with the system and all its components, and effective maintenance and water treatment. However, good maintenance and water treatment regimens alone cannot assure that amplification will not occur somewhere in the system. Systematic microbiological testing for Legionella and appropriate interpretation of the testing results can be powerful assets in prevention by enabling the detection and control of amplification. The occurrence of a confirmed or suspected case of Legionnaires' disease in a building occupant may indicate transmission within the facility; this poses an immediate crisis for the facility manager. An aggressive intervention is indicated to search for previously unknown additional cases of illness, to detect potential sources of transmission, and to decontaminate any suspected sources of transmission on an emergency basis. Once adequate remediation has been achieved and confirmed by microbiological testing, on-going control measures are essential with periodic microbiological investigation to assure continuing prevention of amplification.


Subject(s)
Air Conditioning/instrumentation , Heating/instrumentation , Legionella/isolation & purification , Legionnaires' Disease , Ventilation/instrumentation , Water Microbiology , Decontamination , Humans , Legionnaires' Disease/epidemiology , Legionnaires' Disease/prevention & control , Legionnaires' Disease/transmission , Safety Management , United States , Water Supply
SELECTION OF CITATIONS
SEARCH DETAIL
...