Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
1.
Dig Dis Sci ; 64(11): 3274-3283, 2019 11.
Article in English | MEDLINE | ID: mdl-30607690

ABSTRACT

INTRODUCTION: Crohn's disease (CD) follows a relapsing and remitting course incurring cumulative bowel damage over time. The question of whether or not the timing of the initiating biologic therapy affects long-term disease progression remains unanswered. Herein, we calculated rates of change in the Lémann index-which quantifies accumulated bowel damage-as a function of the time between the disease onset and initiation of biologic therapy. We aimed to explore the impact of the earlier introduction of biologics on the rate of progression of long-term cumulative bowel damage. METHODS: Medical records of CD patients treated during 2009-2014 at The Mount Sinai Hospital were queried. Inclusion criteria were two comprehensive assessments allowing calculation of the index at t1 and t2: two time-points ≥ 1 year apart. Patients with biologics introduced before or within 3 months at inclusion (t1) were defined as Bio-pre-t1 and those who did not as Bio-post-t1. The rate of disease progression was calculated as the change in the index per year during t1-t2. RESULTS: A total of 88 patients were studied: 58 Bio-pre-t1 and 30 Bio-post-t1. Among the 58 Bio-pre-t1 cases, damage progressed in 29 (50%), regressed in 20 (34.5%), and stabilized in 9 (15.5%). Median time to initiation of biologics among patients whose index improved was nominally shorter compared to that in patients whose index progressed (8 vs. 15 years). Earlier introduction of biologics tended to correlate with the slower rate of progression (ρ = 0.241; p = 0.069). CONCLUSIONS: Earlier introduction of biologics tended to correlate with the slower progression of bowel damage in CD, reflected by the reduced rate of Lémann index progression.


Subject(s)
Crohn Disease/diagnosis , Crohn Disease/drug therapy , Disease Progression , Time-to-Treatment/standards , Tumor Necrosis Factor Inhibitors/therapeutic use , Tumor Necrosis Factor-alpha/antagonists & inhibitors , Adult , Aged , Cohort Studies , Female , Follow-Up Studies , Humans , Longitudinal Studies , Male , Middle Aged , Retrospective Studies
2.
J Oral Maxillofac Surg ; 76(2): 375-379, 2018 02.
Article in English | MEDLINE | ID: mdl-28963867

ABSTRACT

PURPOSE: The purpose of this project was to characterize patients with isolated head and neck burns admitted to the Grady Memorial Hospital (GMH) Burn Center (Atlanta, GA). MATERIALS AND METHODS: This was a retrospective case series of patients admitted to the GMH Burn Center with the primary diagnosis of head and neck burns from 2000 through 2015. Demographic data (gender and age) were recorded. Burn details (etiology, mechanism, percentage of burned total body surface area, depth, and associated injuries) were summarized. Patient management and hospital course were documented. Data were collected using a standardized collection form. Descriptive statistics were computed. RESULTS: There were 5,938 patients admitted to the burn unit at the GMH Burn Center during the study period. Of these, 2,547 patients had head and neck burns and 205 patients met the inclusion criteria. Most (n = 136; 66%) were male, with a mean age of 40 years. The most common burn depth was superficial partial thickness. Flame burns were the most likely mechanism related to full-thickness injury. Approximately one fourth of patients had an associated injury, such as inhalation or ocular injury. Surgical interventions consisted of tangential excision and split-thickness skin grafting, contracture release, excision of hypertrophic scars, and rotational flaps. Mean length of hospital stay for isolated head and neck burns was 4.4 days. Overall mortality was 2%. CONCLUSION: The results of this study show that superficial partial-thickness head and neck burns are more likely to occur from accidental exposure to flames in men older than 55 years. Owing to an increase in risk and mortality of inhalation injury associated with head and neck burns, airway protection and respiratory management are critical considerations of head and neck burn management.


Subject(s)
Burns/surgery , Craniocerebral Trauma/surgery , Neck Injuries/surgery , Adolescent , Adult , Aged , Aged, 80 and over , Burns/mortality , Child , Child, Preschool , Craniocerebral Trauma/mortality , Female , Humans , Infant , Male , Middle Aged , Neck Injuries/mortality , Retrospective Studies , Survival Analysis , Treatment Outcome
3.
HPB (Oxford) ; 17(12): 1074-84, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26373873

ABSTRACT

BACKGROUND: The Model for End-stage Liver Disease (MELD) has been used as a prognostic tool since 2002 to predict pre-transplant mortality. Increasing proportions of transplant candidates with higher MELD scores, combined with improvements in transplant outcomes, mandate the need to study surgical outcomes in patients with MELD scores of ≥40. METHODS: A retrospective longitudinal analysis of United Network for Organ Sharing (UNOS) data on all liver transplantations performed between February 2002 and June 2011 (n = 33,398) stratified by MELD score (<30, 30-39, ≥40) was conducted. The primary outcomes of interest were short- and longterm graft and patient survival. A Kaplan-Meier product limit method and Cox regression were used. A subanalysis using a futile population was performed to determine futility predictors. RESULTS: Of the 33,398 transplant recipients analysed, 74% scored <30, 18% scored 30-39, and 8% scored ≥40 at transplantation. Recipients with MELD scores of ≥40 were more likely to be younger (P < 0.001), non-White and to have shorter waitlist times (P < 0.001). Overall patient survival correlated inversely with increasing MELD score; this trend was consistent for both short-term (30 days and 90 days) and longterm (1, 3 and 5 years) graft and patient survival. In multivariate analysis, increasing age, African-American ethnicity, donor obesity and diabetes were negative predictors of survival. Futility predictors included patient age of >60 years, obesity, peri-transplantation intensive care unit hospitalization with ventilation, and multiple comorbidities. CONCLUSIONS: Liver transplantation in recipients with MELD scores of ≥40 offers acceptable longterm survival outcomes. Futility predictors indicate the need for prospective follow-up studies to define the population to gain the highest benefit from this precious resource.


Subject(s)
Decision Support Techniques , Liver Diseases/surgery , Liver Transplantation , Survivors , Transplant Recipients , Adolescent , Adult , Aged , Allografts , Chi-Square Distribution , Databases, Factual , Female , Humans , Kaplan-Meier Estimate , Liver Diseases/diagnosis , Liver Diseases/mortality , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Male , Middle Aged , Multivariate Analysis , Patient Selection , Predictive Value of Tests , Proportional Hazards Models , Retrospective Studies , Risk Factors , Survivors/statistics & numerical data , Time Factors , Tissue and Organ Procurement , Transplant Recipients/statistics & numerical data , Treatment Outcome , United States , Young Adult
4.
Transpl Int ; 28(8): 990-9, 2015 Aug.
Article in English | MEDLINE | ID: mdl-25864733

ABSTRACT

This study analyzed outcomes of retransplantation from expanded criteria donors (ECD) over the last two decades to determine the benefits and risks of using ECD kidneys for retransplantation. Data from the United Network for Organ Sharing database were collected and analyzed. Graft survival, death-censored graft survival, and patient survival for retransplantation with ECD kidneys (re-ECD) were reported and compared with primary transplantation with ECD kidneys (prim-ECD) and retransplantation with standard criteria donor kidneys (re-SCD). Re-ECD kidneys had higher risk of graft failure compared with prim-ECD (hazard ratio [HR] = 1.19) and to re-SCD (HR = 1.76). Patient survival was better in re-ECD compared with prim-ECD (HR = 0.89) but was worse than re-SCD (HR = 1.82). After censoring the patients who died with a functioning graft, re-ECD had a higher mortality risk compared with prim-ECD (HR = 1.45) and re-SCD (HR = 1.79). Transplantation improves quality of life and reduces healthcare costs, and due to the risk associated with resumption of hemodialysis and the longer waiting list times for SCD kidneys, there is a benefit to accepting ECD kidneys for select patients requiring retransplantation. Although this benefit exists for select patients, retransplantation with ECD kidneys should be undertaken with trepidation, and appropriate informed consent should be obtained.


Subject(s)
Donor Selection/methods , Kidney Failure, Chronic/surgery , Kidney Transplantation , Adolescent , Adult , Aged , Aged, 80 and over , Databases, Factual , Donor Selection/standards , Female , Graft Survival , Humans , Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Longitudinal Studies , Male , Middle Aged , Practice Guidelines as Topic , Reoperation , Retrospective Studies , Survival Analysis , Treatment Outcome , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...