Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 54
Filter
1.
J Surg Oncol ; 129(2): 244-253, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37800378

ABSTRACT

INTRODUCTION: Adjuvant (A) multiagent chemotherapy (MC) is the standard of care for patients with pancreatic adenocarcinoma (PDAC). Tolerating MC following a morbid operation may be difficult, thus neoadjuvant (NA) treatment is preferable. This study examined how the timing of chemotherapy was related to the regimen given and ultimately the overall survival (OS). METHODS: The National Cancer Database was queried from 2006 to 2017 for nonmetastatic PDAC patients who underwent surgical resection and received MC or single-agent chemotherapy (SC) pre- or postresection. Predictors of receiving MC were determined using multivariable logistic regression. Five-year OS was evaluated using the Kaplan-Meier and Cox proportional hazards model. RESULTS: A total of 12,440 patients (NA SC, n = 663; NA MC, n = 2313; A SC, n = 6152; A MC, n = 3312) were included. MC utilization increased from 2006-2010 to 2011-2017 (33.1%-49.7%; odds ratio [OR]: 0.59; p < 0.001). Younger age, fewer comorbidities, higher clinical stage, and larger tumor size were all associated with receipt of MC (all p < 0.001), but NA treatment was the greatest predictor (OR 5.18; 95% confidence interval [CI]: 4.63-5.80; p < 0.001). MC was associated with increased median 5-year OS (26.0 vs. 23.9 months; hazard ratio [HR]: 0.92; 95% CI: 0.88-0.96) and NA MC was associated with the highest survival (28.2 months) compared to NA SC (23.3 months), A SC (24.0 months), and A MC (24.6 months; p < 0.001). CONCLUSION: Use and timing of MC contribute to OS in PDAC with an improved 5-year OS compared to SC. The greatest predictor of receiving MC was being given as NA therapy and the greatest survival benefit was the NA MC subgroup. Randomized studies evaluating the timing of effective MC in PDAC are needed.


Subject(s)
Adenocarcinoma , Pancreatic Neoplasms , Humans , Pancreatic Neoplasms/drug therapy , Pancreatic Neoplasms/surgery , Pancreatic Neoplasms/pathology , Adenocarcinoma/pathology , Chemotherapy, Adjuvant , Neoadjuvant Therapy , Proportional Hazards Models , Retrospective Studies
2.
Ann Surg Oncol ; 29(9): 6015-6028, 2022 Sep.
Article in English | MEDLINE | ID: mdl-35583691

ABSTRACT

BACKGROUND: Neoadjuvant chemotherapy (NAC) or chemoradiation (NAC+XRT) is incorporated into the treatment of localized pancreatic adenocarcinoma (PDAC), often with the goal of downstaging before resection. However, the effect of downstaging on overall survival, particularly the differential effects of NAC and NAC+XRT, remains undefined. This study examined the impact of downstaging from NAC and NAC+XRT on overall survival. METHODS: The National Cancer Data Base (NCDB) was queried from 2006 to 2015 for patients with non-metastatic PDAC who received NAC or NAC+XRT. Rates of overall and nodal downstaging, and pathologic complete response (pCR) were assessed. Predictors of downstaging were evaluated using multivariable logistic regression. Overall survival (OS) was assessed with Kaplan-Meier and Cox proportional hazards modeling. RESULTS: The study enrolled 2475 patients (975 NAC and 1500 NAC+XRT patients). Compared with NAC, NAC+XRT was associated with higher rates of overall downstaging (38.3 % vs 23.6 %; p ≤ 0.001), nodal downstaging (16.0 % vs 7.8 %; p ≤ 0.001), and pCR (1.7 % vs 0.7 %; p = 0.041). Receipt of NAC+XRT was independently predictive of overall (odds ratio [OR] 2.28; p < 0.001) and nodal (OR 3.09; p < 0.001) downstaging. Downstaging by either method was associated with improved 5-year OS (30.5 vs 25.2 months; p ≤ 0.001). Downstaging with NAC was associated with an 8-month increase in median OS (33.7 vs 25.6 months; p = 0.005), and downstaging by NAC+XRT was associated with a 5-month increase in median OS (30.0 vs 25.0 months; p = 0.008). Cox regression showed an association of overall downstaging with an 18 % reduction in the risk of death (hazard ratio [HR] 0.82; 95 % confidence interval, 0.71-0.95; p = 0.01) CONCLUSION: Downstaging after neoadjuvant therapies improves survival. The addition of radiation therapy may increase the rate of downstaging without affecting overall oncologic outcomes.


Subject(s)
Adenocarcinoma , Pancreatic Neoplasms , Adenocarcinoma/pathology , Adenocarcinoma/therapy , Chemoradiotherapy , Chemotherapy, Adjuvant , Humans , Neoadjuvant Therapy , Neoplasm Staging , Pancreatic Neoplasms/pathology , Pancreatic Neoplasms/therapy , Retrospective Studies , Treatment Outcome , Pancreatic Neoplasms
3.
J Surg Res ; 276: 261-271, 2022 08.
Article in English | MEDLINE | ID: mdl-35398630

ABSTRACT

INTRODUCTION: Hepatocellular carcinoma (HCC) is rare among adolescent and young adult (AYA) patients, and resection or transplant remains the only curative therapy. The role of lymph node (LN) sampling is not well-defined. The aim of this study was to describe practice patterns, as well as investigate the impact of LN sampling on survival outcomes in this population. MATERIALS AND METHODS: A retrospective cohort study using the 2004-2018 National Cancer Database (NCDB) was performed. Patients ≤21 y old with nonmetastatic HCC who underwent liver resection or transplant were evaluated. Clinical features of patients who underwent LN sampling were compared to those who did not, and univariable and multivariable logistic regression was performed to evaluate independent predictive factors of node positivity. Survival analysis was performed using Kaplan-Meier methods and Cox Proportional Hazard Survival Regression. RESULTS: A total of 262 AYA patients with HCC were identified, of whom 137 (52%) underwent LN sampling, 44 patients had positive nodes, 40 (95%) of them had tumors >5 cm; 87 (64%) of patients with sampled nodes had fibrolamellar carcinoma (FLC), which was an independent risk factor for predicting positive nodes (P = 0.001). There was no difference in overall survival between patients who underwent LN sampling and those who did not; however, 5-y overall survival for node-positive patients was 40% versus 79% for node-negative patients (P < 0.0001). CONCLUSIONS: In AYA patients with HCC, LN sampling was not associated with an independent survival benefit. However, FLC was an independent risk factor for LN positivity, suggesting a role for routine LN sampling in these patients.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Adolescent , Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/surgery , Humans , Liver Neoplasms/pathology , Liver Neoplasms/surgery , Lymph Node Excision , Lymph Nodes/pathology , Lymph Nodes/surgery , Neoplasm Staging , Prognosis , Retrospective Studies , Young Adult
4.
J Surg Res ; 268: 650-659, 2021 12.
Article in English | MEDLINE | ID: mdl-34474214

ABSTRACT

BACKGROUND: Surgical management of hepatic metastases in patients with stage IV breast cancer remains controversial. The purpose of this study was to examine the impact of hepatic metastasectomy on long-term outcomes. METHODS: The 2004-2015 National Cancer Database was queried for all patients diagnosed with stage IV breast cancer with metastases isolated to the liver. Patient demographics, disease-, treatment- and outcome-related data were analyzed. RESULTS: Of 2,895 patients, only 90 (3.1%) underwent hepatic resection. Compared to patients who did not undergo metastasectomy, patients treated with metastasectomy tended to be younger (52 ± 12.7 versus 59.2 ± 14.6; P < 0.001) and have private insurance (74.4% versus 45.3%; P < 0.001). Independent predictors of metastasectomy included younger age (OR 0.98; CI 0.96-0.99; P = 0.01), lobular carcinoma (OR 2.26; CI 1.06-4.82; P = 0.03), and prior surgery of the primary site (partial mastectomy (OR 6.96; CI 3.47-13.95; P < 0.001) or total mastectomy (OR 5.74; CI 3.06-10.76; P < 0.001)). Compared to no metastasectomy, hepatic metastasectomy was independently associated with a 37% reduction in the risk of death (HR 0.63; CI 0.44-0.91; P = 0.01). CONCLUSIONS: Stage IV breast cancer with metastases to the liver is rare and few patients undergo hepatic resection. However, in this select patient population, hepatic metastasectomy was associated with a significant survival advantage when included in the multimodal treatment of synchronous stage IV breast cancer.


Subject(s)
Breast Neoplasms , Colorectal Neoplasms , Liver Neoplasms , Lung Neoplasms , Metastasectomy , Breast Neoplasms/pathology , Colorectal Neoplasms/pathology , Combined Modality Therapy , Female , Humans , Liver/pathology , Liver Neoplasms/secondary , Lung Neoplasms/surgery , Mastectomy , Retrospective Studies , Survival Rate
5.
Am J Emerg Med ; 49: 338-342, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34229241

ABSTRACT

BACKGROUND: Some studies have suggested gender disparities in both pay and academic promotion which may adversely affect salary and career progression for female physicians. The areas of research output, funding, and authorship have not been fully and systematically examined in the emergency medicine literature. We hypothesize that gender differences may exist in research output, impact, authorship, and funding. METHODS: We conducted a cross-sectional study examining all published articles in the top three emergency medicine journals as determined by Impact Factor between February 2015 and February 2018. We compared the authorship, number of citations of each article, funding, and h-index of each author by gender. RESULTS: Of the 10,118 authors representing 4166 original articles in our sample, 7562 (74.7%) were male and 2556 (25.3%) were female, with females underrepresented relative to the known proportion of female emergency medicine faculty. Males were proportionally more likely to be last authors (OR 1.65, 95% CI, 1.47-1.86) and less likely to be first authors than females (OR 0.85, 95% CI, 0.77-0.94). No difference in proportions of males and females in terms of being named as having funding was found (OR 1.02, 95% CI, 0.78-1.35). Males had higher h-indexes than females (5 vs. 3, p < .001) as well as a higher average number of citations (OR 1.068, 95% CI, 1.018-1.119). CONCLUSIONS: Males outnumber females in terms of numbers of publications, but also in number of citations, h-index, and last authorship. Future studies on physician gender disparities in emergency medicine need to account for these population differences.


Subject(s)
Emergency Medicine/statistics & numerical data , Publications/standards , Sex Characteristics , Cross-Sectional Studies , Female , Humans , Male , Periodicals as Topic/statistics & numerical data , Publications/statistics & numerical data , Sexism/psychology , Sexism/statistics & numerical data
6.
Ann Surg Oncol ; 28(6): 2960-2972, 2021 Jun.
Article in English | MEDLINE | ID: mdl-33566248

ABSTRACT

INTRODUCTION: Lymphadenectomy (LND) is recommended following surgical resection of ≥ T1b gallbladder cancer (GBC). However, frequency and stage-specific survival benefits of LND remain unclear. PATIENTS AND METHODS: The National Cancer Database (NCDB; 2006-15) was queried for resected pathologic stage I-III GBC. LND performance, predictors of receiving LND, and LND association with overall survival (OS) were assessed. RESULTS: Of 2302 total patients, 1343 (58.3%) underwent LND. Patients who underwent LND were younger and more frequently had private health insurance, a negative surgical margin, higher pathologic T stage, and received adjuvant chemotherapy (all p < 0.001). LND rates were highest at academic centers (70.1%) relative to all other facility types (p < 0.001). LND was independently associated with improved OS [hazard ratio (HR) 0.52, 95% confidence interval (CI) 0.44-0.61]. LND was associated with improved OS for pT1b, pT2, and pT3 patients (all p < 0.05) on univariate analysis. LND was independently associated with improved OS in pT2 (HR 0.44, CI 0.35-0.56) and pT3 (HR 0.54, CI 0.43-0.69) patients. CONCLUSIONS: LND is associated with a 48% reduction in risk of death in patients with resectable non-metastatic GBC, with greatest impact in pT2-3 patients. Patients without LND have similar OS to patients with node-positive disease, highlighting the importance of LND. Underutilization of LND likely results in undertreatment of patients with undiagnosed nodal disease, which may contribute to unfavorable oncologic outcomes.


Subject(s)
Carcinoma in Situ , Gallbladder Neoplasms , Chemotherapy, Adjuvant , Gallbladder Neoplasms/pathology , Gallbladder Neoplasms/surgery , Humans , Lymph Node Excision , Neoplasm Staging , Proportional Hazards Models
7.
World J Surg ; 45(2): 531-542, 2021 Feb.
Article in English | MEDLINE | ID: mdl-33151372

ABSTRACT

BACKGROUND: Surgical debulking of primary neuroendocrine tumors (NETs) and hepatic resection of metastatic NET disease may each independently improve overall survival. However, evidence for combined primary site debulking and metastasectomy on survival and impact on short-term perioperative outcomes is limited. METHODS: The 2014-2016 ACS-NSQIP targeted hepatectomy database was queried for all patients undergoing liver resection for metastatic NET. Secondary procedure codes were evaluated for major concurrent operations. Multivariable analysis was performed to determine risk factors for 30-day morbidity and mortality. RESULTS: A total of 472 patients were identified, of whom 153 (32.4%) underwent ≥1 additional concurrent major operation. The most common concurrent procedures were small bowel resection (14.6%), partial colectomy (8.9%), and radical lymphadenectomy (7.4%). Among all patients, overall 30-day mortality and morbidity were 1.5% and 25.6%, respectively. Modifiable and treatment-related factors associated with increased major postoperative morbidity risk included >10% weight loss within six months of surgery (p = 0.05), increasing number of hepatic lesions treated (p = 0.05), and biliary reconstruction (p = 0.001). No major concurrent procedure was associated with increased 30-day morbidity (all p > 0.05). CONCLUSIONS: Approximately one-third of patients with stage IV NET underwent combined hepatic and multi-organ resection. Although modifiable and treatment-related factors predictive of perioperative morbidity were identified, performance of concurrent major procedures did not increase perioperative morbidity. These results support consideration of multi-organ resection in carefully selected patients with metastatic NET.


Subject(s)
Carcinoma/secondary , Carcinoma/surgery , Liver Neoplasms/secondary , Liver Neoplasms/surgery , Neoplasm Metastasis/pathology , Neuroendocrine Tumors/surgery , Adult , Aged , Colectomy , Colorectal Neoplasms/surgery , Female , Hepatectomy/adverse effects , Humans , Liver Neoplasms/pathology , Lymph Node Excision , Male , Middle Aged , Neoplasm Staging , Postoperative Complications , Retrospective Studies , Survival Analysis , Treatment Outcome
8.
Ann Surg Oncol ; 28(3): 1466-1480, 2021 Mar.
Article in English | MEDLINE | ID: mdl-32749621

ABSTRACT

BACKGROUND: Adjuvant chemotherapy (AC) is recommended following surgical resection of gallbladder cancer regardless of stage. However, stage-specific benefits of AC in gallbladder cancer are unclear. PATIENTS AND METHODS: Patients with resected pathologic stage I-III gallbladder cancer were identified using the 2006-2015 National Cancer Database. Utilization trends, predictors of use, and impact of AC on overall survival (OS) were determined. RESULTS: A total of 5656 patients were included. Use of AC increased from 9.9% in 2006 to 24.2% in 2015 (OR 2.91; 95% CI 2.06-4.09; p < 0.001). However, only 17.5% of patients overall and only 32.4% of node-positive (stage IIIb) patients received AC. Patients receiving AC were younger and had fewer comorbidities, shorter hospitalizations, more advanced disease, and more margin-positive resections (all p < 0.01). Higher pathologic T stage and positive nodal status represented the greatest independent predictors of receipt of AC. While AC demonstrated no OS advantage for stage I patients (p = 0.83), AC was associated with improved OS among stage II patients (p = 0.003), though this impact was not independently associated with improved OS on multivariable analysis. AC was independently associated with improved OS among stage IIIb patients, with a 30% reduction in risk of death (HR 0.70; 95% CI 0.58-0.83; p < 0.001). Younger age, fewer comorbidities, and shorter hospitalization all predicted receipt of AC among stage IIIb patients (all p < 0.05). CONCLUSIONS: Systemic therapy remains underprescribed, in particular among patients that would seem to benefit most. Adjuvant chemotherapy likely improves survival in node-positive gallbladder cancer, but its utility in the treatment of node-negative disease has not been demonstrated.


Subject(s)
Gallbladder Neoplasms , Chemotherapy, Adjuvant , Databases, Factual , Gallbladder Neoplasms/drug therapy , Gallbladder Neoplasms/surgery , Humans , Neoplasm Staging , Proportional Hazards Models
9.
Mil Med ; 185(1-2): e178-e182, 2020 02 12.
Article in English | MEDLINE | ID: mdl-31184698

ABSTRACT

INTRODUCTION: Often referred to as aseptic or osteonecrosis, avascular necrosis (AVN) typically affects people between 30 and 50 years of age. Given the substantial morbidity associated with AVN as well as overlapping age groups of both the military and average age at diagnosis for AVN, the military represents an ideal cohort for a large database study to elicit the incidence and epidemiology of AVN. The purpose of this study was to identify demographic risk factors in the United States military. MATERIALS AND METHODS: First-time occurrences for ICD-9-CM codes for all types of AVN (head of humerus, head and neck of femur, medial femoral condyle, talus, and other bone) between 2004 and 2014 were queried in the Defense Medical Epidemiology Database. Multivariate data analysis was performed to obtain adjusted rate (adjusted for age, sex, race, rank, and branch of service). RESULTS: Between 2005 and 2014, 2,671 cases of AVN occurred among an at-risk population of 13,820,906 servicemembers for an unadjusted IR of 0.19 per 1,000 person-years. The most common location was located at the proximal femur, responsible for 41.7% of all cases. With an adjusted rate ratio of 18.7, the over 40 age-group accounted for 53.3% of cases. Servicemembers of black race, Senior rank, and the Army branch of service were more at risk for AVN. CONCLUSIONS: The overall incidence of AVN was 0.19 per 1,000 person-years. Whilte increasing age had the greatest influence on the development of symptomatic AVN, other statistically significant risk factors were found to be increasing age, black race, senior enlisted rank, and Army branch of service.


Subject(s)
Military Personnel , Osteonecrosis , Femur , Humans , Incidence , Osteonecrosis/epidemiology , Osteonecrosis/etiology , Risk Factors , United States/epidemiology
10.
J Surg Orthop Adv ; 28(2): 137-143, 2019.
Article in English | MEDLINE | ID: mdl-31411960

ABSTRACT

Timing of definitive fixation of femoral shaft fractures is a subject of continued controversy. The purpose of this study was to determine if early definitive fixation of femoral shaft fractures in the setting of polytrauma decreased the risk of pulmonary complications and mortality. The 2009-2012 National Sample Program of the National Trauma Data Bank was queried for all patients 18 to 65 years with Injury Severity Scores (ISS) >15 who underwent definitive fixation of femoral shaft fractures. Mortality, perioperative complications, and length of intensive care unit (ICU) and hospital stay were the primary outcome measures of interest. Following multivariate analyses, increased time to surgery was found to portend a statistically significant increased risk of acute respiratory distress syndrome(ARDS), mean ventilator time, length of ICU and hospital stay, and mortality. Earlier definitive fixation of femoral shaft fractures in the setting of polytrauma is associated with significantly decreased risk of ARDS, mean ventilator time, length of ICU and hospital stay, and mortality. (Journal of Surgical Orthopaedic Advances 28(2):137-143, 2019).


Subject(s)
Femoral Fractures , Lung Diseases , Multiple Trauma , Femoral Fractures/complications , Femoral Fractures/surgery , Humans , Injury Severity Score , Length of Stay , Lung Diseases/etiology , Lung Diseases/prevention & control , Multiple Trauma/therapy , Retrospective Studies , Survival Analysis
11.
Physiol Behav ; 184: 196-204, 2018 02 01.
Article in English | MEDLINE | ID: mdl-29155246

ABSTRACT

Housing and enrichment conditions are essential factors to consider when using animal models of behavior, as they can alter the behavior that is under investigation. The goal of this study was to determine the impact of the relatively enriched environment recommended by current animal care guidelines on development and maintenance of binge-type behavior in rats, using the limited access (LA) binge model. Non-food-deprived rats were divided into two groups, enriched and nonenriched, with all rats housed in shoebox cages. Bedding, nesting material, toys, and a solid floor were provided only to the enriched group to create a state of relative enrichment, or RE, compared to the nonenriched conditions historically used in the LA model. Enriched and nonenriched groups were further divided into control and experimental groups. Control rats received access to an optional source of fat (vegetable shortening) for 30min each day (daily access) while experimental rats received 30-min optional fat access on Monday, Wednesday, and Friday only (intermittent access). The four groups were designated C-E (Control-Enriched), C-NE (Control-Nonenriched), I-E (Intermittent-Enriched), and I-NE (Intermittent-Nonenriched). Bingeing in the LA model is established when a group with intermittent access (i.e., the I-E or I-NE group) consumes significantly more vegetable shortening during the limited access period than a group with daily access (i.e., the C-E or C-NE group). Access sessions continued for 8weeks under these conditions, at which time the housing conditions of the I-E and I-NE groups were reversed for an additional 8weeks of access sessions. Intakes of the C-E and C-NE groups were similar and data from these two groups were combined. Relative to this Combined Control Group (CCG), the I-NE group began bingeing in week 3 while the I-E group binged during weeks 6 and 8. Following the reversal at the beginning of week 9, the newly enriched I-NE group ceased bingeing in week 9 but resumed bingeing in weeks 10-16. The newly nonenriched I-E group continued bingeing through the remainder of the study. Intakes of the I-E and I-NE groups were not significantly different at any time during the study. These results indicate that RE delays binge onset; that is, RE increases the time between the first fat access session and the first occurrence of bingeing. However, RE does not significantly alter the amount of fat consumed during binge sessions. Furthermore, addition of RE to a nonenriched group of animals (I-NE) does not reverse established binge behavior. Thus it appears that regardless of enrichment condition, intermittent access to vegetable shortening induces greater consumption of fat than does daily access. However, it is clear that a certain level of austerity in housing conditions is required for rapid development of lasting binge-type eating to occur. In addition, results suggest that it is unlikely that enrichment, to the degree provided in this study, can prevent or reverse binge-type eating in rats.


Subject(s)
Bulimia/prevention & control , Bulimia/psychology , Environment , Animals , Behavior, Animal , Body Weight/physiology , Dietary Fats/adverse effects , Disease Models, Animal , Eating/physiology , Feeding Behavior , Female , Food Deprivation/physiology , Rats , Rats, Wistar , Time Factors
12.
Int J Surg ; 48: 286-290, 2017 Dec.
Article in English | MEDLINE | ID: mdl-29191407

ABSTRACT

BACKGROUND: The long-term impact of gun violence on physical function and occupational disability remains poorly explored. We sought to examine the effect of combat-related gunshot injury on work capacity within a cohort of military servicemembers and identify clinical characteristics that influence the capacity to return to work. METHODS: A query was performed to identify all servicemembers injured by gunshot in the years 2005-2009. These soldiers were then followed for a period up to the end of 2014 in order to identify those separated from service due to an inability to perform military duties as a result of their injury. Socio-demographic and clinical characteristics were considered co-variates. The dependent variable in this study was inability to effectively return to work, as delineated by the proxy of medical separation from military service. A multivariable logistic regression model was used to evaluate factors associated with an increased likelihood of medical separation following gunshot injury. RESULTS: Of the 1417 individuals meeting inclusion criteria, 40% (n = 572) of the cohort were medically separated in the time-period under study. Significant predictors of separation included non-thoracic injuries, increased injury severity score (ISS; OR 1.05; 95% CI 1.04, 1.06), Senior Enlisted (OR 3.90; 95% CI 2.16, 7.01), and Junior Enlisted military rank (OR 6.99; 95% CI 3.93, 12.44). CONCLUSIONS: This is the largest study in the literature to assess the long-term capacity to return to work following gunshot injury in any population. Individuals in high-demand occupations and those with non-thoracic wounds, or elevated ISS, should be counseled in the post-gunshot injury period regarding the negative associations of these characteristics with the capacity to return to work. Enhanced access to social services in the period following injury could similarly benefit individuals of low socioeconomic background.


Subject(s)
Military Personnel , Occupational Injuries/epidemiology , Return to Work/statistics & numerical data , Wounds, Gunshot/epidemiology , Adult , Cohort Studies , Female , Humans , Injury Severity Score , Male , Middle Aged , Multivariate Analysis , United States/epidemiology , Warfare , Young Adult
13.
Orthopedics ; 40(1): e1-e10, 2017 Jan 01.
Article in English | MEDLINE | ID: mdl-27648576

ABSTRACT

The study was conducted to determine the incidence rate, risk factors, and postoperative conditions associated with 30-day readmission after total shoulder arthroplasty (TSA). A total of 3547 patients who underwent primary TSA were identified from the 2011-2013 American College of Surgeons National Surgical Quality Improvement Program. The 30-day readmission rate was 2.9%. The only preoperative predictors of hospital readmission were American Society of Anesthesiologists classification of 3 or greater (odds ratio, 2.16; 95% confidence interval, 1.30-3.61) and a history of cardiac disease (odds ratio, 2.13; 95% confidence interval, 1.05-4.31). Of patients with any perioperative complications, 42 (34%) were readmitted, and the presence of any complication increased the risk of readmission (odds ratio, 28.95; 95% confidence interval, 18.44-45.46). Periprosthetic joint infection, myocardial infarction, pulmonary embolism, deep venous thrombosis, and pneumonia were significant predictors of hospital readmission after TSA (P<.0001). The incidence of hospital readmission after TSA peaked within the first 5 days after discharge, and 26%, 32%, and 55% of all hospital readmissions occurred by postoperative days 5, 7, and 14, respectively. Pre-operative medical optimization to reduce the rates of postoperative complications, such as periprosthetic joint infection, myocardial infarction, pulmonary embolism, deep venous thrombosis, pneumonia, and urinary tract infection, are likely to decrease the need for subsequent readmission. Patients should be counseled about these risk factors preoperatively. [Orthopedics. 2017; 40(1):e1-e10.].


Subject(s)
Arthroplasty, Replacement, Shoulder , Patient Readmission/statistics & numerical data , Postoperative Complications/epidemiology , Aged , Aged, 80 and over , Arthritis, Infectious/epidemiology , Databases, Factual , Female , Humans , Incidence , Logistic Models , Male , Middle Aged , Multivariate Analysis , Myocardial Infarction/epidemiology , Odds Ratio , Pneumonia/epidemiology , Prosthesis-Related Infections/epidemiology , Pulmonary Embolism/epidemiology , Quality Improvement , Risk Factors , Shoulder/surgery , Shoulder Joint/surgery , Shoulder Prosthesis , Surgical Wound Infection/epidemiology , Venous Thrombosis/epidemiology
14.
Mil Med ; 181(10): 1308-1313, 2016 10.
Article in English | MEDLINE | ID: mdl-27753569

ABSTRACT

PURPOSE: To comprehensively quantify established risk factors for the development of lower extremity stress fractures within a contemporary U.S. military cohort. METHODS: Using the Defense Medical Epidemiological Database, all U.S. service members diagnosed with tibia/fibula, metatarsal, other bone, femoral neck, and femoral shaft stress fractures were identified based on International Classification of Diseases, 9th Revision, Clinical Modification code from 2009 to 2012. Incidence rates (IRs) and adjusted IRs controlling for sex, race, age, rank, and branch of service were obtained with multivariate Poisson regression analysis. RESULTS: Between 2009 and 2012, 31,758 lower extremity stress fractures occurred among 5,580,875 person-years, for an unadjusted IR of 5.69 per 1,000 person-years. Tibial/fibular (40%) involvement was the most common. Bimodal age distribution revealed that service members under 20 years old (23.06; 95% confidence interval [CI] 22.52, 23.55) or ≥40 (6.86; 95% CI 6.65, 7.07) had greatest risk. Females were at higher risk for total lower extremity (3.11; 95% CI, 3.03, 3.18). White service members were also more at risk than Black service members (p < 0.0001). The majority of stress fractures (77.5%) occurred in junior enlisted service members, with the Army and Marines most at risk. CONCLUSION: This investigation elucidates several nonmodifiable risk factors for stress fractures in the military and may inform screening measures to reduce this significant source of disability.


Subject(s)
Fractures, Stress/epidemiology , Lower Extremity/injuries , Military Personnel/statistics & numerical data , Adolescent , Adult , Age Distribution , Female , Humans , Incidence , Male , Racial Groups/statistics & numerical data , United States/epidemiology
15.
J Arthroplasty ; 31(10): 2108-14, 2016 10.
Article in English | MEDLINE | ID: mdl-27181491

ABSTRACT

BACKGROUND: This investigation sought to quantify incidence rates (IRs) and risk factors for primary and secondary (ie, posttraumatic) osteoarthritis (OA) of the knee in an active military population. METHODS: We performed a retrospective review of United States military active duty servicemembers with first-time diagnosis of primary (International Classification of Disease, 9th Edition code: 715.16) and secondary (International Classification of Disease, 9th Edition code: 715.26) OA of the knee between 2005 and 2014 using the Defense Medical Epidemiology Database. IRs and 95% CIs were expressed per 1000 person-years, with stratified subgroup analysis adjusted for sex, age, race, military rank, and branch of military service. Relative risk factors were evaluated using IR ratios and multiple regression analysis. RESULTS: A total of 21,318 cases of OA of the knee were identified among an at-risk population of 13,820,906 person-years for an overall IR of 1.54 per 1000 person-years, including 19,504 cases of primary (IR: 1.41) and 1814 cases of secondary OA (IR: 0.13). The IRs of both primary and secondary OA increased significantly from 2005 to 2014. Increasing age (P < .0001); black race (P < .001); senior military rank (P < .0001); and Army, Marines, and Air Force services (P < .0001) were significantly associated with an increased risk for knee OA. CONCLUSION: This study is the first large-scale report of knee OA in a young athletic population. An increasing incidence and several risk factors for knee OA were identified, indicating a need for better preventative strategies and forecasting the increased anticipated demands for knee arthroplasty among US military servicemembers.


Subject(s)
Knee Injuries/complications , Military Personnel/statistics & numerical data , Osteoarthritis, Knee/epidemiology , Adult , Databases, Factual , Female , Humans , Incidence , International Classification of Diseases , Male , Multivariate Analysis , Osteoarthritis, Knee/etiology , Regression Analysis , Retrospective Studies , Risk Factors , United States/epidemiology , Young Adult
16.
J Arthroplasty ; 31(6): 1170-1174, 2016 06.
Article in English | MEDLINE | ID: mdl-26777548

ABSTRACT

BACKGROUND: Current indices fail to consistently predict risk for major adverse cardiac events after major total joint arthroplasty. METHODS: All primary total knee arthroplasty (TKA) and total hip arthroplasty (THA) were identified from the National Surgical Quality Improvement Program data set. Based on prior analyses, age ≥80 years, history of hypertension, and history of cardiac disease were evaluated as predictors of myocardial infarction and cardiac arrest using stepwise multivariate logistic regression. A series of predictive scores were constructed and weighted to identify the influence of each variable on 30-day postoperative cardiac events, while comparing with the Revised Cardiac Risk Index (RCRI). RESULTS: Among 85,129 patients, age ≥80 years, hypertension, and a history of cardiac disease were all statistically significant predictors of postoperative cardiac events (0.32%; n = 275) after TKA and THA (P ≤ .02). Equal weighting of all variables maintained the highest discriminative capacity in both THA and TKA cohorts. Adjusted models explained 75% and 71% of the variation in postoperative cardiac events for those with THA and TKA, respectively, without statistically significant lack of fit (P = .52; P = .23, respectively). Conversely, the RCRI was not a significant predictor of postoperative cardiac events after TKA (odds ratio, 3.36; 95% CI, 0.19, 58.04; P = .40), although it maintained a similar discriminative capacity after THA (76%). CONCLUSION: The current total joint arthroplasty Cardiac Risk Index score was the most economical in predicting postoperative cardiac complication after primary unilateral TKA and THA. The RCRI was not a significant predictor of perioperative cardiac events for TKA patients but performed similarly to the current model for THA.


Subject(s)
Arthroplasty, Replacement, Hip/adverse effects , Arthroplasty, Replacement, Knee/adverse effects , Heart Arrest/etiology , Myocardial Infarction/etiology , Postoperative Complications/etiology , Risk Adjustment , Aged , Aged, 80 and over , Female , Humans , Knee Joint , Logistic Models , Male , Middle Aged
17.
Knee Surg Sports Traumatol Arthrosc ; 24(10): 3329-3338, 2016 Oct.
Article in English | MEDLINE | ID: mdl-26387125

ABSTRACT

PURPOSE: Hospital readmission is emerging as an important quality measure, yet modifiable predictors of readmission remain unknown. This study was designed to identify risk factors for readmission following revision total knee arthroplasty. METHODS: The National Surgical Quality Improvement Program dataset was queried to identify patients undergoing revision total knee arthroplasty from 2011 to 2012. Patient demographics, medical co-morbidities, laboratory values, surgical characteristics and surgical outcomes were examined using bivariate and multivariate logistic regression to identify significant predictors for readmission within 30 days of discharge. RESULTS: There were 108 readmissions (6.2 %) among 1754 patients. Risk factors for readmission included a history of transient ischaemic attack/cerebrovascular accident (OR 3.47; 13 95 % CI 1.30, 9.25), female sex (OR 1.75, 95 % CI 1.15, 2.68) and general anaesthesia (OR 14 1.74, 95 % CI 1.09, 2.79). Hypertension treated with medication (OR 0.61, 95 % CI 0.39, 0.96) was associated with a lower risk of readmission. Post-operative complications that were significant predictors of hospital readmission included periprosthetic joint infection (OR 15.09, 95 % CI 5.57, 40.91), superficial wound infection (OR 16.57, 95 % CI 5.82, 47.22) and deep venous thrombosis (OR 8.59, 95 % CI 2.36, 31.24). CONCLUSIONS: The preferred use of neuraxial anaesthesia and coordinated discharge planning in patients with a history of transient ischaemic attack/cerebrovascular accident may reduce the risk of readmission following discharge after revision total knee arthroplasty. Additionally, patients with post-operative infections and deep venous thrombosis following these procedures can benefit from close observation in the first weeks following discharge to minimize the likelihood of readmission. LEVEL OF EVIDENCE: III.


Subject(s)
Arthroplasty, Replacement, Knee/adverse effects , Patient Readmission , Postoperative Complications , Aged , Aged, 80 and over , Comorbidity , Female , Humans , Logistic Models , Male , Middle Aged , Multivariate Analysis , Patient Readmission/standards , Quality Improvement , Retrospective Studies , Risk Factors
18.
Foot Ankle Int ; 36(7): 780-6, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25791034

ABSTRACT

BACKGROUND: Literature evaluating surgical outcomes after ankle fixation in an active patient population is limited. This study determined occupational outcomes and return to running following ankle fracture fixation in a military cohort. METHODS: All service members undergoing ankle fracture fixation at a single military hospital from August 2007 to August 2012 were reviewed. Univariate analysis determined the association between patient demographic information, type of fracture fixation, and the development of posttraumatic ankle arthritis and functional outcomes, including medical separation, return to running, and reoperation. Seventy-two primary ankle fracture fixation procedures were performed on patients with mean age of 29.1 years. The majority of patients were male (88%), were 25 years of age or older (61%), were of junior rank (57%), underwent unimalleolar fracture fixation (78%), and did not require syndesmotic fixation (54%). The average follow-up was 35.9 months. RESULTS: The mean time to radiographic union was 8.6 weeks. Twelve service members (17%) were medically separated from the military due to refractory pain following ankle fracture fixation with a minimum of 2-year occupational follow-up. Among military service members undergoing ankle fracture fixation, 64% returned to running. Service members with higher occupational demands had a statistical trend to return to running (odds ratio [OR] 2.49; 95% CI, 0.93-6.68). Junior enlisted rank was a risk factor for medical separation (OR 11.00; 95% CI, 1.34-90.57). Radiographic evidence of posttraumatic ankle osteoarthritis occurred in 8 (11%) service members. CONCLUSIONS: At mean 3-year follow-up, 83% of service members undergoing ankle fracture fixation remained on active duty or successfully completed their military service, while nearly two-thirds returned to occupationally required daily running. LEVEL OF EVIDENCE: Level IV, retrospective case series.


Subject(s)
Ankle Fractures/surgery , Fracture Fixation, Internal , Military Personnel , Return to Work , Running/physiology , Adult , Ankle Fractures/physiopathology , Ankle Joint/pathology , Female , Humans , Male , Occupations , Osteoarthritis/pathology , Recovery of Function/physiology , Retrospective Studies , Young Adult
19.
J Orthop Trauma ; 29(12): e476-82, 2015 Dec.
Article in English | MEDLINE | ID: mdl-25785357

ABSTRACT

OBJECTIVE: The purpose was to calculate the incidence rates and determine risk factors for 30-day postoperative mortality and morbidity after ankle fracture open reduction and internal fixation (ORIF). METHODS: The NSQIP database was queried to identify patients undergoing ankle fracture ORIF from 2006 to 2011, with extraction patient-based or surgical variables and a 30-day clinical course. Multivariable logistic regression analysis identified significant predictors on outcome measures. RESULTS: Mean age was 50.3 (±18.2) years while diabetes mellitus (12.8%) and body mass index ≥40 kg/m(2) (9.2%) were documented from a total of 3328 patients identified. The 30-day mortality rate was 0.30%, and complications occurred in 5.1%. Chronic obstructive pulmonary disease [odds ratio (OR): 4.23, 95% confidence interval (CI): 1.19-15.06] and a nonindependent functional status before surgery (OR: 2.25, 95% CI: 1.13-4.51) were the sole independent predictors of mortality and major local complications, respectively. Major local complications occurred in 2.2% of patients, and significant predictors were peripheral vascular disease (OR: 6.14; 95% CI: 1.95-19.35), open wound (OR: 5.04; 95% CI: 2.25-11.27), nonclean wound classification (OR: 3.02; 95% CI: 1.31-6.93), and smoking (OR: 2.85; 95% CI: 1.42-5.70). Independent predictors of hospital stay >3 days were cardiac disease, age 70 years or older, open wound, partially/totally dependent functional status, American Society of Anesthesiologists (ASA) classification ≥3, body mass index ≥40 kg/m(2), bimalleolar or trimalleolar ankle fracture pattern, female sex, and diabetes. CONCLUSIONS: Chronic obstructive pulmonary disease increased the risk of mortality after ankle fracture ORIF. Risk factors for postoperative complications included peripheral vascular disease, open wound, nonclean wound classification, age 70 years or older, and ASA classification ≥3. LEVEL OF EVIDENCE: Prognostic Level II. See Instructions for Authors for a complete description of levels of evidence.


Subject(s)
Ankle Fractures/mortality , Ankle Fractures/surgery , Fracture Fixation, Internal/mortality , Peripheral Vascular Diseases/mortality , Postoperative Complications/mortality , Pulmonary Disease, Chronic Obstructive/mortality , Adult , Age Distribution , Aged , Aged, 80 and over , Ankle Fractures/diagnosis , Comorbidity , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Retrospective Studies , Risk Factors , Sex Distribution , Survival Rate , Texas/epidemiology , Treatment Outcome
20.
J Bone Joint Surg Am ; 96(24): 2025-31, 2014 Dec 17.
Article in English | MEDLINE | ID: mdl-25520335

ABSTRACT

BACKGROUND: Cardiac complications are a major cause of postoperative morbidity. The purpose of this study was to determine the rates, risk factors, and time of occurrence for cardiac complications within thirty days after primary unilateral total knee arthroplasty and total hip arthroplasty. METHODS: The American College of Surgeons National Surgical Quality Improvement Program data set from 2006 to 2011 was used to identify all total knee arthroplasties and total hip arthroplasties. Cardiac complications occurring within thirty days after surgery were the primary outcome measure. Patients were designated as having a history of cardiac disease if they had a new diagnosis or exacerbation of chronic congestive heart failure or a history of angina within thirty days before surgery, a history of myocardial infarction within six months, and/or any percutaneous cardiac intervention or other major cardiac surgery at any time. An analysis of the occurrence of all major cardiac complications and deaths within the thirty-day postoperative time frame was performed. RESULTS: For the 46,322 patients managed with total knee arthroplasty or total hip arthroplasty, the cardiac complication rate was 0.33% (n = 153) at thirty days postoperatively. In both the total knee arthroplasty and total hip arthroplasty groups, an age of eighty years or more (odds ratios [ORs] = 27.95 and 3.72), hypertension requiring medication (ORs = 4.74 and 2.59), and a history of cardiac disease (ORs = 4.46 and 2.80) were the three most significant predictors for the development of postoperative cardiac complications. Of the patients with a cardiac complication, the time of occurrence was within seven days after surgery for 79% (129 of the 164 patients for whom the time of occurrence could be determined). CONCLUSIONS: An age of eighty years or more, a history of cardiac disease, and hypertension requiring medication are significant risk factors for developing postoperative cardiac complications following primary unilateral total knee arthroplasty and total hip arthroplasty. Consideration should be given to a preoperative cardiology evaluation and co-management in the perioperative period for individuals with these risk factors.


Subject(s)
Arthroplasty, Replacement, Hip/adverse effects , Arthroplasty, Replacement, Knee/adverse effects , Heart Arrest/epidemiology , Myocardial Infarction/epidemiology , Aged , Aged, 80 and over , Female , Heart Arrest/etiology , Humans , Incidence , Male , Middle Aged , Myocardial Infarction/etiology , Retrospective Studies , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...