Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
Clin Transplant ; 31(10)2017 10.
Article in English | MEDLINE | ID: mdl-28758236

ABSTRACT

BACKGROUND: As patients with chronic kidney disease become older, there is greater need to identify who will most benefit from kidney transplantation. Analytic morphomics has emerged as an objective risk assessment tool distinct from chronologic age. We hypothesize that morphometric age is a significant predictor of survival following transplantation. METHODS: A retrospective cohort of 158 kidney transplant patients from 2005 to 2014 with 1-year preoperative imaging was identified. Based on a control population comprising of trauma patients and kidney donors, morphometric age was calculated using the validated characteristics of psoas area, psoas density, and abdominal aortic calcification. The primary outcome was post-transplant survival. RESULTS: Cox regression showed morphometric age was a significant predictor of survival (hazard ratio, 1.06 per morphometric year [95% confidence interval, 1.03-1.08]; P < .001). Chronological age was not significant (hazard ratio, 1.03 per year [0.98-1.07]; P = .22). Among the chronologically oldest patients, those with younger morphometric age had greater survival rates compared to those with older morphometric age. CONCLUSIONS: Morphometric age predicts survival following kidney transplantation. Particularly for older patients, it offers improved risk stratification compared to chronologic age. Morphomics may improve the transplant selection process and provide a greater assessment of prospective survival benefits.


Subject(s)
Graft Rejection/diagnostic imaging , Graft Rejection/mortality , Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Postoperative Complications , Tomography, X-Ray Computed/methods , Adult , Age Factors , Aged , Aged, 80 and over , Case-Control Studies , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/etiology , Graft Survival , Humans , Kidney Failure, Chronic/diagnostic imaging , Kidney Failure, Chronic/surgery , Kidney Function Tests , Kidney Transplantation/adverse effects , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Assessment , Risk Factors
2.
Clin Transplant ; 31(9)2017 Sep.
Article in English | MEDLINE | ID: mdl-28640481

ABSTRACT

BACKGROUND: Better risk assessment tools are needed to predict post-transplantation diabetes mellitus (PTDM). Using analytic morphomic measurements from computed tomography (CT) scans, we aimed to identify specific measures of body composition associated with PTDM. METHODS: We retrospectively reviewed 99 non-diabetic kidney transplant recipients who received pre-transplant CT scans at a single institution between 1/2005 and 5/2014. Analytic morphomic techniques were used to measure abdominal adiposity, abdominal size, and psoas muscle area and density, standardized by gender. We measured the associations of these morphomic factors with PTDM. RESULTS: One-year incidence of PTDM was 18%. The morphomic factors significantly associated with PTDM included visceral fat area (OR=1.84 per standard deviation increase, P=.020), body depth (OR=1.79, P=.035), and total body area (OR=1.67, P=.049). Clinical factors significantly associated with PTDM included African American race (OR=3.01, P=.044), hypertension (OR=2.97, P=.041), and dialysis vintage (OR=1.24 per year on dialysis, P=.048). Body mass index was not associated with PTDM (OR=1.05, P=.188). On multivariate modeling, visceral fat area was an independent predictor of PTDM (OR=1.91, P=.035). CONCLUSIONS: Analytic morphomics can identify pre-transplant measurements of body composition that are predictive of PTDM in kidney transplant recipients. Pre-transplant imaging contains a wealth of underutilized data that may inform PTDM prevention strategies.


Subject(s)
Body Composition , Body Weights and Measures/methods , Diabetes Mellitus/etiology , Kidney Transplantation , Postoperative Complications/etiology , Tomography, X-Ray Computed , Adult , Diabetes Mellitus/diagnosis , Diabetes Mellitus/epidemiology , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Multivariate Analysis , Odds Ratio , Postoperative Complications/diagnosis , Postoperative Complications/epidemiology , Preoperative Period , Retrospective Studies , Risk Factors
3.
Clin Transplant ; 30(3): 289-94, 2016 Mar.
Article in English | MEDLINE | ID: mdl-26717257

ABSTRACT

BACKGROUND: Current measures of obesity do not accurately describe body composition. Using cross-sectional imaging, objective measures of musculature and adiposity are possible and may inform efforts to optimize liver transplantation outcomes. METHODS: Abdominal visceral fat area and psoas muscle cross-sectional area were measured on CT scans for 348 liver transplant recipients. After controlling for donor and recipient characteristics, survival analysis was performed using Cox regression. RESULTS: Visceral fat area was significantly associated with post-transplant mortality (p < 0.001; HR = 1.06 per 10 cm(2) , 95% CI: 1.04-1.09), as were positive hepatitis C status (p = 0.004; HR = 1.78, 95% CI: 1.21-2.61) and total psoas area (TPA) (p < 0.001; HR = 0.91 per cm(2) , 95% CI: 0.88-0.94). Among patients with smaller TPA, the patients with high visceral fat area had 71.8% one-yr survival compared to 81.8% for those with low visceral fat area (p = 0.15). At five yr, the smaller muscle patients with high visceral fat area had 36.9% survival compared to 58.2% for those with low visceral fat area (p = 0.023). CONCLUSIONS: Abdominal adiposity is associated with survival after liver transplantation, especially in patients with small trunk muscle size. When coupled with trunk musculature, abdominal adiposity offers direct characterization of body composition that can aid preoperative risk evaluation and inform transplant decision-making.


Subject(s)
Adiposity , Body Composition , Intra-Abdominal Fat/pathology , Liver Diseases/mortality , Liver Transplantation/mortality , Obesity/mortality , Body Mass Index , Cross-Sectional Studies , Female , Follow-Up Studies , Humans , Image Processing, Computer-Assisted/methods , Intra-Abdominal Fat/diagnostic imaging , Liver Diseases/diagnostic imaging , Liver Diseases/surgery , Male , Middle Aged , Obesity/diagnostic imaging , Prognosis , Psoas Muscles/diagnostic imaging , Retrospective Studies , Risk Factors , Survival Rate , Tomography, X-Ray Computed/methods
4.
Clin Transplant ; 29(12): 1076-80, 2015 Dec.
Article in English | MEDLINE | ID: mdl-26358578

ABSTRACT

INTRODUCTION: Sarcopenic liver transplant recipients have higher rates of mortality, but mechanisms underlying these rates remain unclear. Failure to rescue (FTR) has been shown to be a primary driver of mortality following major general and vascular surgery. We hypothesized that FTR is common in sarcopenic liver transplant recipients. METHODS: We retrospectively reviewed 348 liver transplant recipients with perioperative CT scans. Analytic morphomic techniques were used to assess trunk muscle size via total psoas area (TPA). One-yr major complication and FTR rates were calculated across TPA tertiles. RESULTS: The one-yr complication rate was 77% and the FTR rate was 19%. Multivariate regression showed TPA as a significant predictor of FTR (OR = 0.27 per 1000 mm(2) increase in TPA, p < 0.001). Compared to patients in the largest muscle tertile, patients in the smallest tertile had 1.4-fold higher adjusted complication rates (91% vs. 66%) and 2.8-fold higher adjusted FTR rates (22% vs. 8%). DISCUSSION: These results suggest that mortality in sarcopenic liver transplant recipients may be strongly related to FTR. Efforts aimed at early recognition and management of complications may decrease postoperative mortality. Additionally, this work highlights the need for expanded multicenter collaborations aimed at collection and analysis of postoperative complications in liver transplant recipients.


Subject(s)
End Stage Liver Disease/complications , Graft Rejection/etiology , Hospital Mortality , Liver Transplantation/adverse effects , Postoperative Complications , Sarcopenia/etiology , Adult , End Stage Liver Disease/surgery , Female , Follow-Up Studies , Graft Survival , Humans , Male , Middle Aged , Retrospective Studies , Risk Factors , Sarcopenia/diagnosis , Treatment Failure
5.
J Surg Res ; 199(1): 51-5, 2015 Nov.
Article in English | MEDLINE | ID: mdl-25990695

ABSTRACT

BACKGROUND: It is well established that sarcopenic patients are at higher risk of postoperative complications and short-term health care utilization. Less well understood is how these patients fare over the long term after surviving the immediate postoperative period. We explored costs over the first postoperative year among sarcopenic patients. METHODS: We identified 1279 patients in the Michigan Surgical Quality Collaborative database who underwent inpatient elective surgery at a single institution from 2006-2011. Sarcopenia, defined by gender-stratified tertiles of lean psoas area, was determined from preoperative computed tomography scans using validated analytic morphomics. Data were analyzed to assess sarcopenia's relationship to costs, readmissions, discharge location, intensive care unit admissions, hospital length of stay, and mortality. Multivariate models were adjusted for patient demographics and surgical risk factors. RESULTS: Sarcopenia was independently associated with increased adjusted costs at 30, 90, and 180 but not 365 d. The difference in adjusted postsurgical costs between sarcopenic and nonsarcopenic patients was $16,455 at 30 d and $14,093 at 1 y. Sarcopenic patients were more likely to be discharged somewhere other than home (P < 0.001). Sarcopenia was not an independent predictor of increased readmission rates in the postsurgical year. CONCLUSIONS: The effects of sarcopenia on health care costs are concentrated in the immediate postoperative period. It may be appropriate to allocate additional resources to sarcopenic patients in the perioperative setting to reduce the incidence of negative postoperative outcomes.


Subject(s)
Hospital Costs/statistics & numerical data , Postoperative Care/economics , Sarcopenia/surgery , Adult , Aged , Critical Care/economics , Female , Humans , Length of Stay/economics , Linear Models , Logistic Models , Male , Michigan , Middle Aged , Multivariate Analysis , Patient Readmission/economics , Patient Readmission/statistics & numerical data , Postoperative Complications/economics , Postoperative Complications/therapy , Retrospective Studies , Sarcopenia/economics , Sarcopenia/mortality
6.
Clin Transplant ; 29(5): 458-64, 2015 May.
Article in English | MEDLINE | ID: mdl-25740081

ABSTRACT

UNLABELLED: Among liver transplant recipients, development of post-transplant complications such as new-onset diabetes after transplantation (NODAT) is common and highly morbid. Current methods of predicting patient risk are inaccurate in the pre-transplant period, making implementation of targeted therapies difficult. We sought to determine whether analytic morphomics (using computed tomography scans) could be used to predict the incidence of NODAT. We analyzed peri-transplant scans from 216 patients with varying indications for liver transplantation, among whom 61 (28%) developed NODAT. Combinations of visceral fat, subcutaneous fat, and psoas area were considered in addition to traditional risk factors. On multivariate analysis adjusting for usual risk factors such as type of immunosuppression, subcutaneous fat thickness remained significantly associated with NODAT (OR = 1.43, 95% CI 1.00-1.88, p = 0.047). Subgroup analysis showed that patients with later-onset of NODAT had higher visceral fat, whereas subcutaneous fat thickness was more correlated with earlier-onset of NODAT (using 10 months post-transplant as the cut-off). CONCLUSION: Analytic morphomics may be used to help assess NODAT risk in patients undergoing liver transplantation.


Subject(s)
Diabetes Mellitus/diagnosis , Diabetes Mellitus/etiology , Graft Rejection/epidemiology , Intra-Abdominal Fat/pathology , Liver Diseases/surgery , Liver Transplantation/adverse effects , Postoperative Complications , Subcutaneous Fat/pathology , Age of Onset , Female , Follow-Up Studies , Graft Rejection/etiology , Graft Survival , Humans , Incidence , Male , Michigan/epidemiology , Middle Aged , Prognosis , Retrospective Studies , Risk Factors , Tomography, X-Ray Computed
7.
J Surg Res ; 193(1): 497-503, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25201576

ABSTRACT

BACKGROUND: The component separation technique (CST) is an important technique now used frequently in complex ventral hernia repair (VHR). Although this technique has demonstrated superior success rates, there is a paucity of research describing how release of the external obliques coupled with rectus myofascial advancement alters the morphology of the abdominal architecture. In this study, we apply the new concept of analytic morphomics to describe the immediate changes in morphology of the abdomen that take place after VHR by CST. METHODS: We identified 21 patients who underwent VHR by CST and received both preoperative and postoperative computed tomography scans between 2004 and 2009 in our clinical database. The surgical technique involved incisional release of the external oblique muscle lateral to the linea semilunaris with rectus abdominis myofascial advancement in all patients. Using semiautomated morphomic analysis, we measured the pre- and post-operative dimensions of the abdominal wall including the anterior-posterior distance from the anterior vertebra-to-skin and fascia along with the circumferential area of the skin and fascial compartments. Paired Student t-tests were used to compare pre- and post-operative values. RESULTS: After hernia repair, there was a decrease in the anterior vertebra-to-skin distance (16.6 cm-15.8 cm, P = 0.007). There were also decreases in total body area (968.0 cm(2)-928.6 cm(2), P = 0.017) and total body circumference (113.6 cm-111.4 cm, P = 0.016). The distance from fascia to skin decreased as well, almost to the point of statistical significance (3.3 cm-2.9 cm, P = 0.0505). Interestingly, fascia area and circumference did not decrease significantly after the operation (578.2 cm(2)-572.5 cm(2), P = 0.519, and 89.1 cm-88.6 cm, P = 0.394, respectively). CONCLUSIONS: Morphomic analysis can be used to compare and pre- and post-operative changes in patients undergoing abdominal surgery. Our study demonstrates that component separation affects the dimensions of the entire abdomen, but leaves the fascia area and circumference relatively unchanged. These changes in the abdominal wall may help explain the muscular changes observed as a result of this operation and demonstrate that this is a functional operation that restores fascial area. By better defining the effects of this procedure, we can better understand the reason for its clinical success.


Subject(s)
Abdominal Wall/surgery , Fasciotomy , Hernia, Ventral/surgery , Herniorrhaphy/methods , Rectus Abdominis/surgery , Adult , Anatomic Landmarks , Fascia/diagnostic imaging , Hernia, Ventral/diagnostic imaging , Humans , Middle Aged , Rectus Abdominis/diagnostic imaging , Retrospective Studies , Skin , Spine , Subcutaneous Fat/diagnostic imaging , Subcutaneous Fat/surgery , Tomography, X-Ray Computed , Wound Healing
8.
Clin Transplant ; 28(10): 1092-8, 2014 Oct.
Article in English | MEDLINE | ID: mdl-25040933

ABSTRACT

INTRODUCTION: Better measures of liver transplant risk stratification are needed. Our previous work noted a strong relationship between psoas muscle area and survival following liver transplantation. The dorsal muscle group is easier to measure, but it is unclear if they are also correlated with surgical outcomes. METHODS: Our study population included liver transplant recipients with a preoperative CT scan. Cross-sectional areas of the dorsal muscle group at the T12 vertebral level were measured. The primary outcomes for this study were one- and five-yr mortality and one-yr complications. The relationship between dorsal muscle group area and post-transplantation outcome was assessed using univariate and multivariate techniques. RESULTS: Dorsal muscle group area measurements were strongly associated with psoas area (r = 0.72; p < 0.001). Postoperative outcome was observed from 325 patients. Multivariate logistic regression revealed dorsal muscle group area to be a significant predictor of one-yr mortality (odds ratio [OR] = 0.53, p = 0.001), five-yr mortality (OR = 0.53, p < 0.001), and one-yr complications (OR = 0.67, p = 0.007). CONCLUSION: Larger dorsal muscle group muscle size is associated with improved post-transplantation outcomes. The muscle is easier to measure and may represent a clinically relevant postoperative risk factor.


Subject(s)
Liver Diseases/surgery , Liver Transplantation/adverse effects , Psoas Muscles/physiopathology , Cross-Sectional Studies , Female , Follow-Up Studies , Humans , Liver Diseases/physiopathology , Male , Middle Aged , Postoperative Complications , Prognosis , Retrospective Studies , Risk Factors , Survival Rate
9.
J Surg Res ; 192(1): 19-26, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25015750

ABSTRACT

BACKGROUND: Older patients account for nearly half of the United States surgical volume, and age alone is insufficient to predict surgical fitness. Various metrics exist for risk stratification, but little work has been done to describe the association between measures. We aimed to determine whether analytic morphomics, a novel objective risk assessment tool, correlates with functional measures currently recommended in the preoperative evaluation of older patients. MATERIALS AND METHODS: We retrospectively identified 184 elective general surgery patients aged >70 y with both a preoperative computed tomography scan and Vulnerable Elderly Surgical Pathways and outcomes Assessment within 90 d of surgery. We used analytic morphomics to calculate trunk muscle size (or total psoas area [TPA]) and univariate logistic regression to assess the relationship between TPA and domains of geriatric function mobility, basic and instrumental activities of daily living (ADLs), and cognitive ability. RESULTS: Greater TPA was inversely correlated with impaired mobility (odds ratio [OR] = 0.46, 95% confidence interval [CI] 0.25-0.85, P = 0.013). Greater TPA was associated with decreased odds of deficit in any basic ADLs (OR = 0.36 per standard deviation unit increase in TPA, 95% CI 0.15-0.87, P <0.03) and any instrumental ADLs (OR = 0.53, 95% CI 0.34-0.81; P <0.005). Finally, patients with larger TPA were less likely to have cognitive difficulty assessed by Mini-Cog scale (OR = 0.55, 95% CI 0.35-0.86, P <0.01). Controlling for age did not change results. CONCLUSIONS: Older surgical candidates with greater trunk muscle size, or greater TPA, are less likely to have physical impairment, cognitive difficulty, or decreased ability to perform daily self-care. Further research linking these assessments to clinical outcomes is needed.


Subject(s)
Elective Surgical Procedures , Geriatric Assessment/methods , Patient Selection , Physical Fitness , Preoperative Care/methods , Psoas Muscles/anatomy & histology , Activities of Daily Living , Aged , Aged, 80 and over , Cognition , Cross-Sectional Studies , Female , Humans , Logistic Models , Male , Motor Activity , Psoas Muscles/physiology , Retrospective Studies , Risk Assessment/methods
10.
J Surg Res ; 192(1): 76-81, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25016439

ABSTRACT

BACKGROUND: Objective measures for preoperative risk assessment are needed to inform surgical risk stratification. Previous studies using preoperative imaging have shown that the psoas muscle is a significant predictor of postoperative outcomes. Because psoas measurements are not always available, additional trunk muscles should be identified as alternative measures of risk assessment. Our research assessed the relationship between paraspinous muscle area, psoas muscle area, and surgical outcomes. METHODS: Using the Michigan Surgical Quality Collaborative database, we retrospectively identified 1309 surgical patients who had preoperative abdominal computerized tomography scans within 90 d of operation. Analytic morphomic techniques were used to measure the cross-sectional area of the paraspinous muscle at the T12 vertebral level. The primary outcome was 1-y mortality. Analyses were stratified by sex, and logistic regression was used to assess the relationship between muscle area and postoperative outcome. RESULTS: The measurements of paraspinous muscle area at T12 were normally distributed. There was a strong correlation between paraspinous muscle area at T12 and total psoas area at L4 (r = 0.72, P <0.001). Paraspinous area was significantly associated with 1-y mortality in both females (odds ratio = 0.70 per standard deviation increase in paraspinous area, 95% confidence interval 0.50-0.99, P = 0.046) and males (odds ratio = 0.64, 95% confidence interval 0.47-0.88, P = 0.006). CONCLUSIONS: Paraspinous muscle area correlates with psoas muscle area, and larger paraspinous muscle area is associated with lower mortality rates after surgery. This suggests that the paraspinous muscle may be an alternative to the psoas muscle in the context of objective measures of risk stratification.


Subject(s)
Digestive System Surgical Procedures/mortality , Elective Surgical Procedures/mortality , Paraspinal Muscles/anatomy & histology , Preoperative Care/methods , Psoas Muscles/anatomy & histology , Adult , Aged , Databases, Factual/statistics & numerical data , Female , Humans , Male , Middle Aged , Multivariate Analysis , Odds Ratio , Predictive Value of Tests , Retrospective Studies , Risk Assessment/methods
11.
J Surg Res ; 192(2): 670-7, 2014 Dec.
Article in English | MEDLINE | ID: mdl-24972736

ABSTRACT

BACKGROUND: Sternal reconstruction with vascularized flaps is central to the management of sternal wound infections and mediastinitis but carries a high risk of complications. There is a need to identify reliable predictors of complication risk to help inform patients and clinicians in preparation for surgery. Unfortunately, body mass index and serum albumin may not be reliable predictors of complication rates. Analytic morphomics provides a robust quantitative method to measure patients' obesity as it pertains to their risk of complications in undergoing sternal reconstruction. METHODS: We identified 34 patients with preoperative computed tomography scans of the abdomen from a cohort of sternal reconstructions performed between 1997 and 2010. Using semiautomated analytic morphomics, we identified the patients' skin and fascia layers between the ninth and 12th thoracic spine levels; from these landmarks, we calculated morphomic measurements of the patients' abdomens, including their total body cross sectional area and the cross sectional area of their subcutaneous fat. We obtained the incidence of complications from chart review and correlated the incidence of complications (including seroma, hematoma, recurrent wounds, mediastinitis, tracheostomy, and death) with patients' morphomic measurements. RESULTS: Sixty-two percent of patients (n = 21) suffered complications after their operation. Those who suffered from complications, relative to those who did not have complications, had increased visceral fat area (12,547.2 mm(2)versus 6569.9 mm(2), P = 0.0080), subcutaneous fat area (16,520.2 mm(2)versus 8020.1 mm(2), P = 0.0036), total body area (91,028.6 mm(2)versus 67,506.5 mm(2), P = 0.0022), fascia area (69,238.4 mm(2)versus 56,730.9 mm(2), P = 0.0118), total body circumference (1101.8 mm versus 950.2 mm, P = 0.0017), and fascia circumference (967.5 mm versus 868.1 mm, P = 0.0077). We also demonstrated a significant positive correlation between the previously mentioned morphomic measurements and the incidence of complications in multivariate logistic regression models, with odds ratios ranging from 1.19-3.10 (P values ranging from 0.010-0.022). CONCLUSIONS: Increases in abdominal morphomic measurements correlate strongly with the incidence of complications in patients undergoing sternal reconstruction. This finding may influence preoperative risk stratification and surgical decision making in this patient population.


Subject(s)
Abdomen/anatomy & histology , Body Surface Area , Plastic Surgery Procedures/adverse effects , Plastic Surgery Procedures/methods , Postoperative Complications/etiology , Sternum/surgery , Adult , Aged , Fascia/anatomy & histology , Female , Humans , Incidence , Intra-Abdominal Fat/anatomy & histology , Male , Middle Aged , Obesity/complications , Obesity/epidemiology , Obesity/pathology , Postoperative Complications/epidemiology , Postoperative Complications/pathology , Predictive Value of Tests , Preoperative Period , Risk Factors , Sternum/diagnostic imaging , Subcutaneous Fat/anatomy & histology , Tomography, X-Ray Computed , Treatment Outcome
12.
J Surg Res ; 191(1): 106-12, 2014 Sep.
Article in English | MEDLINE | ID: mdl-24750985

ABSTRACT

BACKGROUND: Surgeons often face difficult decisions in selecting which patients can tolerate major surgical procedures. Although recent studies suggest the potential for trunk muscle size, as measured on preoperative imaging, to inform surgical risk, these measures are static and do not account for the effect of the surgery itself. We hypothesize that trunk muscle size will show dynamic changes over the perioperative period, and this change correlates with postoperative mortality risk. METHODS: A total of 425 patients who underwent inpatient general surgery were identified to have both a 90-d preoperative and a 90-d postoperative abdominal computed tomography scan. The change in trunk muscle size was calculated using analytic morphomic techniques. The primary outcome was 1-y survival. Covariate-adjusted outcomes were assessed using multivariable logistic regression. RESULTS: A total of 82.6% patients (n = 351) experienced a decrease in trunk muscle size in the time between their scans (average 62.1 d). When stratifying patients into tertiles of rate of change in trunk muscle size and adjusting for other covariates, patients in the tertile of the greatest rate loss had significantly increased risk of 1-y mortality than those in the tertile of the least rate loss (P = 0.002; odds ratio = 3.40 95% confidence interval, 1.55-7.47). The adjusted mortality rate for the tertile of the greatest rate loss was 24.0% compared with 13.3% for the tertile of the least decrease. CONCLUSIONS: Trunk muscle size changes rapidly in the perioperative period and correlates with mortality. Trunk muscle size may be a critical target for interventional programs focusing on perioperative optimization of the surgical patient.


Subject(s)
Abdomen/surgery , Postoperative Complications/mortality , Psoas Muscles/anatomy & histology , Psoas Muscles/diagnostic imaging , Surgical Procedures, Operative/mortality , Tomography, X-Ray Computed/methods , Female , Humans , Kaplan-Meier Estimate , Logistic Models , Male , Middle Aged , Multivariate Analysis , Predictive Value of Tests , Preoperative Care/methods , Risk Adjustment/methods , Sex Distribution , Surgical Procedures, Operative/adverse effects
13.
Clin Transplant ; 28(4): 419-22, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24617506

ABSTRACT

BACKGROUND: Living kidney donor pools are expanding with the use of "medically complex" donors. Whether or not to include cigarette smokers as living kidney donors remains unclear. The aim of this study was to determine the relationship between donor smoking and recipient outcomes. We hypothesized that donor smoking would increase donor complications and decrease allograft and recipient survival over time. METHODS: The charts of 602 living kidney donors and their recipients were retrospectively reviewed. Kaplan-Meier survival analysis and Cox modeling were used to assess the relationships between smoking and recipient and allograft survival. RESULTS: No difference in postoperative complications was seen in smoking versus non-smoking donors. Donor smoking at time of evaluation did not significantly decrease allograft survival (HR = 1.19, p = 0.52), but recipient smoking at evaluation did reduce allograft survival (HR = 1.74, p = 0.05). Both donor and recipient smoking decreased recipient survival (HR = 1.93, p < 0.01 vs HR = 1.74, p = 0.048). DISCUSSION: When controlled for donor and recipient factors, cigarette smoking by living kidney donors significantly reduced recipient survival. This datum suggests that careful attention to smoking history is an important clinical measure in which to counsel potential donors and recipients. Policy efforts to limit donors with a recent smoking history should be balanced with the overall shortage of appropriate kidney donors.


Subject(s)
Graft Survival , Kidney Transplantation/mortality , Living Donors , Postoperative Complications/etiology , Smoking/adverse effects , Adolescent , Adult , Aged , Aged, 80 and over , Female , Follow-Up Studies , Humans , Kaplan-Meier Estimate , Kidney Transplantation/methods , Male , Middle Aged , Patient Outcome Assessment , Postoperative Complications/epidemiology , Proportional Hazards Models , Retrospective Studies , Risk Factors , Transplantation, Homologous/methods , Transplantation, Homologous/mortality , Young Adult
14.
Plast Reconstr Surg ; 133(4): 559e-566e, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24675208

ABSTRACT

BACKGROUND: Body mass index does not allow accurate risk stratification for individuals undergoing component separation repair of ventral hernias. The authors hypothesized that tissue morphology measurements (morphomics) of preoperative computed tomography scans stratify the risk of surgical site infection in patients undergoing ventral hernia repair with a component separation technique. METHODS: The authors identified 93 patients who underwent component release ventral hernia repair (2004 to 2012). The surgical technique involved release of the external oblique muscle lateral to the linea semilunaris. Using analytic morphomic techniques, the authors measured patients' morphology using routine preoperative computed tomography scans. Two-sample t test was used to evaluate the effect of morphomic and demographic factors on surgical-site infection. Separate logistic regression analyses were performed on these morphomic factors to evaluate their predictive value in assessing the risk of surgical site infection, controlling for demographic covariates. RESULTS: Surgical site infections were observed in 31 percent (n = 29) of the population. Subcutaneous fat area, total body area, and total body circumference had increased odds ratios for surgical site infection (p = 0.004, 0.014, and 0.012, respectively), indicating that these measures are better associated with surgical site infection than body mass index. These calculations control for demographic covariates, confirming that these morphomic parameters are predictive of surgical site infection. CONCLUSION: Specific morphomic values serve as superior predictors of surgical site infection in patients undergoing component separation technique hernia repair than currently used values such as body mass index. CLINICAL QUESTION/LEVEL OF EVIDENCE: Risk, III.


Subject(s)
Abdominal Fat/diagnostic imaging , Body Composition , Hernia, Ventral/epidemiology , Hernia, Ventral/surgery , Obesity/epidemiology , Surgical Wound Infection/epidemiology , Comorbidity , Female , Humans , Logistic Models , Male , Middle Aged , Preoperative Period , Risk Assessment , Surgical Mesh , Tomography, X-Ray Computed
15.
JAMA Surg ; 149(4): 335-40, 2014 Apr.
Article in English | MEDLINE | ID: mdl-24500820

ABSTRACT

IMPORTANCE: Morphometric assessment has emerged as a strong predictor of postoperative morbidity and mortality. However, a gap exists in translating this knowledge to bedside decision making. We introduced a novel measure of patient-centered surgical risk assessment: morphometric age. OBJECTIVE: To investigate the relationship between morphometric age and posttransplant survival. DATA SOURCES: Medical records of recipients of deceased-donor liver transplants (study population) and kidney donors/trauma patients (morphometric age control population). STUDY SELECTION: A retrospective cohort study of 348 liver transplant patients and 3313 control patients. We assessed medical records for validated morphometric characteristics of aging (psoas area, psoas density, and abdominal aortic calcification). We created a model (stratified by sex) for a morphometric age equation, which we then calculated for the control population using multivariate linear regression modeling (covariates). These models were then applied to the study population to determine each patient's morphometric age. DATA EXTRACTION AND SYNTHESIS: All analytic steps related to measuring morphometric characteristics were obtained via custom algorithms programmed into commercially available software. An independent observer confirmed all algorithm outputs. Trained assistants performed medical record review to obtain patient characteristics. RESULTS: Cox proportional hazards regression model showed that morphometric age was a significant independent predictor of overall mortality (hazard ratio, 1.03 per morphometric year [95% CI, 1.02-1.04; P < .001]) after liver transplant. Chronologic age was not a significant covariate for survival (hazard ratio, 1.02 per year [95% CI, 0.99-1.04; P = .21]). Morphometric age stratified patients at high and low risk for mortality. For example, patients in the middle chronologic age tertile who jumped to the oldest morphometric tertile have worse outcomes than those who jumped to the youngest morphometric tertile (74.4% vs 93.2% survival at 1 year [P = .03]; 45.2% vs 75.0% at 5 years [P = .03]). CONCLUSIONS AND RELEVANCE: Morphometric age correlated with mortality after liver transplant with better discrimination than chronologic age. Assigning a morphometric age to potential liver transplant recipients could improve prediction of postoperative mortality risk.


Subject(s)
Donor Selection/methods , Graft Survival , Liver Transplantation/mortality , Risk Assessment/methods , Tissue Donors , Adult , Age Factors , Cross-Sectional Studies , Female , Follow-Up Studies , Humans , Kaplan-Meier Estimate , Male , Michigan/epidemiology , Middle Aged , Retrospective Studies , Risk Factors , Survival Rate/trends , Tomography, X-Ray Computed , Treatment Outcome
16.
J Am Geriatr Soc ; 62(2): 352-7, 2014 Feb.
Article in English | MEDLINE | ID: mdl-24428139

ABSTRACT

OBJECTIVES: To determine whether failure to rescue, as a driver of mortality, can be used to identify which hospitals attenuate the specific risks inherent to elderly adults undergoing surgery. DESIGN: Retrospective cohort study. SETTING: State-wide surgical collaborative in Michigan. PARTICIPANTS: Older adults undergoing major general or vascular surgery between 2006 and 2011 (N = 24,216). MEASUREMENTS: Thirty-four hospitals were ranked according to risk-adjusted 30-day mortality and grouped into tertiles. Within each tertile, rates of major complications and failure to rescue were calculated, stratifying outcomes according to age (<75 vs ≥ 75). Next, differences in failure-to-rescue rates between age groups within each hospital were calculated. RESULTS: Failure-to-rescue rates were more than two times as high in elderly adults as in younger individuals in each tertile of hospital mortality (26.0% vs 10.3% at high-mortality hospitals, P < .001). Within hospitals, the average difference in failure-to-rescue rates was 12.5%. Nine centers performed better than expected, and three performed worse than expected, with the largest differences exceeding 25%. CONCLUSION: Although elderly adults experience higher failure-to-rescue rates, this does not account for hospitals' overall capacity to rescue individuals from complications. Comparing rates of younger and elderly adults within hospitals may identify centers where efforts toward complication rescue favor, or are customized for, elderly adults. These centers should be studied as part of the collaborative's effort to address the disparate outcomes that elderly adults in Michigan experience.


Subject(s)
Outcome Assessment, Health Care , Postoperative Complications/mortality , Quality of Health Care/standards , Surgical Procedures, Operative/mortality , Aged , Female , Follow-Up Studies , Hospital Mortality/trends , Humans , Male , Michigan/epidemiology , Middle Aged , Retrospective Studies , Survival Rate/trends , Vascular Surgical Procedures/mortality
17.
Surg Infect (Larchmt) ; 14(6): 512-9, 2013 Dec.
Article in English | MEDLINE | ID: mdl-24274058

ABSTRACT

BACKGROUND: Laparoscopic cholecystectomy (LC) is the procedure of choice for treatment of cholelithiasis/cholecystitis. Conversion rates (CR) to open cholecystectomy (OC) have been reported previously as 5-15% in elective cases, and up to 25% in patients with acute cholecystitis. We examined the CR in a tertiary-care academic hospital and a statewide surgery quality collaborative, and to compare complications and outcomes in elective and emergency cholecystectomy. METHODS: Prospective data were obtained from: 1) Non-Trauma Emergency Surgery (NTE) database of all emergent cholecystectomies 1/1/2008-12/31/2009; and 2) Michigan Surgical Quality Collaborative (MSQC) database with a random sample of 20-30% of all operations performed 1/1/2005-12/31/2010, including both University of Michigan (UM) data and statewide data from 34 participating hospitals. Patient characteristics, CR, and outcomes were compared for emergent vs. elective cases. RESULTS: Non-trauma ES patients had a mean hospital length of stay (HLOS) of 4.9 d. Open cholecystectomy-HLOS was greater (4.0, LC; 7.9 laparoscopic converted to open cholecystectomy; 8.7, OC, p<0.0001); mortality was 0.35% and CR was 17.5%. In the UM-MSQC dataset, OC-HLOS was greater (6.8 OC vs. 4.6 LC, p<0.001); mortality was 0.65%; CR was 9.1% in elective cases and 14.9% in emergent cases. CR was almost two-fold higher [17.5% of all NTE cholecystectomies vs. 9.1% of UM-MSQC elective cholecystectomies (p=0.00078)]. The statewide MSQC cholecystectomy data showed significantly increased HLOS in emergent cholecystectomy patients (4.34 vs. 2.65 d; p<0.0001). Morbidity (8.8 vs. 3.7%) and mortality (2.6 vs. 0.5%) rates were also significantly higher in emergent vs. elective cholecystectomies (p<0.0001). CONCLUSION: In NTE patients requiring cholecystectomy, CR is almost two-fold higher but is lower than in reports published previously (25%). However, there is a wide variability in mortality and morbidity for emergency cholecystectomy in both unadjusted and risk-adjusted analyses. Further studies are required to determine modifiable risk factors to improve outcomes in emergency cholecystectomy.


Subject(s)
Cholecystectomy/methods , Cholecystitis/surgery , Cholelithiasis/surgery , Elective Surgical Procedures/methods , Emergency Medicine/methods , Adolescent , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Michigan , Middle Aged , Survival Analysis , Treatment Outcome , Young Adult
18.
J Am Coll Surg ; 217(5): 813-8, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24119996

ABSTRACT

BACKGROUND: Sarcopenia is associated with poor outcomes after major surgery. There are currently no data regarding the financial implications of providing care for these high-risk patients. STUDY DESIGN: We identified 1,593 patients within the Michigan Surgical Quality Collaborative (MSQC) who underwent elective major general or vascular surgery at a single institution between 2006 and 2011. Patient sarcopenia, determined by lean psoas area (LPA), was derived from preoperative CT scans using validated analytic morphomic methods. Financial data including hospital revenue and direct costs were acquired for each patient through the hospital's finance department. Financial data were adjusted for patient and procedural factors using multiple linear regression methods, and Mann-Whitney U test was used for significance testing. RESULTS: After controlling for patient and procedural factors, decreasing LPA was independently associated with increasing payer costs ($6,989.17 per 1,000 mm(2) LPA, p < 0.001). The influence of LPA on payer costs increased to $26,988.41 per 1,000 mm(2) decrease in LPA (p < 0.001) in patients who experienced a postoperative complication. Further, the covariate-adjusted hospital margin decreased by $2,620 per 1,000 mm(2) decrease in LPA (p < 0.001) such that average negative margins were observed in the third of patients with the smallest LPA. CONCLUSIONS: Sarcopenia is associated with high payer costs and negative margins after major surgery. Although postoperative complications are universally expensive to payers and providers, sarcopenic patients represent a uniquely costly patient demographic. Given that sarcopenia may be remediable, efforts to attenuate costs associated with major surgery should focus on targeted preoperative interventions to optimize these high risk patients for surgery.


Subject(s)
Sarcopenia , Surgical Procedures, Operative/economics , Aged , Costs and Cost Analysis , Female , Humans , Male , Middle Aged , Postoperative Complications/economics , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Retrospective Studies , Sarcopenia/complications
19.
J Am Coll Surg ; 216(5): 976-85, 2013 May.
Article in English | MEDLINE | ID: mdl-23522786

ABSTRACT

BACKGROUND: A cornerstone of a surgeon's clinical assessment of suitability for major surgery is best described as the "eyeball test." Preoperative imaging may provide objective measures of this subjective assessment by calculating a patient's morphometric age. Our hypothesis is that morphometric age is a surgical risk factor distinct from chronologic age and comorbidity and correlates with surgical mortality and length of stay. STUDY DESIGN: This is a retrospective cohort study within a large academic medical center. Using novel analytic morphomic techniques on preoperative CT scans, a morphometric age was assigned to a random sample of patients having inpatient general and vascular abdominal surgery from 2006 to 2011. The primary outcomes for this study were postoperative mortality (1-year) and length of stay (LOS). RESULTS: The study cohort (n = 1,370) was stratified into tertiles based on morphometric age. The postoperative risk of mortality was significantly higher in the morphometric old age group when compared with the morphometric middle age group (odds ratio 2.42, 95% CI 1.52 to 3.84, p < 0.001). Morphometric old age patients were predicted to have a LOS 4.6 days longer than the morphometric middle age tertile. Similar trends were appreciated when comparing morphometric middle and young age tertiles. Chronologic age correlated poorly with these outcomes. Furthermore, patients in the chronologic middle age tertile found to be of morphometric old age had significantly inferior outcomes (mortality 21.4% and mean LOS 13.8 days) compared with patients in the chronologic middle age tertile found to be of morphometric young age (mortality 4.5% and mean LOS 6.3 days, p < 0.001 for both). CONCLUSIONS: Preoperative imaging can be used to assign a morphometric age to patients, which accurately predicts mortality and length of stay.


Subject(s)
Aging , Length of Stay , Surgical Procedures, Operative/mortality , Adult , Age Factors , Aged , Area Under Curve , Comorbidity , Female , Humans , Length of Stay/statistics & numerical data , Linear Models , Male , Middle Aged , Odds Ratio , ROC Curve , Retrospective Studies , Risk Assessment , Risk Factors , Tomography, X-Ray Computed , Vascular Surgical Procedures/mortality
20.
Ann Surg ; 257(3): 427-32, 2013 Mar.
Article in English | MEDLINE | ID: mdl-23388351

ABSTRACT

OBJECTIVE: Alvimopan was approved by the Food and Drug Administration in May 2008 and has been shown to accelerate gastrointestinal recovery after colectomy. Our independent study evaluated alvimopan as it is used in actual hospital practice in the state of Michigan. We hypothesized that alvimopan significantly decreases incidence of prolonged ileus and reduces length of stay (LOS) in patients who have undergone colectomy. METHODS: We identified 4749 patients from the Michigan Surgical Quality Collaborative (N = 28 hospitals) database between August 2007 and December 2010 who underwent elective colectomy operations. A total of 528 patients received alvimopan both pre- and postoperatively. We first selected a control group of patients from hospitals that had never administered alvimopan (n = 1833) and used propensity matching to manage differences in patient demographics and clinical characteristics. To control for hospital and surgeon characteristics, we then performed a sensitivity analysis, using a separate group of historical control patients treated before May 2008 in hospitals that would later administer alvimopan (n = 270). The Fisher exact test was used to compare complication rates, and the Student t test was used to compare LOS. RESULTS: Patients who received alvimopan had significantly lower incidence of prolonged ileus (2.3% vs 7.9%; P < 0.001) and a significantly shorter LOS (4.84 ± 4.54 vs 6.40 ± 4.45 days; P < 0.001) than control patients in hospitals that had never administered alvimopan. No differences were noted in these outcomes using sensitivity analysis. CONCLUSION: This study suggests that the actual utilization of alvimopan leads to a reduction in prolonged ileus and LOS in patients who underwent colectomy. By accelerating postoperative recovery, alvimopan has the potential to benefit patients and health care systems by improving outcomes, ensuring patient comfort, and reducing cost.


Subject(s)
Colectomy/adverse effects , Ileus/prevention & control , Piperidines/administration & dosage , Colonic Diseases/surgery , Dose-Response Relationship, Drug , Female , Gastrointestinal Agents , Humans , Ileus/epidemiology , Ileus/etiology , Incidence , Laparoscopy , Length of Stay/trends , Male , Michigan/epidemiology , Middle Aged , Postoperative Period , Recovery of Function/drug effects , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...