Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 53
Filter
1.
Ann Surg ; 275(3): 591-595, 2022 03 01.
Article in English | MEDLINE | ID: mdl-32657945

ABSTRACT

OBJECTIVE: To review outcomes after laparoscopic, robotic-assisted living donor nephrectomy (RLDN) in the first, and largest series reported to date. SUMMARY OF BACKGROUND DATA: Introduction of minimal invasive, laparoscopic donor nephrectomy has increased live kidney donation, paving the way for further innovation to expand the donor pool with RLDN. METHODS: Retrospective chart review of 1084 consecutive RLDNs performed between 2000 and 2017. Patient demographics, surgical data, and complications were collected. RESULTS: Six patients underwent conversion to open procedures between 2002 and 2005, whereas the remainder were successfully completed robotically. Median donor age was 35.7 (17.4) years, with a median BMI of 28.6 (7.7) kg/m2. Nephrectomies were preferentially performed on the left side (95.2%). Multiple renal arteries were present in 24.1%. Median operative time was 159 (54) minutes, warm ischemia time 180 (90) seconds, estimated blood loss 50 (32) mL, and length of stay 3 (1) days. The median follow-up was 15 (28) months. Complications were reported in 216 patients (19.9%), of which 176 patients (81.5%) were minor (Clavien-Dindo class I and II). Duration of surgery, warm ischemia time, operative blood loss, conversion, and complication rates were not associated with increase in body mass index. CONCLUSION: RLDN is a safe technique and offers a reasonable alternative to conventional laparoscopic surgery, in particular in donors with higher body mass index and multiple arteries. It offers transplant surgeons a platform to develop skills in robotic-assisted surgery needed in the more advanced setting of minimal invasive recipient operations.


Subject(s)
Kidney Transplantation , Laparoscopy , Nephrectomy , Robotic Surgical Procedures , Tissue and Organ Harvesting/methods , Adolescent , Adult , Female , Humans , Living Donors , Male , Middle Aged , Retrospective Studies , Young Adult
2.
J Transplant ; 2021: 6612453, 2021.
Article in English | MEDLINE | ID: mdl-33564467

ABSTRACT

BACKGROUND: Prior to 2014, treatment for hepatitis C was limited. However, the subsequent introduction of direct acting antiviral medications (DAA) against hepatitis C led to improvements in morbidity and better medication tolerance. DAA therapy allowed for an increase in treatment rates of hepatitis C in patients on the liver transplant waiting list. With the popularization of DAA, there became a growing concern about the utility of hepatitis C-positive (HCV+) deceased liver donors, especially after treating HCV+ potential recipients on the transplant waiting list. METHODS: This is a retrospective, observational study using Mid-America Transplant Services (MTS) database from 2008 to 2017. Comparison was made before the widespread use of DAAs 2008-2013 (pre-DAA) against their common practice use 2014-2017 (post-DAA). All deceased liver donors with HCV antibody or nucleic acid positive results were evaluated. RESULTS: Between 2008 and 2017, 96 deceased liver donors were positive for HCV. In the pre-DAA era, 47 deceased liver donors were positive for HCV, of which 32 (68.1%) were transplanted and 15 (31.9%) were discarded. In the post-DAA era, a total of 49 HCV+ organs were identified, out of which 43 (87.8%) livers were transplanted and 6 (12.2%) were discarded. Discard rate was significantly higher in the pre-DAA population (31.9% vs. 12.2%, p = 0.026). Secondary analysis showed a distinct trend towards increased regional sharing and utilization of HCV+ donors. CONCLUSION: In order to reduce discard rates of HCV+ patients, our data suggest that transplant centers could potentially delay HCV treatment in patients on the transplant waitlist.

3.
Transplant Proc ; 52(3): 932-937, 2020 Apr.
Article in English | MEDLINE | ID: mdl-32139274

ABSTRACT

BACKGROUND: With increased demand for liver transplantation, sicker patients are being transplanted frequently. These patients are at a higher risk of significant postoperative morbidity, including respiratory failure. This study evaluated the phenotype that characterizes liver transplant candidates who may benefit from early tracheostomy. METHODS: A single center retrospective review of all liver transplant candidates between January 2012 and December 2017. Patients who eventually required tracheostomies were identified and compared to their counterparts. RESULTS: Of the 130 liver transplants performed during the study period, 11 patients required tracheostomy. Although patients in the tracheostomized population (TP) did not have significantly worse preoperative functional status (<4 metabolic equivalents; 64% vs 42%, P = .21), they had a higher native model for end-stage liver disease (MELD) score (37 vs 30, P < .05) at the time of transplantation. Patients who eventually succumbed to respiratory failure had lower arterial pressure of oxygen/fraction of inspired oxygen (PaO2/FiO2) ratios at the start of surgery and remained unchanged for the duration of surgery compared with the nontracheostomy group (P < .05). TP patients required more net fluid intraoperatively (7.3 vs 5.0 L, P < .05), increased length of time to attempted extubation (3.5 vs 1 day, P < .05), longer ventilation days (15 vs 1 day, P < .05), increased length of stay (37 vs 9 days, P < .05), and higher 1-year mortality (36% vs 8%, P < .05). CONCLUSIONS: Based on our findings, patients with a high MELD score (>30), net postoperative fluid balance >6 L, and PaO2/FiO2 ratio ≤300 who fail to wean off mechanical ventilation after 72 hours may benefit from tracheostomy during the postoperative period.


Subject(s)
Liver Transplantation , Respiratory Insufficiency/complications , Tracheostomy , Adult , Female , Humans , Liver Transplantation/mortality , Male , Middle Aged , Postoperative Period , Retrospective Studies , Risk Factors
4.
Am J Transplant ; 20(2): 430-440, 2020 02.
Article in English | MEDLINE | ID: mdl-31571369

ABSTRACT

Despite increasing obesity rates in the dialysis population, obese kidney transplant candidates are still denied transplantation by many centers. We performed a single-center retrospective analysis of a robotic-assisted kidney transplant (RAKT) cohort from January 2009 to December 2018. A total of 239 patients were included in this analysis. The median BMI was 41.4 kg/m2 , with the majority (53.1%) of patients being African American and 69.4% of organs sourced from living donors. The median surgery duration and warm ischemia times were 4.8 hours and 45 minutes respectively. Wound complications (mostly seromas and hematomas) occurred in 3.8% of patients, with 1 patient developing a surgical site infection (SSI). Seventeen (7.1%) graft failures, mostly due to acute rejection, were reported during follow-up. Patient survival was 98% and 95%, whereas graft survival was 98% and 93%, at 1 and 3 years respectively. Similar survival statistics were obtained from patients undergoing open transplant over the same time period from the UNOS database. In conclusion, RAKT can be safely performed in obese patients with minimal SSI risk, excellent graft function, and patient outcomes comparable to national data. RAKT could improve access to kidney transplantation in obese patients due to the low surgical complication rate.


Subject(s)
Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Obesity/complications , Robotic Surgical Procedures , Adult , Female , Follow-Up Studies , Graft Survival , Humans , Kaplan-Meier Estimate , Kidney Failure, Chronic/complications , Kidney Failure, Chronic/mortality , Male , Middle Aged , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Retrospective Studies , Risk Factors , Treatment Outcome
5.
Transpl Int ; 33(6): 581-589, 2020 06.
Article in English | MEDLINE | ID: mdl-31667905

ABSTRACT

The prevalence of obesity among patients with chronic kidney disease continues to increase as a reflection of the trend observed in the general population. Factors affecting the access to the waiting list and the transplantability of this specific population will be analysed. From observational studies, kidney transplantation in obese patients carries an increased risk of surgical complications compared to the nonobese population; therefore, many centres have been reluctant to proceed with transplantation, despite this treatment modality confers a survival advantage over dialysis. As a consequence, obese patients continue to face decreased access to the waiting list, with a lower likelihood of being transplanted and higher waiting times when compared to the nonobese candidates. In this review will be described the current strategies for treatment of obesity in different settings (pretransplant, at transplant and post-transplant). Obesity represents a risk factor for surgical complications but not a contraindication for kidney transplantation; outcomes could be greatly improved with its multidisciplinary and multimodal treatment. The modern technology with minimally invasive techniques, mainly using robotic platform, allows a reduction in the surgical complications rate, with graft and patient survival rates comparable to the nonobese counterpart.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Humans , Obesity/complications , Renal Dialysis , Treatment Outcome , Waiting Lists
6.
Transpl Int ; 33(3): 321-329, 2020 03.
Article in English | MEDLINE | ID: mdl-31730258

ABSTRACT

Patients with end-stage renal disease and severe iliac atherosclerosis are frequently denied renal transplant due to technical challenges, and risk of potential steal syndrome in the allograft, or ipsilateral limb. Few studies have evaluated the safety and efficacy of performing an endarterectomy in this setting. A single-center retrospective review of renal transplant patients from 1/2013 to 12/2017 was performed. Patients requiring endarterectomy at the time of transplant were matched to a nonendarterectomized cohort in a 1:2 fashion using propensity score matching. Patients were followed for a minimum of 12 months. Simultaneous endarterectomy and renal transplant were performed in 23 patients and subsequently matched to 42 controls. Ankle-brachial index was lower in the endarterectomized group (P = 0.04). Delayed graft function (26.1% vs. 19%, P = 0.54), graft loss (8.7% vs. 7.1%, P = 0.53), 1-year mortality (8.7% vs. 4.8%, P = 0.53), and renal function at 12 months were comparable in both groups. There were no incidents of ipsilateral limb loss in the endarterectomized population. This is the first matched study investigating endarterectomy and renal transplant. Long-term follow-up of limb and graft function is indicated. Despite the small sample size, our findings suggest that a combined procedure can safely provide renal transplantation access to a previously underserved population.


Subject(s)
Kidney Transplantation , Endarterectomy , Humans , Iliac Artery/surgery , Propensity Score , Retrospective Studies , Treatment Outcome
7.
Transplant Proc ; 51(10): 3205-3212, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31732201

ABSTRACT

BACKGROUND: Equitable deceased donor liver allocation and distribution has remained a heated topic in transplant medicine. Despite the establishment of numerous policies, mixed reports regarding organ allocation persist. METHODS: Patient data was obtained from the United Network for Organ Sharing liver transplant database between January 2016 and September 2017. A total of 20,190 patients were included in the analysis. Of this number, 8790 transplanted patients had a median Model for End-Stage Liver Disease (MELD) score of 25 (17-33), after a wait time of 129 (32-273) days. Patients were grouped into low MELD and high MELD regions using a score 25 as the cutoff. RESULTS: Significant differences were noted between low and high MELD regions in ethnicity (white 77.4% vs 60.4%, Hispanic 8.1% vs 24.5%; P < .001) and highest level of education (grade school 4.8% vs 8.5%, Associate/Bachelor's degree 19% vs 15.7%, P < .001), respectively. Patients in high MELD regions were more likely to be multiply listed if they had a diagnosis of hepatocellular carcinoma (12.1% vs 15%, P = .046). Wait-list mortality (4.8% vs 6%, P < .001) and wait-list time (110 [27-238] vs 156 [42-309] days, P < .001) were greater in the high MELD regions. CONCLUSIONS: These results highlight some of the existing disparities in the recently updated allocation and distribution policy of deceased donor livers. Our findings are consistent with previous work and support the liver distribution policy revision.


Subject(s)
End Stage Liver Disease/classification , Liver Transplantation/statistics & numerical data , Tissue and Organ Procurement/organization & administration , Carcinoma, Hepatocellular/surgery , End Stage Liver Disease/epidemiology , End Stage Liver Disease/surgery , Ethnicity , Female , Geography, Medical , Humans , Hyponatremia , Liver Neoplasms/surgery , Liver Transplantation/mortality , Male , Middle Aged , Proportional Hazards Models , Socioeconomic Factors , United States/epidemiology , Waiting Lists
8.
Transpl Int ; 32(11): 1173-1181, 2019 Nov.
Article in English | MEDLINE | ID: mdl-31250486

ABSTRACT

The prevalence of obesity within the diabetic population is on the rise. This development poses unique challenges for pancreas transplantation candidates as obese individuals are often denied access to transplant. The introduction of robotic approach to transplant has been shown to improve outcomes in obese patients. A single center retrospective review of pancreas transplant cases over a 4-year period ending December 2018 was performed. Patients undergoing robotic surgery were compared to their counterparts undergoing open transplant. 49 patients (10 robot, 39 open) received pancreas transplants over the study period. Mean age was 43.1 ± 7.5 vs. 42.8 ± 9.7 years. There were no significant differences in demographics except body mass index (33.7 ± 5.2 vs. 27.1 ± 6.6, P = 0.005). Operative duration (7.6 ± 1.6 vs. 5.3 ± 1.4, P < 0.001), and warm ischemia times [45.5 (IQR: 13.7) vs. 33 (7), P < 0.001] were longer in the robotic arm. There were no wound complications in the robotic approach patients. Graft (100% vs. 88%, P = 0.37) and patient survival (100% vs. 100%, P = 0.72) after 1 year were similar. Our findings suggest that robotic pancreas is both safe and effective in obese diabetic patients, without added risk of wound complications. Wide adoption of the technique is encouraged while long term follow-up of our recipients is awaited.


Subject(s)
Diabetes Mellitus, Type 1/surgery , Obesity/surgery , Pancreas Transplantation/methods , Robotic Surgical Procedures/methods , Adult , Body Mass Index , Diabetes Mellitus, Type 1/complications , Female , Humans , Kidney Failure, Chronic/surgery , Male , Middle Aged , Obesity/complications , Postoperative Complications , Retrospective Studies , Treatment Outcome , Warm Ischemia
9.
Ann Vasc Surg ; 59: 225-230, 2019 Aug.
Article in English | MEDLINE | ID: mdl-31009722

ABSTRACT

BACKGROUND: Central venous occlusion may occur in hemodialysis patients, resulting in arm or facial swelling and failure of dialysis access. Endovascular management with balloon angioplasty or stenting has been described, but there are minimal data on the use of covered stents in this pathology. We sought to review a single institution's experience with the use of covered stents for central venous occlusive disease in hemodialysis patients. METHODS: A retrospective review of all patients undergoing placement of covered stents between April 2014 and December 2016 for central venous occlusive disease to preserve a failing dialysis access was performed. Patients' records were reviewed to identify demographics, medical comorbidities, operative variables, primary patency rates, and secondary interventions. RESULTS: A total of 29 patients were included in the analysis. Viabahn (W.L. Gore and Associates, Flagstaff, AZ) stent grafts were exclusively used in all patients. Technical success rate was 100%. The patients were predominantly female (65.5%), with a mean age of 67.9 ± 12.1 and medical comorbidities of hypertension (86%), diabetes (76%), and tobacco use (7%). The majority (86%) had prior angioplasty and 17 of 29 (59%) patients had previous central venous catheters. The right brachiocephalic vein was the most commonly stented vessel (28%). The median stent length and diameter used were 50 millimeters (range 25-100 millimeters) and 13 millimeters (range: 9-13 millimeters), respectively. The majority of patients (83%) received a single stent, with only 2 patients requiring more than one. Median follow-up was 24 months (range: 6-41 months). Four of 29 (13.8%) patients developed symptomatic stent restenosis requiring secondary intervention, all of which occurred in patients with primary stenosis between 50% and 75%. When compared to the patients without restenosis, longer stents were found to be significantly associated with restenosis (62.5 centimeters, interquartile range [IQR]: 0] vs. 50 centimeter, IQR: 0, P = 0.002). Primary patency rates were 92.9%, 91.7%, and 80.0% at 6, 12, and 24 months respectively. Secondary patency rates were 96.4%, 95.8%, and 93.3% at 6 months, 12 months, and 24 months, respectively. The overall primary patency rate was estimated at 86.2% using Kaplan-Meier analysis at 30.5 months (95% confidence interval: 26.5-34.5 months). CONCLUSIONS: Covered stent grafts have reasonable primary patency and excellent secondary patency when used for central venous stenosis in dialysis patients. Stent-graft length is associated with poorer long-term patency rates.


Subject(s)
Blood Vessel Prosthesis Implantation/adverse effects , Blood Vessel Prosthesis Implantation/instrumentation , Blood Vessel Prosthesis , Catheterization, Central Venous/adverse effects , Endovascular Procedures/adverse effects , Endovascular Procedures/instrumentation , Graft Occlusion, Vascular/etiology , Kidney Failure, Chronic/therapy , Renal Dialysis , Stents , Vascular Diseases/surgery , Vascular Patency , Aged , Aged, 80 and over , Constriction, Pathologic , Female , Graft Occlusion, Vascular/diagnostic imaging , Graft Occlusion, Vascular/physiopathology , Humans , Kidney Failure, Chronic/diagnosis , Kidney Failure, Chronic/physiopathology , Male , Middle Aged , Prosthesis Design , Registries , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , Vascular Diseases/diagnostic imaging , Vascular Diseases/etiology , Vascular Diseases/physiopathology
10.
Am Surg ; 84(5): 667-671, 2018 May 01.
Article in English | MEDLINE | ID: mdl-29966566

ABSTRACT

Mirizzi syndrome (MS) is an uncommon complication of cholelithiasis caused by extrinsic biliary compression by stones in the gallbladder infundibulum or cystic duct. The purpose of this study was to evaluate the outcomes associated with a laparoscopic approach to this disease process. This is a 10-year, retrospective study conducted at two academic medical centers with established acute care surgery practices. Patients with a diagnosis of MS confirmed intraoperatively were included. Eighty-eight patients with MS were identified with 55 (62.5%) being type 1. Twenty six (29.5%) patients, all type 1, underwent successful laparoscopic cholecystectomy. Of the 62 patients that underwent open cholecystectomy, 27.3 per cent had a laparoscopy converted to open procedure. There was no significant difference in overall complications (19.2 vs 29%) among those undergoing laparoscopic versus open cholecystectomy. Length of stay was lower in patients that had a laparoscopic approach (P = 0.001). Laparoscopic cholecystectomy can safely be attempted in type 1 MS and seems to be associated with fewer overall complications and shorter length of stay compared with an open approach.


Subject(s)
Cholecystectomy, Laparoscopic , Mirizzi Syndrome/surgery , Adult , Conversion to Open Surgery/statistics & numerical data , Female , Humans , Length of Stay/statistics & numerical data , Male , Middle Aged , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Retrospective Studies , Treatment Outcome
11.
Glob Surg ; 4(1)2018 Apr.
Article in English | MEDLINE | ID: mdl-29782618

ABSTRACT

Hepatocellular carcinoma (HCC) is an aggressive neoplastic disease that has been rapidly increasing in incidence. It usually occurs in the background of liver disease, and cirrhosis. Definitive therapy requires surgical resection. However, in majority of cases surgical resection is not tolerated, especially in the presence of portal hypertension and cirrhosis. Orthotopic liver transplant (OLT) in well selected candidates has been accepted as a viable option. Due to a relative scarcity of donors compared to the number of listed recipients, long waiting times are anticipated. To prevent patients with HCC from dropping out from the transplant list due to progression of their disease, most centers utilize loco-regional therapies. These loco-regional therapies(LRT) include minimally invasive treatments like percutaneous thermal ablation, trans-arterial chemoembolization, trans-arterial radio-embolization or a combination thereof. The type of therapy or combination used is determined by the size and location of the HCC and Barcelona Clinic Liver Cancer (BCLC) classification. The data regarding the efficacy of LRT in reducing post-transplant recurrence or disease-free survival is limited. This article reviews the available therapies, their strengths, limitations, and current use in the management of patients with hepatocellular carcinoma awaiting transplant.

12.
Int J Surg Case Rep ; 41: 251-254, 2017.
Article in English | MEDLINE | ID: mdl-29102862

ABSTRACT

INTRODUCTION: Portal vein thrombosis (PVT) poses an extremely difficult problem in cirrhotic patients who are in need of a liver transplant. The prevalence of PVT in patients with cirrhosis ranges from 0.6% to 26% Nery et al. (2015) [1]. The presence of PVT is associated with more technically difficult liver transplant and in certain cases can be a contraindication to liver transplant. The only option for these patients with extensive PVT would be a multi-visceral transplant, the later unfortunately has a much higher morbidity and mortality compared to liver only transplant Smith et al. (2016) [2]. An alternative approach is needed to provide a safe and reliable outcome. PRESENTATION OF CASE: In this case series, we present our experience with reno-portal shunt as an alternative inflow for the liver allograft. DISCUSSION: This approach appears to be safe with good long-term outcome.Although this technique has been described before, we provide additional considerations that produced good outcomes in our patients. CONCLUSION: We believe that meticulous preoperative planning with high-resolution triple phase CT imaging with a measurement of the diameter of the spleno-renal shunt along with a duplex scan measuring flow through the shunt is key to a successful transplantation. Moreover, appropriate donor liver size is also of extreme importance to avoid portal hypoperfusion.

13.
Dig Surg ; 34(5): 421-428, 2017.
Article in English | MEDLINE | ID: mdl-28668951

ABSTRACT

BACKGROUND: Aging has been associated with increasing common bile duct (CBD) diameter and reported as independently predictive of the likelihood of choledocolithiasis. These associations are controversial with uncertain diagnostic utility in patients presenting with symptomatic disease. The current study examined the relationship between age, CBD size, and the diagnostic probability of choledocolithiasis. METHODS: Symptomatic patients undergoing evaluation for suspected choledocolithiasis from January 2008 to February 2011 were reviewed. In the cohort without choledocolithiasis, the relationship between aging and CBD size was examined as a continuous variable and by comparing mean CBD size across stratified age groups. Multivariate analysis examined the relationship between increasing age and diagnostic probability of choledocolithiasis in all patients. RESULTS: Choledocolithasis was diagnosed by MR cholangiopancreatography (MRCP) or endoscopic retrograde (ERCP) in 496 of 1,000 patients reviewed. Mean CBD was 6.0 mm (±2.8 mm) in the 504 of 1,000 patients without choledocolithiasis on ERCP/MRCP. Increasing age had no correlation with CBD size as a continuous variable (r2 = 0.011, p = 0.811). No difference occurred across age groups (Kruskal-Wallis, p = 0.157). Age had no association with diagnostic likelihood of choledocolithiasis (AOR [95% CI] 0.99 [0.98-1.01], adjusted-p = 0.335). CONCLUSION: In a large population undergoing investigation for biliary disease, increasing age was neither associated with increasing CBD diameter nor predictive of the likelihood of choledocolithiasis.


Subject(s)
Aging/pathology , Choledocholithiasis/diagnostic imaging , Choledocholithiasis/pathology , Common Bile Duct/pathology , Adult , Age Factors , Aged , Cholangiopancreatography, Endoscopic Retrograde , Cholangiopancreatography, Magnetic Resonance , Female , Humans , Male , Middle Aged , Probability , Retrospective Studies
14.
Am J Surg ; 214(1): 19-23, 2017 Jul.
Article in English | MEDLINE | ID: mdl-27769542

ABSTRACT

INTRODUCTION: A daily Chest X-ray (CXR) is obtained in many surgical intensive care units (SICU). This study implemented a selective CXR protocol in a high volume, academic SICU and evaluated its impact on clinical outcomes. METHODS: All SICU patients admitted in 2/2010 were compared with patients admitted in 2/2012. Between the time periods, a protocol eliminating the routine daily CXRs was instituted. RESULTS: In 02/2010 and 02/2012, 107 and 90 patients were admitted to the SICU, respectively, for a total of 1384 patient days. CXRs decreased from 365 (57.1% of patient-days) in 2010 to 299 (40.9% of patient days; p < 0.001) in 2012. A greater proportion of Physician Directed CXRs (PDCXRs) had new findings (80.8%) compared to Automatic Daily CXRs (ADCXRs) (23.5%, p < 0.001). There was no difference in overall or SICU length of stay, ventilator-free days, morbidity or mortality. CONCLUSION: Eliminating ADCXRs decreased the number of CXRs performed, without affecting LOS, mechanical ventilation, morbidity or mortality. Physician-directed ordering of CXRs increased the diagnostic value of the CXR and decreased the number of clinically irrelevant CXRs performed.


Subject(s)
Clinical Protocols , Critical Care/methods , Intensive Care Units , Radiography, Thoracic/statistics & numerical data , Adolescent , Adult , Aged , Female , Hospital Mortality , Humans , Length of Stay , Los Angeles , Male , Middle Aged , Prospective Studies , Respiration, Artificial , Unnecessary Procedures , Young Adult
15.
Ann Surg ; 264(4): 599-604, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27433911

ABSTRACT

OBJECTIVE: To prospectively evaluate the ability of radio frequency detection (RFD) system-embedded sponges to mitigate the incidence of retained surgical sponges (RSS) after emergency surgery. BACKGROUND: Emergency surgery patients are at high risk for retained foreign bodies. METHODS: All emergent trauma and nontrauma cavitary operations over a 5-year period (January 2010-December 2014) were prospectively enrolled. For damage-control procedures, only the definitive closure was included. RFD sponges were used exclusively throughout the study period. Before closure, the sponge and instrument count was followed by RFD scanning and x-ray evaluation for retained sponges. RSS and near-misses averted using the RFD system were analyzed. RESULTS: In all, 2051 patients [median (range)], aged 41 (1-101) years, 72.2% male, 46.8% trauma patients, underwent 2148 operations (1824 laparotomy, 100 thoracotomy, 30 sternotomy, and 97 combined). RFD detected retained sponges in 11 (0.5%) patients (81.8%laparotomy, 18.2% sternotomy) before cavitary closure. All postclosure x-rays were negative. No retained sponges were missed by the RFD system. Body mass index was 29 (23-43), estimated blood loss 1.0 L (0-23), and operating room time 160 minutes (71-869). Procedures started after 18:00 to 06:00 hours in 45.5% of the patients. The sponge count was incorrect in 36.4%, not performed due to time constraints in 45.5%, and correct in 18.2%. The additional cost of using RFD-embedded disposables was $0.17 for a 4X18 laparotomy sponge and $0.46 for a 10 pack of 12ply, 4X8. CONCLUSIONS: Emergent surgical procedures are high-risk for retained sponges, even when sponge counts are performed and found to be correct. Implementation of a RFD system was effective in preventing this complication and should be considered for emergent operations in an effort to improve patient safety.


Subject(s)
Foreign Bodies/prevention & control , Postoperative Complications/prevention & control , Radio Waves , Surgical Sponges , Adolescent , Adult , Aged , Aged, 80 and over , Child , Child, Preschool , Female , Foreign Bodies/etiology , Humans , Infant , Laparotomy/adverse effects , Laparotomy/instrumentation , Male , Middle Aged , Postoperative Complications/etiology , Prospective Studies , Sternotomy/adverse effects , Sternotomy/instrumentation , Thoracotomy/adverse effects , Thoracotomy/instrumentation , Young Adult
16.
Am Surg ; 82(2): 134-9, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26874135

ABSTRACT

We sought to use the National Trauma Databank to determine the demographics, injury distribution, associated abdominal injuries, and outcomes of those patients who are restrained versus unrestrained. All victims of motor vehicle collisions (MVCs) were identified from the National Trauma Databank and stratified into subpopulations depending on the use of seat belts. A total of 150,161 MVC victims were included in this study, 72,394 (48%) were belted. Young, male passengers were the least likely to be wearing a seat belt. Restrained victims were less likely to have severe injury as measured by Injury Severity Score and Abbreviated Injury Score. Restrained victims were also less likely to suffer solid organ injuries (9.7% vs 12%, P < 0.001), but more likely to have hollow viscous injuries (1.9% vs 1.3%, P < 0.001). The hospital and intensive care unit length of stay were significantly shorter in belted victims with adjusted mean difference: -1.36 (-1.45, -1.27) and -0.96 (-1.02, -0.90), respectively. Seat belt use was associated with a significantly lower crude mortality than unrestrained victims (1.9% vs 3.3%, P < 0.001), and after adjusting for differences in age, gender, position in vehicle, and deployment of air bags, the protective effect remained (adjusted odds ratio for mortality 0.50, 95% confidence interval 0.47, 0.54). In conclusion, MVC victims wearing seat belts have a significant reduction in the severity of injuries in all body areas, lower mortality, a shorter hospital stay, and decreased length of stay in the intensive care unit. The nature of abdominal injuries, however, was significantly different, with a higher incidence of hollow viscous injury in those wearing seat belts.


Subject(s)
Abdominal Injuries/etiology , Accidents, Traffic , Seat Belts/statistics & numerical data , Abdominal Injuries/epidemiology , Accidents, Traffic/mortality , Adolescent , Adult , Aged , Aged, 80 and over , Child , Child, Preschool , Databases, Factual , Female , Humans , Infant , Infant, Newborn , Length of Stay , Male , Middle Aged , Retrospective Studies , Seat Belts/adverse effects , Trauma Severity Indices , United States/epidemiology , Wounds and Injuries/epidemiology , Wounds and Injuries/etiology , Wounds and Injuries/prevention & control , Young Adult
17.
Am J Surg ; 209(6): 959-68, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25669120

ABSTRACT

BACKGROUND: The traditional theory that pulmonary emboli (PE) originate from the lower extremity has been challenged. METHODS: All autopsies performed in Los Angeles County between 2002 and 2010 where PE was the cause of death were reviewed. RESULTS: Of the 491 PE deaths identified, 36% were surgical and 64% medical. Venous dissection for clots was performed in 380 patients; the PE source was the lower extremity (70.8%), pelvic veins (4.2 %), and upper extremity (1.1%). No source was identified in 22.6% of patients. Body mass index (adjusted odds ratio [AOR] 1.044, 95% confidence interval [CI] 1.011 to 1.078, P = .009) and age (AOR 1.018, 95% CI 1.001 to 1.036, P = .042) were independent predictors for identifying a PE source. Chronic obstructive pulmonary disease (AOR .173, 95% CI .046 to .646, P = .009) was predictive of not identifying a PE source. CONCLUSIONS: Most medical and surgical patients with fatal PE had a lower extremity source found, but a significant number had no source identified. Age and body mass index were positively associated with PE source identification. However, a diagnosis of chronic obstructive pulmonary disease was associated with no PE source identification.


Subject(s)
Postoperative Complications , Pulmonary Embolism/etiology , Venous Thrombosis/complications , Wounds and Injuries/complications , Adult , Aged , Female , Humans , Logistic Models , Lower Extremity/blood supply , Male , Middle Aged , Odds Ratio , Postoperative Complications/mortality , Pulmonary Embolism/mortality , Retrospective Studies , Risk Factors , Upper Extremity/blood supply
18.
JAMA Surg ; 150(4): 332-6, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25692391

ABSTRACT

IMPORTANCE: The standard practice of irrigation and debridement (I&D) of open fractures within 6 hours of injury remains controversial. OBJECTIVE: To prospectively evaluate the effect of the time from injury to the initial I&D on infectious complications. DESIGN, SETTING, AND PARTICIPANTS: A total of 315 patients who were admitted to a level 1 trauma center with open extremity fractures from September 22, 2008, through June 21, 2011, were enrolled in a prospective observational study and followed up for 1 year after discharge (mean [SD] age, 33.9 [16.3] years; 79% were male; and 78.4% were due to blunt trauma). Demographics, mechanism of injury, time to I&D, operative intervention, and incidence of local infectious complications were documented. Patients were stratified into 4 groups based on the time of I&D (<6 hours, 7-12 hours, 13-18 hours, and 19-24 hours after injury). Univariate and multivariable analysis were used to determine the effect of time to I&D on outcomes. MAIN OUTCOMES AND MEASURES: Development of local infectious complications at early (<30 days) or late (>30 days and <1 year) intervals from admission. RESULTS: The most frequently injured site was the lower extremity (70.2%), and 47.9% of all injuries were Gustilo classification type III. There was no difference in fracture location, degree of contamination, or antibiotic use between groups. All patients underwent I&D within 24 hours. Overall, 14 patients (4.4%) developed early wound infections, while 10 (3.2%) developed late wound infections (after 30 days). The infection rate was not statistically different on univariate (<6 hours, 4.7%; 7-12 hours, 7.5%; 13-18 hours, 3.1%; and 19-24 hours, 3.6%; P = .65) or multivariable analysis (<6-hour group [reference], P = .65; 7- to 12-hour group adjusted odds ratio [AOR] [95% CI], 2.1 [0.4-10.2], P = .37; 13- to 18-hour group AOR [95% CI], 0.8 [0.1-4.5], P = .81; 19- to 24-hour group AOR [95% CI], 1.1 [0.2-6.2], P = .90). Time to I&D did not affect the rate of nonunion, hardware failure, length of stay, or mortality. CONCLUSIONS AND RELEVANCE: In this prospective analysis, time to I&D did not affect the development of local infectious complications provided it was performed within 24 hours of arrival.


Subject(s)
Debridement , Fractures, Open/surgery , Therapeutic Irrigation , Time-to-Treatment , Adult , Female , Follow-Up Studies , Humans , Incidence , Male , Postoperative Complications/epidemiology , Prospective Studies , Surgical Wound Infection/epidemiology , Trauma Centers , Treatment Outcome
19.
Transfusion ; 55(3): 532-43, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25582335

ABSTRACT

BACKGROUND: The Mirasol system has been demonstrated to effectively inactivate white blood cells (WBCs) and reduce pathogens in whole blood in vitro. The purpose of this study was to compare the safety and efficacy of Mirasol-treated fresh whole blood (FWB) to untreated FWB in an in vivo model of surgical bleeding. STUDY DESIGN AND METHODS: A total of 18 anesthetized pigs (40 kg) underwent a 35% total blood volume bleed, cooling to 33°C, and a standardized liver injury. Animals were then randomly assigned to resuscitation with either Mirasol-treated or untreated FWB, and intraoperative blood loss was measured. After abdominal closure, the animals were observed for 14 days, after which the animals were euthanized and tissues were obtained for histopathologic examination. Mortality, tissue near-infrared spectroscopy, red blood cell (RBC) variables, platelets (PLTs), WBCs, and coagulation indices were analyzed. RESULTS: Total intraoperative blood loss was similar in test and control arms (8.3 ± 3.2 mL/kg vs. 7.7 ± 3.9 mL/kg, p = 0.720). All animals survived to Day 14. Trended values over time did not show significant differences-tissue oxygenation (p = 0.605), hemoglobin (p = 0.461), PLTs (p = 0.807), WBCs (p = 0.435), prothrombin time (p = 0.655), activated partial thromboplastin time (p = 0.416), thromboelastography (TEG)-reaction time (p = 0.265), or TEG-clot formation time (p = 0.081). Histopathology did not show significant differences between arms. CONCLUSIONS: Mirasol-treated FWB did not impact survival, blood loss, tissue oxygen delivery, RBC indices, or coagulation variables in a standardized liver injury model. These data suggest that Mirasol-treated FWB is both safe and efficacious in vivo.


Subject(s)
Blood Safety , Blood Transfusion/methods , Blood/drug effects , Blood/radiation effects , Hemorrhage/therapy , Resuscitation/methods , Riboflavin/pharmacology , Ultraviolet Rays , Animals , Blood Cells/drug effects , Blood Cells/radiation effects , Blood Coagulation Tests , Blood Preservation , Erythrocyte Indices , Female , Hemodilution , Hemorrhage/etiology , Hypothermia, Induced , Lacerations/complications , Lacerations/therapy , Laparotomy , Liver/injuries , Liver/pathology , Male , Random Allocation , Sus scrofa , Swine , Thrombelastography
SELECTION OF CITATIONS
SEARCH DETAIL
...