Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 31
Filter
1.
NAR Cancer ; 6(1): zcae007, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38406263

ABSTRACT

Diffuse large B-cell lymphoma (DLBCL) is a commonly diagnosed, aggressive non-Hodgkin's lymphoma. While R-CHOP chemoimmunotherapy is potentially curative, about 40% of DLBCL patients will fail, highlighting the need to identify biomarkers to optimize management. SAMHD1 has a dNTPase-independent role in promoting resection to facilitate DNA double-strand break (DSB) repair by homologous recombination. We evaluated the relationship of SAMHD1 levels with sensitivity to DSB-sensitizing agents in DLBCL cells and the association of SAMHD1 expression with clinical outcomes in 79 DLBCL patients treated with definitive therapy and an independent cohort dataset of 234 DLBCL patients. Low SAMHD1 expression, Vpx-mediated, or siRNA-mediated degradation/depletion in DLBCL cells was associated with greater sensitivity to doxorubicin and PARP inhibitors. On Kaplan-Meier log-rank survival analysis, low SAMHD1 expression was associated with improved overall survival (OS), which on subset analysis remained significant only in patients with advanced stage (III-IV) and moderate to high risk (2-5 International Prognostic Index (IPI)). The association of low SAMHD1 expression with improved OS remained significant on multivariate analysis independent of other adverse factors, including IPI, and was validated in an independent cohort. Our findings suggest that SAMHD1 expression mediates doxorubicin resistance and may be an important prognostic biomarker in advanced, higher-risk DLBCL patients.

2.
Aging Ment Health ; 28(3): 473-481, 2024.
Article in English | MEDLINE | ID: mdl-37655598

ABSTRACT

OBJECTIVES: Disparities impacting dementia health care exist in racial/ethnic minority groups, including Asian Americans, an understudied population in Alzheimer's disease and related dementias. The qualitative study explored caregiving experiences and potential cultural influences among Asian Indian, Chinese, Korean, and Vietnamese family care partners of persons living with dementia. METHODS: We conducted focus groups and individual interviews with 32 care partners from these four Asian subgroups using Zoom, WeChat, or telephone. RESULTS: Four themes emerged from the data: (1) Family obligations influencing caregiving decisions; (2) Evolving challenges related to dementia caregiving; (3) Caregiving burdens/negative impacts from caregiving (relationship burdens and emotional distress); and (4) Coping with their situation in their own ways (cognitive, behavioral, and social strategies).Conclusion: Cultural values (e.g. familism or filial piety) played a significant role in caregiving decisions and experiences. There was a need to raise public awareness of dementia and create culturally and linguistically appropriate training programs for this population.


Subject(s)
Alzheimer Disease , Asian , Caregivers , Humans , Caregivers/psychology , Ethnicity , Minority Groups
3.
Surg Obes Relat Dis ; 19(8): 808-816, 2023 08.
Article in English | MEDLINE | ID: mdl-37353413

ABSTRACT

BACKGROUND: Venous thromboembolism (VTE) is a leading cause of 30-day mortality after metabolic and bariatric surgery (MBS). Multiple predictive tools exist for VTE risk assessment and extended VTE chemoprophylaxis determination. OBJECTIVE: To review existing risk-stratification tools and compare their predictive abilities. SETTING: MBSAQIP database. METHODS: Retrospective analysis of the Metabolic and Bariatric Surgery Accreditation and Quality Improvement Program (MBSAQIP) database was performed (2015-2019) for primary minimally invasive MBS cases. VTE clinical factors and risk-assessment tools were evaluated: body mass index threshold of 50 kg/m2, Caprini risk-assessment model, and 3 bariatric-specific tools: the Cleveland Clinic VTE risk tool, the Michigan Bariatric Surgery Collaborative tool, and BariClot. MBS patients were deemed high risk based on criteria from each tool and further assessed for sensitivity, specificity, and positive predictive value. RESULTS: Overall, 709,304 patients were identified with a .37% VTE rate. Bariatric-specific tools included multiple predictors: procedure, age, race, gender, operative time, length of stay, heart failure, and dyspnea at rest; operative time was the only variable common to all. The body mass index cutoff and Caprini risk-assessment model had higher sensitivity but lower specificity when compared with the Michigan Bariatric Surgery Collaborative and BariClot tools. While the sensitivity of the tools varied widely and was overall low, the Cleveland Clinic tool had the highest sensitivity. The bariatric-specific tools would have recommended extended prophylaxis for 1.1%-15.6% of patients. CONCLUSIONS: Existing MBS VTE risk-assessment tools differ widely for inclusion variables, high-risk definition, and predictive performance. Further research and registry inclusion of all significant risk factors are needed to determine the optimal risk-stratified approach for predicting VTE events and determining the need for extended prophylaxis.


Subject(s)
Bariatric Surgery , Venous Thromboembolism , Humans , Venous Thromboembolism/etiology , Venous Thromboembolism/prevention & control , Quality Improvement , Retrospective Studies , Postoperative Complications/etiology , Anticoagulants/therapeutic use , Bariatric Surgery/adverse effects , Bariatric Surgery/methods , Risk Factors
4.
Cureus ; 14(6): e25845, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35832750

ABSTRACT

Bidirectional ventricular tachycardia (BVT) is a rare and unusual ventricular dysrhythmia that is characterized by a beat-to-beat alternation of the QRS axis. This can sometimes manifest as alternating left and right bundle branch blocks. To the best of our knowledge, there are two previous cases of BVT in the setting of type I myocardial infarction. Our case would be the third and showed a subtle change in the anterior-posterior axis that can be seen in lead V2. The coronary angiography of our patient demonstrated severe multivessel coronary artery disease with complete total occlusion of the proximal dominant right coronary artery, 100% in-stent restenosis of the ostial left circumflex, 40% stenosis of left main, and 90% stenosis of mid left anterior descending artery (LAD). The BVT resolved after two amiodarone boluses followed by a drip. We attempted to transition to oral mexiletine, however, the patient was unable to tolerate the medication due to intractable nausea and vomiting. The patient subsequently underwent high risk coronary artery bypass graft surgery with no further episodes of BVT following revascularization and was discharged after six weeks of hospitalization. Although rare, type I myocardial infarction is an important differential diagnosis of BVT.

5.
Article in English | MEDLINE | ID: mdl-35206567

ABSTRACT

BACKGROUND: Many older adults suffer from poor oral health, including tooth loss, and disparities among racial/ethnic and socially disadvantaged populations continue to exist. METHODS: Data were obtained from the National Health and Nutrition Examination Survey among the adult population in the U.S. The prevalence of edentulism and multiple regression models were conducted on 15,821 adults, including Asians, Blacks, Hispanics, Whites, and others to assess the relationships between tooth loss and their predictors. RESULTS: The prevalence of complete tooth loss increased with age from 0.7% for ages 20-44 to 20.2% for ages 65 and over. There are disparities in complete tooth loss regarding race/ethnicity, with the highest percentages (9%) among Whites and Blacks and the lowest percentages among Asians (3%) and Hispanics (4%). After adjusting for predictors, their impact on tooth loss was not consistent within racial/ethnic groups, as Asians had more tooth loss from Model 1 (ß = -1.974, p < 0.0001) to Model 5 (ß = -1.1705, p < 0.0001). CONCLUSION: Tooth loss was significantly higher among older adults and racial/ethnic groups even after controlling for other predictors among a nationally representative sample. The findings point to the fact that subgroup-tailored preventions are necessary.


Subject(s)
Ethnicity , Tooth Loss , Aged , Humans , Nutrition Surveys , Racial Groups , Risk Factors , Tooth Loss/epidemiology , United States/epidemiology
6.
Mil Med ; 185(9-10): e1528-e1535, 2020 09 18.
Article in English | MEDLINE | ID: mdl-32962326

ABSTRACT

INTRODUCTION: Combined burn injury and hemorrhagic shock are a common cause of injury in wounded warfighters. Current protocols for resuscitation for isolated burn injury and isolated hemorrhagic shock are well defined, but the optimal strategy for combined injury is not fully established. Direct peritoneal resuscitation (DPR) has been shown to improve survival in rats after hemorrhagic shock, but its role in a combined burn/hemorrhage injury is unknown. We hypothesized that DPR would improve survival in mice subjected to combined burn injury and hemorrhage. MATERIALS AND METHODS: Male C57/BL6J mice aged 8 weeks were subjected to a 7-second 30% total body surface area scald in a 90°C water bath. Following the scald, mice received DPR with 1.5 mL normal saline or 1.5 mL peritoneal dialysis solution (Delflex). Control mice received no peritoneal solution. Mice underwent a controlled hemorrhage shock via femoral artery cannulation to a systolic blood pressure of 25 mm Hg for 30 minutes. Mice were then resuscitated to a target blood pressure with either lactated Ringer's (LR) or a 1:1 ratio of packed red blood cells (pRBCs) and fresh frozen plasma (FFP). Mice were observed for 24 hours following injury. RESULTS: Median survival time for mice with no DPR was 1.47 hours in combination with intravascular LR resuscitation and 2.08 hours with 1:1 pRBC:FFP. Median survival time significantly improved with the addition of intraperitoneal normal saline or Delflex. Mice that received DPR followed by 1:1 pRBC:FFP required less intravascular volume than mice that received DPR with LR, pRBC:FFP alone, and LR alone. Intraperitoneal Delflex was associated with higher levels of tumor necrosis factor alpha and macrophage inflammatory protein 1 alpha and lower levels of interleukin 10 and intestinal fatty acid binding protein. Intraperitoneal normal saline resulted in less lung injury 1 hour postresuscitation, but increased to similar severity of Delflex at 4 hours. CONCLUSIONS: After a combined burn injury and hemorrhage, DPR leads to increased survival in mice. Survival was similar with the use of normal saline or Delflex. DPR with normal saline reduced the inflammatory response seen with Delflex and delayed the progression of acute lung injury. DPR may be a valuable strategy in the treatment of patients with combined burn injury and hemorrhage.


Subject(s)
Burns , Resuscitation , Shock, Hemorrhagic , Animals , Burns/complications , Burns/therapy , Disease Models, Animal , Humans , Male , Mice , Rats , Rats, Sprague-Dawley , Shock, Hemorrhagic/complications , Shock, Hemorrhagic/therapy
7.
J Surg Res ; 247: 453-460, 2020 03.
Article in English | MEDLINE | ID: mdl-31668606

ABSTRACT

BACKGROUND: Acute lung injury (ALI) is a frequent complication after severe trauma. Lung-protective ventilation strategies and damage control resuscitation have been proposed for the prevention of ALI; however, there are no clinical or laboratory parameters to predict who is at risk of developing ALI after trauma. In the present study, we explored pulmonary inflammatory markers as a potential predictor of ALI using a porcine model of hemorrhagic shock. MATERIALS AND METHODS: Female swine were randomized to mechanical ventilation with low tidal volume (VT) (6 mL/kg) or high VT (12 mL/kg). After equilibration, animals underwent pressure-controlled hemorrhage (mean arterial pressure [MAP] 35 ± 5 mmHg) for 1 h, followed by resuscitation with fresh whole blood or Hextend. They were maintained at MAP of 50 ± 5 mmHg for 3 h in the postresuscitation phase. Bronchoalveolar lavage fluids were collected hourly and analyzed for inflammatory markers. Lung samples were taken, and porcine neutrophil antibody staining was used to evaluate the presence of neutrophils. ELISA evaluated serum porcine surfactant protein D levels. Sham animals were used as negative controls. RESULTS: Pigs that underwent hemorrhagic shock had higher heart rates, lower cardiac output, lower MAPs, and worse acidosis compared with sham at the early time points (P < 0.05 each). There were no significant differences in central venous pressure or pulmonary capillary wedge pressure between groups. Pulmonary neutrophil infiltration, as defined by neutrophil antibody staining on lung samples, was greater in the shock groups regardless of resuscitation fluid (P < 0.05 each). Bronchoalveolar lavage fluid neutrophil levels were not different between groups. There were no differences in levels of porcine surfactant protein D between groups at any time points, and the levels did not change over time in each respective group. CONCLUSIONS: Our study demonstrates the reproducibility of a porcine model of hemorrhagic shock that is consistent with physiologic changes in humans in hemorrhagic shock. Pulmonary neutrophil infiltration may serve as an early marker for ALI; however, the practicality of this finding has yet to be determined.


Subject(s)
Acute Lung Injury/diagnosis , Neutrophils/immunology , Shock, Hemorrhagic/complications , Acute Lung Injury/immunology , Acute Lung Injury/physiopathology , Acute Lung Injury/prevention & control , Animals , Blood Transfusion , Bronchoalveolar Lavage Fluid/cytology , Bronchoalveolar Lavage Fluid/immunology , Cardiac Output/immunology , Disease Models, Animal , Female , Heart Rate/immunology , Humans , Lung/cytology , Lung/immunology , Lung/pathology , Neutrophil Infiltration , Predictive Value of Tests , Prognosis , Pulmonary Surfactant-Associated Protein D/analysis , Pulmonary Surfactant-Associated Protein D/immunology , Pulmonary Surfactant-Associated Protein D/metabolism , Reproducibility of Results , Respiration, Artificial/instrumentation , Respiration, Artificial/methods , Resuscitation/methods , Shock, Hemorrhagic/immunology , Shock, Hemorrhagic/therapy , Sus scrofa , Time Factors
8.
Thromb Res ; 185: 160-166, 2020 01.
Article in English | MEDLINE | ID: mdl-31821908

ABSTRACT

INTRODUCTION: During storage, packed red blood cells undergo a series of physical, metabolic, and chemical changes collectively known as the red blood cell storage lesion. One key component of the red blood cell storage lesion is the accumulation of microparticles, which are submicron vesicles shed from erythrocytes as part of the aging process. Previous studies from our laboratory indicate that transfusion of these microparticles leads to lung injury, but the mechanism underlying this process is unknown. In the present study, we hypothesized that microparticles from aged packed red blood cell units induce pulmonary thrombosis. MATERIALS AND METHODS: Leukoreduced, platelet-depleted, murine packed red blood cells (pRBCS) were prepared then stored for up to 14 days. Microparticles were isolated from stored units via high-speed centrifugation. Mice were transfused with microparticles. The presence of pulmonary microthrombi was determined with light microscopy, Martius Scarlet Blue, and thrombocyte stains. In additional studies microparticles were labelled with CFSE prior to injection. Murine lung endothelial cells were cultured and P-selectin concentrations determined by ELISA. In subsequent studies, P-selectin was inhibited by PSI-697 injection prior to transfusion. RESULTS: We observed an increase in microthrombi formation in lung vasculature in mice receiving microparticles from stored packed red blood cell units as compared with controls. These microthrombi contained platelets, fibrin, and microparticles. Treatment of cultured lung endothelial cells with microparticles led to increased P-selectin in the media. Treatment of mice with a P-selectin inhibitor prior to microparticle infusion decreased microthrombi formation. CONCLUSIONS: These data suggest that microparticles isolated from aged packed red blood cell units promote the development of pulmonary microthrombi in a murine model of transfusion. This pro-thrombotic event appears to be mediated by P-selectin.


Subject(s)
Cell-Derived Microparticles , Thrombosis , Animals , Blood Preservation , Endothelial Cells , Erythrocytes , Lung , Mice , Mice, Inbred C57BL , P-Selectin
9.
Liver Transpl ; 25(11): 1673-1681, 2019 11.
Article in English | MEDLINE | ID: mdl-31518478

ABSTRACT

Obesity has become an epidemic in the United States over the past decade, and recent studies have shown this trend in the liver transplantation (LT) population. These patients may be candidates for laparoscopic sleeve gastrectomy (LSG) to promote significant and sustained weight loss to prevent recurrence of nonalcoholic steatohepatitis. However, safety remains a concern, and efficacy in this setting is uncertain. A single-institution database from 2014 to 2018 was queried for patients undergoing LSG following LT. The selection criteria for surgery were consistent with National Institutes of Health guidelines, and patients were at least 6 months after LT. A total of 15 patients (median age, 59.0 years; Caucasian, 86.7%; and female, 60%) underwent LSG following LT. Median time from LT to LSG was 2.2 years with a median follow-up period of 2.6 years. The median hospital length of stay (LOS) was 2 days after LSG. Mortality and rate of liver allograft rejection was 0, and there was 1 postoperative complication (a surgical site infection). Following LSG, body mass index (BMI) decreased from 42.7 to 35.9 kg/m2 (P < 0.01), and in 12 patients with at least 1 year of follow-up, the total body weight loss was 20.6%. Following LSG in patients with diabetes, the median daily insulin requirements decreased from 98 (49-118) to 0 (0-29) units/day (P = 0.02), and 60% discontinued insulin. Post-LT patients had a similar decrease in BMI and reduction in comorbidities at 1 year compared with a matched non-LT patient cohort. In the largest patient series to date, we show that LSG following LT is safe, effective, and does not increase the incidence of liver allograft rejection. Larger longer-term studies are needed to confirm underlying metabolic changes following LSG.


Subject(s)
Bariatric Surgery/adverse effects , Graft Rejection/epidemiology , Liver Transplantation/adverse effects , Non-alcoholic Fatty Liver Disease/surgery , Obesity, Morbid/surgery , Secondary Prevention/methods , Bariatric Surgery/methods , Female , Gastrectomy/adverse effects , Gastrectomy/methods , Graft Rejection/etiology , Graft Rejection/prevention & control , Humans , Incidence , Laparoscopy/adverse effects , Laparoscopy/statistics & numerical data , Length of Stay , Male , Middle Aged , Non-alcoholic Fatty Liver Disease/etiology , Non-alcoholic Fatty Liver Disease/prevention & control , Obesity, Morbid/complications , Postoperative Period , Retrospective Studies , Secondary Prevention/statistics & numerical data , Surgical Wound Infection/epidemiology , Surgical Wound Infection/etiology , Time-to-Treatment , Treatment Outcome , Weight Loss
10.
Surgery ; 166(4): 632-638, 2019 10.
Article in English | MEDLINE | ID: mdl-31472973

ABSTRACT

BACKGROUND: The impact of recent preoperative opioid exposure on outcomes of colorectal surgery is unclear. Our aim was to evaluate the impact of preoperative opioid use on outcomes and opioid prescribing patterns after colorectal surgery. METHODS: We performed a retrospective review of all patients undergoing elective resection at a single institution from 2015 to 2017. Primary outcomes included in-hospital narcotic use and cost. Secondary outcomes included postoperative surgical outcomes and discharge prescribing patterns. RESULTS: A total of 390 patients underwent elective colorectal surgery, of whom 63 (16%) had a recent history of preoperative opioid use. Opioid users had similar age, sex, American Society of Anesthesiologists score, and operative indication compared with opioid-naïve patients (P > .05 for each). Postoperatively, the 30-day readmission rate was greater among opioid users (18% vs 9%, P = .03). Opioid users had greater total narcotic use (218 morphine milligram equivalents vs 111 morphine milligram equivalents, P = .04) and direct costs ($11,165 vs $8,911, P < .01). These patients were also more likely to require an opioid prescription on discharge (90% vs 68%, P < .01) and an opioid refill within 30 days (54% vs 21%, P < .01). CONCLUSION: Recent preoperative opioid exposure among colorectal surgery patients was associated with increased opioid consumption and costs. Moreover, unadjusted analysis was pertinent for more readmissions after surgery among preoperative opioid users. This work underscores the negative impact of preoperative, chronic opioid use on surgical outcomes and highlights the need for developing protocols to minimize perioperative narcotics.


Subject(s)
Analgesics, Opioid/administration & dosage , Colorectal Neoplasms/surgery , Colorectal Surgery/methods , Elective Surgical Procedures/methods , Length of Stay/economics , Pain, Postoperative/drug therapy , Aged , Analgesics, Opioid/economics , Cohort Studies , Colorectal Neoplasms/mortality , Colorectal Neoplasms/pathology , Colorectal Surgery/mortality , Cost-Benefit Analysis , Elective Surgical Procedures/mortality , Female , Humans , Male , Middle Aged , Pain Measurement , Pain, Postoperative/physiopathology , Preoperative Period , Prognosis , Propensity Score , Retrospective Studies , Risk Assessment , Statistics, Nonparametric , Treatment Outcome
11.
Surg Open Sci ; 1(2): 74-79, 2019 Oct.
Article in English | MEDLINE | ID: mdl-32754696

ABSTRACT

BACKGROUND: Enhanced recovery protocols are associated with improved recovery. However, data on outcomes following the implementation of an enhanced recovery protocol in colorectal cancer are limited. We set out to study the postoperative outcomes, opioid use patterns, and cost impact for patients undergoing colon or rectal resection for cancer. METHODS: A retrospective review of all elective colorectal cancer resections from January 2015 to June 2018 at a single institution was performed. Patient demographics, operative details, and postoperative outcomes were collected. Colon and rectal patients were studied separately, with comparison of patients before and after the implementation of an enhanced recovery protocol. RESULTS: One hundred ninety-two patients underwent elective colorectal resection for cancer. In January 2016, an enhanced recovery protocol was implemented for all elective resections - 71 patients (33 colon and 38 rectal) underwent surgery before implementation and 121 patients (56 colon and 65 rectal) underwent surgery after implementation of the enhanced recovery protocol. There were no differences with regard to age, gender, or body mass index before or after implementation (all P > .05). For both colon and rectal cancer patients, the enhanced recovery protocol reduced time to regular diet (both P < .05) and length of stay (colon: 3 vs 4 days; rectal: 4 vs 6 days; both P < .01). Enhanced recovery protocol patients also consumed fewer total narcotics (colon: 44 vs 184 morphine milligram equivalents, P < .01; rectal: 121 vs 393 morphine milligram equivalents, P < .01). CONCLUSIONS: Enhanced recovery protocol use reduced length of stay and narcotic use with similar total costs and no difference in 30-day complications for both colon and rectal cancer resections.

12.
J Surg Res ; 231: 373-379, 2018 11.
Article in English | MEDLINE | ID: mdl-30278956

ABSTRACT

BACKGROUND: Minimizing the interval between diagnosis of sepsis and administration of antibiotics improves patient outcomes. We hypothesized that a commercially available bedside clinical surveillance visualization system (BSV) would hasten antibiotic administration and decrease length of stay (LOS) in surgical intensive care unit (SICU) patients. METHODS: A BSV, integrated with the electronic medical record and displayed at bedside, was implemented in our SICU in July 2016. A visual sepsis screen score (SSS) was added in July 2017. All patients admitted to SICU beds with bedside displays equipped with a BSV were analyzed to determine mean SSS, maximum SSS, time from positive SSS to antibiotic administration, SICU LOS, and mortality. RESULTS: During the study period, 232 patients were admitted to beds equipped with the clinical surveillance visualization system. Thirty patients demonstrated positive SSS followed by confirmed sepsis (23 Pre-SSS versus 7 Post-SSS). Mean and maximum SSS were similar. Time from positive SSS to antibiotic administration was decreased in patients with a visual SSS (55.3 ± 15.5 h versus 16.2 ± 9.2 h; P < 0.05). ICU and hospital LOS was also decreased (P < 0.01). CONCLUSIONS: Implementation of a visual SSS into a BSV led to a decreased time interval between the positive SSS and administration of antibiotics and was associated with shorter SICU and hospital LOS. Integration of a visual decision support system may help providers adhere to Surviving Sepsis Guidelines.


Subject(s)
Computer Systems , Critical Care/methods , Decision Support Systems, Clinical , Point-of-Care Testing , Postoperative Complications/diagnosis , Quality Improvement/statistics & numerical data , Sepsis/diagnosis , Adult , Aged , Anti-Bacterial Agents/therapeutic use , Cohort Studies , Critical Care/standards , Female , Guideline Adherence/statistics & numerical data , Humans , Length of Stay/statistics & numerical data , Male , Middle Aged , Postoperative Complications/drug therapy , Postoperative Complications/mortality , Practice Guidelines as Topic , Sepsis/drug therapy , Sepsis/etiology , Sepsis/mortality , Time Factors , Treatment Outcome
13.
Brain Inj ; 32(13-14): 1834-1842, 2018.
Article in English | MEDLINE | ID: mdl-30136863

ABSTRACT

BACKGROUND: Resuscitation strategies for combined traumatic brain injury (TBI) with haemorrhage in austere environments are not fully established. Our aim was to establish the effects of various saline concentrations in a murine model of combined TBI and haemorrhage, and identify an effective resuscitative strategy for the far-forward environment. METHODS: Male C57BL/6 mice underwent closed head injury and subjected to controlled haemorrhage to a systolic blood pressure of 25 mmHg via femoral artery cannulation for 60 min. Mice were resuscitated with a fixed volume bolus or variable volumes of fluid to achieve a systolic blood pressure goal of 80 mmHg with 0.9% saline, 3% saline, 0.1-mL bolus of 23.4% saline, or a 0.1-mL bolus of 23.4% saline followed by 0.9% saline (23.4+). RESULTS: 23.4% saline and 23.4+ resulted in higher mortality at 6 h compared to 0.9% saline. Use of 3% saline required less volume to achieve targeted resuscitation, did not affect survival, and did not exacerbate post-traumatic inflammation. While 23.4+ resuscitation utilized lower volume, it resulted in hypernatremia, azotemia, and elevated systemic pro-inflammatory cytokines. All groups except 3% saline demonstrated progression of neuron damage, with cerebral oedema highest with 0.9% saline. CONCLUSIONS: 3% saline demonstrated favourable balance of survival, blood pressure restoration, minimization of inflammation, and prevention of ongoing neurologic injury without contributing to significant physiologic derangements. 23.4% saline administration may not be appropriate in the setting of concomitant hypotension.


Subject(s)
Brain Injuries, Traumatic/complications , Brain Injuries, Traumatic/drug therapy , Hemorrhage/complications , Hemorrhage/drug therapy , Resuscitation/methods , Saline Solution/therapeutic use , Animals , Blood-Brain Barrier/drug effects , Blood-Brain Barrier/pathology , Cytokines/metabolism , Disease Models, Animal , Dose-Response Relationship, Drug , Electrolytes/blood , Hemodynamics/drug effects , Male , Mice , Mice, Inbred C57BL , Survival Rate
14.
Shock ; 49(3): 288-294, 2018 03.
Article in English | MEDLINE | ID: mdl-29438268

ABSTRACT

Microparticles are submicron vesicles shed from aging erythrocytes as a characteristic feature of the red blood cell (RBC) storage lesion. Exposure of pulmonary endothelial cells to RBC-derived microparticles promotes an inflammatory response, but the mechanisms underlying microparticle-induced endothelial cell activation are poorly understood. In the present study, cultured murine lung endothelial cells (MLECs) were treated with microparticles isolated from aged murine packed RBCs or vehicle. Microparticle-treated cells demonstrated increased expression of the adhesion molecules ICAM and E-selectin, as well as the cytokine, IL-6. To identify mechanisms that mediate these effects of microparticles on MLECs, cells were treated with microparticles covalently bound to carboxyfluorescein succinimidyl ester (CFSE) and cellular uptake of microparticles was quantified via flow cytometry. Compared with controls, there was a greater proportion of CFSE-positive MLECs from 15 min up to 24 h, suggesting endocytosis of the microparticles by endothelial cells. Colocalization of microparticles with lysosomes was observed via immunofluorescence, indicating endocytosis and endolysosomal trafficking. This process was inhibited by endocytosis inhibitors. SiRNA knockdown of Rab5 signaling protein in endothelial cells resulted in impaired microparticle uptake as compared with nonsense siRNA-treated cells, as well as an attenuation of the inflammatory response to microparticle treatment. Taken together, these data suggest that endocytosis of RBC-derived microparticles by lung endothelial cells results in endothelial cell activation. This response seems to be mediated, in part, by the Rab5 signaling protein.


Subject(s)
Cell-Derived Microparticles/metabolism , Endocytosis , Epithelial Cells/metabolism , Erythrocytes/metabolism , Lung/metabolism , Respiratory Mucosa/metabolism , rab5 GTP-Binding Proteins/metabolism , Animals , Epithelial Cells/cytology , Erythrocytes/cytology , Lung/cytology , Male , Mice , Respiratory Mucosa/cytology
15.
J Am Coll Surg ; 226(4): 586-593, 2018 04.
Article in English | MEDLINE | ID: mdl-29421693

ABSTRACT

BACKGROUND: Enhanced recovery pathways (ERPs) aim to reduce length of stay without adversely affecting short-term outcomes. High pharmaceutical costs associated with ERP regimens, however, remain a significant barrier to widespread implementation. We hypothesized that ERP would reduce hospital costs after elective colorectal resections, despite the use of more expensive pharmaceutical agents. STUDY DESIGN: An ERP was implemented in January 2016 at our institution. We collected data on consecutive colorectal resections for 1 year before adoption of ERP (traditional, n = 160) and compared them with consecutive resections after universal adoption of ERP (n = 146). Short-term surgical outcomes, total direct costs, and direct hospital pharmacy costs were compared between patients who received the ERP and those who did not. RESULTS: After implementation of the ERP, median length of stay decreased from 5.0 to 3.0 days (p < 0.01). There were no differences in 30-day complications (8.1% vs 8.9%) or hospital readmission (11.9% vs 11.0%). The ERP patients required significantly less narcotics during their index hospitalization (211.7 vs 720.2 morphine equivalence units; p < 0.01) and tolerated a regular diet 1 day sooner (p < 0.01). Despite a higher daily pharmacy cost ($477 per day vs $318 per day in the traditional cohort), the total direct pharmacy cost for the hospitalization was reduced in ERP patients ($1,534 vs $1,859; p = 0.016). Total direct cost was also lower in ERP patients ($9,791 vs $11,508; p = 0.004). CONCLUSIONS: Implementation of an ERP for patients undergoing elective colorectal resection substantially reduced length of stay, total hospital cost, and direct pharmacy cost without increasing complications or readmission rates. Enhanced recovery pathway after colorectal resection has both clinical and financial benefits. Widespread implementation has the potential for a dramatic impact on healthcare costs.


Subject(s)
Colectomy/economics , Critical Pathways/economics , Direct Service Costs , Drug Costs , Hospital Costs , Proctectomy/economics , Adult , Aged , Female , Humans , Length of Stay/economics , Male , Middle Aged , Perioperative Care/economics
16.
J Gastrointest Surg ; 22(1): 98-106, 2018 01.
Article in English | MEDLINE | ID: mdl-28849353

ABSTRACT

BACKGROUND: Due to disparities in access to care, patients with Medicaid or no health insurance are at risk of not receiving appropriate adjuvant treatment following resection of pancreatic cancer. We have previously shown inferior short-term outcomes following surgery at safety-net hospitals. Subsequently, we hypothesized that safety-net hospitals caring for these vulnerable populations utilize less adjuvant chemoradiation, resulting in inferior long-term outcomes. METHODS: The American College of Surgeons National Cancer Data Base was queried for patients diagnosed with pancreatic adenocarcinoma (n = 32,296) from 1998 to 2010. Hospitals were grouped according to safety-net burden, defined as the proportion of patients with Medicaid or no insurance. The highest quartile, representing safety-net hospitals, was compared to lower-burden hospitals with regard to patient demographics, disease characteristics, surgical management, delivery of multimodal systemic therapy, and survival. RESULTS: Patients at safety-net hospitals were less often white, had lower income, and were less educated. Safety-net hospital patients were just as likely to undergo surgical resection (OR 1.03, p = 0.73), achieving similar rates of negative surgical margins when compared to patients at medium and low burden hospitals (70% vs. 73% vs. 66%). Thirty-day mortality rates were 5.6% for high burden hospitals, 5.2% for medium burden hospitals, and 4.3% for low burden hospitals. No clinically significant differences were noted in the proportion of surgical patients receiving either chemotherapy (48% vs. 52% vs. 52%) or radiation therapy (26% vs. 30% vs. 29%) or the time between diagnosis and start of systemic therapy (58 days vs. 61 days vs. 53 days). Across safety-net burden groups, no difference was noted in stage-specific median survival (all p > 0.05) or receipt of adjuvant as opposed to neoadjuvant systemic therapy (82% vs. 85% vs. 85%). Multivariate analysis adjusting for cancer stage revealed no difference in survival for safety-net hospital patients who had surgery and survived > 30 days (HR 1.02, p = 0.63). CONCLUSION: For patients surviving the perioperative setting following pancreatic cancer surgery, safety-net hospitals achieve equivalent long-term survival outcomes potentially due to equivalent delivery of multimodal therapy at non-safety-net hospitals. Safety-net hospitals are a crucial resource that provides quality long-term cancer treatment for vulnerable populations.


Subject(s)
Hospitals/statistics & numerical data , Pancreatic Neoplasms/therapy , Quality of Health Care/statistics & numerical data , Safety-net Providers/statistics & numerical data , Aged , Chemotherapy, Adjuvant/statistics & numerical data , Databases, Factual , Female , Hospitals/classification , Humans , Male , Medicaid/statistics & numerical data , Medically Uninsured/statistics & numerical data , Neoadjuvant Therapy/statistics & numerical data , Neoplasm Staging , Neoplasm, Residual , Pancreatectomy , Radiotherapy, Adjuvant/statistics & numerical data , Time Factors , United States
17.
Surgery ; 163(2): 423-429, 2018 02.
Article in English | MEDLINE | ID: mdl-29198748

ABSTRACT

BACKGROUND: Red blood cell-derived microparticles are biologically active, submicron vesicles shed by erythrocytes during storage. Recent clinical studies have linked the duration of red blood cell storage with thromboembolic events in critically ill transfusion recipients. In the present study, we hypothesized that microparticles from aged packed red blood cell units promote a hypercoagulable state in a murine model of transfusion. METHODS: Microparticles were isolated from aged, murine packed red blood cell units via serial centrifugation. Healthy male C57BL/6 mice were transfused with microparticles or an equivalent volume of vehicle, and whole blood was harvested for analysis via rotational thromboelastometry. Serum was harvested from a separate set of mice after microparticles or saline injection, and analyzed for fibrinogen levels. Red blood cell-derived microparticles were analyzed for their ability to convert prothrombin to thrombin. Finally, mice were transfused with either red blood cell microparticles or saline vehicle, and a tail bleeding time assay was performed after an equilibration period of 2, 6, 12, or 24 hours. RESULTS: Mice injected with red blood cell-derived microparticles demonstrated an accelerated clot formation time (109.3 ± 26.9 vs 141.6 ± 28.2 sec) and increased α angle (68.8 ± 5.0 degrees vs 62.8 ± 4.7 degrees) compared with control (each P < .05). Clotting time and maximum clot firmness were not significantly different between the 2 groups. Red blood cell-derived microparticles exhibited a hundredfold greater conversion of prothrombin substrate to its active thrombin form (66.60 ± 0.03 vs 0.70 ± 0.01 peak OD; P<.0001). Additionally, serum fibrinogen levels were lower in microparticles-injected mice compared with saline vehicle, suggesting thrombin-mediated conversion to insoluble fibrin (14.0 vs 16.5 µg/mL, P<.05). In the tail bleeding time model, there was a more rapid cessation of bleeding at 2 hours posttransfusion (90.6 vs 123.7 sec) and 6 hours posttransfusion (87.1 vs 141.4 sec) in microparticles-injected mice as compared with saline vehicle (each P<.05). There was no difference in tail bleeding time at 12 or 24 hours. CONCLUSION: Red blood cell-derived microparticles induce a transient hypercoagulable state through accelerated activation of clotting factors.


Subject(s)
Cell-Derived Microparticles , Thrombophilia , Transfusion Reaction , Animals , Blood Transfusion , Male , Mice, Inbred C57BL , Models, Animal
18.
Surgery ; 163(3): 528-534, 2018 03.
Article in English | MEDLINE | ID: mdl-29198768

ABSTRACT

BACKGROUND: Before elective colectomy, many advocate mechanical bowel preparation with oral antibiotics, whereas enhanced recovery pathways avoid mechanical bowel preparations. The optimal preparation for right versus left colectomy is also unclear. We sought to determine which strategy for bowel preparation decreases surgical site infection (SSI) and anastomotic leak (AL). METHODS: Elective colectomies from the National Surgical Quality Improvement Program colectomy database (2012-2015) were divided by (1) type of bowel preparation: no preparation (NP), mechanical preparation (MP), oral antibiotics (PO), or mechanical and oral antibiotics (PO/MP); and (2) type of colonic resection: right, left, or segmental colectomy. Univariate and multivariate analyses identified predictors of SSI and AL, and their risk-adjusted incidence was determined by logistic regression. RESULTS: When analyzed as the odds ratio compared with NP, the PO and PO/MP groups were associated with a decrease in SSI (PO = 0.70 [0.55-0.88] and PO/MP = 0.47 [0.42-0.53]; P < .01). Use of PO/MP was associated with a decrease in SSI across all types of resections (right colectomy = 0.40 [0.33-0.50], left colectomy = 0.57 [0.47-0.68], and segmental colectomy = 0.43 (0.34-0.54); P < .01). Similarly, use of PO/MP was associated with a decrease in AL in left colectomy = 0.50 ([0.37-0.69]; P < .01) and segmental colectomy = 0.53 ([0.36-0.80]; P < .01). CONCLUSION: Mechanical bowel preparation with oral antibiotics is the preferred preoperative preparation strategy in elective colectomy because of decreased incidence of SSI and AL.


Subject(s)
Anastomotic Leak/prevention & control , Antibiotic Prophylaxis , Cathartics/therapeutic use , Colectomy/adverse effects , Preoperative Care , Surgical Wound Infection/prevention & control , Administration, Oral , Adolescent , Adult , Aged , Aged, 80 and over , Anastomotic Leak/epidemiology , Databases, Factual , Female , Humans , Incidence , Logistic Models , Male , Middle Aged , Odds Ratio , Retrospective Studies , Surgical Wound Infection/epidemiology , United States , Young Adult
19.
Surgery ; 163(3): 565-570, 2018 03.
Article in English | MEDLINE | ID: mdl-29195735

ABSTRACT

BACKGROUND: Despite the potential benefits of social media, health care providers are often hesitant to engage patients through these sites. Our aim was to explore how implementation of social media may affect patient engagement and satisfaction. METHODS: In September 2016 a Facebook support group was created for liver transplant patients to use as a virtual community forum. Data including user demographics and group activity were reviewed. A survey was conducted evaluating users' perceptions regarding participation in the group. RESULTS: Over 9 months, 350 unique users (50% liver transplant patients, 36% caregivers/friends, 14% health care providers) contributed 339 posts, 2,338 comments, and 6,274 reactions to the group; 98% of posts were reacted to or commented on by other group members. Patients were the most active users compared with health care providers and caregivers. A total of 95% of survey respondents reported that joining the group had a positive impact on their care; and 97% reported that their main motivation for joining was to provide or receive support from other patients. CONCLUSION: This pilot study indicates that the integration of social media into clinical practice can empower surgeons to synthesize effectively a patient support community that augments patient engagement and satisfaction.


Subject(s)
Liver Transplantation , Patient Participation , Social Media , Social Support , Adolescent , Adult , Female , Humans , Male , Middle Aged , Motivation , Patient Satisfaction , Pilot Projects , Qualitative Research , Time Factors , Young Adult
20.
HPB (Oxford) ; 20(3): 268-276, 2018 03.
Article in English | MEDLINE | ID: mdl-28988703

ABSTRACT

BACKGROUND: We aimed to characterize variability in cost after straightforward orthotopic liver transplant (OLT). METHODS: Using the University HealthSystem Consortium and Scientific Registry of Transplant Recipients databases, we identified patients who underwent OLT between 2011 and 2014. Patients meeting criteria for straightforward OLT, defined as length of stay < 14 days with discharge to home, were selected (n = 5763) and grouped into tertiles (low, medium, high) according to cost of perioperative stay. RESULTS: Patients undergoing straightforward OLT were of similar demographics regardless of cost. High cost patients were more likely to require preoperative hemodialysis, had higher severity of illness, and higher model for end-stage liver disease (MELD) (p < 0.01). High cost patients required greater utilization of resources including lab tests, blood transfusions, and opioids (p < 0.01). Despite having higher burden of disease and requiring increased resource utilization, high cost OLT patients with a straightforward perioperative course were shown to have identical 2-year graft and overall survival compared to lower cost patients (p = 0.82 and p = 0.63), respectively. CONCLUSION: Providing adequate perioperative care for OLT patients with higher severity of illness and disease burden requires increased cost and resource utilization; however, doing so provides these patients with long term survival equivalent to more routine patients.


Subject(s)
End Stage Liver Disease/economics , End Stage Liver Disease/surgery , Hospital Costs , Liver Transplantation/economics , Outcome and Process Assessment, Health Care/economics , Adolescent , Adult , Aged , End Stage Liver Disease/diagnosis , Female , Graft Survival , Health Status , Humans , Length of Stay/economics , Liver Transplantation/adverse effects , Male , Middle Aged , Patient Discharge/economics , Postoperative Care/economics , Renal Dialysis/economics , Retrospective Studies , Severity of Illness Index , Time Factors , Treatment Outcome , United States , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...