Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 94
Filter
1.
J Perinatol ; 2024 May 20.
Article in English | MEDLINE | ID: mdl-38769336

ABSTRACT

OBJECTIVE: To determine the safety and effectiveness of sodium bicarbonate administration in the management of metabolic acidemia and short-term outcomes in neonates with hypoxic-ischemic encephalopathy (HIE). STUDY DESIGN: Retrospective cohort study of neonates born at ≥35 weeks of gestation and receiving therapeutic hypothermia. Demographics, pH, lactate, base deficit, treatment, MRI findings, seizure incidence, death prior to discharge were collected. RESULTS: There was higher mortality (p = 0.010) and injury on MRI (p = 0.008)-primarily deep gray matter (p < 0.001) and cortical injury (p = 0.003)-in the bicarbonate group compared to controls in univariate analysis. The combined outcome of death or abnormal MRI was not significantly associated (OR 1.97, 95% CI 0.80-4.87, p = 0.141) with bicarbonate administration when adjusting for sex, 5-minute Apgar, and initial base deficit. CONCLUSION: This study demonstrated association between bicarbonate use after HIE and negative short-term outcomes. Future prospective trials could overcome the treatment bias limitation demonstrated in this retrospective study.

2.
J Arthroplasty ; 2024 Apr 09.
Article in English | MEDLINE | ID: mdl-38604279

ABSTRACT

BACKGROUND: Tibial bone defects are commonly encountered in revision total knee arthroplasty (rTKA) and can be managed with metaphyseal cones or sleeves. Few studies have directly compared tibial cones and sleeves in rTKA, and none have limited this comparison to the most severe tibial defects. The purpose of this study was to evaluate and compare the outcomes of metaphyseal cones and sleeves for tibial reconstruction in rTKA regarding implant fixation and clinical outcomes. METHODS: A retrospective review was conducted on patients undergoing rTKA in which metaphyseal cones or sleeves were utilized for addressing metaphyseal bone loss (34 cones and 18 sleeves). Tibial bone loss was classified according to the Anderson Orthopaedic Research Institute bone defect classification, with types 2B and 3 being included. Patient-reported outcomes and postoperative complications were collected, and a radiographic evaluation of osseointegration or loosening was performed. RESULTS: There were 52 knees included (34 cones, 18 sleeves), with a median follow-up of 41.0 months. All-cause implant survival was 100% at 2 years and 96% (95% confidence interval: 76 to 99%) at 4 years, with 98% of tibial components demonstrating osseointegration at the final follow-up. During follow-up, there were a total 11 revisions, of which 1 sleeve was revised secondary to implant loosening. Tibial sleeves had a higher risk of revision compared to tibial cones (P < .01), and sleeves fixed with a hybrid technique were more likely to need revision than cones fixed by the same method (P = .01). CONCLUSIONS: Porous metaphyseal tibial cones and tibial metaphyseal sleeves both performed well at a 41-month median follow-up with no difference in aseptic survivorship between the 2 constructs. Both demonstrate high rates of osseointegration, low rates of aseptic failure, and significant improvement in Knee Society Scores in patients with severe tibial defects in rTKA.

3.
Bone Joint J ; 106-B(3 Supple A): 38-43, 2024 Mar 01.
Article in English | MEDLINE | ID: mdl-38423114

ABSTRACT

Aims: Oxidized zirconium (OxZi) and highly cross-linked polyethylene (HXLPE) were developed to minimize wear and risk of osteolysis in total hip arthroplasty (THA). However, retrieval studies have shown that scratched femoral heads may lead to runaway wear, and few reports of long-term results have been published. The purpose of this investigation is to report minimum ten-year wear rates and clinical outcomes of THA with OxZi femoral heads on HXLPE, and to compare them with a retrospective control group of cobalt chrome (CoCr) or ceramic heads on HXLPE. Methods: From 2003 to 2006, 108 THAs were performed on 96 patients using an OxZi head with a HXLPE liner with minimum ten-year follow-up. Harris Hip Scores (HHS) were collected preoperatively and at the most recent follow-up (mean 13.3 years). Linear and volumetric liner wear was measured on radiographs of 85 hips with a minimum ten-year follow-up (mean 14.5 years). This was compared to a retrospective control group of 45 THAs using ceramic or CoCr heads from October 1999 to February 2005, with a minimum of ten years' follow-up. Results: Average HHS improved from 50.8 to 91.9 and 51.0 to 89.8 in the OxZi group and control group, respectively (p = 0.644), with no osteolysis in either group. Linear and volumetric wear rates in the OxZi group averaged 0.03 mm/year and 3.46 mm3/year, respectively. There was no statistically significant difference in HHS scores, nor in linear or volumetric wear rate between the groups, and no revision for any indication. Conclusion: The radiological and clinical outcomes, and survivorship of THA with OxZi femoral heads and HXLPE liners, were excellent, and comparable to CoCr or ceramic heads at minimum ten-year follow-up. Wear rates are below what would be expected for development of osteolysis. OxZi-HXLPE is a durable bearing couple with excellent long-term outcomes.


Subject(s)
Arthroplasty, Replacement, Hip , Hip Prosthesis , Osteolysis , Humans , Polyethylene , Zirconium , Retrospective Studies , Femur Head/surgery , Osteolysis/etiology , Osteolysis/surgery , Prosthesis Failure , Prosthesis Design , Chromium Alloys
4.
Ann Pharmacother ; : 10600280231221241, 2024 Jan 21.
Article in English | MEDLINE | ID: mdl-38247044

ABSTRACT

BACKGROUND: Phenobarbital may offer advantages over benzodiazepines for severe alcohol withdrawal syndrome (SAWS), but its impact on clinical outcomes has not been fully elucidated. OBJECTIVE: The purpose of this study was to determine the clinical impact of phenobarbital versus benzodiazepines for SAWS. METHODS: This retrospective cohort study compared phenobarbital to benzodiazepines for the management of SAWS for patients admitted to progressive or intensive care units (ICUs) between July 2018 and July 2022. Patients included had a history of delirium tremens (DT) or seizures, Clinical Institute Withdrawal Assessment of Alcohol-Revised (CIWA-Ar) >15, or Prediction of Alcohol Withdrawal Severity Scale (PAWSS) score ≥4. The primary outcome was hospital length of stay (LOS). Secondary outcomes included progressive or ICU LOS, incidence of adjunctive pharmacotherapy, and incidence/duration of mechanical ventilation. RESULTS: The final analysis included 126 phenobarbital and 98 benzodiazepine encounters. Patients treated with phenobarbital had shorter median hospital LOS versus those treated with benzodiazepines (2.8 vs 4.7 days; P < 0.0001); a finding corroborated by multivariable analysis. The phenobarbital group also had shorter median progressive/ICU LOS (0.7 vs 1.3 days; P < 0.0001), and lower incidence of dexmedetomidine (P < 0.0001) and antipsychotic initiation (P < 0.0001). Fewer patients in the phenobarbital group compared to the benzodiazepine group received new mechanical ventilation (P = 0.045), but median duration was similar (1.2 vs 1.6 days; P = 1.00). CONCLUSION AND RELEVANCE: Scheduled phenobarbital was associated with decreased hospital LOS compared to benzodiazepines for SAWS. This was the first study to compare outcomes of fixed-dose, nonoverlapping phenobarbital to benzodiazepines in patients with clearly defined SAWS and details a readily implementable protocol.

5.
Infect Control Hosp Epidemiol ; 45(1): 35-39, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37466074

ABSTRACT

OBJECTIVE: Determination of whether vascular catheter disinfecting antiseptic-containing caps alone are effective at decreasing microbial colonization of connectors compared to antiseptic-containing caps plus a 5-second alcohol manual disinfection. SETTING: The study was conducted in a 718-bed, tertiary-care, academic hospital. PATIENTS: A convenience sample of adult patients across intensive care units and acute care wards with peripheral and central venous catheters covered with antiseptic-containing caps. METHODS: Quality improvement study completed over 5 days. The standard-of-care group consisted of catheter connectors with antiseptic-containing caps cleaned with a 5-second alcohol wipe scrub prior to culture. The comparison group consisted of catheter connectors with antiseptic-containing caps without a 5-second alcohol wipe scrub prior to culture. The connectors were pressed directly onto blood agar plates and incubated. Plates were assessed for growth after 48-72 hours. RESULTS: In total, 356 catheter connectors were cultured: 165 in the standard-of-care group, 165 in the comparison group, and 26 catheters connectors without an antiseptic-containing cap, which were designated as controls. Overall, 18 catheter connectors (5.06%) yielded microbial growth. Of the 18 connectors with microbial growth, 2 (1.21%) were from the comparison group, 1 (0.61%) was from the standard-of-care group, and 15 were controls without an antiseptic-containing cap. CONCLUSIONS: Bacterial colonization rates were similar between the catheter connectors cultured with antiseptic-containing caps alone and catheter connectors with antiseptic-containing caps cultured after a 5-second scrub with alcohol. This finding suggests that the use of antiseptic-containing caps precludes the need for additional disinfection.


Subject(s)
Anti-Infective Agents, Local , Catheter-Related Infections , Catheterization, Central Venous , Central Venous Catheters , Adult , Humans , Anti-Infective Agents, Local/pharmacology , Disinfection , Ethanol , Chlorhexidine/pharmacology , Catheter-Related Infections/prevention & control , Equipment Contamination/prevention & control
6.
Nutrients ; 15(19)2023 Oct 07.
Article in English | MEDLINE | ID: mdl-37836561

ABSTRACT

Little is known about the inflammatory potential of diet and its relation to bone health. This cross-sectional study examined the association between the inflammatory potential of diet and bone-related outcomes in midwestern, post-menopausal women enrolled in the Heartland Osteoporosis Prevention Study (HOPS) randomized controlled trial. Dietary intake from the HOPS cohort was used to calculate Dietary Inflammatory Index (DII®) scores, which were energy-adjusted (E-DIITM) and analyzed by quartile. The association between E-DII and lumbar and hip bone mineral density (BMD) and lumbar trabecular bone scores (TBS; bone structure) was assessed using ANCOVA, with pairwise comparison to adjust for relevant confounders (age, education, race/ethnicity, smoking history, family history of osteoporosis/osteopenia, BMI, physical activity, and calcium intake). The cohort included 272 women, who were predominately white (89%), educated (78% with college degree or higher), with a mean BMI of 27 kg/m2, age of 55 years, and E-DII score of -2.0 ± 1.9 (more anti-inflammatory). After adjustment, E-DII score was not significantly associated with lumbar spine BMD (p = 0.53), hip BMD (p = 0.29), or TBS at any lumbar location (p > 0.05). Future studies should examine the longitudinal impact of E-DII scores and bone health in larger, more diverse cohorts.


Subject(s)
Osteoporosis , Postmenopause , Humans , Female , Middle Aged , Cross-Sectional Studies , Diet , Bone Density , Absorptiometry, Photon , Lumbar Vertebrae
7.
Antioxidants (Basel) ; 12(4)2023 Apr 18.
Article in English | MEDLINE | ID: mdl-37107321

ABSTRACT

Inflammation plays a key role in cancer development. As an important modulator of inflammation, the role of diet should be explored. The purpose of this study was to determine the association between diets with a higher inflammatory potential, as measured by the Dietary Inflammatory Index (DII®), and cancer development in a cohort of rural post-menopausal women. Dietary intake from a randomized controlled trial cohort of rural, post-menopausal women in Nebraska was used to compute energy-adjusted DII (E-DIITM) scores at baseline and four years later (visit 9). A linear mixed model analysis and multivariate logistic regression evaluated the association between E-DII scores (baseline, visit 9, change score) and cancer status. Of 1977 eligible participants, those who developed cancer (n = 91, 4.6%) had a significantly larger, pro-inflammatory change in E-DII scores (Non-cancer: Δ 0.19 ± 1.43 vs. Cancer: Δ 0.55 ± 1.43, p = 0.02). After adjustment, odds of cancer development were over 20% higher in those with a larger change (more pro-inflammatory) in E-DII scores than those with smaller E-DII changes (OR = 1.21, 95% CI [1.02, 1.42], p = 0.02). Shifting to a more pro-inflammatory diet pattern over four years was associated with increased odds of cancer development, but not with E-DII at baseline or visit 9 alone.

8.
J Arthroplasty ; 38(6S): S326-S330, 2023 06.
Article in English | MEDLINE | ID: mdl-36813212

ABSTRACT

BACKGROUND: Periprosthetic joint infection (PJI) is a devastating complication of knee and hip arthroplasty. Past literature has shown that gram-positive bacteria are commonly responsible for these infections, although limited research exists studying the changes in the microbial profile of PJIs over time. This study sought to analyze the incidence and trends of pathogens responsible for PJI over three decades. METHODS: This is a multi-institutional retrospective review of patients who had a knee or hip PJI from 1990 to 2020. Patients with a known causative organism were included and those with insufficient culture sensitivity data were excluded. There were 731 eligible joint infections from 715 patients identified. Organisms were divided into multiple categories based on genus/species and 5-year increments were used to analyze the study period. The Cochran-Armitage trend tests were used to evaluate linear trends in microbial profile over time and a P-value <.05 was considered statistically significant. RESULTS: There was a statistically significant positive linear trend in the incidence of methicillin-resistant Staphylococcus aureus over time (P = .0088) as well as a statistically significant negative linear trend in the incidence of coagulase-negative staphylococci over time (P = .0018). There was no statistical significance between organism and affected joint (knee/hip). CONCLUSION: The incidence of methicillin-resistant Staphylococcus aureus PJI is increasing over time, whereas, coagulase-negative staphylococci PJI is decreasing, paralleling the global trend of antibiotic resistance. Identifying these trends may help with the prevention and treatment of PJI through methods such as remodeling perioperative protocols, modifying prophylactic/empiric antimicrobial approaches, or transitioning to alternative therapeutic strategies.


Subject(s)
Arthroplasty, Replacement, Hip , Arthroplasty, Replacement, Knee , Methicillin-Resistant Staphylococcus aureus , Prosthesis-Related Infections , Staphylococcal Infections , Humans , Arthroplasty, Replacement, Hip/adverse effects , Incidence , Coagulase/therapeutic use , Arthroplasty, Replacement, Knee/adverse effects , Retrospective Studies , Prosthesis-Related Infections/epidemiology , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/drug therapy , Anti-Bacterial Agents/therapeutic use , Staphylococcal Infections/epidemiology , Staphylococcal Infections/etiology , Staphylococcal Infections/drug therapy
9.
J Interv Card Electrophysiol ; 65(3): 765-772, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36056221

ABSTRACT

BACKGROUND: The development of new left bundle branch block (LBBB) is frequently seen post TAVR and is a known risk factor for progression to high degree AV block. The timing and likelihood of progression into complete heart block is variable and can develop after hospital discharge. We sought to determine predictors for the development of high degree AV block in patients who developed LBBB following TAVR. METHODS: All patients between 2014 and 2019 underwent electrophysiology study after developing LBBB post TAVR. Data on these patients including baseline characteristics, echo parameters, EKG variables, HV interval, and the need for subsequent pacemaker implantation were extracted. A prolonged HV interval was defined as ≥ 65 ms. Clinically significant conduction abnormality was defined as development of high-degree AV block or clinically significant complete heart block. RESULTS: Thirty-four patients were included in our study of which 10 (29.4%) developed clinically significant heart block, while 24 (70.6%) did not. The mean HV interval for patients with clinically significant heart block was 70.1 ms vs 57.8 ms for those who did not (p = 0.022). Pre-existing first-degree heart block prior to TAVR (p = 0.026), history of AFib (p = 0.05) in addition to STS score (p = 0.037) were predictors of development of high-degree AV block in our patient population. CONCLUSIONS: In patients who develop LBBB following TAVR, HV interval, pre-existing first-degree heart block, and STS score predict progression to high-degree AV block. Performance of a routine electrophysiology study should be considered for high-risk patients who develop LBBB following TAVR.


Subject(s)
Atrioventricular Block , Transcatheter Aortic Valve Replacement , Humans , Atrioventricular Block/epidemiology , Atrioventricular Block/etiology , Atrioventricular Block/therapy , Bundle-Branch Block/epidemiology , Bundle-Branch Block/etiology , Bundle-Branch Block/therapy , Transcatheter Aortic Valve Replacement/adverse effects
10.
J Cardiovasc Pharmacol ; 80(3): 471-475, 2022 09 01.
Article in English | MEDLINE | ID: mdl-35881901

ABSTRACT

ABSTRACT: Initial warfarin dosing and time in therapeutic range (TTR) are poorly characterized for early post-operative left ventricular assist device (LVAD) patients. This study evaluated TTR after LVAD implantation compared between patients receiving low-dose (<3 mg) and high-dose (≥3 mg) warfarin. This single-center, retrospective analysis included 234 LVAD patients who received warfarin within 5 days of implantation. The primary outcome was TTR during the 5 days following first international normalized ratio (INR) ≥2 compared between low-dose and high-dose groups. Secondary outcomes were hospital and intensive care unit length of stay, time to first INR ≥2, TTR after first INR ≥2, and reinitiation of parenteral anticoagulation. No difference in TTR was detected between warfarin groups (57.2% vs. 62.7%, P = 0.13). Multivariable analysis did not detect any factors predictive of TTR during the primary outcome timeframe, but age and body mass index were associated with the warfarin dose. The low-dose group received a mean warfarin dose of 1.9 mg (±0.64 mg), and the high dose group received 4.34 mg (±1.38 mg). Cohort TTR during the primary outcome timeframe was 60.5% and 56.5% for hospitalization. The low-dose group had longer intensive care unit length of stay, shorter time to therapeutic INR, and more frequently reinitiated parenteral anticoagulation. Patients with recent LVAD implantation are complex and have diverse warfarin sensitivity factors, which did not allow for optimal warfarin dose detection, although half of all patients received doses between 2.04 mg and 4.33 mg. Individualized dosing should be used, adjusting for patient-specific factors such as age, body mass index, and drug interactions.


Subject(s)
Heart-Assist Devices , Warfarin , Anticoagulants , Heart-Assist Devices/adverse effects , Humans , International Normalized Ratio , Retrospective Studies
11.
Int J Mol Sci ; 23(12)2022 Jun 16.
Article in English | MEDLINE | ID: mdl-35743155

ABSTRACT

B-cell chronic lymphocytic leukemia (CLL) results from intrinsic genetic defects and complex microenvironment stimuli that fuel CLL cell growth through an array of survival signaling pathways. Novel small-molecule agents targeting the B-cell receptor pathway and anti-apoptotic proteins alone or in combination have revolutionized the management of CLL, yet combination therapy carries significant toxicity and CLL remains incurable due to residual disease and relapse. Single-molecule inhibitors that can target multiple disease-driving factors are thus an attractive approach to combat both drug resistance and combination-therapy-related toxicities. We demonstrate that SRX3305, a novel small-molecule BTK/PI3K/BRD4 inhibitor that targets three distinctive facets of CLL biology, attenuates CLL cell proliferation and promotes apoptosis in a dose-dependent fashion. SRX3305 also inhibits the activation-induced proliferation of primary CLL cells in vitro and effectively blocks microenvironment-mediated survival signals, including stromal cell contact. Furthermore, SRX3305 blocks CLL cell migration toward CXCL-12 and CXCL-13, which are major chemokines involved in CLL cell homing and retention in microenvironment niches. Importantly, SRX3305 maintains its anti-tumor effects in ibrutinib-resistant CLL cells. Collectively, this study establishes the preclinical efficacy of SRX3305 in CLL, providing significant rationale for its development as a therapeutic agent for CLL and related disorders.


Subject(s)
Leukemia, Lymphocytic, Chronic, B-Cell , Cell Cycle Proteins/pharmacology , Humans , Leukemia, Lymphocytic, Chronic, B-Cell/pathology , Nuclear Proteins , Phosphatidylinositol 3-Kinases , Protein Kinase Inhibitors/pharmacology , Protein Kinase Inhibitors/therapeutic use , Receptors, Antigen, B-Cell/metabolism , Transcription Factors , Tumor Microenvironment
12.
Health Secur ; 20(3): 238-245, 2022.
Article in English | MEDLINE | ID: mdl-35675667

ABSTRACT

During the COVID-19 pandemic, academic health centers suspended clinical clerkships for students. A need emerged for innovative virtual curricula to continue fostering professional competencies. In March 2020, a multidisciplinary team from the University of Nebraska Medical Center had 2 weeks to create a course on the impact of infectious diseases that addressed the COVID-19 pandemic in real time for upper-level medical and physician assistant students. Content addressing social determinants of health, medical ethics, population health, service learning, health security, and emergency preparedness were interwoven throughout the course to emphasize critical roles during a pandemic. In total, 320 students were invited to complete the survey on knowledge gained and attitudes about the course objectives and materials and 139 responded (response rate 43%). Students documented over 8,000 total hours of service learning; many created nonprofit organizations, aligned their initiatives with health systems efforts, and partnered with community-based organizations. Thematic analysis of qualitative evaluations revealed that learners found the greatest value in the emphasis on social determinants of health, bioethics, and service learning. The use of predeveloped, asynchronous e-modules were widely noted as the least effective aspect of the course. The COVID-19 pandemic introduced substantial challenges in medical education but also provided trainees with an unprecedented opportunity to learn from real-world emergency preparedness and public health responses. The University of Nebraska Medical Center plans to create a health security elective that includes traditional competencies for emergency preparedness and interrogates the social and structural vulnerabilities that drive disproportionately worse outcomes among marginalized communities. With further evaluation, many components of the curriculum could be broadly scaled to meet the increasing need for more public health and health security medical education.


Subject(s)
COVID-19 , Civil Defense , Communicable Diseases , Curriculum , Humans , Pandemics/prevention & control
13.
J Arthroplasty ; 37(7S): S674-S677, 2022 07.
Article in English | MEDLINE | ID: mdl-35283230

ABSTRACT

BACKGROUND: Two-stage reimplantation is an effective treatment for periprosthetic joint infection (PJI). Many factors are involved in the variable success of this procedure. The purpose of this study is to examine the relationship between patient risk factors, comorbidities, and the pathogen on reinfection rates following two-stage reimplantation. METHODS: We evaluated 158 patients treated for PJI from 2008-2019. Only patients who had completed a two-stage exchange were included. Patient demographics, comorbidities, laboratory values, time-to-reimplantation, pathogen, antibiotic sensitivities, host status, and reinfection rates were assessed. Multivariate analysis was performed to identify correlation between risk factors and reinfection. A P-value < .05 was considered statistically significant. RESULTS: 31 patients experienced a reinfection (19.6%). There was a statistically significant association between infection with Methicillin Sensitive Staphylococcus Aureus (MSSA) and reinfection (P = .046). Patients with a reinfection also had a significantly greater median serum C-reactive protein (CRP) level (12.65 g/dL) at the time of diagnosis compared to patients without a reinfection (5.0 g/dL) (P = .010). Median Erythrocyte Sedimentation Rate (ESR) (56 in no re-infection and 69 in re-infection) and time-to-reimplantation (101 days in no reinfection and 141 days in reinfection) demonstrated a trend toward an association with re-infection but were not statistically significant (P = .055 and P = .054 respectively). CONCLUSION: As the number of arthroplasties continue to rise, PJIs are increasing proportionately and represent a significant revision burden. Elevated C-reactive protein (CRP) levels and Methicillin Sensitive Staphylococcus aureus (MSSA) infection were strongly associated with failure of a two-stage reimplantation. While not statistically significant with our numbers, there were strong trends toward an association between elevated Erythrocyte Sedimentation Rate (ESR), longer time-to-reimplantation, and reinfection.


Subject(s)
Arthritis, Infectious , Arthroplasty, Replacement, Hip , Arthroplasty, Replacement, Knee , Prosthesis-Related Infections , Reinfection , Replantation , Staphylococcal Infections , Anti-Bacterial Agents/pharmacology , Anti-Bacterial Agents/therapeutic use , Arthritis, Infectious/etiology , Arthroplasty, Replacement, Hip/adverse effects , Arthroplasty, Replacement, Knee/adverse effects , C-Reactive Protein/analysis , Humans , Methicillin/pharmacology , Methicillin/therapeutic use , Prosthesis-Related Infections/diagnosis , Prosthesis-Related Infections/etiology , Reoperation , Retrospective Studies , Staphylococcal Infections/diagnosis , Staphylococcal Infections/drug therapy , Staphylococcal Infections/etiology
14.
J Arthroplasty ; 37(6S): S327-S332, 2022 06.
Article in English | MEDLINE | ID: mdl-35074448

ABSTRACT

BACKGROUND: Long-term reinfection and mortality rates and clinical outcomes with sufficient subject numbers remain limited for patients undergoing two-stage exchange arthroplasty for chronic periprosthetic knee infections. The purpose of this study was to determine the long-term reinfection, complication, and mortality following reimplantation for two-stage exchange following knee arthroplasty. METHODS: Retrospective review of 178 patients who underwent two-stage exchange knee arthroplasty for chronic PJI at three large tertiary referral institutions with an average of 6.63-year follow-up from reimplantation from 1990 to 2015. Rates of reinfection, mortality, and all-cause revision were calculated along with the cumulative incidence of reinfection with death as a competing factor. Risk factors for reinfection were determined using Cox multivariate regression analysis. RESULTS: Overall rate of infection eradication was 85.41%, with a mortality rate of 30.33%. Patients with minimum 5-year follow-up (n = 118, average 8.32 years) had an infection eradication rate of 88.98%, with a mortality rate of 33.05%. CONCLUSION: This is a large series with long-term follow-up evaluating outcomes of two-stage exchange knee arthroplasty resulting in adequate infection eradication and high mortality. Results were maintained at longer follow-up. This technique should be considered in patients with chronic PJI; however, realistic expectations regarding long-term outcomes must be discussed with patients.


Subject(s)
Arthroplasty, Replacement, Knee , Prosthesis-Related Infections , Anti-Bacterial Agents/therapeutic use , Arthroplasty, Replacement, Knee/adverse effects , Arthroplasty, Replacement, Knee/methods , Humans , Knee Joint/surgery , Prosthesis-Related Infections/epidemiology , Prosthesis-Related Infections/etiology , Prosthesis-Related Infections/surgery , Reinfection , Reoperation/adverse effects , Retrospective Studies , Treatment Outcome
15.
J Matern Fetal Neonatal Med ; 35(5): 907-913, 2022 Mar.
Article in English | MEDLINE | ID: mdl-32146832

ABSTRACT

OBJECTIVE: Neonatal brain injury is a potentially devastating cause of neurodevelopmental impairment. There is no consensus, however, on the appropriate timing and frequency of routine head ultrasound (HUS) screening for such injuries. We evaluated the diagnostic utility of routine HUS screening at 30 days of life ("late HUS") for detecting severe intraventricular hemorrhage (IVH) or cystic periventricular leukomalacia (c-PVL) in preterm infants with a negative HUS before 14 days of life ("early HUS"). METHODS: Single-center retrospective cohort analysis of infants born at ≤ 32 weeks gestational age (GA) admitted to the University of Nebraska Medical Center NICU from 2011-2018. Demographics, HUS and MRI diagnoses were abstracted from clinical records. Fisher's exact test and t-test assessed associations between categorical and continuous variable, respectively. RESULTS: 205 infants were included-120 very preterm (28-32 weeks GA) and 85 extremely preterm (<28 weeks GA). Negative predictive value of early HUS for predicting any clinically significant anomalies (severe IVH or c-PVL) on late HUS was 100% for extremely and 99.2% for very preterm infants. Term-equivalent MRI detected previously undiagnosed c-PVL in 16.7% of the 24 patients that received MRI; all infants with new c-PVL on MRI had severe IVH on early HUS. CONCLUSION: Following negative early HUS, late HUS detected significant new abnormalities in one infant. These data suggest that in a unit with low prevalence of c-PVL, 30-day HUS may have limited clinical utility following negative screening. In infants with abnormal early HUS, clinicians should consider obtaining term-equivalent MRI screening to detect c-PVL.


Subject(s)
Infant, Premature, Diseases , Leukomalacia, Periventricular , Cerebral Hemorrhage , Humans , Infant , Infant, Newborn , Infant, Premature , Leukomalacia, Periventricular/diagnostic imaging , Leukomalacia, Periventricular/epidemiology , Retrospective Studies , Ultrasonography
16.
Pediatr Emerg Care ; 38(1): e170-e172, 2022 Jan 01.
Article in English | MEDLINE | ID: mdl-32675710

ABSTRACT

OBJECTIVES: Abusive head trauma (AHT) is the leading cause of death from trauma in children less than 2 years of age. A delay in presentation for care has been reported as a risk factor for abuse; however, there has been limited research on this topic. We compare children diagnosed with AHT to children diagnosed with accidental head trauma to determine if there is a delay in presentation. METHODS: We retrospectively studied children less than 6 years old who had acute head injury and were admitted to the pediatric intensive care unit at a pediatric hospital from 2013 to 2017. Cases were reviewed to determine the duration from symptom onset to presentation to care and the nature of the head injury (abusive vs accidental). RESULTS: A total of 59 children met inclusion criteria. Patients who had AHT were significantly more likely to present to care more than 30 minutes after symptom onset (P = 0.0015). Children who had AHT were more likely to be younger (median, 4 vs 31 months; P < 0.0001) and receive Medicaid (P < 0.0001) than those who had accidental head trauma. Patients who had AHT were more likely to have a longer length of stay (median, 11 vs 3 days; P < 0.0001) and were less likely to be discharged home than patients who had accidental head trauma (38% vs 84%; P = 0.0005). CONCLUSIONS: Children who had AHT were more likely to have a delayed presentation for care as compared with children whose head trauma was accidental. A delay in care should prompt clinicians to strongly consider a workup for abusive injury.


Subject(s)
Child Abuse , Craniocerebral Trauma , Child , Child Abuse/diagnosis , Craniocerebral Trauma/diagnosis , Craniocerebral Trauma/epidemiology , Craniocerebral Trauma/etiology , Humans , Intensive Care Units, Pediatric , Retrospective Studies , Risk Factors , United States
17.
Ann Pharmacother ; 56(4): 377-386, 2022 Apr.
Article in English | MEDLINE | ID: mdl-34282636

ABSTRACT

BACKGROUND: The gut microbiome plays a critical role in modulating the therapeutic effect of immune checkpoint inhibitors (ICIs). Proton pump inhibitors (PPIs) are commonly used in cancer patients and may affect the gut microbiome by altering gut pH. OBJECTIVE: To evaluate if concurrent use of PPI is associated with overall survival (OS) and progression-free survival (PFS) in patients with stage IV non-small-cell lung cancer (NSCLC), melanoma, renal cell carcinoma, transitional cell carcinoma, or head and neck squamous cell carcinoma. METHODS: This was a single-center retrospective cohort study of advanced cancer adult patients who received nivolumab or pembrolizumab between September 1, 2014, and August 31, 2019. Concomitant PPI exposure was defined as PPI use 0 to 30 days before or after initiation of ICIs. Treatment outcome was OS and PFS. RESULTS: A total of 233 patients were included in our study. Concomitant PPI use was not significantly associated with OS (hazard ratio [HR] = 1.22; 95% CI = 0.80-1.86) or PFS (HR = 1.05; 95% CI = 0.76-1.45) in patients with ICI use. The effect estimates were robust after adjusting for covariates in multivariate analysis and in patients with NSCLC. CONCLUSION AND RELEVANCE: Concomitant PPI use was not associated with the effectiveness of nivolumab or pembrolizumab. Certain predictors of survival outcomes related to PPI use in patients receiving immunotherapy, such as the time window and indication of PPI exposure and autoimmune disorders, should be explored in the future to better carve out the impact of PPI on the effectiveness of ICI use.


Subject(s)
Carcinoma, Non-Small-Cell Lung , Lung Neoplasms , Carcinoma, Non-Small-Cell Lung/drug therapy , Humans , Immune Checkpoint Inhibitors/therapeutic use , Lung Neoplasms/drug therapy , Proton Pump Inhibitors/adverse effects , Retrospective Studies
18.
J Contin Educ Health Prof ; 42(1): e88-e91, 2022 Jan 01.
Article in English | MEDLINE | ID: mdl-34459438

ABSTRACT

INTRODUCTION: Rapid and accurate detection of the novel coronavirus using a nasopharyngeal specimen requires training for professionals who may have limited experience. To respond to the urgent need, an interprofessional team created a just-in-time (JIT) module to provide only what was needed, precisely when needed, and rapidly deployed training sessions to a large group of health professionals. METHODS: In April and May 2020, health professionals from the hospital, ambulatory clinics, and public health attended training. Procedural comfort/knowledge and perception of the training were assessed with pre-survey and post-survey. RESULTS: Comfort level in collecting a nasopharyngeal specimen among participating health professionals increased from 2.89 (n = 338) on the pre-survey to 4.51 (n = 300) on the postsurvey on a 5-point scale. Results revealed a significant difference (P < .01) between pre-post knowledge questions regarding the correct angle and depth of the swab to obtain an adequate sample from the nasopharynx. DISCUSSION: This study demonstrates that a JIT intervention can improve knowledge and comfort regarding the nasopharyngeal swab procedure. In preparation for the prevention and mitigation of future viral outbreaks (ie, coronavirus and influenza), educators should consider creating JIT skills training for health care professionals who may be deployed to assist in mass testing efforts.


Subject(s)
COVID-19 , Simulation Training , COVID-19/epidemiology , COVID-19 Testing , Humans , Nasopharynx , Pandemics , SARS-CoV-2 , Specimen Handling/methods
19.
Orthop Clin North Am ; 52(4): 317-321, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34538344

ABSTRACT

This article is a retrospective review of a consecutive series of 401 primary total hip arthroplasties with the use of cementless, ream and broach Synergy stem (Smith & Nephew, Memphis, TN, USA) with minimal 10-year follow-up. We report an overall 10-year survivorship of 99.6% with a total of 15 fractures during the study period. Six of these fractures occurred intraoperatively. This is the largest series to our knowledge reporting greater than 10-year follow-up. This stem has excellent survivorship with overall low risk of periprosthetic fracture.


Subject(s)
Arthroplasty, Replacement, Hip/adverse effects , Hip Prosthesis , Periprosthetic Fractures , Prosthesis Design , Arthroplasty, Replacement, Hip/instrumentation , Bone Cements , Cementation , Hip Prosthesis/adverse effects , Humans , Periprosthetic Fractures/etiology , Risk Factors
20.
J Dent Educ ; 2021 Aug 13.
Article in English | MEDLINE | ID: mdl-34387866

ABSTRACT

BACKGROUND: Technological advances and pedagogical shifts toward active learning have led dental academics to explore alternatives to traditional didactic lectures, yet questions remain regarding the effectiveness of new modalities at both relaying foundational knowledge and inspiring critical thinking. Here, we developed an integrative e-learning module on the subject of bone growth and recruited novice learners from undergraduate institutions to participate. The aim of the study was to investigate the impact of learning modality on novice learners' ability to apply newly acquired knowledge to critical thinking exercises related to dentistry. METHODS: In the fall of 2019, 42 undergraduate students from University of Nebraska and Nebraska Wesleyan University campuses voluntarily participated in the study involving a pretest, intervention, posttest, and retention test with survey and results were analyzed. RESULTS: Our data reveal a significant difference in mean pre- and posttest scores within delivery group of both traditional lecture and e-module cohorts (p < 0.0001) and no statistically significant difference between cohorts in posttest scores. Similarly, there was no significant difference in student performance on higher-level cognitive skill questions between cohorts, indicating that students learning via e-module were able to apply foundational knowledge to clinical scenarios similarly to students learning via content-expert lecture discussions. CONCLUSION: The authors shed light on an opportunity to integrate e-learning into dental education, relieving time constraints for faculty and meeting the needs of our tech-savvy students, without compromising the fostering of critical thinking skills in future dentists.

SELECTION OF CITATIONS
SEARCH DETAIL
...