Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 345
Filter
1.
Front Med (Lausanne) ; 11: 1373593, 2024.
Article in English | MEDLINE | ID: mdl-38756942

ABSTRACT

Objective: The objective of this study was to examine the impact of the introduction of the Universal Anaesthesia Machine (UAM), a device designed for use in clinical environments with limited clinical perioperative resources, on the choice of general anesthesia technique and safe anesthesia practice in a tertiary-care hospital in Sierra Leone. Methods: We introduced an anesthesia machine (UAM) into Connaught Hospital, Freetown, Sierra Leone. We conducted a prospective observational study of anesthesia practice and an examination of perioperative clinical parameters among surgical patients at the hospital to determine the usability of the device, its impact on anesthesia capacity, and changes in general anesthesia technique. Findings: We observed a shift from the use of ketamine total intravenous anesthesia to inhalational anesthesia. This shift was most demonstrable in anesthesia care for appendectomies and surgical wound management. In 10 of 17 power outages that occurred during inhalational general anesthesia, anesthesia delivery was uninterrupted because inhalational anesthesia was being delivered with the UAM. Conclusion: Anesthesia technologies tailored to overcome austere environmental conditions can support the delivery of safe anesthesia care while maintaining fidelity to recommended international anesthesia practice standards.

2.
Front Med (Lausanne) ; 11: 1326144, 2024.
Article in English | MEDLINE | ID: mdl-38444409

ABSTRACT

Introduction: Intravenous (IV) therapy is a crucial aspect of care for the critically ill patient. Barriers to IV infusion pumps in low-resource settings include high costs, lack of access to electricity, and insufficient technical support. Inaccuracy of traditional drop-counting practices places patients at risk. By conducting a comparative assessment of IV infusion methods, we analyzed the efficacy of different devices and identified one that most effectively bridges the gap between accuracy, cost, and electricity reliance in low-resource environments. Methods: In this prospective mixed methods study, nurses, residents, and medical students used drop counting, a manual flow regulator, an infusion pump, a DripAssist, and a DripAssist with manual flow regulator to collect normal saline at goal rates of 240, 120, and 60 mL/h. Participants' station setup time was recorded, and the amount of fluid collected in 10 min was recorded (in milliliters). Participants then filled out a post-trial survey to rate each method (on a scale of 1 to 5) in terms of understandability, time consumption, and operability. Cost-effectiveness for use in low-resource settings was also evaluated. Results: The manual flow regulator had the fastest setup time, was the most cost effective, and was rated as the least time consuming to use and the easiest to understand and operate. In contrast, the combination of the DripAssist and manual flow regulator was the most time consuming to use and the hardest to understand and operate. Conclusion: The manual flow regulator alone was the least time consuming and easiest to operate. The DripAssist/Manual flow regulator combination increases accuracy, but this combination was the most difficult to operate. In addition, the manual flow regulator was the most cost-effective. Healthcare providers can adapt these devices to their practice environments and improve the safety of rate-sensitive IV medications without significant strain on electricity, time, or personnel resources.

3.
World Neurosurg ; 180: e449-e459, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37769846

ABSTRACT

OBJECTIVE: Preoperative assessment is important for neurosurgical risk stratification, but the level of evidence for individual screening tests is low. In preoperative urinalysis (UA), testing may significantly increase costs and lead to inappropriate antibiotic treatment. We prospectively evaluated whether eliminating preoperative UA was noninferior to routine preoperative UA as measured by 30-day readmission for surgical site infection in adult elective neurosurgical procedures. METHODS: A single-institution prospective, pragmatic study of patients receiving elective neurosurgical procedures from 2018 to 2020 was conducted. Patients were allocated based on same-day versus preoperative admission status. Rates of preoperative UA and subsequent wound infection were measured along with detailed demographic, surgical, and laboratory data. RESULTS: The study included 879 patients. The most common types of surgery were cranial (54.7%), spine (17.4%), and stereotactic/functional (19.5%). No preoperative UA was performed in 315 patients, while 564 underwent UA. Of tested patients, 103 (18.3%) met criteria for suspected urinary tract infection, and 69 (12.2%) received subsequent antibiotic treatment. There were 14 patients readmitted within 30 days (7 without UA [2.2%] vs. 7 with UA [1.2%]) for subsequent wound infection with a risk difference of 0.98% (95% confidence interval -0.89% to 2.85%). The upper limit of the confidence interval exceeded the preselected noninferiority margin of 1%. CONCLUSIONS: In this prospective study of preoperative UA for elective neurosurgical procedures using a pragmatic, real-world design, risk of readmission due to surgical site infection was very low across the study cohort, suggesting a limited role of preoperative UA for elective neurosurgical procedures.


Subject(s)
Surgical Wound Infection , Urinary Tract Infections , Adult , Humans , Surgical Wound Infection/diagnosis , Surgical Wound Infection/epidemiology , Surgical Wound Infection/prevention & control , Prospective Studies , Urinalysis , Anti-Bacterial Agents/therapeutic use , Spine , Urinary Tract Infections/diagnosis , Urinary Tract Infections/etiology , Urinary Tract Infections/prevention & control
4.
Nat Cancer ; 4(9): 1258-1272, 2023 09.
Article in English | MEDLINE | ID: mdl-37537301

ABSTRACT

The accepted paradigm for both cellular and anti-tumor immunity relies upon tumor cell killing by CD8+ T cells recognizing cognate antigens presented in the context of target cell major histocompatibility complex (MHC) class I (MHC-I) molecules. Likewise, a classically described mechanism of tumor immune escape is tumor MHC-I downregulation. Here, we report that CD8+ T cells maintain the capacity to kill tumor cells that are entirely devoid of MHC-I expression. This capacity proves to be dependent instead on interactions between T cell natural killer group 2D (NKG2D) and tumor NKG2D ligands (NKG2DLs), the latter of which are highly expressed on MHC-loss variants. Necessarily, tumor cell killing in these instances is antigen independent, although prior T cell antigen-specific activation is required and can be furnished by myeloid cells or even neighboring MHC-replete tumor cells. In this manner, adaptive priming can beget innate killing. These mechanisms are active in vivo in mice as well as in vitro in human tumor systems and are obviated by NKG2D knockout or blockade. These studies challenge the long-advanced notion that downregulation of MHC-I is a viable means of tumor immune escape and instead identify the NKG2D-NKG2DL axis as a therapeutic target for enhancing T cell-dependent anti-tumor immunity against MHC-loss variants.


Subject(s)
CD8-Positive T-Lymphocytes , Neoplasms , Animals , Humans , Mice , Antigens/metabolism , CD8-Positive T-Lymphocytes/pathology , Histocompatibility Antigens Class I/genetics , Histocompatibility Antigens Class I/metabolism , Neoplasms/genetics , NK Cell Lectin-Like Receptor Subfamily K/genetics , NK Cell Lectin-Like Receptor Subfamily K/metabolism
5.
BioDrugs ; 37(4): 489-503, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37256535

ABSTRACT

Glioblastoma is highly aggressive and remains difficult to treat despite being the most common malignant primary brain tumor in adults. Current standard-of-care treatment calls for maximum resection of the tumor mass followed by concurrent chemotherapy and radiotherapy and further adjuvant chemotherapy if necessary. Despite this regimen, prognosis remains grim. Immunotherapy has shown promising success in a variety of solid tumor types, but efficacy in glioblastoma is yet to be demonstrated. Barriers to the success of immunotherapy in glioblastoma include: a heterogeneous tumor cell population, a highly immunosuppressive microenvironment, and the blood-brain barrier, to name a few. Several immunotherapeutic approaches are actively being investigated and developed to overcome these limitations. In this review, we present different classes of immunotherapy targeting glioblastoma, their most recent results, and potential future directions.


Subject(s)
Brain Neoplasms , Glioblastoma , Adult , Humans , Glioblastoma/therapy , Immunotherapy , Blood-Brain Barrier , Brain Neoplasms/therapy , Immunosuppressive Agents , Tumor Microenvironment
6.
Int J Sports Physiol Perform ; 18(6): 625-633, 2023 Jun 01.
Article in English | MEDLINE | ID: mdl-37059425

ABSTRACT

PURPOSE: To assess objective strain and subjective muscle soreness in "Bigs" (offensive and defensive line), "Combos" (tight ends, quarterbacks, line backers, and running backs), and "Skills" (wide receivers and defensive backs) in American college football players during off-season, fall camp, and in-season phases. METHODS: Twenty-three male players were assessed once weekly (3-wk off-season, 4-wk fall camp, and 3-wk in-season) for hydroperoxides (free oxygen radical test [FORT]), antioxidant capacity (free oxygen radical defense test [FORD]), oxidative stress index (OSI), countermovement-jump flight time, Reactive Strength Index (RSI) modified, and subjective soreness. Linear mixed models analyzed the effect of a 2-within-subject-SD change between predictor and dependent variables. RESULTS: Compared to fall camp and in-season phases, off-season FORT (P ≤ .001 and <.001), FORD (P ≤ .001 and <.001), OSI (P ≤ .001 and <.001), flight time (P ≤ .001 and <.001), RSI modified (P ≤ .001 and <.001), and soreness (P ≤ .001 and <.001) were higher for "Bigs," whereas FORT (P ≤ .001 and <.001) and OSI (P = .02 and <.001) were lower for "Combos." FORT was higher for "Bigs" compared to "Combos" in all phases (P ≤ .001, .02, and .01). FORD was higher for "Skills" compared with "Bigs" in off-season (P = .02) and "Combos" in-season (P = .01). OSI was higher for "Bigs" compared with "Combos" (P ≤ .001) and "Skills" (P = .01) during off-season and to "Combos" in-season (P ≤ .001). Flight time was higher for "Skills" in fall camp compared with "Bigs" (P = .04) and to "Combos" in-season (P = .01). RSI modified was higher for "Skills" during off-season compared with "Bigs" (P = .02) and "Combos" during fall camp (P = .03), and in-season (P = .03). CONCLUSION: Off-season American college football training resulted in higher objective strain and subjective muscle soreness in "Bigs" compared with fall camp and during in-season compared with "Combos" and "Skills" players.


Subject(s)
Football , Myalgia , Humans , Male , United States , Football/physiology , Seasons , Reactive Oxygen Species , Universities
7.
Adv Radiat Oncol ; 8(2): 101166, 2023.
Article in English | MEDLINE | ID: mdl-36845614

ABSTRACT

Purpose: Hypofractionated stereotactic radiosurgery (HF-SRS) with or without surgical resection is potentially a preferred treatment for larger or symptomatic brain metastases (BMs). Herein, we report clinical outcomes and predictive factors following HF-SRS. Methods and Materials: Patients undergoing HF-SRS for intact (iHF-SRS) or resected (rHF-SRS) BMs from 2008 to 2018 were retrospectively identified. Linear accelerator-based image-guided HF-SRS consisted of 5 fractions at 5, 5.5, or 6 Gy per fraction. Time to local progression (LP), time to distant brain progression (DBP), and overall survival (OS) were calculated. Cox models assessed effect of clinical factors on OS. Fine and Gray's cumulative incidence model for competing events examined effect of factors on LP and DBP. The occurrence of leptomeningeal disease (LMD) was determined. Logistic regression examined predictors of LMD. Results: Among 445 patients, median age was 63.5 years; 87% had Karnofsky performance status ≥70. Fifty-three % of patients underwent surgical resection, and 75% received 5 Gy per fraction. Patients with resected BMs had higher Karnofsky performance status (90-100, 41 vs 30%), less extracranial disease (absent, 25 vs 13%), and fewer BMs (multiple, 32 vs 67%). Median diameter of the dominant BM was 3.0 cm (interquartile range, 1.8-3.6 cm) for intact BMs and 4.6 cm (interquartile range, 3.9-5.5 cm) for resected BMs. Median OS was 5.1 months (95% confidence interval [CI], 4.3-6.0) following iHF-SRS and 12.8 months (95% CI, 10.8-16.2) following rHF-SRS (P < .01). Cumulative LP incidence was 14.5% at 18 months (95% CI, 11.4-18.0%), significantly associated with greater total GTV (hazard ratio, 1.12; 95% CI, 1.05-1.20) following iFR-SRS, and with recurrent versus newly diagnosed BMs across all patients (hazard ratio, 2.28; 95% CI, 1.01-5.15). Cumulative DBP incidence was significantly greater following rHF-SRS than iHF-SRS (P = .01), with respective 24-month rates of 50.0 (95% CI, 43.3-56.3) and 35.7% (95% CI, 29.2-42.2). LMD (57 events total; 33% nodular, 67% diffuse) was observed in 17.1% of rHF-SRS and 8.1% of iHF-SRS cases (odds ratio, 2.46; 95% CI, 1.34-4.53). Any radionecrosis and grade 2+ radionecrosis events were observed in 14 and 8% of cases, respectively. Conclusions: HF-SRS demonstrated favorable rates of LC and radionecrosis in postoperative and intact settings. Corresponding LMD and RN rates were comparable to those of other studies.

8.
Front Immunol ; 14: 1085547, 2023.
Article in English | MEDLINE | ID: mdl-36817432

ABSTRACT

Chimeric antigen receptor (CAR) T cell therapy in glioblastoma faces many challenges including insufficient CAR T cell abundance and antigen-negative tumor cells evading targeting. Unfortunately, preclinical studies evaluating CAR T cells in glioblastoma focus on tumor models that express a single antigen, use immunocompromised animals, and/or pre-treat with lymphodepleting agents. While lymphodepletion enhances CAR T cell efficacy, it diminishes the endogenous immune system that has the potential for tumor eradication. Here, we engineered CAR T cells to express IL7 and/or Flt3L in 50% EGFRvIII-positive and -negative orthotopic tumors pre-conditioned with non-lymphodepleting irradiation. IL7 and IL7 Flt3L CAR T cells increased intratumoral CAR T cell abundance seven days after treatment. IL7 co-expression with Flt3L modestly increased conventional dendritic cells as well as the CD103+XCR1+ population known to have migratory and antigen cross-presenting capabilities. Treatment with IL7 or IL7 Flt3L CAR T cells improved overall survival to 67% and 50%, respectively, compared to 9% survival with conventional or Flt3L CAR T cells. We concluded that CAR T cells modified to express IL7 enhanced CAR T cell abundance and improved overall survival in EGFRvIII heterogeneous tumors pre-conditioned with non-lymphodepleting irradiation. Potentially IL7 or IL7 Flt3L CAR T cells can provide new opportunities to combine CAR T cells with other immunotherapies for the treatment of glioblastoma.


Subject(s)
Glioblastoma , Glioma , Animals , Mice , ErbB Receptors , Glioblastoma/therapy , Interleukin-7 , T-Lymphocytes
9.
Neuro Oncol ; 25(6): 1085-1097, 2023 06 02.
Article in English | MEDLINE | ID: mdl-36640127

ABSTRACT

BACKGROUND: MDNA55 is an interleukin 4 receptor (IL4R)-targeting toxin in development for recurrent GBM, a universally fatal disease. IL4R is overexpressed in GBM as well as cells of the tumor microenvironment. High expression of IL4R is associated with poor clinical outcomes. METHODS: MDNA55-05 is an open-label, single-arm phase IIb study of MDNA55 in recurrent GBM (rGBM) patients with an aggressive form of GBM (de novo GBM, IDH wild-type, and nonresectable at recurrence) on their 1st or 2nd recurrence. MDNA55 was administered intratumorally as a single dose treatment (dose range of 18 to 240 ug) using convection-enhanced delivery (CED) with up to 4 stereo-tactically placed catheters. It was co-infused with a contrast agent (Gd-DTPA, Magnevist®) to assess distribution in and around the tumor margins. The flow rate of each catheter did not exceed 10µL/min to ensure that the infusion duration did not exceed 48 h. The primary endpoint was mOS, with secondary endpoints determining the effects of IL4R status on mOS and PFS. RESULTS: MDNA55 showed an acceptable safety profile at doses up to 240 µg. In all evaluable patients (n = 44) mOS was 11.64 months (80% one-sided CI 8.62, 15.02) and OS-12 was 46%. A subgroup (n = 32) consisting of IL4R High and IL4R Low patients treated with high-dose MDNA55 (>180 ug) showed the best benefit with mOS of 15 months, OS-12 of 55%. Based on mRANO criteria, tumor control was observed in 81% (26/32), including those patients who exhibited pseudo-progression (15/26). CONCLUSIONS: MDNA55 demonstrated tumor control and promising survival and may benefit rGBM patients when treated at high-dose irrespective of IL4R expression level.Trial Registration: Clinicaltrials.gov NCT02858895.


Subject(s)
Brain Neoplasms , Glioblastoma , Humans , Glioblastoma/drug therapy , Glioblastoma/genetics , Glioblastoma/pathology , Brain Neoplasms/drug therapy , Brain Neoplasms/genetics , Brain Neoplasms/pathology , Receptors, Interleukin-4/therapeutic use , Neoplasm Recurrence, Local/drug therapy , Neoplasm Recurrence, Local/pathology , Tumor Microenvironment
10.
Mil Med ; 188(3-4): 670-677, 2023 03 20.
Article in English | MEDLINE | ID: mdl-34986241

ABSTRACT

INTRODUCTION: Subjective measures may offer practitioners a relatively simple method to monitor recruit responses to basic military training (BMT). Yet, a lack of agreement between subjective and objective measures may presents a problem to practitioners wishing to implement subjective monitoring strategies. This study therefore aims to examine associations between subjective and objective measures of workload and sleep in Australian Army recruits. MATERIALS AND METHODS: Thirty recruits provided daily rating of perceived exertion (RPE) and differential RPE (d-RPE) for breathlessness and leg muscle exertion each evening. Daily internal workloads determined via heart rate monitors were expressed as Edwards training impulse (TRIMP) and average heart rate. External workloads were determined via global positioning system (PlayerLoadTM) and activity monitors (step count). Subjective sleep quality and duration was monitored in 29 different recruits via a customized questionnaire. Activity monitors assessed objective sleep measures. Linear mixed-models assessed associations between objective and subjective measures. Akaike Information Criterion assessed if the inclusion of d-RPE measures resulted in a more parsimonious model. Mean bias, typical error of the estimate (TEE) and within-subject repeated measures correlations examined agreement between subjective and objective sleep duration. RESULTS: Conditional R2 for associations between objective and subjective workloads ranged from 0.18 to 0.78, P < 0.01, with strong associations between subjective measures of workload and TRIMP (0.65-0.78), average heart rate (0.57-0.73), and PlayerLoadTM (0.54-0.68). Including d-RPE lowered Akaike Information Criterion. The slope estimate between objective and subjective measures of sleep quality was not significant. A trivial relationship (r = 0.12; CI -0.03, 0.27) was observed between objective and subjective sleep duration with subjective measures overestimating (mean bias 25 min) sleep duration (TEE 41 min). CONCLUSIONS: Daily RPE offers a proxy measure of internal workload in Australian Army recruits; however, the current subjective sleep questionnaire should not be considered a proxy measure of objective sleep measures.


Subject(s)
Sleep , Workload , Humans , Australia , Sleep/physiology , Surveys and Questionnaires , Sleep Duration , Physical Exertion/physiology , Heart Rate
11.
ATS Sch ; 4(4): 502-516, 2023 Dec.
Article in English | MEDLINE | ID: mdl-38196674

ABSTRACT

Background: The coronavirus disease (COVID-19) pandemic resulted in an increased need for medical professionals with expertise in managing patients with acute hypoxemic respiratory failure, overwhelming the existing critical care workforce in many low-resource countries. Objective: To address this need in Sierra Leone, we developed, piloted, and evaluated a synchronous simulation-based tele-education workshop for healthcare providers on the fundamental principles of intensive care unit (ICU) management of the COVID-19 patient in a low-resource setting. Methods: Thirteen 2-day virtual workshops were implemented between April and July 2020 with frontline Sierra Leone physicians and nurses for potential ICU patients in hospitals throughout Sierra Leone. Although all training sessions took place at the 34 Military Hospital (a national COVID-19 center) in Freetown, participants were drawn from hospitals in each of the provinces of Sierra Leone. The workshops included synchronous tele-education-directed medical simulation didactic sessions about COVID-19, hypoxemia management, and hands-on simulation training about mechanical ventilation. Measures included pre and postworkshop knowledge tests, simulation checklists, and a posttest survey. Test results were analyzed with a paired sample t test; Likert-scale survey responses were reported using descriptive statistics; and open-ended responses were analyzed using thematic analysis. Results: Seventy-five participants enrolled in the program. On average, participants showed 20.8% improvement (a score difference of 4.00 out of a maximum total score of 20) in scores between pre and postworkshop knowledge tests (P = 0.004). Participants reported satisfaction with training (96%; n = 73), achieved 100% of simulation checklist objectives, and increased confidence with ventilator skills (96%; n = 73). Themes from the participants' feedback included increased readiness to train colleagues on critical care ventilators at their hospitals, the need for longer and more frequent training, and a need to have access to critical care ventilators at their hospitals. Conclusion: This synchronous tele-education-directed medical simulation workshop implemented through partnerships between U.S. physicians and Sierra Leone healthcare providers was a feasible, acceptable, and effective means of providing training about COVID-19, hypoxemia management, and mechanical ventilation. Future ICU ventilator training opportunities may consider increasing the length of training beyond 2 days to allow more time for the hands-on simulation scenarios using the ICU ventilator and assessing knowledge application in long-term follow-up.

12.
J Sports Sci ; 41(19): 1753-1761, 2023 Oct.
Article in English | MEDLINE | ID: mdl-38179709

ABSTRACT

Adolescent elite-level footballers are exposed to unique physical and psychological stressors which may increase injury risk, with fluctuating injury prevalence and burden. This study investigates the patterns of injury incidence and burden from 2017 to 2020 within combined pre-, start-of-, mid- and end-of-season and school-holiday phases in U13-U18 Australian male academy players. Injury incidence rate and burden were calculated for medical attention (MA), full and partial time-loss (TL) and non-time-loss (non-TL) injuries. Injury rate ratios (IRR) for injury incidences were assessed using Generalised Linear Mixed Models, and 99% confidence intervals for injury burden differences between phases. MA and non-TL injury incidence rates were higher during pre-season (IRR 1.65, p = 0.01; IRR 2.08, p = 0.02, respectively), and mid-season showed a higher non-TL incidence rate (IRR 2.15, p = 0.02) and burden (69 days with injury/1000 hrs, CI 47-103) compared to end-of-season (25 days with injury/1000 hrs, CI 15-45). MA injury rates and partial TL injury burden were higher during school compared to holiday periods (IRR 0.6, p = 0.04; 61 partial days lost/1000 hrs, CI 35-104; 13 partial days lost/1000 hrs, CI 8-23). Season phase and return-to-school may increase injury risks for elite academy footballers, and considering these phases may assist in developing injury prevention systems.


Subject(s)
Athletic Injuries , Soccer , Adolescent , Humans , Male , Soccer/injuries , Incidence , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Seasons , Australia/epidemiology
13.
Sci Med Footb ; : 1-9, 2022 Dec 13.
Article in English | MEDLINE | ID: mdl-36473725

ABSTRACT

In football, the number of days without full participation in training/competition is often used as a surrogate measure for time-loss (TL) caused by injury. However, injury management and return-to-play processes frequently include modified participation, which to date has only been recorded through self-reports. This study aims to demonstrate the differentiation between 'full' (no participation in team football) and 'partial' (reduced/modified participation in team football) burden. Injury and exposure data were collected from 118 male elite footballers (U13-U18) over 3 consecutive seasons according to the Football Consensus Statement. TL injury burden was calculated separately as the number of total, 'full' and 'partial' days lost per 1000 h of exposure. Injury burden (137.2 days lost/1000 h, 95% CI 133.4-141.0) was comprised of 23% (31.9 days lost/1000 h, 95% CI 30.1-33.8) partial TL and 77% (105.3 days lost/1000 h, 95% CI 102.0-108.6) full TL burden. Injuries of moderate severity (8-28 days lost) showed 40% of partial TL. TL injury incidence rate (6.6 injuries/1000 h, 95% CI 5.8-7.5), the number of severe injuries (16%), and the distribution of TL and non-TL injuries (56% and 44%) were comparable to other reports in elite youth footballers. Almost one-quarter of the TL injury burden showed that injured players were still included in some team football activities, which, for injuries with TL >7 days, was likely related to the return to play process. Therefore, reporting on partial TL provides insight into the true impact of injury on participation levels.

14.
Nat Commun ; 13(1): 6483, 2022 10 29.
Article in English | MEDLINE | ID: mdl-36309495

ABSTRACT

Glioblastoma (GBM) is notorious for its immunosuppressive tumor microenvironment (TME) and is refractory to immune checkpoint blockade (ICB). Here, we identify calmodulin-dependent kinase kinase 2 (CaMKK2) as a driver of ICB resistance. CaMKK2 is highly expressed in pro-tumor cells and is associated with worsened survival in patients with GBM. Host CaMKK2, specifically, reduces survival and promotes ICB resistance. Multimodal profiling of the TME reveals that CaMKK2 is associated with several ICB resistance-associated immune phenotypes. CaMKK2 promotes exhaustion in CD8+ T cells and reduces the expansion of effector CD4+ T cells, additionally limiting their tumor penetrance. CaMKK2 also maintains myeloid cells in a disease-associated microglia-like phenotype. Lastly, neuronal CaMKK2 is required for maintaining the ICB resistance-associated myeloid phenotype, is deleterious to survival, and promotes ICB resistance. Our findings reveal CaMKK2 as a contributor to ICB resistance and identify neurons as a driver of immunotherapeutic resistance in GBM.


Subject(s)
Glioblastoma , Humans , Glioblastoma/drug therapy , Glioblastoma/genetics , CD8-Positive T-Lymphocytes , Tumor Microenvironment , Immunosuppression Therapy , Neurons/pathology , Calcium-Calmodulin-Dependent Protein Kinase Kinase/genetics
15.
Mil Med ; 2022 Jul 04.
Article in English | MEDLINE | ID: mdl-35781513

ABSTRACT

INTRODUCTION: The injury definitions and surveillance methods commonly used in Army basic military training (BMT) research may underestimate the extent of injury. This study therefore aims to obtain a comprehensive understanding of injuries sustained during BMT by employing recording methods to capture all physical complaints. MATERIALS AND METHODS: Six hundred and forty-six recruits were assessed over the 12-week Australian Army BMT course. Throughout BMT injury, data were recorded via (1) physiotherapy reports following recruit consultation, (2) a member of the research team (third party) present at physical training sessions, and (3) recruit daily self-reports. RESULTS: Two hundred and thirty-five recruits had ≥1 incident injury recorded by physiotherapists, 365 recruits had ≥1 incident injury recorded by the third party, and 542 recruits reported ≥1 injury-related problems via the self-reported health questionnaire. Six hundred twenty-one, six hundred eighty-seven, and two thousand nine hundred sixty-four incident injuries were recorded from a total of 997 physiotherapy reports, 1,937 third-party reports, and 13,181 self-reported injury-related problems, respectively. The lower extremity was the most commonly injured general body region as indicated by all three recording methods. Overuse accounted for 79% and 76% of documented incident injuries from physiotherapists and the third party, respectively. CONCLUSIONS: This study highlights that injury recording methods impact injury reporting during BMT. The present findings suggest that traditional injury surveillance methods, which rely on medical encounters, underestimate the injury profile during BMT. Considering accurate injury surveillance is fundamental in the sequence of injury prevention, implementing additional injury recording methods during BMT may thus improve injury surveillance and better inform training modifications and injury prevention programs.

16.
Sci Adv ; 8(29): eabm7833, 2022 07 22.
Article in English | MEDLINE | ID: mdl-35857833

ABSTRACT

Subunit vaccines inducing antibodies against tumor-specific antigens have yet to be clinically successful. Here, we use a supramolecular α-helical peptide nanofiber approach to design epitope-specific vaccines raising simultaneous B cell, CD8+ T cell, and CD4+ T cell responses against combinations of selected epitopes and show that the concurrent induction of these responses generates strong antitumor effects in mice, with significant improvements over antibody or CD8+ T cell-based vaccines alone, in both prophylactic and therapeutic subcutaneous melanoma models. Nanofiber vaccine-induced antibodies mediated in vitro tumoricidal antibody-dependent cellular cytotoxicity (ADCC) and antibody-dependent cellular phagocytosis (ADCP). The addition of immune checkpoint and phagocytosis checkpoint blockade antibodies further improved the therapeutic effect of the nanofiber vaccines against murine melanoma. These findings highlight the potential clinical benefit of vaccine-induced antibody responses for tumor treatments, provided that they are accompanied by simultaneous CD8+ and CD4+ responses, and they illustrate a multiepitope cancer vaccine design approach using supramolecular nanomaterials.


Subject(s)
Cancer Vaccines , Melanoma , Nanofibers , Animals , Epitopes , Immunity, Cellular , Mice , Peptides
17.
Neurosurgery ; 91(3): 427-436, 2022 09 01.
Article in English | MEDLINE | ID: mdl-35593705

ABSTRACT

BACKGROUND: Extracranial multisystem organ failure is a common sequela of severe traumatic brain injury (TBI). Risk factors for developing circulatory shock and long-term functional outcomes of this patient subset are poorly understood. OBJECTIVE: To identify emergency department predictors of circulatory shock after moderate-severe TBI and examine long-term functional outcomes in patients with moderate-severe TBI who developed circulatory shock. METHODS: We conducted a retrospective cohort study using the Transforming Clinical Research and Knowledge in TBI database for adult patients with moderate-severe TBI, defined as a Glasgow Coma Scale (GCS) score of <13 and stratified by the development of circulatory shock within 72 hours of hospital admission (Sequential Organ Failure Assessment score ≥2). Demographic and clinical data were assessed with descriptive statistics. A forward selection regression model examined risk factors for the development of circulatory shock. Functional outcomes were examined using multivariable regression models. RESULTS: Of our moderate-severe TBI population (n = 407), 168 (41.2%) developed circulatory shock. Our predictive model suggested that race, computed tomography Rotterdam scores <3, GCS in the emergency department, and development of hypotension in the emergency department were associated with developing circulatory shock. Those who developed shock had less favorable 6-month functional outcomes measured by the 6-month GCS-Extended (odds ratio 0.36, P = .002) and 6-month Disability Rating Scale score (Diff. in means 3.86, P = .002) and a longer length of hospital stay (Diff. in means 11.0 days, P < .001). CONCLUSION: We report potential risk factors for circulatory shock after moderate-severe TBI. Our study suggests that developing circulatory shock after moderate-severe TBI is associated with poor long-term functional outcomes.


Subject(s)
Brain Injuries, Traumatic , Brain Injuries , Adult , Brain Injuries, Traumatic/complications , Brain Injuries, Traumatic/epidemiology , Glasgow Coma Scale , Humans , Retrospective Studies , Risk Factors
18.
Neurooncol Adv ; 4(1): vdac025, 2022.
Article in English | MEDLINE | ID: mdl-35402913

ABSTRACT

Background: The phase 1 cohorts (1c+1d) of CheckMate 143 (NCT02017717) evaluated the safety/tolerability and efficacy of nivolumab plus radiotherapy (RT) ± temozolomide (TMZ) in newly diagnosed glioblastoma. Methods: In total, 136 patients were enrolled. In part A (safety lead-in), 31 patients (n = 15, methylated/unknown MGMT promoter; n = 16, unmethylated MGMT promoter) received nivolumab and RT+TMZ (NIVO+RT+TMZ) and 30 patients with unmethylated MGMT promoter received NIVO+RT. In part B (expansion), patients with unmethylated MGMT promoter were randomized to NIVO+RT+TMZ (n = 29) or NIVO+RT (n = 30). Primary endpoint was safety/tolerability; secondary endpoint was overall survival (OS). Results: NIVO+RT±TMZ was tolerable; grade 3/4 treatment-related adverse events occurred in 51.6% (NIVO+RT+TMZ) and 30.0% (NIVO+RT) of patients in part A and 46.4% (NIVO+RT+TMZ) and 28.6% (NIVO+RT) in part B. No new safety signals were detected. In part A, median OS (mOS) with NIVO+RT+TMZ was 33.38 months (95% CI, 16.2 to not estimable) in patients with methylated MGMT promoter. In patients with unmethylated MGMT promoter, mOS was 16.49 months (12.94-22.08) with NIVO+RT+TMZ and 14.41 months (12.55-17.31) with NIVO+RT. In part B, mOS was 14.75 months (10.01-18.6) with NIVO+RT+TMZ and 13.96 months (10.81-18.14) with NIVO+RT in patients with unmethylated MGMT promoter. Conclusions: CheckMate 143 was the first trial evaluating immune checkpoint inhibition with first-line treatment of glioblastoma. Results showed that NIVO can be safely combined with RT±TMZ, with no new safety signals. Toxicities, including lymphopenia, were more frequent with NIVO+RT+TMZ. OS was similar with or without TMZ in patients with unmethylated MGMT promoter, and differences by MGMT methylation status were observed.

19.
J Sci Med Sport ; 25(5): 432-438, 2022 May.
Article in English | MEDLINE | ID: mdl-35277344

ABSTRACT

OBJECTIVES: To investigate: (i) the chronicity and phasic variability of sleep patterns and restriction in recruits during basic military training (BMT); and (ii) identify subjective sleep quality in young adult recruits prior to entry into BMT. DESIGN: Prospective observational study. METHODS: Sleep was monitored using wrist-worn actigraphy in Army recruits (n = 57, 18-43 y) throughout 12-weeks of BMT. The Pittsburgh Sleep Quality Index (PSQI) was completed in the first week of training to provide a subjective estimate of pre-BMT sleep patterns. A mixed-effects model was used to compare week-to-week and training phase (Orientation, Development, Field, Drill) differences for rates of sub-optimal sleep (6-7 h), sleep restriction (≤6 h), and actigraphy recorded sleep measures. RESULTS: Sleep duration was 06:24 ±â€¯00:18h (mean ±â€¯SD) during BMT with all recruits experiencing sub-optimal sleep and 42% (n = 24) were sleep restricted for ≥2 consecutive weeks. During Field, sleep duration (06:06 ±â€¯00:36h) and efficiency (71 ±â€¯6%; p < 0.01) were reduced by 15-18 min (minimum - maximum) and 7-8% respectively; whereas, sleep latency (30 ±â€¯15 min), wake after sleep onset (121 ±â€¯23 min), sleep fragmentation index (41 ±â€¯4%) and average awakening length (6.5 ±â€¯1.6 min) were greater than non-Field phases (p < 0.01) by 16-18 min, 28-33 min, 8-10% and 2.5-3 min respectively. Pre-BMT global PSQI score was 5 ±â€¯3, sleep duration and efficiency were 7.4 ±â€¯1.3 h and 88 ±â€¯9% respectively. Sleep schedule was highly variable at pre-BMT (bedtime: 22:34 ±â€¯7:46 h; wake time: 6:59 ±â€¯1:42 h) unlike BMT (2200-0600 h). CONCLUSIONS: The chronicity of sub-optimal sleep and sleep restriction is substantial during BMT and increased training demands exacerbate sleep disruption. Exploration of sleep strategies (e.g. napping, night-time routine) are required to mitigate sleep-associated performance detriments and maladaptive outcomes during BMT.


Subject(s)
Military Personnel , Sleep Wake Disorders , Actigraphy , Humans , Sleep , Sleep Deprivation , Young Adult
20.
Anesth Analg ; 135(6): 1245-1252, 2022 12 01.
Article in English | MEDLINE | ID: mdl-35203085

ABSTRACT

BACKGROUND: Early hypotension after severe traumatic brain injury (sTBI) is associated with increased mortality and poor long-term outcomes. Current guidelines suggest the use of intravenous vasopressors, commonly norepinephrine and phenylephrine, to support blood pressure after TBI. However, guidelines do not specify vasopressor type, resulting in variation in clinical practice. We describe early vasopressor utilization patterns in critically ill patients with TBI and examine the association between utilization of norepinephrine, compared to phenylephrine, with hospital mortality after sTBI. METHODS: We conducted a retrospective cohort study of US hospitals participating in the Premier Healthcare Database between 2009 and 2018. We examined adult patients (>17 years of age) with a primary diagnosis of sTBI who were treated in an intensive care unit (ICU) after injury. The primary exposure was vasopressor choice (phenylephrine versus norepinephrine) within the first 2 days of hospital admission. The primary outcome was in-hospital mortality. Secondary outcomes examined included hospital length of stay (LOS) and ICU LOS. We conducted a post hoc subgroup analysis in all patients with intracranial pressure (ICP) monitor placement. Regression analysis was used to assess differences in outcomes between patients exposed to phenylephrine versus norepinephrine, with propensity matching to address selection bias due to the nonrandom allocation of treatment groups. RESULTS: From 2009 to 2018, 24,718 (37.1%) of 66,610 sTBI patients received vasopressors within the first 2 days of hospitalization. Among these patients, 60.6% (n = 14,991) received only phenylephrine, 10.8% (n = 2668) received only norepinephrine, 3.5% (n = 877) received other vasopressors, and 25.0% (n = 6182) received multiple vasopressors. In that time period, the use of all vasopressors after sTBI increased. A moderate degree of variation in vasopressor choice was explained at the individual hospital level (23.1%). In propensity-matched analysis, the use of norepinephrine compared to phenylephrine was associated with an increased risk of in-hospital mortality (OR, 1.65; CI, 1.46-1.86; P < .0001). CONCLUSIONS: Early vasopressor utilization among critically ill patients with sTBI is common, increasing over the last decade, and varies across hospitals caring for TBI patients. Compared to phenylephrine, norepinephrine was associated with increased risk of in-hospital mortality in propensity-matched analysis. Given the wide variation in vasopressor utilization and possible differences in efficacy, our analysis suggests the need for randomized controlled trials to better inform vasopressor choice for patients with sTBI.


Subject(s)
Brain Injuries, Traumatic , Critical Illness , Adult , Humans , Retrospective Studies , Vasoconstrictor Agents/therapeutic use , Phenylephrine/therapeutic use , Norepinephrine/therapeutic use , Brain Injuries, Traumatic/diagnosis , Brain Injuries, Traumatic/drug therapy , Brain Injuries, Traumatic/chemically induced
SELECTION OF CITATIONS
SEARCH DETAIL
...