Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 4.363
Filtrar
1.
J Craniovertebr Junction Spine ; 15(2): 216-223, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38957762

RESUMO

Background: Posterior cervical fusion (PCF) with lateral mass screws is a favorable treatment option to revise a symptomatic pseudarthrosis due to reliable rates of arthrodesis; however, this technique introduces elevated risk for wound infection and hospital readmission. A tissue-sparing PCF approach involving facet fixation instrumentation reduces the rates of postoperative complications while stabilizing the symptomatic level to achieve arthrodesis; however, these outcomes have been limited to small study cohorts from individual surgeons commonly with mixed indications for treatment. Materials and Methods: One hundred and fifty cases were identified from a retrospective chart review performed by seven surgeons across six sites in the United States. All cases involved PCF revision for a pseudarthrosis at one or more levels from C3 to C7 following anterior cervical discectomy and fusion (ACDF). PCF was performed using a tissue-sparing technique with facet instrumentation. Cases involving additional supplemental fixation such as lateral mass screws, rods, wires, or other hardware were excluded. Demographics, operative notes, postoperative complications, hospital readmission, and subsequent surgical interventions were summarized as an entire cohort and according to the following risk factors: age, sex, number of levels revised, body mass index (BMI), and history of nicotine use. Results: The average age of patients at the time of PCF revision was 55 ± 11 years and 63% were female. The average BMI was 29 ± 6 kg/m2 and 19% reported a history of nicotine use. Postoperative follow-up visits were available with a median of 68 days (interquartile range = 41-209 days) from revision PCF. There were 91 1-level, 49 2-level, 8 3-level, and 2 4±-level PCF revision cases. The mean operative duration was 52 ± 3 min with an estimated blood loss of 14 ± 1.5cc. Participants were discharged an average of 1 ± 0.05 days following surgery. Multilevel treatment resulted in longer procedure times (single = 45 min, multi = 59 min, P = 0.01) but did not impact estimated blood loss (P = 0.94). Total nights in the hospital increased by 0.2 nights with multilevel treatment (P = 0.01). Sex, age, nicotine history, and BMI had no effect on recorded perioperative outcomes. There was one instance of rehospitalization due to deep-vein thrombosis, one instance of persistent pseudarthrosis at the revised level treated with ACDF, and four instances of adjacent segment disease. In patients initially treated with multilevel ACDF, revisions occurred most commonly on the caudal level (48% of revised levels), followed by the cranial (43%), and least often in the middle level (9%). Conclusions: This chart review of perioperative and safety outcomes provides evidence in support of tissue-sparing PCF with facet instrumentation as a treatment for symptomatic pseudarthrosis after ACDF. The most common locations requiring revision were the caudal and cranial levels. Operative duration and estimated blood loss were favorable when compared to open alternatives. There were no instances of postoperative wound infection, and the majority of patients were discharged the day following surgery.

2.
HPB (Oxford) ; 2024 Jun 22.
Artigo em Inglês | MEDLINE | ID: mdl-38960764

RESUMO

BACKGROUND: The demand for liver transplants (LT) in the United States far surpasses the availability of allografts. New allocation schemes have resulted in occasional difficulties with allograft placement and increased intraoperative turndowns. We aimed to evaluate the outcomes related to use of late-turndown liver allografts. METHODS: A review of prospectively collected data of LTs at a single center from July 2019 to July 2023 was performed. Late-turndown placement was defined as an open offer 6 h prior to donation, intraoperative turndown by primary center, or post-cross-clamp turndown. RESULTS: Of 565 LTs, 25.1% (n = 142) received a late-turndown liver allograft. There were no significant differences in recipient age, gender, BMI, or race (all p > 0.05), but MELD was lower for the late-turndown LT recipient group (median 15 vs 21, p < 0.001). No difference in 30-day, 6-month, or 1-year survival was noted on logistic regression, and no difference in patient or graft survival was noted on Cox proportional hazard regression. Late-turndown utilization increased during the study from 17.2% to 25.8%, and median waitlist time decreased from 77 days in 2019 to 18 days in 2023 (p < 0.001). CONCLUSION: Use of late-turndown livers has increased and can increase transplant rates without compromising post-transplant outcomes with appropriate selection.

3.
Environ Pollut ; : 124491, 2024 Jul 02.
Artigo em Inglês | MEDLINE | ID: mdl-38964646

RESUMO

Deep Geological Repository (DGR) concept consist of storing radioactive waste in metal canisters, surrounded by compacted bentonite, and placed into a geological formation. Here, bentonite slurry microcosms with copper canister, inoculated with bacterial consortium and amended with acetate, lactate and sulfate were set up to investigate their geochemical evolution over a year under anoxic conditions. The impact of microbial communities on the corrosion of copper canister in an early-stage (45 days) was also assessed. The amended bacterial consortium and electron donors/acceptor accelerated the microbial activity, while bentonite heat-shocked process had a retarding effect. The microbial communities partially oxidize lactate to acetate which is subsequently consumed when the lactate is depleted. Early-stage microbial communities showed that the bacterial consortium reduced microbial diversity with Pseudomonas and Stenotrophomonas dominating the community. However, sulfate-reducing bacteria such as Desulfocurvibacter, Anaerosolibacter, and Desulfosporosinus were enriched coupling oxidation of lactate/acetate with reduction of sulfates. The generated biogenic sulfides could mediate the conversion of copper oxides (possibly formed by trapped oxygen molecules on the bentonite or driven by the reduction of H2O) to copper sulfide (Cu2S) identified by X-ray photoelectron spectroscopy (XPS). Overall, these findings shed light on the ideal geochemical conditions that would affect the stability of DGR barriers, emphasizing the impact of the SRB on the corrosion of the metal canisters, the gas generation, and the interaction with components of the bentonite.

4.
Plast Reconstr Surg ; 2024 Jun 28.
Artigo em Inglês | MEDLINE | ID: mdl-38967623

RESUMO

Facial buttresses are supportive bony structures of the facial skeleton that form a thick, strong, and protective framework for the face. Surgical fixation may be required to restore morphology and function when damage to these buttresses occurs. We sought to determine if, similar to buttresses of the facial skeleton, buttresses of the internal orbit exist. Hence, we analyzed 10 human cadaver skulls imaged by microcomputed tomography (micro-CT). Image processing and thickness/heat mapping were performed using Avizo and ImageJ softwares. After identifying the orbital buttresses, we reviewed CT scans of patients who had orbital fractures across three years to determine the frequency of fracture of the orbital buttresses. We identified 5 buttresses of the internal orbit: superomedial fronto-ethmoidal strut with the deep orbital buttress, inferomedial strut with the posterior ledge, inferior orbital fissure, sphenoid-frontal superolateral strut, and the sphenoid lip. The average threshold orbital buttress thickness was 1.36 (0.25) mm. A total of 1186 orbits of 593 individuals were analyzed for orbital buttress involvement. Orbital buttresses were spared in 770 (65%) orbits. The inferomedial strut with the posterior ledge was the most commonly fractured buttress in 14.4% of orbits (n=171), followed by the sphenoid strut and lip (66 [5.6%]). To our knowledge, this is the first description of the buttresses of the internal orbit. Orbital reconstruction for fracture repair or oncologic purposes requires the support of orbital buttresses. Understanding the anatomy of orbital buttresses is crucial for successful surgical planning, proper implant positioning, and restoration of function and appearance.

5.
JCI Insight ; 9(13)2024 May 21.
Artigo em Inglês | MEDLINE | ID: mdl-38973611

RESUMO

Optimization of protective immune responses against SARS-CoV-2 remains an urgent worldwide priority. In this regard, type III IFN (IFN-λ) restricts SARS-CoV-2 infection in vitro, and treatment with IFN-λ limits infection, inflammation, and pathogenesis in murine models. Furthermore, IFN-λ has been developed for clinical use to limit COVID-19 severity. However, whether endogenous IFN-λ signaling has an effect on SARS-CoV-2 antiviral immunity and long-term immune protection in vivo is unknown. In this study, we identified a requirement for IFN-λ signaling in promoting viral clearance and protective immune programming in SARS-CoV-2 infection of mice. Expression of both IFN and IFN-stimulated gene (ISG) in the lungs were minimally affected by the absence of IFN-λ signaling and correlated with transient increases in viral titers. We found that IFN-λ supported the generation of protective CD8 T cell responses against SARS-CoV-2 by facilitating accumulation of CD103+ DC in lung draining lymph nodes (dLN). IFN-λ signaling specifically in DCs promoted the upregulation of costimulatory molecules and the proliferation of CD8 T cells. Intriguingly, antigen-specific CD8 T cell immunity to SARS-CoV-2 was independent of type I IFN signaling, revealing a nonredundant function of IFN-λ. Overall, these studies demonstrate a critical role for IFN-λ in protective innate and adaptive immunity upon infection with SARS-CoV-2 and suggest that IFN-λ serves as an immune adjuvant to support CD8 T cell immunity.


Assuntos
Linfócitos T CD8-Positivos , COVID-19 , Interferon Tipo I , SARS-CoV-2 , Animais , Linfócitos T CD8-Positivos/imunologia , SARS-CoV-2/imunologia , Camundongos , COVID-19/imunologia , COVID-19/virologia , Interferon Tipo I/imunologia , Interferon Tipo I/metabolismo , Pulmão/imunologia , Pulmão/virologia , Transdução de Sinais/imunologia , Modelos Animais de Doenças , Interferon lambda , Interferons/imunologia , Interferons/metabolismo , Camundongos Endogâmicos C57BL , Camundongos Knockout , Células Dendríticas/imunologia , Humanos
6.
J Clin Anesth ; 97: 111534, 2024 Jun 28.
Artigo em Inglês | MEDLINE | ID: mdl-38943851

RESUMO

STUDY OBJECTIVE: Describe dosing of local anesthetic when both a periarticular injection (PAI) and peripheral nerve block (PNB) are utilized for knee arthroplasty analgesia, and compare the dosing of local to suggested maximum dosing, and look for evidence of local anesthetic systemic toxicity (LAST). DESIGN: A single center retrospective cohort study between May 2018 and November 2022. SETTING: A major academic hospital. PATIENTS: Patients who had both a PAI and PNB while undergoing primary, revision, total, partial, unilateral, or bilateral knee arthroplasty. INTERVENTIONS: None. MEASUREMENTS: Calculate the dose of local anesthetic given via PAI, PNB, and both routes combined as based on lean body weight and compare that to the suggested maximum dosing. Look for medications, clinical interventions, and critical event notes suggestive of a LAST event. MAIN RESULTS: There were 4527 knee arthroplasties where both a PAI and PNB were performed during the study period. When combining PAI and PNB doses, >75% of patients received more than the suggested maximum dose of 3 mg/kg lean body weight. The median local anesthetic dosing over the study period, 4.4 mg/kg (IQR 3.5,5.9), was 147% of the suggested maximum dose (IQR 117,197). There was no conclusive evidence of LAST among any of the patients in the study. CONCLUSIONS: Over the course of our study, we had 4527 knee arthroplasties with a median PAI and PNB local anesthetic dose that was 147% of the suggested maximum without any clear clinical evidence of a LAST event.

8.
Gene ; 927: 148731, 2024 Jun 27.
Artigo em Inglês | MEDLINE | ID: mdl-38944164

RESUMO

Vascular calcification is prevalent in chronic kidney disease (CKD). Genetic causes of CKD account for 10-20% of adult-onset disease. Vascular calcification is thought to be one of the most important risk factors for increased cardiovascular morbidity and mortality in CKD patients and is detectable in 80% of patients with end stage kidney disease (ESKD). Despite the high prevalence of vascular calcification in CKD, no single gene cause has been described. We hypothesized that variants in vascular calcification genes may contribute to disease pathogenesis in CKD, particularly in families who exhibit a predominant vascular calcification phenotype. We developed a list of eight genes that are hypothesized to play a role in vascular calcification due to their involvement in the ectopic calcification pathway: ABCC6, ALPL, ANK1, ENPP1, NT5E, SLC29A1, SLC20A2, and S100A12. With this, we assessed exome data from 77 CKD patients, who remained unsolved following evaluation for all known monogenic causes of CKD. We also analyzed an independent cohort (Ontario Neurodegenerative Disease Research Initiative (ONDRI), n = 520) who were screened for variants in ABCC6 and compared this to a control cohort of healthy adults (n = 52). We identified two CKD families with heterozygous pathogenic variants (R1141X and A667fs) in ABCC6. We identified 10 participants from the ONDRI cohort with heterozygous pathogenic or likely pathogenic variant in ABCC6. Replication in a healthy control cohort did not reveal any variants. Our study provides preliminary data supporting the hypothesis that ABCC6 may play a role in vascular calcification in CKD. By screening CKD patients for genetic causes early in the diagnostic pathway, patients with genetic causes associated with vascular calcification can potentially be preventatively treated with new therapeutics with aims to decrease mortality.

9.
Artigo em Inglês | MEDLINE | ID: mdl-38835938

RESUMO

Introduction: Orthopaedic surgery continues to be one of the most competitive specialties to match into as a medical student, particularly for osteopathic medical students. Therefore, in this study, we sought to examine the prevalence of osteopathic students (DO) matching into orthopaedic surgery at traditional Accreditation Council for Graduate Medical Education (ACGME) accredited programs (former allopathic residency programs) in recent years. Methods: A retrospective review of National Residency Match Program annual reports and Association of American Medical Colleges's Electronic Residency Application Service Statistic reports were performed to determine the number of applications and match rates among osteopathic (DO) and allopathic (MD) medical students into orthopaedic surgery from 2019 to 2023. Data on the degree type of current residents at all ACGME-accredited residency programs were identified. Results: During the analyzed study period of 2019 to 2023, there were 3,473 (74.5%) allopathic students and 571 (59.9%) osteopathic students who successfully matched into orthopaedic surgery. This match rate for allopathic students was 74.5% compared with 59.9% for osteopathic students. Of the 3,506 medical students who hold postgraduate orthopaedic surgery positions at former allopathic programs over the past 5 years, only 58 (1.7%) hold an osteopathic degree. Of the 560 medical students who hold postgraduate orthopaedic surgery positions at former osteopathic programs over the past 5 years, 47 (8.4%) hold an allopathic degree. The match rate of allopathic students at former osteopathic programs is significantly higher than the match rate of osteopathic students at former allopathic programs. Conclusions: Osteopathic students continue to match into orthopaedic surgery at lower rates than their allopathic counterparts. In addition, there remains a consistent and low number of osteopathic students matching into former allopathic programs. Allopathic students also have a higher likelihood of matching into former osteopathic programs when compared with osteopathic students matching into previous allopathic orthopaedic surgery programs.

11.
bioRxiv ; 2024 Jun 02.
Artigo em Inglês | MEDLINE | ID: mdl-38853965

RESUMO

Microbes in soil navigate interactions by recognizing kin, forming social groups, exhibiting antagonistic behavior, and engaging in competitive kin rivalry. Here, we investigated a novel phenomenon of self-growth suppression (sibling rivalry) observed in Bradyrhizobium diazoefficiens USDA 110. Swimming colonies of USDA 110 developed a distinct demarcation line and inter-colony zone when inoculated adjacent to each other. In addition to self, USDA 110 suppressed growth of other Bradyrhizobium strains and several other soil bacteria. We demonstrated that the phenomenon of sibling rivalry is due to growth suppression but not cell death. The cells in the inter-colony zone were culturable but have reduced respiratory activity, ATP levels and motility. The observed growth suppression was due to the presence of a diffusible effector compound. This effector was labile, preventing extraction, and identification, but it is unlikely a protein or a strong acid or base. This counterintuitive phenomenon of self-growth suppression suggests a strategic adaptation for conserving energy and resources in competitive soil environments. Bradyrhizobium's utilization of antagonism including self-growth suppression likely provides a competitive advantage for long-term success in soil ecosystems.

12.
AJR Am J Roentgenol ; 2024 Jun 20.
Artigo em Inglês | MEDLINE | ID: mdl-38899842

RESUMO

Background: Differences in survival and morbidity among treatment options (ablation, surgical resection, and transplant) for early-stage hepatocellular carcinoma (HCC) have been well-studied. Additional understanding of the costs of such care would help to identify drivers of high costs and potential barriers to care delivery. Objective: To quantify total and patient out-of-pocket costs for ablation, surgical resection, and transplant in the management of early-stage HCC and to identify factors predictive of these costs. Methods: This retrospective U.S. population-based study used the SEER-Medicare linked dataset to identify a sample of 1067 Medicare beneficiaries (mean age, 73 years; 674 men, 393 women) diagnosed with early-stage HCC (size ≤5 cm) treated with ablation (N=623), resection (N=201), or transplant (N=243) between January 2009 and December 2016. Total costs and patient out-of-pocket costs for the index procedure as well as for any care within 30 days and 90 days post-procedure were identified and stratified by treatment modality. Additional comparisons were performed among propensity-score matched subgroups of patients treated by ablation or resection (each N=172). Multivariable linear regression models were used to identify factors predictive of total costs and out-of-pocket costs for index procedures as well as for 30-day and 90-day post-procedure periods. Results: For ablation, resection, and transplant, median index-procedure total cost was $6689, $25,614, and $66,034; index-procedure out-of-pocket cost was $1235, $1650, and $1317; 30-day total cost was $9456, $29,754, and $69,856; 30-day out-of-pocket cost was $1646, $2208, and $3198; 90-day total cost was $14,572, $34,984, and $88,103; and 90-day out-of-pocket cost was $2138, $2462, and $3876, respectively (all p<.001). In propensity-matched subgroups, ablation and resection had median index-procedure, 30-day, and 90-day total costs of $6690 and $25,716, $9995 and $30,365, and $15,851 and $34,455, respectively. In multivariable analysis adjusting for socioeconomic factors, comorbidities, and liver-disease prognostic indicators, surgical treatment (resection or transplant) was predictive of significantly greater costs compared with ablation at all time points. Conclusion: Total and out-of-pocket costs for index procedures as well as for 30-day and 90-day post-procedure periods were lowest for ablation, followed by resection and then transplant. Clinical Impact: This comprehensive cost analysis could help inform future cost-effectiveness analyses.

13.
Immun Ageing ; 21(1): 36, 2024 Jun 12.
Artigo em Inglês | MEDLINE | ID: mdl-38867294

RESUMO

BACKGROUND AND PURPOSE: The immune response changes during aging and the progression of Alzheimer's disease (AD) and related dementia (ADRD). Terminally differentiated effector memory T cells (called TEMRA) are important during aging and AD due to their cytotoxic phenotype and association with cognitive decline. However, it is not clear if the changes seen in TEMRAs are specific to AD-related cognitive decline specifically or are more generally correlated with cognitive decline. This study aimed to examine whether TEMRAs are associated with cognition and plasma biomarkers of AD, neurodegeneration, and neuroinflammation in a community-based cohort of older adults. METHODS: Study participants from a University of Kentucky Alzheimer's Disease Research Center (UK-ADRC) community-based cohort of aging and dementia were used to test our hypothesis. There were 84 participants, 44 women and 40 men. Participants underwent physical examination, neurological examination, medical history, cognitive testing, and blood collection to determine plasma biomarker levels (Aß42/Aß40 ratio, total tau, Neurofilament Light chain (Nf-L), Glial Fibrillary Acidic Protein (GFAP)) and to isolate peripheral blood mononuclear cells (PBMCs). Flow cytometry was used to analyze PBMCs from study participants for effector and memory T cell populations, including CD4+ and CD8+ central memory T cells (TCM), Naïve T cells, effector memory T cells (TEM), and effector memory CD45RA+ T cells (TEMRA) immune cell markers. RESULTS: CD8+ TEMRAs were positively correlated with Nf-L and GFAP. We found no significant difference in CD8+ TEMRAs based on cognitive scores and no associations between CD8+ TEMRAs and AD-related biomarkers. CD4+ TEMRAs were associated with cognitive impairment on the MMSE. Gender was not associated with TEMRAs, but it did show an association with other T cell populations. CONCLUSION: These findings suggest that the accumulation of CD8+ TEMRAs may be a response to neuronal injury (Nf-L) and neuroinflammation (GFAP) during aging or the progression of AD and ADRD. As our findings in a community-based cohort were not clinically-defined AD participants but included all ADRDs, this suggests that TEMRAs may be associated with changes in systemic immune T cell subsets associated with the onset of pathology.

14.
Burns ; 2024 May 09.
Artigo em Inglês | MEDLINE | ID: mdl-38890052

RESUMO

BACKGROUND: Long-term cognitive impairment (LTCI) is experienced by up to two thirds of patients discharged from burns intensive care units (BICUs), however little is known about its neurobiological basis. This study investigated if patients previously admitted to BICU showed structural and functional MRI changes of the Default Mode Network (DMN). METHODS: Fifteen patients previously admitted to BICU with a significant burns injury, and 15 matched volunteers, underwent structural and functional MRI scans. Functional connectivity, fractional anisotropy and cortical thickness of the main DMN subdivisions (anterior DMN (aDMN), posterior DMN (pDMN) and right (rTPJ) and left (lTPJ) temporo-parietal junctions) were compared between patients and volunteers, with differences correlated against cognitive performance. RESULTS: Functional connectivity between rTPJ and pDMN (t = 2.91, p = 0.011) and between rTPJ and lTPJ (t = 3.18, p = 0.008) was lower in patients compared to volunteers. Functional connectivity between rTPJ and pDMN correlated with cognitive performance (r2 =0.33, p < 0.001). Mean fractional anisotropy of rTPJ (t = 2.70, p = 0.008) and lTPJ (T = 2.39, p = 0.015) was lower in patients but there was no difference in cortical thickness. CONCLUSIONS: Patients previously admitted to BICU show structural and functional disruption of the DMN. Since functional changes correlate with cognitive performance, this should direct further research into intensive-care-related cognitive impairment.

15.
Vaccine ; 2024 Jun 17.
Artigo em Inglês | MEDLINE | ID: mdl-38890105

RESUMO

The first dengue "endgame" summit was held in Syracuse, NY over August 9 and 10, 2023. Organized and hosted by the Institute for Global Health and Translational Sciences at SUNY Upstate Medical University, the gathering brought together researchers, clinicians, drug and vaccine developers, government officials, and other key stakeholders in the dengue field for a highly collaborative and discussion-oriented event. The objective of the gathering was to discuss the current state of dengue around the world, what dengue "control" might look like, and what a potential roadmap might look like to achieve functional dengue control. Over the course of 7 sessions, speakers with a diverse array of expertise highlighted both current and historic challenges associated with dengue control, the state of dengue countermeasure development and deployment, as well as fundamental virologic, immunologic, and medical barriers to achieving dengue control. While sustained eradication of dengue was considered challenging, attendees were optimistic that significant reduction in the burden of dengue can be achieved by integration of vector control with effective application of therapeutics and vaccines.

16.
bioRxiv ; 2024 Jun 09.
Artigo em Inglês | MEDLINE | ID: mdl-38895264

RESUMO

Ovarian cancer is the deadliest gynecological malignancy, owing to its late-stage diagnosis and high rates of recurrence and resistance following standard-of-care treatment, highlighting the need for novel treatment approaches. Through an unbiased drug screen, we identified the kinase inhibitor, lestaurtinib, as a potent antineoplastic agent for chemotherapy- and PARP-inhibitor (PARPi)-sensitive and -resistant ovarian cancer cells and patient derived xenografts (PDXs). RNA-sequencing revealed that lestaurtinib potently suppressed JAK/STAT signaling and lestaurtinib efficacy was shown to be directly related to JAK/STAT pathway activity in cell lines and PDX models. Most ovarian cancer cells exhibited constitutive JAK/STAT pathway activation and genetic loss of STAT1 and STAT3 resulted in growth inhibition. Lestaurtinib also displayed synergy when combined with cisplatin and olaparib, including in a model of PARPi resistance. In contrast, the most well-known JAK/STAT inhibitor, ruxolitinib, lacked antineoplastic activity against all ovarian cancer cell lines and PDX models tested. This divergent behavior was reflected in the ability of lestaurtinib to block both Y701/705 and S727 phosphorylation of STAT1 and STAT3, whereas ruxolitinib failed to block S727. Consistent with these findings, lestaurtinib additionally inhibited JNK and ERK activity, leading to more complete suppression of STAT phosphorylation. Concordantly, combinatorial treatment with ruxolitinib and a JNK or ERK inhibitor resulted in synergistic antineoplastic effects at dose levels where single agents were ineffective. Taken together, these findings indicate that lestaurtinib, and other treatments that converge on JAK/STAT signaling, are worthy of further pre-clinical and clinical exploration for the treatment of highly aggressive and advanced forms of ovarian cancer. Statement of significance: Lestaurtinib is a novel inhibitor of ovarian cancer, including chemotherapy- and PARPi-resistant models, that acts through robust inhibition of the JAK/STAT pathway and synergizes with standard-of-care agents at clinically relevant concentrations.

17.
Artigo em Inglês | MEDLINE | ID: mdl-38895979

RESUMO

Our purpose was to determine how age affects metabolic flexibility and underlying glucose kinetics in healthy young and older adults. Therefore, glucose and lactate tracers, along with pulmonary gas exchange data were used to determine glucose kinetics and respiratory exchange ratios (RER=CO2/O2) during a 2-hour 75-gram oral glucose tolerance test (OGTT). After an 12-hour overnight fast, 28 participants, 15 young (21-35 yr.; 7 men and 8 women) and 13 older (60-80 yr.; 7 men and 6 women) received venous primed-continuous infusions of [6,6-2H]glucose, and [3-13C]lactate with a H13CO3- bolus. Following a 90-minute metabolic stabilization and tracer equilibration period, volunteers underwent an OGTT. Arterialized glucose concentrations ([glucose]) started to rise 15 minutes post-glucose consumption, peaked at 60 minutes, and remained elevated. As assessed by rates of appearance (Ra), disposal (Rd) and metabolic clearance (MCR) glucose kinetics were suppressed in older compared to young individuals. As well, unlike in young individuals, fractional gluconeogenesis (fGNG) remained elevated in the older population following the oral glucose challenge. Lastly, there were no differences in 12-hr fasting baseline or peak RER values following an oral glucose challenge in older compared to young men and women, making RER an incomplete measure of metabolic flexibility in the volunteers we evaluated. Our study revealed that glucose kinetics are significantly altered in a healthy aged population following a glucose challenge. Further, those physiological deficits are not detected from changes in RER during an OGTT.

18.
J Surg Res ; 301: 163-171, 2024 Jun 26.
Artigo em Inglês | MEDLINE | ID: mdl-38936245

RESUMO

INTRODUCTION: Many patients suffering from isolated severe traumatic brain injury (sTBI) receive blood transfusion on hospital arrival due to hypotension. We hypothesized that increasing blood transfusions in isolated sTBI patients would be associated with an increase in mortality. METHODS: We performed a trauma quality improvement program (TQIP) (2017-2019) and single-center (2013-2021) database review filtering for patients with isolated sTBI (Abbreviated Injury Scale head ≥3 and all other areas ≤2). Age, initial Glasgow Coma Score (GCS), Injury Severity Score (ISS), initial systolic blood pressure (SBP), mechanism (blunt/penetrating), packed red blood cells (pRBCs) and fresh frozen plasma (FFP) transfusion volume (units) within the first 4 h, FFP/pRBC ratio (4h), and in-hospital mortality were obtained from the TQIP Public User Files. RESULTS: In the TQIP database, 9257 patients had isolated sTBI and received pRBC transfusion within the first 4 h. The mortality rate within this group was 47.3%. The increase in mortality associated with the first unit of pRBCs was 20%, then increasing approximately 4% per unit transfused to a maximum mortality of 74% for 11 or more units. When adjusted for age, initial GCS, ISS, initial SBP, and mechanism, pRBC volume (1.09 [1.08-1.10], FFP volume (1.08 [1.07-1.09]), and FFP/pRBC ratio (1.18 [1.08-1.28]) were associated with in-hospital mortality. Our single-center study yielded 138 patients with isolated sTBI who received pRBC transfusion. These patients experienced a 60.1% in-hospital mortality rate. Logistic regression corrected for age, initial GCS, ISS, initial SBP, and mechanism demonstrated no significant association between pRBC transfusion volume (1.14 [0.81-1.61]), FFP transfusion volume (1.29 [0.91-1.82]), or FFP/pRBC ratio (6.42 [0.25-164.89]) and in-hospital mortality. CONCLUSIONS: Patients suffering from isolated sTBI have a higher rate of mortality with increasing amount of pRBC or FFP transfusion within the first 4 h of arrival.

19.
Mol Genet Metab ; 142(4): 108516, 2024 Jun 17.
Artigo em Inglês | MEDLINE | ID: mdl-38941880

RESUMO

Glutaric aciduria type II (GAII) is a heterogeneous genetic disorder affecting mitochondrial fatty acid, amino acid and choline oxidation. Clinical manifestations vary across the lifespan and onset may occur at any time from the early neonatal period to advanced adulthood. Historically, some patients, in particular those with late onset disease, have experienced significant benefit from riboflavin supplementation. GAII has been considered an autosomal recessive condition caused by pathogenic variants in the gene encoding electron-transfer flavoprotein ubiquinone-oxidoreductase (ETFDH) or in the genes encoding electron-transfer flavoprotein subunits A and B (ETFA and ETFB respectively). Variants in genes involved in riboflavin metabolism have also been reported. However, in some patients, molecular analysis has failed to reveal diagnostic molecular results. In this study, we report the outcome of molecular analysis in 28 Australian patients across the lifespan, 10 paediatric and 18 adult, who had a diagnosis of glutaric aciduria type II based on both clinical and biochemical parameters. Whole genome sequencing was performed on 26 of the patients and two neonatal onset patients had targeted sequencing of candidate genes. The two patients who had targeted sequencing had biallelic pathogenic variants (in ETFA and ETFDH). None of the 26 patients whose whole genome was sequenced had biallelic variants in any of the primary candidate genes. Interestingly, nine of these patients (34.6%) had a monoallelic pathogenic or likely pathogenic variant in a single primary candidate gene and one patient (3.9%) had a monoallelic pathogenic or likely pathogenic variant in two separate genes within the same pathway. The frequencies of the damaging variants within ETFDH and FAD transporter gene SLC25A32 were significantly higher than expected when compared to the corresponding allele frequencies in the general population. The remaining 16 patients (61.5%) had no pathogenic or likely pathogenic variants in the candidate genes. Ten (56%) of the 18 adult patients were taking the selective serotonin reuptake inhibitor antidepressant sertraline, which has been shown to produce a GAII phenotype, and another two adults (11%) were taking a serotonin-norepinephrine reuptake inhibitor antidepressant, venlafaxine or duloxetine, which have a mechanism of action overlapping that of sertraline. Riboflavin deficiency can also mimic both the clinical and biochemical phenotype of GAII. Several patients on these antidepressants showed an initial response to riboflavin but then that response waned. These results suggest that the GAII phenotype can result from a complex interaction between monoallelic variants and the cellular environment. Whole genome or targeted gene panel analysis may not provide a clear molecular diagnosis.

20.
Clin Infect Dis ; 2024 Jun 25.
Artigo em Inglês | MEDLINE | ID: mdl-38917034

RESUMO

BACKGROUND: Gram-negative bloodstream infections (GNBSI) more commonly occur in children with comorbidities and are increasingly associated with antimicrobial resistance. There are few large studies of GNBSI in children that relate the clinical presentation, pathogen characteristics and outcomes. METHODS: A 3-year prospective study of GNBSI in children aged <18 years was conducted in five Australian children's hospitals between 2019-2021. The clinical characteristics, disease severity and outcomes were recorded. Causative pathogens underwent antibiotic susceptibility testing and whole genome sequencing. RESULTS: There were 931 GNBSI episodes involving 818 children. Median age was 3 years (IQR 0.6-8.5). 576/931 episodes (62%) were community onset though 661/931 (71%) occurred in children with comorbidities and a central venous catheter (CVC) was present in 558/931 (60%). CVC (145/931) and urinary tract (149/931) were the most common sources (16% each). 100/931 (11%) children required Intensive Care Unit (ICU) admission and a further 11% (105/931) developed GNBSI in ICU. 659/927 (71%) isolates were Enterobacterales of which 22% (138/630) were third generation cephalosporin resistant (3GCR). Extended spectrum beta-lactamase genes (ESBL) were confirmed in 65/138 (47%) 3GCR-Enterobacterales. Most common ESBL genes were blaCTX-M-15 (34/94, 36%) and blaSHV-12 (10/94, 11%). There were 48 deaths overall and 30-day in-hospital mortality was 3% (32/931). Infections with 3GCR Enterobacterales were independently associated with higher mortality (adjusted OR 3.2, 95%CI 1.6-6.4). CONCLUSION: GNBSI in children are frequently healthcare-associated and affect children under 5 years. Infections with 3GCR Enterobacterales were associated with worse outcomes. These findings will inform optimal management guidelines and help prioritise future antimicrobial clinical trials.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...