Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 3.404
Filtrar
1.
Gastroenterology Res ; 17(3): 109-115, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38993547

RESUMO

Background: Recent studies suggest an inverse relationship between baseline levels of hepatitis B virus (HBV) DNA and on-treatment risk of hepatocellular carcinoma (HCC) in patients with chronic hepatitis B (CHB). However, data are limited to Asian cohorts, and it is unclear if similar associations hold true for non-Asians with CHB. We aimed to evaluate association of baseline HBV DNA with long-term risks of cirrhosis and HCC among a predominantly non-Asian cohort of CHB patients in the USA. Methods: Using longitudinal data from the national Veterans Affairs database, we evaluated the risk of cirrhosis or HCC among adults with non-cirrhotic CHB who are on continuous antiviral therapy, stratified by moderate levels of baseline HBV DNA (4.00 - 6.99 log10 IU/mL) vs. high levels of baseline HBV DNA (7.00 log10 IU/mL or higher). Propensity score weighting was applied, and competing risks cumulative incidence functions and Cox proportional hazards models were utilized. Results: Among 1,129 non-cirrhotic CHB patients (41% non-Hispanic White, 36% African American, mean age 57.0 years, 62.2% hepatitis B e antigen (HBeAg) positive), 585 had moderate levels of baseline HBV DNA and 544 had high HBV DNA. After propensity score weighting, no significant difference in risk of cirrhosis was observed between moderate vs. high baseline HBV DNA (4.55 vs. 5.22 per 100 person-years, hazard ratio (HR): 0.87, 95% confidence interval (CI): 0.69 - 1.09, P = 0.22), but risk of HCC was significantly higher in patients with moderate vs. high baseline HBV DNA (0.84 vs. 0.69 per 100 person-years, HR: 1.33, 95% CI: 1.09 - 1.62, P < 0.01). Conclusions: Among a national cohort of predominantly non-Asian US veterans with non-cirrhotic CHB on antiviral therapy, moderate levels of baseline HBV DNA was associated with higher risk of HCC than high HBV DNA.

2.
World Neurosurg ; 2024 Jul 06.
Artigo em Inglês | MEDLINE | ID: mdl-38977129

RESUMO

OBJECTIVE: The Pfirrmann scoring system classifies lumbosacral disc degeneration based on MRI signal intensity. The relationship between pre-existing disc degeneration and PROMs after one-level lumbar fusion is not well documented. The purpose of this study was to investigate the relationship between the severity of preoperative intervertebral disc degeneration and pre- and postoperative patient-reported outcome measures (PROMs) in patients undergoing one-level lumbar fusion. METHODS: All adult patients underwent posterior lumbar decompression and fusion (PLDF) or transforaminal lumbar interbody fusion (TLIF) between 2014-2022 were included. Patient demographics, and comorbidities were extracted from medical records. Lumbar intervertebral discs on sagittal MRI T2-weighted images were assessed by two independent graders utilizing Pfirrmann criteria. Grades I-III were categorized as low-grade disc degeneration, while IV-V were considered high-grade. Multivariable linear regression assessed the impact of disc degeneration on PROMS. RESULTS: A total of 150 patients were included, of which, 69 (46%) had low grade disc degeneration, while 81 (54%) had high grade degeneration. Patients with high-grade degeneration had increased preoperative VAS-Leg scores (6.10 vs. 4.54, p=0.005) and displayed greater one-year postoperative improvements in VAS-Back scores (-2.11 vs -0.66, p=0.002). Multivariable regression demonstrated Pfirrmann scores as independent predictors for both preoperative VAS-Leg scores (p=0.004) and postoperative VAS-Back improvement (p=0.005). CONCLUSIONS: In patients undergoing one-level lumbar fusion, higher Pfirmann scores were associated with increased preoperative leg pain and greater one-year postoperative improvement in back pain. Further studies into the relationship of pre-operative disc degeneration and their impact on postoperative outcomes may help guide clinical decision making and patient expectations.

3.
Nat Rev Cardiol ; 2024 Jul 19.
Artigo em Inglês | MEDLINE | ID: mdl-39030270

RESUMO

For more than 60 years, humans have travelled into space. Until now, the majority of astronauts have been professional, government agency astronauts selected, in part, for their superlative physical fitness and the absence of disease. Commercial spaceflight is now becoming accessible to members of the public, many of whom would previously have been excluded owing to unsatisfactory fitness or the presence of cardiorespiratory diseases. While data exist on the effects of gravitational and acceleration (G) forces on human physiology, data on the effects of the aerospace environment in unselected members of the public, and particularly in those with clinically significant pathology, are limited. Although short in duration, these high acceleration forces can potentially either impair the experience or, more seriously, pose a risk to health in some individuals. Rather than expose individuals with existing pathology to G forces to collect data, computational modelling might be useful to predict the nature and severity of cardiovascular diseases that are of sufficient risk to restrict access, require modification, or suggest further investigation or training before flight. In this Review, we explore state-of-the-art, zero-dimensional, compartmentalized models of human cardiovascular pathophysiology that can be used to simulate the effects of acceleration forces, homeostatic regulation and ventilation-perfusion matching, using data generated by long-arm centrifuge facilities of the US National Aeronautics and Space Administration and the European Space Agency to risk stratify individuals and help to improve safety in commercial suborbital spaceflight.

4.
bioRxiv ; 2024 Jul 14.
Artigo em Inglês | MEDLINE | ID: mdl-39026747

RESUMO

Since their initial discovery in maize, transposable elements (TEs) have emerged as being integral to the evolution of maize, accounting for 80% of its genome. However, the repetitive nature of TEs has hindered our understanding of their regulatory potential. Here, we demonstrate that long- read chromatin fiber sequencing (Fiber-seq) permits the comprehensive annotation of the regulatory potential of maize TEs. We uncover that only 94 LTR retrotransposons contain the functional epigenetic architecture required for mobilization within maize leaves. This epigenetic architecture degenerates with evolutionary age, resulting in solo TE enhancers being preferentially marked by simultaneous hyper-CpG methylation and chromatin accessibility, an architecture markedly divergent from canonical enhancers. We find that TEs shape maize gene regulation by creating novel promoters within the TE itself as well as through TE-mediated gene amplification. Lastly, we uncover a pervasive epigenetic code directing TEs to specific loci, including that locus that sparked McClintock's discovery of TEs.

5.
Mil Med ; 2024 Jun 13.
Artigo em Inglês | MEDLINE | ID: mdl-38870040

RESUMO

INTRODUCTION: Uncontrolled torso hemorrhage is the primary cause of potentially survivable deaths on the battlefield. Zone 1 Resuscitative Endovascular Balloon Occlusion of the Aorta (REBOA), in conjunction with damage control resuscitation, may be an effective management strategy for these patients in the prehospital or austere phase of their care. However, the effect of whole blood (WB) transfusion during REBOA on post-occlusion circulatory collapse is not fully understood. MATERIALS AND METHODS: Yorkshire male swine (n = 6 per group, 70-90 kg) underwent a 40% volume-controlled hemorrhage. After a 10-minute hemorrhagic shock period, a REBOA balloon was inflated in Zone 1. Fifteen minutes after inflation, 0, 1, or 3 units (450 mL/unit) of autologous WB was infused through the left jugular vein. Thirty minutes after initial balloon inflation, the balloon was deflated slowly over 3 minutes. Following deflation, normal saline was administered (up to 3,000 mL) and swine were observed for 2 hours. Survival (primary outcome), hemodynamics, and blood gas values were compared among groups. Statistical significance was determined by log-rank test, one-way ANOVA, and repeated measures ANOVA. RESULTS: Survival rates were comparable between groups (P = .345) with 66% of control, 33% of the one-unit animals, and 50% of the 3-unit animals survived until the end of the study. Following WB infusion, both the 1-unit and the 3-unit groups had significantly higher blood pressure (P < .01), pulmonary artery pressure (P < .01), and carotid artery flow (P < .01) compared to the control group. CONCLUSIONS: WB transfusion during Zone 1 REBOA was not associated with increased short-term survival in this large animal model of severe hemorrhage. We observed no signal that WB transfusion may mitigate post-occlusion circulatory collapse. However, there was evidence of supra-normal blood pressures during WB transfusion.

6.
Front Netw Physiol ; 4: 1396383, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38840902

RESUMO

Pulmonary fibrosis is a deadly disease that involves the dysregulation of fibroblasts and myofibroblasts, which are mechanosensitive. Previous computational models have succeeded in modeling stiffness-mediated fibroblasts behaviors; however, these models have neglected to consider stretch-mediated behaviors, especially stretch-sensitive channels and the stretch-mediated release of latent TGF-ß. Here, we develop and explore an agent-based model and spring network model hybrid that is capable of recapitulating both stiffness and stretch. Using the model, we evaluate the role of mechanical signaling in homeostasis and disease progression during self-healing and fibrosis, respectively. We develop the model such that there is a fibrotic threshold near which the network tends towards instability and fibrosis or below which the network tends to heal. The healing response is due to the stretch signal, whereas the fibrotic response occurs when the stiffness signal overpowers the stretch signal, creating a positive feedback loop. We also find that by changing the proportional weights of the stretch and stiffness signals, we observe heterogeneity in pathological network structure similar to that seen in human IPF tissue. The system also shows emergent behavior and bifurcations: whether the network will heal or turn fibrotic depends on the initial network organization of the damage, clearly demonstrating structure's pivotal role in healing or fibrosis of the overall network. In summary, these results strongly suggest that the mechanical signaling present in the lungs combined with network effects contribute to both homeostasis and disease progression.

7.
Heliyon ; 10(11): e31815, 2024 Jun 15.
Artigo em Inglês | MEDLINE | ID: mdl-38845891

RESUMO

The occurrence of pharmaceuticals and xenoestrogen compounds (PXCs) in drinking water presents a dire human health risk challenge. The problem stems from the high anthropogenic pollution load on source water and the inefficiencies of the conventional water treatment plants in treating PXCs. This study assessed the PXCs levels and the consequential health risks of exposure to tap water from selected Ghanaian communities as well as that of raw water samples from the respective treatment plants. Thus the PXCs treatment efficiency of two drinking water treatment plants in the metropolises studied was also assessed. The study also conducted source apportionment of the PXCs in the tap water. Twenty six (26) tap and raw water samples from communities in the Cape Coast and Sekondi-Takoradi metropolises were extracted using SPE cartridges and analysed for PXCs using Ultra-fast-HPLC-UV instrument. Elevated levels of PXCs up to 24.79 and 22.02 µg/L were respectively recorded in raw and tap water samples from the metropolises. Consequently, elevated non-cancer health risk (HI > 1) to residential adults were found for tap water samples from Cape Coast metropolis and also for some samples from Sekondi-Takoradi metropolis. Again, elevated cumulative oral cancer risks >10-5 and dermal cancer risk up to 4 × 10-5 were recorded. The source apportionment revealed three significant sources of PXCs in tap water samples studied. The results revealed the inefficiency of the treatment plants in removing PXCs from the raw water during treatments. The situation thus requires urgent attention to ameliorate it, safeguarding public health. It is recommended that the conventional water treatment process employed be augmented with advanced treatment technologies to improve their efficacy in PXCs treatment.

8.
Ecol Evol ; 14(6): e11440, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38855318

RESUMO

Species rarity is a common phenomenon across global ecosystems that is becoming increasingly more common under climate change. Although species rarity is often considered to be a stochastic response to environmental and ecological constraints, we examined the hypothesis that plant rarity is a consequence of natural selection acting on performance traits that affect a species range size, habitat specificity, and population aggregation; three primary descriptors of rarity. Using a common garden of 25 species of Tasmanian Eucalyptus, we find that the rarest species have 70% lower biomass than common species. Although rare species demonstrate lower biomass, rare species allocated proportionally more biomass aboveground than common species. There is also a negative phylogenetic autocorrelation underlying the biomass of rare and common species, indicating that traits associated with rarity have diverged within subgenera as a result of environmental factors to reach different associated optima. In support of our hypothesis, we found significant positive relationships between species biomass, range size and habitat specificity, but not population aggregation. These results demonstrate repeated convergent evolution of the trait-based determinants of rarity across the phylogeny in Tasmanian eucalypts. Furthermore, the phylogenetically driven patterns in biomass and biomass allocation seen in rare species may be representative of a larger plant strategy, not yet considered, but offering a mechanism as to how rare species continue to persist despite inherent constraints of small, specialized ranges and populations. These results suggest that if rarity can evolve and is related to plant traits such as biomass, rather than a random outcome of environmental constraints, we may need to revise conservation efforts in these and other rare species to reconsider the abiotic and biotic factors that underlie the distributions of rare plant species.

9.
J Hepatol ; 81(1): 33-41, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38906621

RESUMO

BACKGROUND & AIMS: Oral antiviral therapy with nucleos(t)ide analogues (NAs) for chronic hepatitis B (CHB) is well-tolerated and lifesaving, but real-world data on utilization are limited. We examined rates of evaluation and treatment in patients from the REAL-B consortium. METHODS: This was a cross-sectional study nested within our retrospective multinational clinical consortium (2000-2021). We determined the proportions of patients receiving adequate evaluation, meeting AASLD treatment criteria, and initiating treatment at any time during the study period. We also identified factors associated with receiving adequate evaluation and treatment using multivariable logistic regression analyses. RESULTS: We analyzed 12,566 adult treatment-naïve patients with CHB from 25 centers in 9 countries (mean age 47.1 years, 41.7% female, 96.1% Asian, 49.6% Western region, 8.7% cirrhosis). Overall, 73.3% (9,206 patients) received adequate evaluation. Among the adequately evaluated, 32.6% (3,001 patients) were treatment eligible by AASLD criteria, 83.3% (2,500 patients) of whom were initiated on NAs, with consistent findings in analyses using EASL criteria. On multivariable logistic regression adjusting for age, sex, cirrhosis, and ethnicity plus region, female sex was associated with adequate evaluation (adjusted odds ratio [aOR] 1.13, p = 0.004), but female treatment-eligible patients were about 50% less likely to initiate NAs (aOR 0.54, p <0.001). Additionally, the lowest evaluation and treatment rates were among Asian patients from the West, but no difference was observed between non-Asian patients and Asian patients from the East. Asian patients from the West (vs. East) were about 40-50% less likely to undergo adequate evaluation (aOR 0.60) and initiate NAs (aOR 0.54) (both p <0.001). CONCLUSIONS: Evaluation and treatment rates were suboptimal for patients with CHB in both the East and West, with significant sex and ethnic disparities. Improved linkage to care with linguistically competent and culturally sensitive approaches is needed. IMPACT AND IMPLICATIONS: Significant sex and ethnic disparities exist in hepatitis B evaluation and treatment, with female treatment-eligible patients about 50% less likely to receive antiviral treatment and Asian patients from Western regions also about 50% less likely to receive adequate evaluation or treatment compared to Asians from the East (there was no significant difference between Asian patients from the East and non-Asian patients). Improved linkage to care with linguistically competent and culturally sensitive approaches is needed.


Assuntos
Antivirais , Disparidades em Assistência à Saúde , Hepatite B Crônica , Humanos , Feminino , Masculino , Antivirais/uso terapêutico , Estudos Transversais , Pessoa de Meia-Idade , Estudos Retrospectivos , Hepatite B Crônica/tratamento farmacológico , Hepatite B Crônica/etnologia , Adulto , Disparidades em Assistência à Saúde/estatística & dados numéricos , Disparidades em Assistência à Saúde/etnologia , Fatores Sexuais , Etnicidade/estatística & dados numéricos , Saúde Global
10.
Hepatol Commun ; 8(7)2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-38896066

RESUMO

BACKGROUND: Steatotic liver disease (SLD) is a growing phenomenon, and our understanding of its determinants has been limited by our ability to identify it clinically. Natural language processing (NLP) can potentially identify hepatic steatosis systematically within large clinical repositories of imaging reports. We validated the performance of an NLP algorithm for the identification of SLD in clinical imaging reports and applied this tool to a large population of people with and without HIV. METHODS: Patients were included in the analysis if they enrolled in the Veterans Aging Cohort Study between 2001 and 2017, had an imaging report inclusive of the liver, and had ≥2 years of observation before the imaging study. SLD was considered present when reports contained the terms "fatty," "steatosis," "steatotic," or "steatohepatitis." The performance of the SLD NLP algorithm was compared to a clinical review of 800 reports. We then applied the NLP algorithm to the first eligible imaging study and compared patient characteristics by SLD and HIV status. RESULTS: NLP achieved 100% sensitivity and 88.5% positive predictive value for the identification of SLD. When applied to 26,706 eligible Veterans Aging Cohort Study patient imaging reports, SLD was identified in 72.2% and did not significantly differ by HIV status. SLD was associated with a higher prevalence of metabolic comorbidities, alcohol use disorder, and hepatitis B and C, but not HIV infection. CONCLUSIONS: While limited to those undergoing radiologic study, the NLP algorithm accurately identified SLD in people with and without HIV and offers a valuable tool to evaluate the determinants and consequences of hepatic steatosis.


Assuntos
Algoritmos , Fígado Gorduroso , Infecções por HIV , Processamento de Linguagem Natural , Humanos , Masculino , Feminino , Infecções por HIV/complicações , Infecções por HIV/epidemiologia , Pessoa de Meia-Idade , Fígado Gorduroso/diagnóstico por imagem , Fígado Gorduroso/complicações , Idoso , Estudos de Coortes , Adulto , Sensibilidade e Especificidade
11.
BJOG ; 2024 Jun 20.
Artigo em Inglês | MEDLINE | ID: mdl-38899437

RESUMO

OBJECTIVE: To estimate temporal changes in the prevalence of pre-existing chronic conditions among pregnant women in Sweden and evaluate the extent to which secular changes in maternal age, birth cohorts and obesity are associated with these trends. DESIGN: Population-based cross-sectional study. SETTING: Sweden, 2002-2019. POPULATION: All women (aged 15-49 years) who delivered in Sweden (2002-2019). METHODS: An age-period-cohort analysis was used to evaluate the effects of age, calendar periods, and birth cohorts on the observed temporal trends. MAIN OUTCOME MEASURES: Pre-existing chronic conditions, including 17 disease categories of physical and psychiatric health conditions recorded within 5 years before childbirth, presented as prevalence rates and rate ratios (RRs) with 95% confidence intervals (CIs). Temporal trends were also adjusted for pre-pregnancy body mass index (BMI) and the mother's country of birth. RESULTS: The overall prevalence of at least one pre-existing chronic condition was 8.7% (147 458 of 1 703 731 women). The rates of pre-existing chronic conditions in pregnancy increased threefold between 2002-2006 and 2016-2019 (RR 2.82, 95% CI 2.77-2.87). Rates of psychiatric (RR 3.80, 95% CI 3.71-3.89), circulatory/metabolic (RR 1.62, 95% CI 1.55-1.71), autoimmune/neurological (RR 1.69, 95% CI 1.61-1.78) and other (RR 2.10, 95% CI 1.99-2.22) conditions increased substantially from 2002-2006 to 2016-2019. However, these increasing rates were less pronounced between 2012-2015 and 2016-2019. No birth cohort effect was evident for any of the pre-existing chronic conditions. Adjusting for secular changes in obesity and the mother's country of birth did not affect these associations. CONCLUSIONS: The burden of pre-existing chronic conditions in pregnancy in Sweden increased from 2002 to 2019. This increase may be associated with the improved reporting of diagnoses and advancements in chronic condition treatment among women, potentially enhancing their fecundity.

12.
AIDS Behav ; 2024 Jun 20.
Artigo em Inglês | MEDLINE | ID: mdl-38900313

RESUMO

Peer advocacy can promote HIV protective behaviors, but little is known about the concordance on prevention advocacy(PA) reports between people living with HIV(PLWH) and their social network members. We examined prevalence and correlates of such concordance, and its association with the targeted HIV protective behavior of the social network member. Data were analyzed from 193 PLWH(index participants) and their 599 social network members(alters). Kappa statistics measured concordance between index and alter reports of PA in the past 3 months. Logistic and multinomial regressions evaluated the relationship between advocacy concordance and alter condom use and HIV testing behavior and correlates of PA concordance. Advocacy concordance was observed in 0.3% of index-alter dyads for PrEP discussion, 9% for condom use, 18% for HIV testing, 26% for care engagement, and 49% for antiretroviral use discussions. Fewer indexes reported condom use(23.5% vs. 28.1%;[Formula: see text]=3.7, p=0.05) and HIV testing(30.5% vs. 50.5%; [Formula: see text]=25.3, p<0.001) PA occurring. Condom advocacy concordance was higher if the index and alter were romantic partners(OR=3.50; p=0.02), and lower if the index was 10 years younger than the alter(OR=0.23; p = 0.02). Alters had higher odds of using condoms with their main partner when both reported condom advocacy compared to dyads where neither reported advocacy(OR=3.90; p<0.001) and compared to dyads where only the index reported such advocacy(OR = 3.71; p=0.01). Age difference and relationship status impact advocacy agreement, and concordant perceptions of advocacy are linked to increased HIV protective behaviors. Alters' perceptions may be crucial for behavior change, informing strategies for improving advocacy.

13.
JAMA Intern Med ; 2024 Jun 24.
Artigo em Inglês | MEDLINE | ID: mdl-38913369

RESUMO

Importance: Current approaches to classify the hepatotoxic potential of medications are based on cumulative case reports of acute liver injury (ALI), which do not consider the size of the exposed population. There is little evidence from real-world data (data relating to patient health status and/or the delivery of health care routinely collected from sources outside of a research setting) on incidence rates of severe ALI after initiation of medications, accounting for duration of exposure. Objective: To identify the most potentially hepatotoxic medications based on real-world incidence rates of severe ALI and to examine how these rates compare with categorization based on case reports. Design, Setting, and Participants: This series of cohort studies obtained data from the US Department of Veterans Affairs on persons without preexisting liver or biliary disease who initiated a suspected hepatotoxic medication in the outpatient setting between October 1, 2000, and September 30, 2021. Data were analyzed from June 2020 to November 2023. Exposures: Outpatient initiation of any one of 194 medications with 4 or more published reports of hepatotoxicity. Main Outcomes and Measures: Hospitalization for severe ALI, defined by either inpatient: (1) alanine aminotransferase level greater than 120 U/L plus total bilirubin level greater than 2.0 mg/dL or (2) international normalized ratio of 1.5 or higher plus total bilirubin level greater than 2.0 mg/dL recorded within the first 2 days of admission. Acute or chronic liver or biliary disease diagnosis recorded during follow-up or as a discharge diagnosis of a hospitalization for severe ALI resulted in censoring. This study calculated age- and sex-adjusted incidence rates of severe ALI and compared observed rates with hepatotoxicity categories based on cumulative published case reports. Results: The study included 7 899 888 patients across 194 medication cohorts (mean [SD] age, 64.4 [16.4] years, 7 305 558 males [92.5%], 4 354 136 individuals [55.1%] had polypharmacy). Incidence rates of severe ALI ranged from 0 events per 10 000 person-years (candesartan, minocycline) to 86.4 events per 10 000 person-years (stavudine). Seven medications (stavudine, erlotinib, lenalidomide or thalidomide, chlorpromazine, metronidazole, prochlorperazine, and isoniazid) exhibited rates of 10.0 or more events per 10 000 person-years, and 10 (moxifloxacin, azathioprine, levofloxacin, clarithromycin, ketoconazole, fluconazole, captopril, amoxicillin-clavulanate, sulfamethoxazole-trimethoprim, and ciprofloxacin) had rates between 5.0 and 9.9 events per 10 000 person-years. Of these 17 medications with the highest observed rates of severe ALI, 11 (64%) were not included in the highest hepatotoxicity category when based on case reports. Conclusions and Relevance: In this study, incidence rates of severe ALI using real-world data identified the most potentially hepatotoxic medications and can serve as a tool to investigate hepatotoxicity safety signals obtained from case reports. Case report counts did not accurately reflect the observed rates of severe ALI after medication initiation.

14.
BMC Plant Biol ; 24(1): 476, 2024 May 30.
Artigo em Inglês | MEDLINE | ID: mdl-38816799

RESUMO

BACKGROUND: Interest in the evolution of climatic niches, particularly in understanding the potential adaptive responses of species under climate change, has increased both theoretically and within macroecological studies. These studies have provided valuable insights into how climatic traits of species influence their niche evolution. In this study, we aim to investigate whether niche conservatism plays a role in the species diversification of Nymphaea, a group of aquatic plants with a cosmopolitan distribution that is facing severe habitat loss. We will use climatic models and phylogenetic data for 23 species to reconstruct Nymphaea's niche evolution, measure niche overlap, and assess disparity through time while testing for evolutionary models. RESULTS: There was a lot of overlap in niches both within and between groups, especially for species that can be found in many places. The breadth and peaks of the niche profile varied depending on the bioclimatic variables, which suggested that the species evolved differently to cope with changes in climate. The analysis also showed that evolutionary changes happened across the phylogeny, with weak to moderate signals. The morphological disparity index (MDI) values indicated that there were disparities within subclades over time but not between or among them. Niche reconstruction and evolution analysis revealed both convergent and divergent evolution among various variables. For example, N. immutabilis, N. atrans, N. violancea, and N. nouchali evolved towards intermediate temperatures for bio2 and bio3 (isothermity) while moving towards extreme temperatures for bio8 and bio9 (wettest and driest average quarterly temperatures). CONCLUSION: Our study will improve our understanding of how changes in climatic niches are potentially driving the evolution of Nymphaea. It has significant scientific implications for the limits, assemblages, evolution, and diversification of species. This information is crucial for the ongoing efforts of conservation and management, particularly considering the inevitable effects of climate change.


Assuntos
Evolução Biológica , Clima , Ecossistema , Filogenia , América do Sul , Austrália , África , Mudança Climática
15.
J Vasc Surg ; 2024 May 18.
Artigo em Inglês | MEDLINE | ID: mdl-38768833

RESUMO

BACKGROUND: Length of stay (LOS) is a major driver of cost and resource utilization following lower extremity bypass (LEB). However, the variable comorbidity burden and mobility status of LEB patients makes implementing enhanced recovery after surgery pathways challenging. The aim of this study was to use a large national database to identify patient factors associated with ultrashort LOS among patients undergoing LEB for peripheral artery disease. METHODS: All patients undergoing LEB for peripheral artery disease in the National Surgical Quality Improvement Project database from 2011 to 2018 were included. Patients were divided into two groups based on the postoperative length of stay : ultrashort (≤2 days) and standard (>2 days). Thirty-day outcomes were compared using descriptive statistics, and multivariable logistic regression was used to identify patient factors associated with ultrashort LOS. RESULTS: Overall, 17,510 patients were identified who underwent LEB, of which 2678 patients (15.3%) had an ultrashort postoperative LOS (mean, 1.8 days) and 14,832 (84.7%) patients had a standard LOS (mean, 7.1 days). When compared to patients with a standard LOS, patients with an ultrashort LOS were more likely to be admitted from home (95.9% vs 88.0%; P < .001), undergo elective surgery (86.1% vs 59.1%; P < .001), and be active smokers (52.1% vs 40.4%; P < .001). Patients with an ultrashort LOS were also more likely to have claudication as the indication for LEB (53.1% vs 22.5%; P < .001), have a popliteal revascularization target rather than a tibial/pedal target (76.7% vs 55.3%; P < .001), and have a prosthetic conduit (40.0% vs 29.9%; P < .001). There was no significant difference in mortality between the two groups (1.4% vs 1.8%; P = .21); however, patients with an ultrashort LOS had a lower frequency of unplanned readmission (10.7% vs 18.8%; P < .001) and need for major reintervention (1.9% vs 5.6%; P < .001). On multivariable analysis, elective status (odds ratio , 2.66; 95% confidence interval [CI], 2.33-3.04), active smoking (OR, 1.18; 95% CI, 1.07-1.30), and lack of vein harvest (OR, 1.55; 95% CI, 1.41-1.70) were associated with ultrashort LOS. Presence of rest pain (OR, 0.57; 95% CI, 0.51-0.63), tissue loss (OR, 0.30; 95% CI, 0.27-0.34), and totally dependent functional status (OR, 0.54; 95% CI, 0.35-0.84) were associated negatively with an ultrashort LOS. When examining the subgroup of patients who underwent vein harvest, totally dependent (OR, 0.38; 95% CI, 0.19-0.75) and partially dependent (OR, 0.53; 95% CI, 0.32-0.88) functional status were persistently negatively associated with ultrashort LOS. CONCLUSIONS: Ultrashort LOS (≤2 days) after LEB is uncommon but feasible in select patients. Preoperative functional status and mobility are important factors to consider when identifying LEB patients who may be candidates for early discharge.

16.
Clin Toxicol (Phila) ; 62(5): 314-321, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38804837

RESUMO

INTRODUCTION: North American pit viper envenomation occurs over 4,000 times annually in the United States, with polyvalent Fab antivenom being the primary treatment. Fasciotomy is occasionally performed due to concerns about compartment syndrome. We utilized our direct access to Texas Poison Center Network data to create a new snakebite abstraction form and database on relevant available information between 2004 and 2021 and to identify, describe, and estimate the incidence of fasciotomy following pit viper envenomation in Texas. METHODS: We searched the Texas Poison Center Network database for cases during 2004-2021 using keywords such as fasciotomy, surgery, compartment pressure, and compartment syndrome. Descriptive statistics summarized the data. RESULTS: Of 16,911 reported envenomations, 0.69 percent involved fasciotomies (n = 117). Most common bite sites were digits/hands and lower extremities. Patients who underwent fasciotomy were typically male, aged 20-59, and 10 years younger than the total snakebite population. Only 6 percent of reported compartment syndrome cases had a compartment pressure measurement. Antivenom was administered in 101 (86.3 percent) cases, 92 (91.1 percent) of which received only Fab antivenom product. Patients with bites from rattlesnakes (47.9 percent) were associated with most fasciotomies. DISCUSSION: Our findings suggest a potential increase in snakebite exposures, accompanied by a decrease in fasciotomies. Overall, copperheads constituted the majority of snakebites, but most fasciotomies were from rattlesnake envenomations (47.9 percent). In this cohort, compartment syndrome diagnosis and decisions regarding fasciotomy were primarily based on clinical evaluation/surgeon expertise without compartment pressure measurements. Despite the efficacy of antivenom, only 86.3 percent of patients in our study received antivenom. CONCLUSIONS: Fasciotomy after North American pit viper envenomation in Texas is uncommon (0.69 percent) and has decreased over time, possibly due to increased antivenom use or surgeon comfort with nonsurgical management.


Assuntos
Antivenenos , Síndromes Compartimentais , Fasciotomia , Mordeduras de Serpentes , Mordeduras de Serpentes/epidemiologia , Texas/epidemiologia , Humanos , Antivenenos/uso terapêutico , Masculino , Adulto , Animais , Feminino , Pessoa de Meia-Idade , Síndromes Compartimentais/etiologia , Síndromes Compartimentais/epidemiologia , Síndromes Compartimentais/cirurgia , Adulto Jovem , Criança , Adolescente , Crotalinae , Pré-Escolar , Idoso , Centros de Controle de Intoxicações/estatística & dados numéricos , Venenos de Crotalídeos/antagonistas & inibidores , Bases de Dados Factuais
19.
Laryngoscope Investig Otolaryngol ; 9(3): e1277, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38803462

RESUMO

Background: Chronic rhinosinusitis (CRS) is a heterogeneous disorder with a wide range of validated subjective and objective assessment tools to assess disease severity. However, a comprehensive and easy-to-use tool that integrates these measures for determining disease severity and response to treatment is still obscure. The objective of this study was to develop a standardized assessment tool that facilitates diagnosis, uniform patient monitoring, and comparison of treatment outcomes between different centers both in routine clinical practice and in research. Methods: To develop this tool, published literature on assessment tools was searched on various databases. A panel of 12 steering committee members conducted an advisory board meeting to review the findings. Specific outcome measures to be included in a comprehensive assessment tool and follow-up sheet were then collated following consensus approval from the panel. The tool was further validated for content and revised with expert recommendations to arrive at the finalized Nasal Polyp Patient Assessment Scoring Sheet (N-PASS) tool. Results: The N-PASS tool was developed by integrating the subjective and objective measures for CRS assessment. Based on expert opinions, N-PASS was revised to be used as an easy-to-use guidance tool that captures patient-reported and physician-assessed components for comprehensively assessing disease status and response to treatment. Conclusion: The N-PASS tool can be used to aid in the diagnosis and management of CRS cases with nasal polyps. The tool would also aid in improved monitoring of patients and pave the way for an international disease registry. Level of evidence: Oxford Level 3.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...