Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 5.787
Filter
1.
J Ethn Subst Abuse ; : 1-16, 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38949863

ABSTRACT

In Greenland, where addiction-related concerns significantly affect well-being, research has explored alcohol's impact on health and mortality. However, no studies have focused on mortality among those who received addiction treatment. This study investigates whether individuals treated for addiction in Greenland experience elevated mortality rates compared to the general population. The study encompassed individuals receiving addiction treatment through the national system between 2012 and December 31, 2022. Data on treatment were sourced from the National Addiction Database, and Statistics Greenland. Person-years at risk were calculated and used to estimate crude mortality rates (CMRs). Adjusted standardized mortality rates (SMRs), accounting for age, sex, and calendar year, were estimated using an indirect method based on observed and expected deaths. Of the 3286 in treatment, 53.9% were women, with a median age of 37. About a third had undergone multiple treatment episodes, and 60.1% received treatment in 2019 or later. The cohort was followed for a median of 2.89 years, yielding 12,068 person-years. The overall CMR was 7.79 deaths per 1000 person-years, with a SMR of 1.42 (95% confidence interval: 1.15; 1.74). Significantly, SMRs differed by age at treatment entry, with younger groups exhibiting higher SMRs (p value = .021). This study found that individuals seeking treatment for addiction problems in Greenland had a higher mortality rate than the general population. Importantly, these SMRs were substantially lower than those observed in clinical populations in other countries.

2.
BMC Emerg Med ; 24(1): 110, 2024 Jul 09.
Article in English | MEDLINE | ID: mdl-38982351

ABSTRACT

BACKGROUND: Substance misuse poses a significant public health challenge, characterized by premature morbidity and mortality, and heightened healthcare utilization. While studies have demonstrated that previous hospitalizations and emergency department visits are associated with increased mortality in patients with substance misuse, it is unknown whether prior utilization of emergency medical service (EMS) is similarly associated with poor outcomes among this population. The objective of this study is to determine the association between EMS utilization in the 30 days before a hospitalization or emergency department visit and in-hospital outcomes among patients with substance misuse. METHODS: We conducted a retrospective analysis of adult emergency department visits and hospitalizations (referred to as a hospital encounter) between 2017 and 2021 within the Substance Misuse Data Commons, which maintains electronic health records from substance misuse patients seen at two University of Wisconsin hospitals, linked with state agency, claims, and socioeconomic datasets. Using regression models, we examined the association between EMS use and the outcomes of in-hospital death, hospital length of stay, intensive care unit (ICU) admission, and critical illness events, defined by invasive mechanical ventilation or vasoactive drug administration. Models were adjusted for age, comorbidities, initial severity of illness, substance misuse type, and socioeconomic status. RESULTS: Among 19,402 encounters, individuals with substance misuse who had at least one EMS incident within 30 days of a hospital encounter experienced a higher likelihood of in-hospital mortality (OR 1.52, 95% CI [1.05 - 2.14]) compared to those without prior EMS use, after adjusting for confounders. Using EMS in the 30 days prior to an encounter was associated with a small increase in hospital length of stay but was not associated with ICU admission or critical illness events. CONCLUSIONS: Individuals with substance misuse who have used EMS in the month preceding a hospital encounter are at an increased risk of in-hospital mortality. Enhanced monitoring of EMS users in this population could improve overall patient outcomes.


Subject(s)
Emergency Medical Services , Hospital Mortality , Substance-Related Disorders , Humans , Retrospective Studies , Male , Female , Middle Aged , Adult , Risk Factors , Emergency Medical Services/statistics & numerical data , Wisconsin/epidemiology , Length of Stay/statistics & numerical data , Aged
3.
Brief Bioinform ; 25(4)2024 May 23.
Article in English | MEDLINE | ID: mdl-38980374

ABSTRACT

Gene-environment (GE) interactions are essential in understanding human complex traits. Identifying these interactions is necessary for deciphering the biological basis of such traits. In this study, we review state-of-art methods for estimating the proportion of phenotypic variance explained by genome-wide GE interactions and introduce a novel statistical method Linkage-Disequilibrium Eigenvalue Regression for Gene-Environment interactions (LDER-GE). LDER-GE improves the accuracy of estimating the phenotypic variance component explained by genome-wide GE interactions using large-scale biobank association summary statistics. LDER-GE leverages the complete Linkage Disequilibrium (LD) matrix, as opposed to only the diagonal squared LD matrix utilized by LDSC (Linkage Disequilibrium Score)-based methods. Our extensive simulation studies demonstrate that LDER-GE performs better than LDSC-based approaches by enhancing statistical efficiency by ~23%. This improvement is equivalent to a sample size increase of around 51%. Additionally, LDER-GE effectively controls type-I error rate and produces unbiased results. We conducted an analysis using UK Biobank data, comprising 307 259 unrelated European-Ancestry subjects and 966 766 variants, across 217 environmental covariate-phenotype (E-Y) pairs. LDER-GE identified 34 significant E-Y pairs while LDSC-based method only identified 23 significant E-Y pairs with 22 overlapped with LDER-GE. Furthermore, we employed LDER-GE to estimate the aggregated variance component attributed to multiple GE interactions, leading to an increase in the explained phenotypic variance with GE interactions compared to considering main genetic effects only. Our results suggest the importance of impacts of GE interactions on human complex traits.


Subject(s)
Gene-Environment Interaction , Linkage Disequilibrium , Phenotype , Humans , Multifactorial Inheritance , Genome-Wide Association Study/methods , Polymorphism, Single Nucleotide , Models, Genetic
4.
BMC Med Inform Decis Mak ; 24(1): 193, 2024 Jul 09.
Article in English | MEDLINE | ID: mdl-38982481

ABSTRACT

BACKGROUND: Linkage errors that occur according to linkage levels can adversely affect the accuracy and reliability of analysis results. This study aimed to identify the differences in results according to personally identifiable information linkage level, sample size, and analysis methods through empirical analysis. METHODS: The difference between the results of linkage in directly identifiable information (DII) and indirectly identifiable information (III) linkage levels was set as III linkage based on name, date of birth, and sex and DII linkage based on resident registration number. The datasets linked at each level were named as databaseIII (DBIII) and databaseDII (DBDII), respectively. Considering the analysis results of the DII-linked dataset as the gold standard, descriptive statistics, group comparison, incidence estimation, treatment effect, and moderation effect analysis results were assessed. RESULTS: The linkage rates for DBDII and DBIII were 71.1% and 99.7%, respectively. Regarding descriptive statistics and group comparison analysis, the difference in effect in most cases was "none" to "very little." With respect to cervical cancer that had a relatively small sample size, analysis of DBIII resulted in an underestimation of the incidence in the control group and an overestimation of the incidence in the treatment group (hazard ratio [HR] = 2.62 [95% confidence interval (CI): 1.63-4.23] in DBIII vs. 1.80 [95% CI: 1.18-2.73] in DBDII). Regarding prostate cancer, there was a conflicting tendency with the treatment effect being over or underestimated according to the surveillance, epidemiology, and end results summary staging (HR = 2.27 [95% CI: 1.91-2.70] in DBIII vs. 1.92 [95% CI: 1.70-2.17] in DBDII for the localized stage; HR = 1.80 [95% CI: 1.37-2.36] in DBIII vs. 2.05 [95% CI: 1.67-2.52] in DBDII for the regional stage). CONCLUSIONS: To prevent distortion of the analyses results in health and medical research, it is important to check that the patient population and sample size by each factor of interest (FOI) are sufficient when different data are linked using DBDII. In cases involving a rare disease or with a small sample size for FOI, there is a high likelihood that a DII linkage is unavoidable.


Subject(s)
Big Data , Medical Record Linkage , Humans , Female , Biomedical Research , Male , Empirical Research
5.
Lancet Reg Health Eur ; 43: 100960, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38975590

ABSTRACT

Background: Guidelines recommend high-sensitivity cardiac troponin to risk stratify patients with possible myocardial infarction and identify those eligible for discharge. Our aim was to evaluate adoption of this approach in practice and to determine whether effectiveness and safety varies by age, sex, ethnicity, or socioeconomic deprivation status. Methods: A multi-centre cohort study was conducted in 13 hospitals across the United Kingdom from November 1st, 2021, to October 31st, 2022. Routinely collected data including high-sensitivity cardiac troponin I or T measurements were linked to outcomes. The primary effectiveness and safety outcomes were the proportion discharged from the Emergency Department, and the proportion dead or with a subsequent myocardial infarction at 30 days, respectively. Patients were stratified using peak troponin concentration as low (<5 ng/L), intermediate (5 ng/L to sex-specific 99th percentile), or high-risk (>sex-specific 99th percentile). Findings: In total 137,881 patients (49% [67,709/137,881] female) were included of whom 60,707 (44%), 42,727 (31%), and 34,447 (25%) were stratified as low-, intermediate- and high-risk, respectively. Overall, 65.8% (39,918/60,707) of low-risk patients were discharged from the Emergency Department, but this varied from 26.8% [2200/8216] to 93.5% [918/982] by site. The safety outcome occurred in 0.5% (277/60,707) and 11.4% (3917/34,447) of patients classified as low- or high-risk, of whom 0.03% (18/60,707) and 1% (304/34,447) had a subsequent myocardial infarction at 30 days, respectively. A similar proportion of male and female patients were discharged (52% [36,838/70,759] versus 54% [36,113/67,109]), but discharge was more likely if patients were <70 years old (61% [58,533/95,227] versus 34% [14,428/42,654]), from areas of low socioeconomic deprivation (48% [6697/14,087] versus 43% [12,090/28,116]) or were black or asian compared to caucasian (62% [5458/8877] and 55% [10,026/18,231] versus 46% [35,138/75,820]). Interpretation: Despite high-sensitivity cardiac troponin correctly identifying half of all patients with possible myocardial infarction as being at low risk, only two-thirds of these patients were discharged. Substantial variation in the discharge of patients by age, ethnicity, socioeconomic deprivation, and site was observed identifying important opportunities to improve care. Funding: UK Research and Innovation.

6.
Small ; : e2403394, 2024 Jul 03.
Article in English | MEDLINE | ID: mdl-38958093

ABSTRACT

The rapid growth of Internet of Things (IoT) in recent years has increased demand for various sensors to collect a wide range of data. Among various sensors, the demand for force sensors that can recognize physical phenomena in 3D space has notably increased. Recent research has focused on developing energy harvesting methods for sensors to address their maintenance problems. Triboelectric nanogenerator (TENG) based force sensors are a promising solution for converting external motion into electrical signals. However, conventional TENG-based force sensors that use the signal peak can negatively affect data accuracy. In this study, a Scott-Russell linkage-inspired TENG (SRI-TENG) is developed. The SRI-TENG has completely separate signal generation and measurement sections, and the number of peaks in the electrical output is measured to prevent disturbing output signals. In addition, the lubricant liquid enhances durability, enabling stable force signal measurements for 270 000 cycles. The SRI system demonstrates consistent peak counts and high accuracy across different contacting surfaces, indicating that it can function as a contact material-independent self-powered force sensor. Furthermore, using a deep learning method, it is demonstrated that it can function as a multimodal sensor by realizing the tactile properties of various materials.

7.
Mol Biol Evol ; 2024 Jul 03.
Article in English | MEDLINE | ID: mdl-38958167

ABSTRACT

Admixture between populations and species is common in nature. Since the influx of new genetic material might be either facilitated or hindered by selection, variation in mixture proportions along the genome is expected in organisms undergoing recombination. Various graph-based models have been developed to better understand these evolutionary dynamics of population splits and mixtures. However, current models assume a single mixture rate for the entire genome and do not explicitly account for linkage. Here, we introduce TreeSwirl, a novel method for inferring branch lengths and locus-specific mixture proportions by using genome-wide allele frequency data, assuming that the admixture graph is known or has been inferred. TreeSwirl builds upon TreeMix that uses Gaussian processes to estimate the presence of gene flow between diverged populations. However, in contrast to TreeMix, our model infers locus-specific mixture proportions employing a Hidden Markov Model that accounts for linkage. Through simulated data, we demonstrate that TreeSwirl can accurately estimate locus-specific mixture proportions and handle complex demographic scenarios. It also outperforms related D- and f-statistics in terms of accuracy and sensitivity to detect introgressed loci.

8.
Front Immunol ; 15: 1352789, 2024.
Article in English | MEDLINE | ID: mdl-38966639

ABSTRACT

Introduction: Extracellular ATP (eATP) released from damaged cells activates the P2X7 receptor (P2X7R) ion channel on the surface of surrounding cells, resulting in calcium influx, potassium efflux and inflammasome activation. Inherited changes in the P2X7R gene (P2RX7) influence eATP induced responses. Single nucleotide polymorphisms (SNPs) of P2RX7 influence both function and signaling of the receptor, that in addition to ion flux includes pathogen control and immunity. Methods: Subjects (n = 105) were admitted to the ICU at the University Hospital Ulm, Germany between June 2018 and August 2019. Of these, subjects with a diagnosis of sepsis (n = 75), were also diagnosed with septic shock (n = 24), and/or pneumonia (n = 42). Subjects with pneumonia (n = 43) included those without sepsis (n = 1), sepsis without shock (n = 29) and pneumonia with septic shock (n = 13). Out of the 75 sepsis/septic shock patients, 33 patients were not diagnosed with pneumonia. Controls (n = 30) were recruited to the study from trauma patients and surgical patients without sepsis, septic shock, or pneumonia. SNP frequencies were determined for 16 P2RX7 SNPs known to affect P2X7R function, and association studies were performed between frequencies of these SNPs in sepsis, septic shock, and pneumonia compared to controls. Results: The loss-of-function (LOF) SNP rs17525809 (T253C) was found more frequently in patients with septic shock, and non-septic trauma patients when compared to sepsis. The LOF SNP rs2230911 (C1096G) was found to be more frequent in patients with sepsis and septic shock than in non-septic trauma patients. The frequencies of these SNPs were even higher in sepsis and septic patients with pneumonia. The current study also confirmed a previous study by our group that showed a five SNP combination that included the GOF SNPs rs208294 (C489T) and rs2230912 (Q460R) that was designated #21211 was associated with increased odds of survival in severe sepsis. Discussion: The results found an association between expression of LOF P2RX7 SNPs and presentation to the ICU with sepsis, and septic shock compared to control ICU patients. Furthermore, frequencies of LOF SNPs were found to be higher in sepsis patients with pneumonia compared to those without pneumonia. In addition, a five SNP GOF combination was associated with increased odds of survival in severe sepsis. These results suggest that P2RX7 is required to control infection in pneumonia and that inheritance of LOF variants increases the risk of sepsis when associated with pneumonia. This study confirms that P2RX7 genotyping in pneumonia may identify patients at risk of developing sepsis. The study also identifies P2X7R as a target in sepsis associated with an excessive immune response in subjects with GOF SNP combinations.


Subject(s)
Pneumonia , Polymorphism, Single Nucleotide , Receptors, Purinergic P2X7 , Sepsis , Shock, Septic , Humans , Receptors, Purinergic P2X7/genetics , Male , Female , Shock, Septic/genetics , Shock, Septic/mortality , Shock, Septic/immunology , Middle Aged , Pneumonia/genetics , Pneumonia/mortality , Aged , Sepsis/genetics , Sepsis/mortality , Genetic Predisposition to Disease , Adenosine Triphosphate/metabolism , Adult , Aged, 80 and over
9.
Am J Obstet Gynecol ; 2024 Jul 01.
Article in English | MEDLINE | ID: mdl-38960017

ABSTRACT

There is an increasing burden of hepatitis C virus (HCV) among persons of reproductive age, including pregnant and breastfeeding women, in many regions worldwide. Routine health services during pregnancy present a critical window of opportunity to diagnose and link women with HCV infection for care and treatment to decrease HCV-related morbidity and early mortality. Effective treatment of HCV infection in women diagnosed during pregnancy also prevents HCV-related adverse events in pregnancy and HCV vertical transmission in future pregnancies. However, linkage to care and treatment for women diagnosed in pregnancy remains insufficient. Currently, there are no best practice recommendations from professional societies to ensure appropriate peripartum linkage to HCV care and treatment. We convened a virtual Community of Practice (CoP) to understand key challenges to the HCV care cascade for women diagnosed with HCV in pregnancy, highlight published models of integrated HCV services for pregnant and postpartum women, and preview upcoming research and programmatic initiatives to improve linkage to HCV care for this population. Four-hundred seventy-three participants from 43 countries participated in the CoP, including a diverse range of practitioners from public health, primary care, and clinical specialties. The CoP included panel sessions with representatives from major professional societies in obstetrics/gynecology, maternal fetal medicine, addiction medicine, hepatology, and infectious diseases. From this CoP, we provide a series of best practices to improve linkage to HCV treatment for pregnant and postpartum women, including specific interventions to enhance co-location of services, treatment by non-specialist providers, active engagement and patient navigation, and decreasing time to HCV treatment initiation. The CoP aims to further support antenatal providers in improving linkage to care by producing and disseminating detailed operational guidance and recommendations and supporting operational research on models for linkage and treatment. Additionally, the CoP may be leveraged to build training materials and toolkits for antenatal providers, convene experts to formalize operational recommendations, and conduct surveys to understand needs of antenatal providers. Such actions are required to ensure equitable access to HCV treatment for women diagnosed with HCV in pregnancy and urgently needed to achieve the ambitious targets for HCV elimination by 2030.

10.
Trop Med Int Health ; 2024 Jul 04.
Article in English | MEDLINE | ID: mdl-38961819

ABSTRACT

OBJECTIVES: The objective of this study is to assess the outcomes of children, adolescents and young adults with HIV reported as lost to follow-up, correct mortality estimates for children, adolescents and young adults with HIV for unascertained outcomes in those loss to follow-up (LTFU) based on tracing and linkage data separately using data from the International epidemiology Databases to Evaluate AIDS in Southern Africa. METHODS: We included data from two different populations of children, adolescents and young adults with HIV; (1) clinical data from children, adolescents and young adults with HIV aged ≤24 years from Lesotho, Malawi, Mozambique, Zambia and Zimbabwe; (2) clinical data from children, adolescents and young adults with HIV aged ≤14 years from the Western Cape (WC) in South Africa. Outcomes of patients lost to follow-up were available from (1) a tracing study and (2) linkage to a health information exchange. For both populations, we compared six methods for correcting mortality estimates for all children, adolescents and young adults with HIV. RESULTS: We found substantial variations of mortality estimates among children, adolescents and young adults with HIV reported as lost to follow-up versus those retained in care. Ascertained mortality was higher among lost and traceable children, adolescents and young adults with HIV and lower among lost and linkable than those retained in care (mortality: 13.4% [traced] vs. 12.6% [retained-other Southern Africa countries]; 3.4% [linked] vs. 9.4% [retained-WC]). A high proportion of lost to follow-up children, adolescents and young adults with HIV had self-transferred (21.0% and 47.0%) in the traced and linked samples, respectively. The uncorrected method of non-informative censoring yielded the lowest mortality estimates among all methods for both tracing (6.0%) and linkage (4.0%) approaches at 2 years from ART start. Among corrected methods using ascertained data, multiple imputation, incorporating ascertained data (MI(asc.)) and inverse probability weighting with logistic weights were most robust for the tracing approach. In contrast, for the linkage approach, MI(asc.) was the most robust. CONCLUSIONS: Our findings emphasise that lost to follow-up is non-ignorable and both tracing and linkage improved outcome ascertainment: tracing identified substantial mortality in those reported as lost to follow-up, whereas linkage did not identify out-of-facility deaths, but showed that a large proportion of those reported as lost to follow-up were self-transfers.

11.
medRxiv ; 2024 Jun 20.
Article in English | MEDLINE | ID: mdl-38946964

ABSTRACT

Background: The use of big data and large language models in healthcare can play a key role in improving patient treatment and healthcare management, especially when applied to large-scale administrative data. A major challenge to achieving this is ensuring that patient confidentiality and personal information is protected. One way to overcome this is by augmenting clinical data with administrative laboratory dataset linkages in order to avoid the use of demographic information. Methods: We explored an alternative method to examine patient files from a large administrative dataset in South Africa (the National Health Laboratory Services, or NHLS), by linking external data to the NHLS database using specimen barcodes associated with laboratory tests. This offers us with a deterministic way of performing data linkages without accessing demographic information. In this paper, we quantify the performance metrics of this approach. Results: The linkage of the large NHLS data to external hospital data using specimen barcodes achieved a 95% success. Out of the 1200 records in the validation sample, 87% were exact matches and 9% were matches with typographic correction. The remaining 5% were either complete mismatches or were due to duplicates in the administrative data. Conclusions: The high success rate indicates the reliability of using barcodes for linking data without demographic identifiers. Specimen barcodes are an effective tool for deterministic linking in health data, and may provide a method of creating large, linked data sets without compromising patient confidentiality.

12.
Skin Res Technol ; 30(7): e13840, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38965811

ABSTRACT

BACKGROUND: Psoriasis is a chronic inflammatory disease that causes significant disability. However, little is known about the underlying metabolic mechanisms of psoriasis. Our study aims to investigate the causality of 975 blood metabolites with the risk of psoriasis. MATERIALS AND METHODS: We mainly applied genetic analysis to explore the possible associations between 975 blood metabolites and psoriasis. The inverse variance weighted (IVW) method was used as the primary analysis to assess the possible association of blood metabolites with psoriasis. Moreover, generalized summary-data-based Mendelian randomization (GSMR) was used as a supplementary analysis. In addition, linkage disequilibrium score regression (LDSC) was used to investigate their genetic correction further. Metabolic pathway analysis of the most suggested metabolites was also performed using MetaboAnalyst 5.0. RESULTS: In our primary analysis, 17 metabolites, including unsaturated fatty acids, phospholipids, and triglycerides traits, were selected as potential factors in psoriasis, with odd ratios (OR) ranging from 0.986 to 1.01. The GSMR method confirmed the above results (ß = 0.001, p < 0.05). LDSC analysis mainly suggested the genetic correlation of psoriasis with genetic correlations (rg) from 0.088 to 0.155. Based on the selected metabolites, metabolic pathway analysis suggested seven metabolic pathways including ketone body that may be prominent pathways for metabolites in psoriasis. CONCLUSION: Our study supports the causal role of unsaturated fatty acid properties and lipid traits with psoriasis. These properties may be regulated by the ketone body metabolic pathway.


Subject(s)
Mendelian Randomization Analysis , Psoriasis , Psoriasis/blood , Psoriasis/genetics , Psoriasis/metabolism , Humans , Genetic Predisposition to Disease/genetics , Polymorphism, Single Nucleotide , Linkage Disequilibrium , Metabolome/physiology , Metabolome/genetics , Metabolic Networks and Pathways/genetics
13.
Drug Des Devel Ther ; 18: 2249-2256, 2024.
Article in English | MEDLINE | ID: mdl-38895174

ABSTRACT

Objective: Recently, a lot of research has been done around the world to popularize the osseointegration of dental implants. In this study, it was investigated the effect of local zoledronic acid application on implants with machined (MAC), resorbable blast materials (RBM), sandblasted and acid-etched (SLA) surface implants integrated in rat tibias. Methodology: A total of 60 female Wistar rats weighing between 270 and 300 g were used in the study. The rats were passing divided into six classes: controls; MAC (n = 10), RBM (n = 10), SLA (n = 10), and local zoledronic acid (LZA) applied groups; LZA-MAC (n = 10), LZA-RBM (n=10) and LZA-SLA (n = 10) and implants were surgically placement into rat tibias in general anesthesia. After a four-week experimental period, the biomechanical bone implant connection level was determined with reverse torque analysis. Results: Osseointegration levels were detected highly in SLA and RBM surface compared with the machined surfaced implants in both control and treatment groups (p < 0.05). Additionally, local application of zoledronic acid in both three groups; implants increased the biomechanic osseointegration level compared with the controls (p < 0.05). Conclusion: In this research, we observe that the local application of the zoledronic acid could increase the osseointegration, and RBM and SLA surface could be better than machined surfaced implants in terms of bone implant connection. In addition, local application of zoledronic acid may be a safer method than systemic application.


Subject(s)
Dental Implants , Osseointegration , Rats, Wistar , Zoledronic Acid , Animals , Zoledronic Acid/pharmacology , Zoledronic Acid/administration & dosage , Osseointegration/drug effects , Rats , Female , Surface Properties , Tibia/drug effects , Tibia/surgery , Bone Density Conservation Agents/pharmacology , Bone Density Conservation Agents/administration & dosage
14.
HLA ; 103(6): e15542, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38887889

ABSTRACT

To analyse the effect of HLA-DPA1 and HLA-DPB1 allelic mismatches on the outcomes of unrelated donor haematopoietic stem cell transplantation (URD-HSCT), we collected 258 recipients with haematological disease who underwent HLA-10/10 matched URD-HSCT. HLA-A, -B, -C, -DRB1, -DQB1, -DRB3/4/5, -DQA1, -DPA1 and -DPB1 typing was performed for the donors and recipients using next-generation sequencing (NGS) technology. After excluding 8 cases with DQA1 or DRB3/4/5 mismatches, we included 250 cases with HLA-14/14 matching for further analysis. Our results showed that the proportion of matched DPA1 and DPB1 alleles was only 10.4% (26/250). The remaining 89.6% of donors and recipients demonstrated DPA1 or DPB1 mismatch. In the DPA1 matched and DPB1 mismatched group, accounting for 18.8% (47/250) of the cohort, DPB1*02:01/DPB1*03:01 allelic mismatches were associated with decreased 2-year OS and increased NRM. DPB1*02:02/DPB1*05:01 and DPB1*02:01/DPB1*05:01 mismatches showed no impact on outcomes. Moreover, the specific allelic mismatches observed were consistent with the DPB1 T-cell epitope (TCE) classification as permissive and non-permissive. We innovatively established an analysis method for DPA1 ~ DPB1 linkage mismatch for cases with both DPA1 and DPB1 mismatched, accounting for 70% (175/250) of the total. DPA1*02:02 ~ DPB1*05:01/DPA1*02:01 ~ DPB1*17:01 linkage mismatches were associated with lower 2-year OS, especially among AML/MDS recipients. DPA1*02:02 ~ DPB1*05:01/DPA1*01:03 ~ DPB1*02:01 linkage mismatches showed no impact on outcomes. In conclusion, applying the DPA1 ~ DPB1 linkage mismatch analysis approach can identify different types of mismatches affecting transplant outcomes and provide valuable insight for selecting optimal donors for AML/MDS and ALL recipients.


Subject(s)
Alleles , HLA-DP alpha-Chains , HLA-DP beta-Chains , Hematopoietic Stem Cell Transplantation , Histocompatibility Testing , Unrelated Donors , Humans , HLA-DP beta-Chains/genetics , Hematopoietic Stem Cell Transplantation/methods , HLA-DP alpha-Chains/genetics , Male , Histocompatibility Testing/methods , Female , Adult , Middle Aged , Adolescent , Young Adult , Child , Child, Preschool , Aged , High-Throughput Nucleotide Sequencing/methods , Graft vs Host Disease/genetics , Graft vs Host Disease/immunology
16.
Endocrine ; 2024 Jun 08.
Article in English | MEDLINE | ID: mdl-38851644

ABSTRACT

BACKGROUND: Falls are the most common consequence of low bone mineral density (BMD). However, due to limitations inherent in observational studies, the causal relationship between the two remains unestablished. METHODS: This study utilized Mendelian Randomization (MR) analysis to explore the causal relationship between BMD and the risk of falling, incorporating linkage disequilibrium score (LDSC) regression for genetic correlation assessment. The primary method was inverse-variance weighted (IVW), supplemented with sensitivity analyses and the causal analysis using summary effect estimates (CAUSE) to address heterogeneity and pleiotropy biases. RESULTS: LDSC analysis indicated significant genetic correlations between BMD at various sites and falling risk (rg range: -0.82 to 0.76, all P < 0.05). IVW analysis, with False Discovery Rate (FDR) correction, showed a protective causal effect of total body BMD (OR = 0.85, 95% CI 0.82-0.88, P = 7.63 × 10-17, PFDR = 1.91 × 10-16), femoral neck BMD (OR = 0.81, 95% CI 0.75-0.88, P = 3.33 × 10-7, PFDR = 5.55 × 10-7), lumbar spine BMD (OR = 0.85, 95% CI 0.79-0.91, P = 9.56 × 10-7, PFDR = 1.20 × 10-6), and heel BMD (OR = 0.82, 95% CI 0.79-0.81, P = 1.69 × 10-39, PFDR = 8.45 × 10-39) on falling risk. No causal relationship was found for forearm BMD (OR = 1.02, 95% CI 0.94-1.11, P = 0.64, PFDR = 0.64). Replication datasets and CAUSE analysis provided causal evidence consistent with the main findings. CONCLUSION: The study established a causal relationship between BMD at four different sites and the risk of falling, highlighting potential areas for targeted prevention strategies.

17.
BMC Glob Public Health ; 2(1): 30, 2024.
Article in English | MEDLINE | ID: mdl-38832047

ABSTRACT

Background: Tuberculosis (TB), a leading cause of infectious death, is curable when patients complete a course of multi-drug treatment. Because entry into the TB treatment cascade usually relies on symptomatic individuals seeking care, little is known about linkage to care and completion of treatment in people with subclinical TB identified through community-based screening. Methods: Participants of the Vukuzazi study, a community-based survey that provided TB screening in the rural uMkhanyakude district of KwaZulu-Natal from May 2018 - March 2020, who had a positive sputum (GeneXpert or Mtb culture, microbiologically-confirmed TB) or a chest x-ray consistent with active TB (radiologically-suggested TB) were referred to the public health system. Telephonic follow-up surveys were conducted from May 2021 - January 2023 to assess linkage to care and treatment status. Linked electronic TB register data was accessed. We analyzed the effect of baseline HIV and symptom status (by WHO 4-symptom screen) on the TB treatment cascade. Results: Seventy percent (122/174) of people with microbiologically-confirmed TB completed the telephonic survey. In this group, 84% (103/122) were asymptomatic and 46% (56/122) were people living with HIV (PLWH). By self-report, 98% (119/122) attended a healthcare facility after screening, 94% (115/122) started TB treatment and 93% (113/122) completed treatment. Analysis of electronic TB register data confirmed that 67% (116/174) of eligible individuals started TB treatment. Neither symptom status nor HIV status affected linkage to care. Among people with radiologically-suggested TB, 48% (153/318) completed the telephonic survey, of which 80% (122/153) were asymptomatic and 52% (79/153) were PLWH. By self-report, 75% (114/153) attended a healthcare facility after screening, 16% (24/153) started TB treatment and 14% (22/153) completed treatment. Nine percent (28/318) of eligible individuals had TB register data confirming that they started treatment. Conclusions: Despite high rates of subclinical TB, most people diagnosed with microbiologically-confirmed TB after community-based screening were willing to link to care and complete TB treatment. Lower rates of linkage to care in people with radiologically-suggested TB highlight the importance of streamlined care pathways for this group. Clearer guidelines for the management of people who screen positive during community-based TB screening are needed. Supplementary Information: The online version contains supplementary material available at 10.1186/s44263-024-00059-0.

18.
Front Public Health ; 12: 1281079, 2024.
Article in English | MEDLINE | ID: mdl-38832223

ABSTRACT

Introduction: Many individuals living with hepatitis C virus (HCV) are unaware of their diagnosis and/or have not been linked to programs providing HCV care. The use of electronic medical record (EMR) systems may assist with HCV infection identification and linkage to care. Methods: In October 2021, we implemented HCV serology-focused best practice alerts (BPAs) at The Ottawa Hospital (TOH) via our EMR (EPIC). Our BPAs were programmed to identify previously tested HCV seropositive individuals. Physicians were prompted to conduct HCV RNA testing and submit consultation requests to the TOH Viral Hepatitis Program. We evaluated data post-BPA implementation to assess the design and related outcomes. Results: From 1 September 2022 to 15 December 2022, a total of 2,029 BPAs were triggered for 139 individuals. As a consequence of the BPA prompts, nine HCV seropositive and nine HCV RNA-positive individuals were linked to care. The proportion of total consultations coming from TOH physicians increased post-BPA implementation. The BPA alerts were frequently declined, and physician engagement with our BPAs varied across specialty groups. Programming issues led to unnecessary BPA prompts (e.g., no hard stop to the prompts even though the individual was treated and cured and individuals linked to care without first undergoing HCV RNA testing). A fixed 6-month lookback period for test results limited our ability to identify many individuals. Conclusion: An EMR-based BPA can assist with the identification and engagement of HCV-infected individuals in care. However, challenges including issues with programming, time commitment toward BPA configuration, productive communication between healthcare providers and the programming team, and physician responsiveness to the BPAs require attention to optimize the impact of BPAs.


Subject(s)
Electronic Health Records , Hepacivirus , Hepatitis C , Humans , Hepatitis C/diagnosis , Male , Female , Hepacivirus/isolation & purification , Middle Aged , Adult , Practice Guidelines as Topic , Ontario
19.
Bio Protoc ; 14(6): e4955, 2024 Mar 20.
Article in English | MEDLINE | ID: mdl-38835995

ABSTRACT

Estimating the time of most recent common ancestor (tMRCA) is important to trace the origin of pathogenic viruses. This analysis is based on the genetic diversity accumulated in a certain time period. There have been thousands of mutant sites occurring in the genomes of SARS-CoV-2 since the COVID-19 pandemic started; six highly linked mutation sites occurred early before the start of the pandemic and can be used to classify the genomes into three main haplotypes. Tracing the origin of those three haplotypes may help to understand the origin of SARS-CoV-2. In this article, we present a complete protocol for the classification of SARS-CoV-2 genomes and calculating tMRCA using Bayesian phylodynamic method. This protocol may also be used in the analysis of other viral genomes. Key features • Filtering and alignment of a massive number of viral genomes using custom scripts and ViralMSA. • Classification of genomes based on highly linked sites using custom scripts. • Phylodynamic analysis of viral genomes using Bayesian evolutionary analysis sampling trees (BEAST). • Visualization of posterior distribution of tMRCA using Tracer.v1.7.2. • Optimized for the SARS-CoV-2.

20.
Front Genet ; 15: 1383696, 2024.
Article in English | MEDLINE | ID: mdl-38836040

ABSTRACT

Background: Rheumatoid arthritis (RA) frequently presents with oral manifestations, including gingival inflammation, loose teeth, and mouth ulcers; however, the causal connections between these conditions remain unclear. This study aims to explore the genetic correlations and causal relationships between RA and prevalent oral phenotypes. Methods: Using summary data from genome-wide association studies of European populations, a cross-trait linkage disequilibrium score regression was conducted to estimate the genetic correlations between RA and six oral phenotypes. Subsequently, a two-sample Mendelian randomization (MR) approach was employed to assess the causal relationships, corroborated by various sensitivity analyses. Heterogeneity was addressed through the RadialMR method, while potential covariates were corrected using the multivariable MR approach. Results: A significant negative genetic correlation was detected between RA and denture usage (rg = -0.192, p = 4.88 × 10-8). Meanwhile, a heterogenous causal relationship between RA and mouth ulcers was observed (OR = 1.027 [1.005-1.05], p = 0.016, P heterogeneity = 4.69 × 10-8), which remained robust across sensitivity analyses. After excluding outlier variants, the results demonstrated robustly consistent (OR = 1.021 [1.008-1.035], p = 1.99 × 10-3, P heterogeneity = 0.044). However, upon adjusting for covariates such as smoking, alcohol consumption, body mass index, and obesity, the significance diminished, revealing no evidence to support independent genetic associations. Conclusion: Genetically predicted RA increases the risk of mouth ulcers, and a negative genetic correlation is identified between RA and denture use. The observed heterogeneity suggests that shared immunological mechanisms and environmental factors may play significant roles. These findings highlight the importance of targeted dental management strategies for RA patients. Further clinical guidelines are required to improve oral health among vulnerable RA patients.

SELECTION OF CITATIONS
SEARCH DETAIL
...