Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 277
Filter
1.
PLoS One ; 19(9): e0308018, 2024.
Article in English | MEDLINE | ID: mdl-39240838

ABSTRACT

INTRODUCTION: Obstetrics research has predominantly focused on the management and identification of factors associated with labor dystocia. Despite these efforts, clinicians currently lack the necessary tools to effectively predict a woman's risk of experiencing labor dystocia. Therefore, the objective of this study was to create a predictive model for labor dystocia. MATERIAL AND METHODS: The study population included nulliparous women with a single baby in the cephalic presentation in spontaneous labor at term. With a cohort-based registry design utilizing data from the Copenhagen Pregnancy Cohort and the Danish Medical Birth Registry, we included women who had given birth from 2014 to 2020 at Copenhagen University Hospital-Rigshospitalet, Denmark. Logistic regression analysis, augmented by a super learner algorithm, was employed to construct the prediction model with candidate predictors pre-selected based on clinical reasoning and existing evidence. These predictors included maternal age, pre-pregnancy body mass index, height, gestational age, physical activity, self-reported medical condition, WHO-5 score, and fertility treatment. Model performance was evaluated using the area under the receiver operating characteristics curve (AUC) for discriminative capacity and Brier score for model calibration. RESULTS: A total of 12,445 women involving 5,525 events of labor dystocia (44%) were included. All candidate predictors were retained in the final model, which demonstrated discriminative ability with an AUC of 62.3% (95% CI:60.7-64.0) and Brier score of 0.24. CONCLUSIONS: Our model represents an initial advancement in the prediction of labor dystocia utilizing readily available information obtainable upon admission in active labor. As a next step further model development and external testing across other populations is warranted. With time a well-performing model may be a step towards facilitating risk stratification and the development of a user-friendly online tool for clinicians.


Subject(s)
Body Mass Index , Dystocia , Maternal Age , Parity , Humans , Female , Pregnancy , Dystocia/epidemiology , Adult , Risk Factors , Denmark/epidemiology , ROC Curve , Labor Onset , Registries , Gestational Age
2.
Clin Epidemiol ; 16: 631-640, 2024.
Article in English | MEDLINE | ID: mdl-39345298

ABSTRACT

Background: Heart failure (HF) is associated with increased risk of death and a hospitalization, but for patients initiating guideline directed medical therapy, it is unknown how high these risks are compared to the general population - and how this may vary depending on age and comorbidity. Methods: In this retrospective cohort study, we identified patients diagnosed with HF in the period 2011-2017, surviving the initial 120 days after diagnosis. Patients who were on angiotensin converting enzyme inhibitor (ACEi)/ angiotensin receptor blocker (ARB) and beta-blocker were included and matched to 5 non-HF individuals from the background population each based on age and sex. We assessed the 5-year risk of all-cause death, HF and non-HF hospitalization according to sex and age and baseline comorbidity. Results: We included 35,367 patients with HF and 176,835 matched non-HF individuals. Patients with HF had a five-year excess risk (absolute risk difference) of death of 13% (31% [for HF] - 18% [for non-HF]), of HF hospitalization of 17% and of non-HF hospitalization of 24%. Excess risk of death increased with increasing age, whereas the relative risk decreased - for women in their twenties, the excess risk was 7%, risk ratio 7.2, while the excess risk was 18%, risk ratio 1.5 for women in their eighties. Having HF as a 60-year old man was associated with a five-year risk of death similar to a 75-year old man without HF. Further, HF was associated with an excess risk of non-HF hospitalization, ranging from 8% for patients >85 years to 30% for patients <30 years. Conclusion: Regardless of age, sex and comorbidity, HF was associated with excess risk of mortality and non-HF hospitalizations, but the relative risk ratio diminishes sharply with advancing age, which may influence allocation of resources for medical care across populations.

3.
Medicine (Baltimore) ; 103(19): e38070, 2024 May 10.
Article in English | MEDLINE | ID: mdl-38728490

ABSTRACT

This study used demographic data in a novel prediction model to identify areas with high risk of out-of-hospital cardiac arrest (OHCA) in order to target prehospital preparedness. We combined data from the nationwide Danish Cardiac Arrest Registry with geographical- and demographic data on a hectare level. Hectares were classified in a hierarchy according to characteristics and pooled to square kilometers (km2). Historical OHCA incidence of each hectare group was supplemented with a predicted annual risk of at least 1 OHCA to ensure future applicability. We recorded 19,090 valid OHCAs during 2016 to 2019. The mean annual OHCA rate was highest in residential areas with no point of public interest and 100 to 1000 residents per hectare (9.7/year/km2) followed by pedestrian streets with multiple shops (5.8/year/km2), areas with no point of public interest and 50 to 100 residents (5.5/year/km2), and malls with a mean annual incidence per km2 of 4.6. Other high incidence areas were public transport stations, schools and areas without a point of public interest and 10 to 50 residents. These areas combined constitute 1496 km2 annually corresponding to 3.4% of the total area of Denmark and account for 65% of the OHCA incidence. Our prediction model confirms these areas to be of high risk and outperforms simple previous incidence in identifying future risk-sites. Two thirds of out-of-hospital cardiac arrests were identified in only 3.4% of the area of Denmark. This area was easily identified as having multiple residents or having airports, malls, pedestrian shopping streets or schools. This result has important implications for targeted intervention such as automatic defibrillators available to the public. Further, demographic information should be considered when implementing such interventions.


Subject(s)
Out-of-Hospital Cardiac Arrest , Humans , Out-of-Hospital Cardiac Arrest/epidemiology , Male , Female , Denmark/epidemiology , Aged , Middle Aged , Incidence , Registries , Adult , Forecasting , Aged, 80 and over
5.
BMJ ; 385: e078063, 2024 04 15.
Article in English | MEDLINE | ID: mdl-38621801

ABSTRACT

OBJECTIVE: To train and test a super learner strategy for risk prediction of kidney failure and mortality in people with incident moderate to severe chronic kidney disease (stage G3b to G4). DESIGN: Multinational, longitudinal, population based, cohort study. SETTINGS: Linked population health data from Canada (training and temporal testing), and Denmark and Scotland (geographical testing). PARTICIPANTS: People with newly recorded chronic kidney disease at stage G3b-G4, estimated glomerular filtration rate (eGFR) 15-44 mL/min/1.73 m2. MODELLING: The super learner algorithm selected the best performing regression models or machine learning algorithms (learners) based on their ability to predict kidney failure and mortality with minimised cross-validated prediction error (Brier score, the lower the better). Prespecified learners included age, sex, eGFR, albuminuria, with or without diabetes, and cardiovascular disease. The index of prediction accuracy, a measure of calibration and discrimination calculated from the Brier score (the higher the better) was used to compare KDpredict with the benchmark, kidney failure risk equation, which does not account for the competing risk of death, and to evaluate the performance of KDpredict mortality models. RESULTS: 67 942 Canadians, 17 528 Danish, and 7740 Scottish residents with chronic kidney disease at stage G3b to G4 were included (median age 77-80 years; median eGFR 39 mL/min/1.73 m2). Median follow-up times were five to six years in all cohorts. Rates were 0.8-1.1 per 100 person years for kidney failure and 10-12 per 100 person years for death. KDpredict was more accurate than kidney failure risk equation in prediction of kidney failure risk: five year index of prediction accuracy 27.8% (95% confidence interval 25.2% to 30.6%) versus 18.1% (15.7% to 20.4%) in Denmark and 30.5% (27.8% to 33.5%) versus 14.2% (12.0% to 16.5%) in Scotland. Predictions from kidney failure risk equation and KDpredict differed substantially, potentially leading to diverging treatment decisions. An 80-year-old man with an eGFR of 30 mL/min/1.73 m2 and an albumin-to-creatinine ratio of 100 mg/g (11 mg/mmol) would receive a five year kidney failure risk prediction of 10% from kidney failure risk equation (above the current nephrology referral threshold of 5%). The same man would receive five year risk predictions of 2% for kidney failure and 57% for mortality from KDpredict. Individual risk predictions from KDpredict with four or six variables were accurate for both outcomes. The KDpredict models retrained using older data provided accurate predictions when tested in temporally distinct, more recent data. CONCLUSIONS: KDpredict could be incorporated into electronic medical records or accessed online to accurately predict the risks of kidney failure and death in people with moderate to severe CKD. The KDpredict learning strategy is designed to be adapted to local needs and regularly revised over time to account for changes in the underlying health system and care processes.


Subject(s)
Kidney Failure, Chronic , Renal Insufficiency, Chronic , Renal Insufficiency , Aged , Aged, 80 and over , Humans , Canada , Glomerular Filtration Rate , Renal Insufficiency, Chronic/complications , Renal Insufficiency, Chronic/epidemiology , Denmark , Scotland , Longitudinal Studies
6.
PLoS One ; 19(3): e0297386, 2024.
Article in English | MEDLINE | ID: mdl-38470907

ABSTRACT

BACKGROUND: Prevention and management of childhood overweight involves the entire family. We aimed to investigate purchase patterns in households with at least one member with overweight in childhood by describing expenditure on different food groups. METHODS: This Danish register-based cohort study included households where at least one member donated receipts concerning consumers purchases in 2019-2021 and at least one member had their Body mass index (BMI) measured in childhood within ten years prior to first purchase. A probability index model was used to evaluate differences in proportion expenditure spent on specific food groups. RESULTS: We identified 737 households that included a member who had a BMI measurement in childhood, 220 with overweight and 517 with underweight or normal weight (reference households). Adjusting for education, income, family type, and urbanization, households with a member who had a BMI classified as overweight in childhood had statistically significant higher probability of spending a larger proportion of expenditure on ready meals 56.29% (95% CI: 51.70;60.78) and sugary drinks 55.98% (95% CI: 51.63;60.23). Conversely, they had a statistically significant lower probability of spending a larger proportion expenditure on vegetables 38.44% (95% CI: 34.09;42.99), compared to the reference households. CONCLUSION: Households with a member with BMI classified as overweight in childhood spent more on unhealthy foods and less on vegetables, compared to the reference households. This study highlights the need for household/family-oriented nutrition education and intervention.


Subject(s)
Income , Overweight , Humans , Cohort Studies , Vegetables , Denmark , Consumer Behavior
7.
Acta Psychiatr Scand ; 149(5): 378-388, 2024 05.
Article in English | MEDLINE | ID: mdl-38379028

ABSTRACT

BACKGROUND: Long-term studies comparing nonresponse to antidepressants for major depressive disorder (MDD) are lacking. AIMS: To present systematic population-based nation-wide register data on comparative 2-year non-response within six antidepressant drug classes and 17 different antidepressants in patients with MDD. METHOD: The study included all 106,920 patients in Denmark with a first main index diagnosis of MDD at a psychiatric hospital inpatient or outpatient contact and who subsequently had a purchase of an antidepressant in the period from 1995 to 2018. Non-response to first antidepressant within a 2-year study period was defined as switch to or add-on of another antidepressant, antipsychotic medication, lithium, or hospitalization. Analyses emulated a targeted trial in populations standardized according to age, sex, socioeconomic status, and comorbidity with psychiatric and physical disorders. RESULTS: Compared with sertraline, there was no difference for citalopram (RR: 1.00 [95% CI: 0.98-1.02]) but fluoxetine (1.13 [95% CI: 1.10-1.17]), paroxetine (1.06 [95% CI: 1.01-1.10]) and escitalopram (1.22 [95% CI: 1.18-1.25]) were associated with higher risk ratio of non-responses. Within selective noradrenaline reuptake inhibitors, sertraline outperformed reboxetine; within serotonin-norepinephrine reuptake inhibitors, venlafaxine outperformed duloxetine; within noradrenergic and specific serotonergic antidepressants, mirtazapine outperformed mianserin and within the class of other antidepressants, sertraline outperformed agomelatine and vortioxetine. Within tricyclic antidepressants, compared to amitriptyline, nortriptyline, dosulepin, and clomipramine had higher non-response, whereas there was no difference for imipramine. CONCLUSIONS: These analyses emulating a randomized trial of "real world" observational register-based data show that 2-year long-term non-responses to some antidepressants within six different drug classes are increased over others.


Subject(s)
Depressive Disorder, Major , Humans , Antidepressive Agents/therapeutic use , Depressive Disorder, Major/drug therapy , Depressive Disorder, Major/epidemiology , Fluoxetine/therapeutic use , Selective Serotonin Reuptake Inhibitors , Sertraline/therapeutic use
8.
Eur J Emerg Med ; 31(2): 127-135, 2024 Apr 01.
Article in English | MEDLINE | ID: mdl-37788126

ABSTRACT

BACKGROUND AND IMPORTANCE: Telephone calls are often patients' first healthcare service contact, outcomes associated with waiting times are unknown. OBJECTIVES: Examine the association between waiting time to answer for a medical helpline and 1- and 30-day mortality. DESIGN, SETTING AND PARTICIPANTS: Registry-based cohort study using phone calls data (January 2014 to December 2018) to the Capital Region of Denmark's medical helpline. The service refers to hospital assessment/treatment, dispatches ambulances, or suggests self-care guidance. EXPOSURE: Waiting time was grouped into the following time intervals in accordance with political service targets for waiting time in the Capital Region: <30 s, 0:30-2:59, 3-9:59, and ≥10 min. OUTCOME MEASURES AND ANALYSIS: The association between time intervals and 1- and 30-day mortality per call was calculated using logistic regression with strata defined by age and sex. MAIN RESULTS: In total, 1 244 252 callers were included, phoning 3 956 243 times, and 78% of calls waited <10 min. Among callers, 30-day mortality was 1% (16 560 deaths). For calls by females aged 85-110 30-day mortality increased with longer waiting time, particularly within the first minute: 9.6% for waiting time <30 s, 10.8% between 30 s and 1 minute and 9.1% between 1 and 2 minutes. For calls by males aged 85-110 30-day mortality was 11.1%, 12.9% and 11.1%, respectively. Additionally, among calls with a Charlson score of 2 or higher, longer waiting times were likewise associated with increased mortality. For calls by females aged 85-110 30-day mortality was 11.6% for waiting time <30 s, 12.9% between 30 s and 1 minute and 11.2% between 1 and 2 minutes. For calls by males aged 85-110 30-day mortality was 12.7%, 14.1% and 12.6%, respectively. Fewer ambulances were dispatched with longer waiting times (4%/2%) with waiting times <30 s and >10 min. CONCLUSION: Longer waiting times for telephone contact to a medical helpline were associated with increased 1- and 30-day mortality within the first minute, especially among elderly or more comorbid callers.


Subject(s)
Triage , Waiting Lists , Aged , Male , Female , Humans , Cohort Studies , Telephone , Registries , Denmark
9.
J Clin Endocrinol Metab ; 109(3): e1029-e1039, 2024 Feb 20.
Article in English | MEDLINE | ID: mdl-37955862

ABSTRACT

CONTEXT: Longitudinal data regarding vitamin D status in adolescence is scarce. This study presents population-based data from an Arctic adolescent population (n = 589) at 16 and 18 years. OBJECTIVE: The aims of this study were to investigate changes in vitamin D status during 2 years in adolescence, and whether lifestyle changes were associated with serum 25-hydroxyvitamin D (s-25(OH)D) at follow-up. METHODS: Fit Futures is a longitudinal study at 69°N in Norway. Participants had their s-25(OH)D levels analyzed in their first and third year of upper secondary school (median age 16 and 18 years), in Fit Futures 1 (FF1) and Fit Futures 2 (FF2), respectively. Self-reported lifestyle habits were registered through questionnaires. The association between lifestyle changes and s-25(OH)D levels at follow-up were calculated by regression analyses, controlling for baseline s-25(OH)D levels. RESULTS: Longitudinal data were available for 309 girls and 280 boys. The proportion of adolescents with s-25(OH)D <50 nmol/L were 73.7% in FF1 and 77.1% in FF2, while the proportion <30 nmol/L constituted 35.7% in FF1 and 40.9% in FF2. Of those with s-25(OH)D <30 nmol/L (severe vitamin D deficiency) in FF1, 73.3% remained severely deficient in FF2. Among boys, an increase in UV exposure was significantly associated with higher s-25(OH)D levels in FF2 (beta; CI [nmol/L] 12.9; 9.1, 16.7). In girls, decreased vitamin/mineral supplement intake was significantly associated with lower s-25(OH)D at FF2 (-6.7; -10.2, -3.1), while increased UV (10.8; 7.0, 14.7) and combined hormonal contraceptive exposure (12.1; 6.0, 18.1) in FF2 was significantly associated with higher s-25(OH)D levels in FF2. CONCLUSION: Severe vitamin D deficiency was prevalent throughout adolescence. Lifestyle changes may alter s-25(OH)D levels in this age group.


Subject(s)
Vitamin D Deficiency , Vitamin D , Male , Female , Adolescent , Humans , Longitudinal Studies , Follow-Up Studies , Vitamins , Vitamin D Deficiency/epidemiology , Life Style , Seasons
10.
Dent Traumatol ; 40(2): 137-143, 2024 Apr.
Article in English | MEDLINE | ID: mdl-37864425

ABSTRACT

BACKGROUND/AIM: There are few long-term clinical follow-up studies on human teeth replanted immediately or after storage in a suitable storage medium prior to replantation. This study aimed to assess the risk of ankylosis in avulsed human teeth replanted immediately or after storage in physiological media for a short time. MATERIAL: Data from 116 patients with 145 replanted avulsed permanent teeth were selected from a comprehensive dental trauma database in Copenhagen University Hospital. The following teeth were selected: Group 1 comprised 36 teeth replanted immediately (dry time <6 min; wet time <6 min). Group 2 comprised 61 teeth replanted after physiologic storage media (saliva and saline) (dry time <6 min; wet time >5 min; wet time ranged from 7 to 170 min, and mean wet time was 59 min). Group 3 (control) included 48 teeth replanted after dry storage (dry time > 60 min). METHOD: Clinical and radiographic registrations were carried out according to a standardized protocol; follow-up ranged from 7 months to 23 years. Ankylosis was diagnosed by percussion test and radiographs and related to the conditions prior to replantation and stage of root development. RESULTS: The overall risk of ankylosis was 17.2% [95% CI: 4.61; 29.79] for immediately replanted teeth, 55.3% [95% CI: 42.54; 68.00] for teeth stored in physiologic media before replantation, and 85.7% [95% CI: 75.70; 95.73] for teeth stored dry more than 1 h. Mature teeth showed a significantly higher risk of ankylosis than immature teeth. CONCLUSION: This clinical long-term study has verified earlier experimental studies showing that immediate reimplantation has the lowest risk of ankylosis. Physiologic storage media are good alternatives that also reduce the risk of ankylosis compared to dry storage, where ankylosis is more likely although not always seen. Mature teeth are significantly more likely to develop ankylosis.


Subject(s)
Root Resorption , Tooth Ankylosis , Tooth Avulsion , Humans , Dentition, Permanent , Tooth Ankylosis/etiology , Tooth Replantation/methods
11.
Lifetime Data Anal ; 30(1): 143-180, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37270750

ABSTRACT

In this article we study the effect of a baseline exposure on a terminal time-to-event outcome either directly or mediated by the illness state of a continuous-time illness-death process with baseline covariates. We propose a definition of the corresponding direct and indirect effects using the concept of separable (interventionist) effects (Robins and Richardson in Causality and psychopathology: finding the determinants of disorders and their cures, Oxford University Press, 2011; Robins et al. in arXiv:2008.06019 , 2021; Stensrud et al. in J Am Stat Assoc 117:175-183, 2022). Our proposal generalizes Martinussen and Stensrud (Biometrics 79:127-139, 2023) who consider similar causal estimands for disentangling the causal treatment effects on the event of interest and competing events in the standard continuous-time competing risk model. Unlike natural direct and indirect effects (Robins and Greenland in Epidemiology 3:143-155, 1992; Pearl in Proceedings of the seventeenth conference on uncertainty in artificial intelligence, Morgan Kaufmann, 2001) which are usually defined through manipulations of the mediator independently of the exposure (so-called cross-world interventions), separable direct and indirect effects are defined through interventions on different components of the exposure that exert their effects through distinct causal mechanisms. This approach allows us to define meaningful mediation targets even though the mediating event is truncated by the terminal event. We present the conditions for identifiability, which include some arguably restrictive structural assumptions on the treatment mechanism, and discuss when such assumptions are valid. The identifying functionals are used to construct plug-in estimators for the separable direct and indirect effects. We also present multiply robust and asymptotically efficient estimators based on the efficient influence functions. We verify the theoretical properties of the estimators in a simulation study, and we demonstrate the use of the estimators using data from a Danish registry study.


Subject(s)
Artificial Intelligence , Models, Statistical , Humans , Biometry , Causality , Computer Simulation , Mediation Analysis , Survival Analysis
12.
BMJ ; 382: e074450, 2023 09 06.
Article in English | MEDLINE | ID: mdl-37673431

ABSTRACT

OBJECTIVE: To study the influence of concomitant use of hormonal contraception and non-steroidal anti-inflammatory drugs (NSAIDs) on the risk of venous thromboembolism. DESIGN: Nationwide cohort study. SETTING: Denmark through national registries. PARTICIPANTS: All 15-49 year old women living in Denmark between 1996 and 2017 with no medical history of any venous or arterial thrombotic event, cancer, thrombophilia, hysterectomy, bilateral oophorectomy, sterilisation, or infertility treatment (n=2 029 065). MAIN OUTCOME MEASURE: A first time discharge diagnosis of lower limb deep venous thrombosis or pulmonary embolism. RESULTS: Among 2.0 million women followed for 21.0 million person years, 8710 venous thromboembolic events occurred. Compared with non-use of NSAIDs, use of NSAIDs was associated with an adjusted incidence rate ratio of venous thromboembolism of 7.2 (95% confidence interval 6.0 to 8.5) in women not using hormonal contraception, 11.0 (9.6 to 12.6) in women using high risk hormonal contraception, 7.9 (5.9 to 10.6) in those using medium risk hormonal contraception, and 4.5 (2.6 to 8.1) in users of low/no risk hormonal contraception. The corresponding numbers of extra venous thromboembolic events per 100 000 women over the first week of NSAID treatment compared with non-use of NSAIDs were 4 (3 to 5) in women not using hormonal contraception, 23 (19 to 27) in women using high risk hormonal contraception, 11 (7 to 15) in those using medium risk hormonal contraception, and 3 (0 to 5) in users of low/no risk hormonal contraception. CONCLUSIONS: NSAID use was positively associated with the development of venous thromboembolism in women of reproductive age. The number of extra venous thromboembolic events with NSAID use compared with non-use was significantly larger with concomitant use of high/medium risk hormonal contraception compared with concomitant use of low/no risk hormonal contraception. Women needing both hormonal contraception and regular use of NSAIDs should be advised accordingly.


Subject(s)
Venous Thromboembolism , Female , Humans , Adolescent , Young Adult , Adult , Middle Aged , Venous Thromboembolism/chemically induced , Venous Thromboembolism/epidemiology , Cohort Studies , Hormonal Contraception , Anti-Inflammatory Agents, Non-Steroidal/adverse effects , Hysterectomy
13.
Diabetologia ; 66(11): 2017-2029, 2023 11.
Article in English | MEDLINE | ID: mdl-37528178

ABSTRACT

AIMS/HYPOTHESIS: We aimed to examine whether individuals with initial omission of glucose-lowering drug treatment (GLDT), including those achieving initial remission of type 2 diabetes, may experience a higher risk of major adverse cardiovascular events (MACE) compared with well-controlled individuals on GLDT after a new type 2 diabetes diagnosis in real-world clinical practice. Furthermore, we examined whether a higher risk could be related to lower initiation of statins and renin-angiotensin system inhibitors (RASi). METHODS: In this cohort study, we used Danish registers to identify individuals with a first measured HbA1c between 48 and 57 mmol/mol (6.5-7.4%) from 2014 to 2020. Six months later, we divided participants into four groups according to GLDT and achieved HbA1c (<48 vs ≥48 mmol/mol [6.5%]): well-controlled and poorly controlled on GLDT; remission and persistent type 2 diabetes not on GLDT. We reported how much the standardised 5 year risk of MACE could be reduced for each group if initiation of statins and RASi was the same as in the well-controlled group on GLDT. RESULTS: We included 14,221 individuals. Compared with well-controlled participants on GLDT, the 5 year standardised risk of MACE was higher in the three other exposure groups: by 3.3% (95% CI 1.6, 5.1) in the persistent type 2 diabetes group not on GLDT; 2.0% (95% CI 0.4, 3.7) in the remission group not on GLDT; and 3.5% (95% CI 1.3, 5.7) in the poorly controlled group on GLDT. Fewer individuals not on GLDT initiated statins and RASi compared with individuals on GLDT. If initiation of statins and RASi had been the same as in the well-controlled group on GLDT, participants not on GLDT could have reduced their risk of MACE by 2.1% (95% CI 1.2, 2.9) in the persistent type 2 diabetes group and by 1.1% (95% CI 0.4, 1.9) in the remission group. CONCLUSIONS/INTERPRETATION: Compared with well-controlled individuals on GLDT, individuals not on initial GLDT had a higher 5 year risk of MACE, even among those achieving remission of type 2 diabetes. This may be related to lower use of statins and RASi.


Subject(s)
Cardiovascular Diseases , Diabetes Mellitus, Type 2 , Hydroxymethylglutaryl-CoA Reductase Inhibitors , Humans , Diabetes Mellitus, Type 2/drug therapy , Cohort Studies , Hydroxymethylglutaryl-CoA Reductase Inhibitors/therapeutic use , Glucose , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/drug therapy , Denmark/epidemiology
14.
Circ Heart Fail ; 16(10): e010617, 2023 10.
Article in English | MEDLINE | ID: mdl-37503624

ABSTRACT

BACKGROUND: Patients with heart failure are vulnerable to the SARS-CoV-2 infection. However, limited evidence exists on the safety of the SARS-CoV-2 mRNA vaccines in this patient population. The objective of this study was to investigate the risk of all-cause mortality, worsening heart failure, venous thromboembolism, and myocarditis associated with the mRNA vaccines in patients with heart failure. METHODS: Using Danish nationwide registries, 2 cohorts were constructed: (1) all prevalent heart failure patients in 2019 aged 40 to 95 years and (2) all prevalent heart failure patients in 2021 aged 40 to 95 years, who were vaccinated with either of the 2 mRNA vaccines (BNT162B2 or mRNA-1273). The patients in the 2 cohorts were matched 1:1 using exact exposure matching on age, sex, and duration of heart failure. To estimate standardized absolute risks, outcome-specific Cox regression analyses were performed. RESULTS: The total study population comprised 101 786 patients. The median age of the study population was 74 years (interquartile range, 66-81). The standardized risk of all-cause mortality within 90 days was 2.23% (95% CI, 2.10%-2.36%) in the vaccinated cohort and 2.56% (95% CI, 2.43%-2.70%) in the unvaccinated cohort (90-day risk difference, -0.33% [95% CI, -0.52% to -0.15%]). The standardized risk of worsening heart failure within 90 days was 1.10% (95% CI, -1.01% to 1.19%) in the 2021 (vaccinated) cohort and 1.08% (95% CI, 0.99%-1.17%) in the 2019 (unvaccinated) cohort (risk difference, 0.02% [95% CI, -0.11% to 0.15%]). No significant differences were found regarding venous thromboembolism or myocarditis. CONCLUSIONS: Receiving an mRNA vaccine was not associated with an increased risk of worsening heart failure, myocarditis, venous thromboembolism, or all-cause mortality.


Subject(s)
COVID-19 , Heart Failure , Myocarditis , Venous Thromboembolism , Humans , Aged , Heart Failure/epidemiology , BNT162 Vaccine , COVID-19 Vaccines/adverse effects , COVID-19/prevention & control , SARS-CoV-2 , Vaccination/adverse effects , mRNA Vaccines
15.
Dent Traumatol ; 39(5): 455-461, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37272585

ABSTRACT

BACKGROUND/AIM: Primary teeth are frequently affected by traumatic dental injuries. Root fractures are rare and have a reported incidence of 2% in the primary dentition. Hence, there is limited evidence on this topic. This study aims to evaluate the risk of healing complications in primary teeth with root fracture and to identify possible sequelae in the permanent dentition following root fracture in the primary dentition. MATERIALS AND METHODS: A retrospective analysis of a cohort of 53 patients with 74 root fractured primary teeth. The standard follow-up program included clinical and radiographic examination after 4 weeks, 8 weeks, 6 months, and 1 year after the trauma and when the patient was 6 years of age. The following complications were registered: pulp necrosis (PN), pulp canal obliteration (PCO), ankylosis with replacement root resorption (ARR), infection-related root resorption (IRR), premature tooth loss (PTL), and repair-related resorption (RRR). STATISTICS: The Kaplan-Meier and Aalen-Johansen estimators were employed. The level of significance was 5%. RESULTS: A total of 74 teeth were included. 42 teeth were extracted at the initial examination. Risks estimated after 3 years: PTL 45.9% [95% CI: 28.8-63.0], PCO 12.9% [95% CI: 2.3-23.4], PN 14.9% [95% CI: 3.9-25.9], RRR 2.6% [95% CI: 0.0-7.5]. No teeth showed ARR or IRR. All complications were diagnosed within the first year. Most common sequelae in the permanent dentition was demarcated opacities, with an estimated risk of 20% [95% CI: 8.2-41.3]. CONCLUSIONS: There is a low risk of healing complications following a root fracture in the primary dentition. Root fractures often result in early extraction of the coronal fragment. The remaining apical fragment will undergo a physiological resorption. Aside from opacities, there is a low risk of sequelae in the permanent dentition.


Subject(s)
Fractures, Bone , Root Resorption , Tooth Ankylosis , Tooth Avulsion , Tooth Fractures , Tooth Loss , Humans , Retrospective Studies , Root Resorption/etiology , Tooth Avulsion/complications , Tooth Ankylosis/etiology , Dental Pulp Necrosis/etiology , Fractures, Bone/complications , Tooth Fractures/complications , Tooth Loss/etiology , Tooth, Deciduous , Tooth Root/diagnostic imaging , Tooth Root/injuries
16.
Front Immunol ; 14: 1161301, 2023.
Article in English | MEDLINE | ID: mdl-37197657

ABSTRACT

Background: Naturally acquired immunity to malaria may involve different immune mechanisms working in concert, however, their respective contributions and potential antigenic targets have not been clearly established. Here, we assessed the roles of opsonic phagocytosis and antibody-mediated merozoite growth inhibition in Plasmodium falciparum (P. falciparum) infection outcomes in Ghanaian children. Methods: The levels of merozoite opsonic phagocytosis, growth inhibition activities and six P. falciparum antigen-specific IgG of plasma samples from children (n=238, aged 0.5 to 13 years) were measured at baseline prior to the malaria seasons in southern Ghana. The children were then actively and passively followed up for febrile malaria and asymptomatic P. falciparum infection detection in a 50-week longitudinal cohort. P. falciparum infection outcome was modelled as a function of the measured immune parameters while accounting for important demographic factors. Results: High plasma activity of opsonic phagocytosis [adjusted odds ratio (aOR)= 0.16; 95%CI= 0.05 - 0.50, p = 0.002], and growth inhibition (aOR=0.15; 95% CI = 0.04-0.47; p = 0.001) were individually associated with protection against febrile malaria. There was no evidence of correlation (b= 0.13; 95% CI= -0.04-0.30; p=0.14) between the two assays. IgG antibodies against MSPDBL1 correlated with opsonic phagocytosis (OP) while IgG against PfRh2a correlated with growth inhibition. Notably, IgG antibodies against RON4 correlated with both assays. Conclusion: Opsonic phagocytosis and growth inhibition are protective immune mechanisms against malaria that may be acting independently to confer overall protection. Vaccines incorporating RON4 may benefit from both immune mechanisms.


Subject(s)
Malaria, Falciparum , Malaria , Animals , Humans , Child , Ghana , Merozoites , Antigens, Protozoan , Protozoan Proteins , Antibodies, Protozoan , Phagocytosis , Immunoglobulin G , Fever , Asymptomatic Infections
17.
Eur J Epidemiol ; 38(5): 523-531, 2023 May.
Article in English | MEDLINE | ID: mdl-37012504

ABSTRACT

A substantial part of mortality during the COVID-19-pandemic occurred among nursing home residents which caused alarm in many countries. We investigate nursing home mortality in relation to the expected mortality prior to the pandemic. This nationwide register-based study included all 135,501 Danish nursing home residents between 2015 until October 6, 2021. All-cause mortality rates were calculated using a standardization method on sex and age distribution of 2020. Survival probability and lifetime lost for 180 days was calculated using Kaplan Meier estimates. Of 3,587 COVID-19 related deaths, 1137 (32%) occurred among nursing home residents. The yearly all-cause mortality rates per 100,000 person-years in 2015, 2016, and 2017 were 35,301 (95% CI: 34,671-35,943), 34,801 (95% CI: 34,180-35,432), and 35,708 (95% CI: 35,085-36,343), respectively. Slightly elevated mortality rates per 100,000 person-years were seen in 2018, 2019, 2020, and 2021 of 38,268 (95% CI: 37,620-38,929), 36,956 (95% CI: 36,323-37,600), 37,475 (95% CI: 36,838-38,122), and 38,536 (95% CI: 37,798-39,287), respectively. For SARS-CoV-2-infected nursing home residents, lifetime lost difference was 42 days (95% CI: 38-46) in 2020 versus non-infected in 2018. Among vaccinated in 2021, lifetime lost difference was 25 days (95% CI: 18-32) for SARS-CoV-2-infected versus non-infected. Even though a high proportion of COVID-19 fatalities took place in nursing homes and SARS-CoV-2-infection increased the risk of individual death, the annual mortality was only slightly elevated. For future epidemics or pandemics reporting numbers of fatal cases in relation to expected mortality is critical.


Subject(s)
COVID-19 , Homes for the Aged , Mortality , Nursing Homes , Humans , Cohort Studies , COVID-19/epidemiology , Denmark/epidemiology , Pandemics/prevention & control , SARS-CoV-2
18.
J Neurol Sci ; 447: 120581, 2023 04 15.
Article in English | MEDLINE | ID: mdl-36827718

ABSTRACT

OBJECTIVE: The association between common electrocardiogram (ECG) markers and Alzheimer's disease has been scarcely investigated, and it is unknown if ECG markers can improve risk prediction. Thus, we aimed to examine the association between common ECG markers and Alzheimer's disease in a large population. METHODS: We studied the association between ECG markers and Alzheimer's disease using Cox models with adjustment for age, sex, and comorbidities using a large primary care population of patients aged 60 years or more. RESULTS: We followed 172,236 subjects for a median of 7.5 years. Increased PR interval (hazard ratio for PR > 188 ms: 0.76 [95% confidence interval: 0.69-0.83, p < 0.001) and increased QTc interval (hazard ratio for QTc = [426;439]: 0.90 [0.83-0.98], p = 0.02) were associated with a decreased rate of Alzheimer's disease. A positive Sokolow-Lyon index >35 mm (1.22 [1.13-1.33], p < 0.001) and increased T-wave amplitude >4.1 mm (1.15 [1.04-1.27]) were associated with an increased rate of Alzheimer's disease. Upon addition of ECG markers to a reference model, 10-year prediction area under the receiver-operator characteristics curve (AUC) improved by 0.39 [0.06-0.67] %-points. The 10-year absolute risk of Alzheimer's disease was 6.5% and 5.2% for an 82-year old female and a male, respectively, with a favorable ECG, and 12% and 9.2%, respectively, with an unfavorable ECG, almost twice as high. CONCLUSIONS: We identified several common ECG markers which were associated with Alzheimer's disease, and which improved risk prediction for Alzheimer's disease.


Subject(s)
Alzheimer Disease , Female , Humans , Male , Middle Aged , Aged, 80 and over , Alzheimer Disease/diagnostic imaging , Electrocardiography , Comorbidity , Biomarkers , Primary Health Care
19.
Scand J Gastroenterol ; 58(8): 937-944, 2023.
Article in English | MEDLINE | ID: mdl-36756743

ABSTRACT

INTRODUCTION: Overall caecum intubation rate(oCIR) and overall polyp detection rate(oPDR) have been proposed as performance indicators, but varying complexity in case mix among endoscopists may potentially affect validity. The study aims to explore the effect of adjusting for case mix on individual endoscopist performance by calculating case mix-adjusted performance estimates (cmCIR and cmPDR) and comparing them to overall performance estimates (oCIR and oPDR). The study also provides an R program for case mix analysis. METHODS: Logistic regression associated endoscopist, colonoscopy indication, patient age and patient gender with the binary outcomes of cecum intubation and polyp detection. Case mix-adjusted performance indicators were calculated for each endoscopist based on logistic regression and bootstraps. Endoscopists were ranked from best to worst by overall and case mix-adjusted performance estimates, and differences were evaluated using percentage points(pp) and rank changes. RESULTS: The dataset consisted of 7376 colonoscopies performed by 47 endoscopists. The maximum rank change for an endoscopist comparing oCIR and cmCIR was eight positions, interquartile range (IQR 1-3). The maximum change in CIR was 1.95 percentage point (pp) (IQR 0.27-0.86). The maximum rank change in the oPDR versus cmPDR analysis was 17 positions (IQR 1.5-8.5). The maximum change in PDR was 11.21 pp (IQR 2.05-6.70). Three endoscopists improved their performance from significantly inferior to within the 95% confidence interval (CI) range of performance targets using case mix-adjusted estimates. CONCLUSIONS: The majority of endoscopists were unaffected by adjustment for case mix, but a few unfortunate endoscopists had an unfavourable case mix that could invite incorrect suspicion of inferior performance.


Subject(s)
Colonic Polyps , Colorectal Neoplasms , Humans , Colonic Polyps/diagnosis , Colonoscopy , Cecum , Logistic Models , Diagnosis-Related Groups , Colorectal Neoplasms/diagnosis
20.
Stat Med ; 42(10): 1542-1564, 2023 05 10.
Article in English | MEDLINE | ID: mdl-36815690

ABSTRACT

Linkage between drug claims data and clinical outcome allows a data-driven experimental approach to drug repurposing. We develop an estimation procedure based on generalized random forests for estimation of time-point specific average treatment effects in a time-to-event setting with competing risks. To handle right-censoring, we propose a two-step procedure for estimation, applying inverse probability weighting to construct time-point specific weighted outcomes as input for the generalized random forest. The generalized random forests adaptively handle covariate effects on the treatment assignment by applying a splitting rule that targets a causal parameter. Using simulated data we demonstrate that the method is effective for a causal search through a list of treatments to be ranked according to the magnitude of their effect on clinical outcome. We illustrate the method using the Danish national health registries where it is of interest to discover drugs with an unexpected protective effect against relapse of severe depression.


Subject(s)
Random Forest , Humans , Probability
SELECTION OF CITATIONS
SEARCH DETAIL