Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 1.469
Filter
1.
medRxiv ; 2024 Feb 29.
Article in English | MEDLINE | ID: mdl-38529496

ABSTRACT

Background: Seed amplification assay (SAA) testing has become an important biomarker in the diagnosis of alpha-synuclein related neurodegenerative disorders. Objectives: To assess the rate of alpha-synuclein SAA positivity in progressive supranuclear palsy (PSP) and corticobasal syndrome (CBS), and analyse the clinical and pathological features of SAA positive and negative cases. Methods: 106 CSF samples from clinically diagnosed PSP (n=59), CBS (n=37) and indeterminate parkinsonism cases (n=10) were analysed using alpha-synuclein SAA. Results: Three cases (1 PSP, 2 CBS) were Multiple System Atrophy (MSA)-type SAA positive. 5/59 (8.5%) PSP cases were Parkinson's disease (PD)-type SAA positive, and these cases were older and had a shorter disease duration compared with SAA negative cases. In contrast, 9/35 (25.7%) CBS cases were PD-type SAA positive. Conclusions: Our results suggest that PD-type seeds can be detected in PSP and CBS using a CSF alpha-synuclein SAA, and in PSP this may impact on clinical course.

2.
Public Health ; 230: 38-44, 2024 May.
Article in English | MEDLINE | ID: mdl-38492260

ABSTRACT

OBJECTIVES: Disease progression, loss to follow-up, and mortality of HIV-2 compared with HIV-1 in children is not well understood. This is the first nationwide study reporting outcomes in children with the two HIV types in Guinea-Bissau. STUDY DESIGN: Nationwide retrospective follow-up study. METHODS: This is a retrospective follow-up study among HIV-infected children <15 years at nine ART centers from 2006 to 2021. Baseline parameters and disease outcomes for children with HIV-2 and HIV-1 were compared. RESULTS: The annual number of children diagnosed with HIV peaked in 2017. HIV-2 (n = 64) and HIV-1 (n = 1945) infected children were different concerning baseline median age (6.5 vs 3.1 years, P < 0.01), but had similar levels of severe immunodeficiency (P = 0.58) and severe anemia (P = 0.26). Within the first year of follow-up, 36.3% were lost, 5.9% died, 2.7% had transferred clinic, and 55.2% remained for follow-up. Mortality (HR = 1.05 95% CI: 0.53-2.08 for HIV-2) and attrition (HR = 0.86 95% CI: 0.62-1.19 for HIV-2) rates were similar for HIV types. CONCLUSIONS: The decline in children diagnosed per year since 2017 is possibly due to lower HIV prevalence, lack of HIV tests, and the SARS-CoV-2 epidemic. Children with HIV-2 were twice as old as HIV-1 infected when diagnosed, which suggests a slower disease progression. However, once they develop immunosuppression mortality is similar.


Subject(s)
HIV Infections , HIV Seropositivity , HIV-1 , Child , Humans , Child, Preschool , Follow-Up Studies , Retrospective Studies , HIV Infections/epidemiology , HIV-2 , Guinea-Bissau/epidemiology , Disease Progression
3.
Acta Physiol (Oxf) ; 240(6): e14117, 2024 06.
Article in English | MEDLINE | ID: mdl-38404156

ABSTRACT

AIM: To investigate effects of hormone replacement therapy in postmenopausal women on factors associated with metabolic flexibility related to whole-body parameters including fat oxidation, resting energy expenditure, body composition and plasma concentrations of fatty acids, glucose, insulin, cortisol, and lipids, and for the mitochondrial level, including mitochondrial content, respiratory capacity, efficiency, and hydrogen peroxide emission. METHODS: 22 postmenopausal women were included. 11 were undergoing estradiol and progestin treatment (HT), and 11 were matched non-treated controls (CONT). Peak oxygen consumption, maximal fat oxidation, glycated hemoglobin, body composition, and resting energy expenditure were measured. Blood samples were collected at rest and during 45 min of ergometer exercise (65% VO2peak). Muscle biopsies were obtained at rest and immediately post-exercise. Mitochondrial respiratory capacity, efficiency, and hydrogen peroxide emission in permeabilized fibers and isolated mitochondria were measured, and citrate synthase (CS) and 3-hydroxyacyl-CoA dehydrogenase (HAD) activity were assessed. RESULTS: HT showed higher absolute mitochondrial respiratory capacity and post-exercise hydrogen peroxide emission in permeabilized fibers and higher CS and HAD activities. All respiration normalized to CS activity showed no significant group differences in permeabilized fibers or isolated mitochondria. There were no differences in resting energy expenditure, maximal, and resting fat oxidation or plasma markers. HT had significantly lower visceral and total fat mass compared to CONT. CONCLUSION: Use of hormone therapy is associated with higher mitochondrial content and respiratory capacity and a lower visceral and total fat mass. Resting energy expenditure and fat oxidation did not differ between HT and CONT.


Subject(s)
Energy Metabolism , Postmenopause , Humans , Female , Postmenopause/metabolism , Middle Aged , Energy Metabolism/drug effects , Aged , Oxygen Consumption/drug effects , Hormone Replacement Therapy , Estrogen Replacement Therapy , Mitochondria/metabolism , Mitochondria/drug effects , Body Composition/drug effects , Estradiol/blood , Estradiol/metabolism , Mitochondria, Muscle/metabolism , Mitochondria, Muscle/drug effects , Muscle, Skeletal/metabolism , Muscle, Skeletal/drug effects , Adipose Tissue/metabolism , Adipose Tissue/drug effects
4.
Eur J Pain ; 28(6): 943-959, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38189159

ABSTRACT

BACKGROUND: The negative consequences of prescription opioid misuse and opioid use disorder make it relevant to identify factors associated with this problem in individuals with chronic pain. This cross-sectional study aimed at identifying subgroups of people with chronic pain based on their psychological profiles, prescription opioid misuse, craving, and withdrawal. METHODS: The sample comprised 185 individuals with chronic pain. We performed hierarchical cluster analysis on impulsivity, anxiety sensitivity, pain acceptance, pain intensity, opioid misuse, craving, and withdrawal. RESULTS: The four-cluster solution was the optimal one. Misuse, craving, and anxiety sensitivity were higher among people in the Severe-problems cluster than among people in the other three clusters. Withdrawal was the highest in the High-withdrawal cluster. Impulsivity was higher among people in the Severe-problems and High-withdrawal clusters than those in the Moderate-problems and Mild-problems clusters. Pain acceptance was higher among people in the Mild-problems cluster than among people in the other three clusters. Anxiety sensitivity and misuse were higher among people in the Moderate-problems cluster than among people in the Mild-problems cluster. CONCLUSIONS: These results support that impulsivity, anxiety sensitivity, and pain acceptance are useful constructs to identify subgroups of people with chronic pain according to their level of prescription opioid misuse, craving, and withdrawal. The results of this study may help in selecting the early intervention most suitable for each of the identified profiles. SIGNIFICANCE: The psychological profile of individuals with chronic pain, prescription opioid misuse, craving, and withdrawal is characterized by fearing anxiety-related symptoms due to the catastrophic interpretation of such symptoms and reacting impulsively to negative moods. In contrast, participants with high pain acceptance had less prescription opioid misuse, craving, and withdrawal. The profiles identified in this study could help clinicians select targets for intervention among profiles with similar needs and facilitate early interventions to prevent opioid misuse onset or aggravation.


Subject(s)
Analgesics, Opioid , Anxiety , Chronic Pain , Craving , Opioid-Related Disorders , Prescription Drug Misuse , Substance Withdrawal Syndrome , Humans , Chronic Pain/psychology , Chronic Pain/drug therapy , Male , Female , Middle Aged , Adult , Substance Withdrawal Syndrome/psychology , Opioid-Related Disorders/psychology , Analgesics, Opioid/therapeutic use , Analgesics, Opioid/adverse effects , Cross-Sectional Studies , Anxiety/psychology , Prescription Drug Misuse/psychology , Impulsive Behavior , Aged
6.
Hepatology ; 79(6): 1279-1292, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38146932

ABSTRACT

BACKGROUND AND AIMS: Alagille syndrome (ALGS) is characterized by chronic cholestasis with associated pruritus and extrahepatic anomalies. Maralixibat, an ileal bile acid transporter inhibitor, is an approved pharmacologic therapy for cholestatic pruritus in ALGS. Since long-term placebo-controlled studies are not feasible or ethical in children with rare diseases, a novel approach was taken comparing 6-year outcomes from maralixibat trials with an aligned and harmonized natural history cohort from the G lobal AL agille A lliance (GALA) study. APPROACH AND RESULTS: Maralixibat trials comprise 84 patients with ALGS with up to 6 years of treatment. GALA contains retrospective data from 1438 participants. GALA was filtered to align with key maralixibat eligibility criteria, yielding 469 participants. Serum bile acids could not be included in the GALA filtering criteria as these are not routinely performed in clinical practice. Index time was determined through maximum likelihood estimation in an effort to align the disease severity between the two cohorts with the initiation of maralixibat. Event-free survival, defined as the time to first event of manifestations of portal hypertension (variceal bleeding, ascites requiring therapy), surgical biliary diversion, liver transplant, or death, was analyzed by Cox proportional hazards methods. Sensitivity analyses and adjustments for covariates were applied. Age, total bilirubin, gamma-glutamyl transferase, and alanine aminotransferase were balanced between groups with no statistical differences. Event-free survival in the maralixibat cohort was significantly better than the GALA cohort (HR, 0.305; 95% CI, 0.189-0.491; p <0.0001). Multiple sensitivity and subgroup analyses (including serum bile acid availability) showed similar findings. CONCLUSIONS: This study demonstrates a novel application of a robust statistical method to evaluate outcomes in long-term intervention studies where placebo comparisons are not feasible, providing wide application for rare diseases. This comparison with real-world natural history data suggests that maralixibat improves event-free survival in patients with ALGS.


Subject(s)
Alagille Syndrome , Humans , Alagille Syndrome/complications , Alagille Syndrome/drug therapy , Female , Male , Retrospective Studies , Child , Infant , Child, Preschool , Progression-Free Survival , Adolescent , Carrier Proteins , Membrane Glycoproteins
7.
Hum Reprod ; 38(12): 2373-2381, 2023 Dec 04.
Article in English | MEDLINE | ID: mdl-37897214

ABSTRACT

STUDY QUESTION: How common is bleeding in early pregnancy after Hormone Replacement Therapy (HRT) Frozen Embryo Transfer (FET) and does bleeding affect the reproductive outcome? SUMMARY ANSWER: A total of 47% of HRT-FET patients experience bleeding before the eighth week of gestation, however, bleeding does not affect the reproductive outcome. WHAT IS KNOWN ALREADY: Bleeding occurs in 20% of spontaneously conceived pregnancies, although most will proceed to term. However, our knowledge regarding bleeding in early pregnancy after HRT-FET and the reproductive outcome is sparse. STUDY DESIGN, SIZE, DURATION: We performed a systematic review of the existing literature on early pregnancy bleeding after assisted reproductive technology (ART) to evaluate the bleeding prevalence and resulting reproductive outcome in this population. A random-effects proportional meta-analysis was conducted. Subsequently, we performed a prospective cohort study including 320 pregnant patients undergoing HRT-FET and a secondary analysis of the cohort study was performed to evaluate bleeding prevalence and reproductive outcome. The trial was conducted from January 2020 to November 2022 in a public fertility clinic. PARTICIPANTS/MATERIALS, SETTING, METHODS: A systematic literature search was performed, using MESH terms and included studies with data from ART patients and with early pregnancy bleeding as a separate outcome. The cohort study included patients with autologous vitrified blastocyst transfer treated in an HRT-FET protocol. In the event of a positive HCG-test, an early pregnancy scan was performed around 8 weeks of gestation. During this visit, patients answered a questionnaire regarding bleeding or spotting and its duration after the positive pregnancy test. The information was verified through medical files, and these were used to obtain information on reproductive outcomes. MAIN RESULTS AND THE ROLE OF CHANCE: The review revealed a total of 12 studies of interest. The studies reported a prevalence of early pregnancy bleeding ranging from 2.1% to 36.2%. The random effects proportional meta-analysis resulted in a pooled effect estimate of the prevalence of early pregnancy bleeding in the ART population of 18.1% (95% CI (10.5; 27.1)). Four of the included studies included data on miscarriage rate following an episode of bleeding. All four studies showed a significantly increased risk of miscarriage in patients with early pregnancy bleeding as compared to patients with no history of bleeding. No studies investigated bleeding after HRT-FET specifically. In our HRT-FET cohort study, we found that a total of 47% (149/320) of patients with a positive pregnancy test experienced bleeding before 8 weeks of gestation. Generally, the bleeding was described as spotting with a median of 2 days (range 0.5-16 days). Out of 149 patients with one or several bleeding episodes, a total of 106 patients (71%) had an ongoing pregnancy at 12 weeks of gestation. In comparison, 171 patients reported no bleeding episodes and a total of 115 (67%) of these patients had an ongoing pregnancy at 12 weeks of gestation. This difference was not significant (P = 0.45). Furthermore there was no difference in the live birth rate between the two groups (P = 0.29). LIMITATIONS, REASONS FOR CAUTION: Most studies included in the review were older and not all studies specified the type of ART. Moreover, the studies were of moderate methodological quality. The patients in the cohort study were treated in a personalized HRT-FET protocol using a rectal supplementary rescue regimen if serum progesterone levels were <35 nmol/l at embryo transfer. The results may not be applicable to other FET protocols, and the present data were based on self-reported symptoms. The systematic review revealed an increased risk of miscarriage following an episode of early pregnancy bleeding. However our cohort study found no such association. This discrepancy can partly be due to the fact, that the four studies in the review only included episodes of heavy bleeding. Also, none of the four studies included data on HRT-FET cycles making them unfit for direct comparison. WIDER IMPLICATIONS OF THE FINDINGS: Episodes of early bleeding during pregnancy are associated with distress for the pregnant woman, especially in a cohort of infertile patients. Our cohort study showed that at least minor bleeding seems to be a common adverse event of early pregnancy after HRT-FET. From the systematic review, it seems that this prevalence is higher than what has previously been described in relation to other types of ART. However, minor bleeding during early pregnancy after HRT-FET does not seem to affect the reproductive outcome. Knowledge regarding the frequent occurrence of bleeding during early pregnancy after HRT-FET and the fact that this should not be used as a prognostic parameter will help the clinician in counselling patients. STUDY FUNDING/COMPETING INTEREST(S): Gedeon Richter Nordic supported this investigator-initiated study with an unrestricted grant as well as study medication (Cyclogest). B.A. has received an unrestricted grant from Gedeon Richter Nordic and Merck and honoraria for lectures from Gedeon Richter, Merck, IBSA, and Marckyrl Pharma. P.H. received honoraria for lectures from Merck, Gedeon Richter, Institut Biochimique SA (IBSA), and Besins as well as unrestricted research grants from Merck, Gedeon Richter, and Institut Biochimique SA (IBSA). The other authors have no conflict of interest to declare. TRIAL REGISTRATION NUMBER: EudraCT no.: 2019-001539-29.


Subject(s)
Abortion, Spontaneous , Female , Pregnancy , Humans , Pregnancy Rate , Cohort Studies , Abortion, Spontaneous/epidemiology , Abortion, Spontaneous/etiology , Prospective Studies , Secondary Data Analysis , Treatment Outcome , Embryo Transfer/methods , Hormone Replacement Therapy
8.
Hum Reprod ; 38(11): 2221-2229, 2023 11 02.
Article in English | MEDLINE | ID: mdl-37759346

ABSTRACT

STUDY QUESTION: Can supplementation with rectal administration of progesterone secure high ongoing pregnancy rates (OPRs) in patients with low serum progesterone (P4) on the day of blastocyst transfer (ET)? SUMMARY ANSWER: Rectally administered progesterone commencing on the ET day secures high OPRs in patients with serum P4 levels below 35 nmol/l (11 ng/ml). WHAT IS KNOWN ALREADY: Low serum P4 levels at peri-implantation in Hormone Replacement Therapy Frozen Embryo Transfer (HRT-FET) cycles impact reproductive outcomes negatively. However, studies have shown that patients with low P4 after a standard vaginal progesterone treatment can obtain live birth rates (LBRs) comparable to patients with optimal P4 levels if they receive additionalsubcutaneous progesterone, starting around the day of blastocyst transfer. In contrast, increasing vaginal progesterone supplementation in low serum P4 patients does not increase LBR. Another route of administration rarely used in ART is the rectal route, despite the fact that progesterone is well absorbed and serum P4 levels reach a maximum level after ∼2 h. STUDY DESIGN, SIZE, DURATION: This prospective interventional study included a cohort of 488 HRT-FET cycles, in which a total of 374 patients had serum P4 levels ≥35 nmol/l (11 ng/ml) at ET, and 114 patients had serum P4 levels <35 nmol/l (11 ng/ml). The study was conducted from January 2020 to November 2022. PARTICIPANTS/MATERIALS, SETTING, METHODS: Patients underwent HRT-FET in a public Fertility Clinic, and endometrial preparation included oral oestradiol (6 mg/24 h), followed by vaginal micronized progesterone, 400 mg/12 h. Blastocyst transfer and P4 measurements were performed on the sixth day of progesterone administration. In patients with serum P4 <35 nmol/l (11 ng/ml), 'rescue' was performed by rectal administration of progesterone (400 mg/12 h) starting that same day. In pregnant patients, rectal administration continued until Week 8 of gestation, and oestradiol and vaginal progesterone treatment continued until Week 10 of gestation. MAIN RESULTS AND THE ROLE OF CHANCE: Among 488 HRT-FET single blastocyst transfers, the mean age of the patients at oocyte retrieval (OR) was 30.9 ± 4.6 years and the mean BMI at ET 25.1 ± 3.5 kg/m2. The mean serum P4 level after vaginal progesterone administration on the day of ET was 48.9 ± 21.0 nmol/l (15.4 ± 6.6 ng/ml), and a total of 23% (114/488) of the patients had a serum P4 level lower than 35 nmol/l (11 ng/ml). The overall, positive hCG rate, clinical pregnancy rate, OPR week 12, and total pregnancy loss rate were 66% (320/488), 54% (265/488), 45% (221/488), and 31% (99/320), respectively. There was no significant difference in either OPR week 12 or total pregnancy loss rate between patients with P4 ≥35 nmol/l (11 ng/ml) and patients with P4 <35 nmol/l, who received rescue in terms of rectally administered progesterone, 45% versus 46%, P = 0.77 and 30% versus 34%, P = 0.53, respectively. OPR did not differ whether patients had initially low P4 and rectal rescue or were above the P4 cut-off. Logistic regression analysis showed that only age at OR and blastocyst scoring correlated with OPR week 12, independently of other factors like BMI and vitrification day of blastocysts (Day 5 or 6). LIMITATIONS, REASONS FOR CAUTION: In this study, vaginal micronized progesterone pessaries, a solid pessary with progesterone suspended in vegetable hard fat, were used vaginally as well as rectally. It is unknown whether other vaginal progesterone products, such as capsules, gel, or tablet, could be used rectally with the same rescue effect. WIDER IMPLICATIONS OF THE FINDINGS: A substantial part of HRT-FET patients receiving vaginal progesterone treatment has lowserum P4. Adding rectally administered progesterone in these patients increases the reproductive outcome. Importantly, rectal progesterone administration is considered convenient, and progesterone pessaries are easy to administer rectally and of low cost. STUDY FUNDING/COMPETING INTEREST(S): Gedeon Richter Nordic supported the study with an unrestricted grant as well as study medication. B.A. has received unrestricted grant from Gedeon Richter Nordic and Merck and honoraria for lectures from Gedeon Richter, Merck, IBSA and Marckyrl Pharma. P.H. has received honoraria for lectures from Gedeon Richter, Merck, IBSA and U.S.K. has received grant from Gedeon Richter Nordic, IBSA and Merck for studies outside this work and honoraria for teaching from Merck and Thillotts Pharma AB and conference expenses covered by Merck. The other co-authors have no conflict of interest to declare. TRIAL REGISTRATION NUMBER (25): EudraCT no.: 2019-001539-29.


Subject(s)
Abortion, Spontaneous , Progesterone , Female , Pregnancy , Humans , Adult , Pregnancy Rate , Prospective Studies , Administration, Rectal , Embryo Transfer/methods , Estradiol , Hormone Replacement Therapy , Retrospective Studies
9.
JDS Commun ; 4(4): 278-283, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37521069

ABSTRACT

This study investigated play behavior of dairy calves kept with the cow either full time or half time in groups of 4 dam-calf pairs in 68-m2 straw-bedded pens. Twenty-four of 48 calves entering the experiment (2 calves in each of 12 pens) were observed during 24 h at 3 and 7 wk of age. The total duration of locomotor play did not differ between the 2 treatments, but full-time calves performed less frontal pushing (social play). Irrespective of treatment, calves performed less parallel locomotor play and more frontal pushing at 7 wk than 3 wk of age. Analyzing the intensity of play behavior during milking times (morning and afternoon, respectively), during the night and between morning and afternoon milking, showed that calves performed locomotor play more intensively after the cows had left the pen for milking than during other periods. The results illustrate the importance of space and external stimulation for the performance of play behavior.

10.
J Dairy Sci ; 106(9): 5853-5879, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37474370

ABSTRACT

The aim of this systematic review was to summarize the literature assessing the effects of milk feeding practices on behavior, health, and performance on dairy calves. Peer-reviewed, published articles, written in English, directly comparing the effects of milk allowance, milk feeding methods, or milk feeding frequency on dairy calves were eligible for inclusion. Outcome measures could include sucking behavior, sucking on a teat (nutritive sucking, non-nutritive sucking on a teat), abnormal sucking behavior (non-nutritive sucking on pen fixtures, other oral behaviors, or cross-sucking), signs of hunger (vocalizations or unrewarded visits at the milk feeder), activity (lying time or locomotor play), feeding behavior (milk intake, starter intake, milk meal duration, or starter meal duration), growth (body weight or average daily gain), and health (occurrence of diarrhea, respiratory disease, or mortality). We conducted 2 targeted searches using Web of Science and PubMed to identify key literature. The resulting articles underwent a 2-step screening process. This process resulted in a final sample of 94 studies. The majority of studies investigated milk allowance (n = 69). Feeding higher milk allowances had a positive or desirable effect on growth, reduced signs of hunger, and increased locomotor play behavior during the preweaning period, whereas starter intake was reduced. Studies addressing health pointed to no effect of milk allowance, with no consistent evidence indicating that higher milk allowances result in diarrhea. Studies addressing milk feeding methods (n = 14) found that feeding milk by teat reduced cross-sucking and other abnormal oral behaviors. However, results on the effect of access to a dry teat were few and mixed. Milk feeding frequency (n = 14 studies) appeared to have little effect on feed intakes and growth; however, there is some evidence that calves with lower feeding frequency experience hunger. Overall, findings strongly suggest feeding higher volumes of milk using a teat; however, further work is needed to determine the optimal feeding frequency for dairy calves.


Subject(s)
Feeding Behavior , Milk , Animals , Cattle , Eating , Body Weight , Diarrhea/veterinary , Health Behavior , Weaning , Animal Feed , Diet/veterinary
11.
NPJ Breast Cancer ; 9(1): 47, 2023 May 31.
Article in English | MEDLINE | ID: mdl-37258527

ABSTRACT

Estrogen receptor (ER) and human epidermal growth factor 2 (HER2) expression guide the use of neoadjuvant chemotherapy (NACT) in patients with early breast cancer. We evaluate the independent predictive value of adding a multigene profile (CIT256 and PAM50) to immunohistochemical (IHC) profile regarding pathological complete response (pCR) and conversion of positive to negative axillary lymph node status. The cohort includes 458 patients who had genomic profiling performed as standard of care. Using logistic regression, higher pCR and node conversion rates among patients with Non-luminal subtypes are shown, and importantly the predictive value is independent of IHC profile. In patients with ER-positive and HER2-negative breast cancer an odds ratio of 9.78 (95% CI 2.60;36.8), P < 0.001 is found for pCR among CIT256 Non-luminal vs. Luminal subtypes. The results suggest a role for integrated use of up-front multigene subtyping for selection of a neoadjuvant approach in ER-positive HER2-negative breast cancer.

12.
J Surg Res ; 290: 28-35, 2023 10.
Article in English | MEDLINE | ID: mdl-37178557

ABSTRACT

INTRODUCTION: In July 2017, a policy to increase the use of segmental grafts (SGs) was implemented at our institution. The aim was to compare changes in waitlist activity after implementation of this policy. METHODS: A single-center, retrospective study. Pediatric patients on the liver waiting list between January 2015 and December 2019 were screened. Patients were classified as receiving a liver transplant (LT) before (Period 1) or after (Period 2) policy changes. Primary end points were transplant rates and time to transplant. RESULTS: Sixty five first LT performed on 65 patients were included. Twenty nine LT were performed during Period 1 and 36 during Period 2. More than half (55%) of LT in Period 2 were SG, compared to 10.3% in Period 1 (P < 0.001). Forty nine and 56 pediatric candidates on the waiting list accounted for 38.78 and 24.48 person-years during Period 1 and Period 2, respectively. Transplant rates per 100 person-years on the waiting list increased from 85.09 during Period 1 to 187.87 in Period 2 (Rate ratio: 2.20; P < 0.001). Median time to receive a LT decreased from 229 d in Period 1 to 75 d during Period 2 (P = 0.013). One-year patient survival rates were 96.6% in Period 1 and 95.7% in Period 2. One-year graft survival rates were 89.7% and 88% in Period 1 and Period 2, respectively. CONCLUSIONS: A policy to increase the use of SG was associated with significantly higher transplant rates and lower waiting times. Implementation of this policy can be done successfully with no observed negative impact on patient and graft survival.


Subject(s)
Liver Transplantation , Humans , Child , Retrospective Studies , Liver , Survival Rate , Waiting Lists
13.
Prev Vet Med ; 214: 105899, 2023 May.
Article in English | MEDLINE | ID: mdl-36940534

ABSTRACT

Research has long established the connection between antimicrobial use (AMU) and antimicrobial resistance (AMR) in production animals, and shown that the ceasing of AMU reduces AMR. Our previous study of Danish slaughter-pig production found a quantitative relationship between lifetime AMU and abundance of antimicrobial resistance genes (ARGs). This study aimed to generate further quantitative knowledge on how changes in AMU in farms influence the abundance of ARGs both with immediate effect and over time. The study included 83 farms that were visited from 1 to 5 times. From each visit, a pooled faecal sample was produced. The abundance of ARGs was obtained by metagenomics. We used two-level linear mixed models for estimating the effect of AMU on the abundance of ARGs against six antimicrobial classes. The lifetime AMU of each batch was calculated from usage during their three rearing periods; as piglets, weaners and slaughter pigs (rearing pathway). AMU at farm level was estimated as the mean lifetime AMU of the sampled batches from each farm. At batch level, AMU was measured as the deviation between the batch-specific lifetime AMU and the general mean lifetime AMU at the farm. For peroral tetracycline and macrolide use there was a significant quantitative linear effect on the abundance of ARGs in batches within individual farms, indicating an immediate effect of changed AMU from batch to batch within farms. These estimated effects between batches within farms were approximately 1/2-1/3 of the effect estimated between farms. For all antimicrobial classes, the effect of the mean farm-level AMU and the abundance of ARGs present in the faeces of slaughter pigs was significant. This effect was identified only for peroral use, except for lincosamides, where the effect was for parenteral use. The results also indicated that the abundance of ARGs against a specific antimicrobial class also increased by the peroral usage of one or several other antimicrobial classes, except for ARGs against beta-lactams. These effects were generally lower than the AMU effect of the specific antimicrobial class. Overall, the farm peroral mean lifetime AMU affected the abundance of ARGs at antimicrobial class level and abundance of ARGs of other classes. However, the difference of AMU of the slaughter-pig batches affected only the abundance of ARGs at the same antimicrobial class level in the same antimicrobial class. The results do not exclude that parenteral usage of antimicrobials may have an effect on the abundance of ARGs.


Subject(s)
Anti-Bacterial Agents , Anti-Infective Agents , Swine , Animals , Anti-Bacterial Agents/pharmacology , Farms , Drug Resistance, Bacterial/genetics , Anti-Infective Agents/pharmacology , Denmark
14.
J Pediatr ; 257: 113339, 2023 06.
Article in English | MEDLINE | ID: mdl-36731714

ABSTRACT

OBJECTIVES: To determine whether neonatal conjugated or direct bilirubin levels were elevated in infants with biliary atresia (BA) and to estimate the number of newborns who would have positive screens in the nursery necessitating repeat testing after discharge. STUDY DESIGN: We used administrative data from a large integrated healthcare network in Utah to identify newborns who had a fractionated bilirubin recorded during birth admission from 2005 through 2019. Elevated conjugated bilirubin was defined as greater than 0.2 mg/dL and direct bilirubin was defined as greater than 0.5 mg/dL (>97.5th percentile for the assays). We performed simulations to estimate the anticipated number of false-positive screens. RESULTS: There were 32 cases of BA and 468 161 live births during the study period (1/14 700). There were 252 892 newborns with fractionated bilirubin assessed, including 26 of those subsequently confirmed to have BA. Conjugated or direct bilirubin was elevated in all 26 infants with BA and an additional 3246 newborns (1.3%) without BA. Simulated data suggest 9-21 per 1000 screened newborns will have an elevated conjugated or direct bilirubin using laboratory-based thresholds for a positive screen. Screening characteristics improved with higher thresholds without increasing false-negative tests. CONCLUSIONS: This study validates the previous findings that conjugated or direct bilirubin are elevated in the newborn period in patients with BA. A higher threshold for conjugated bilirubin improved screening performance. Future studies are warranted to determine the optimal screening test for BA and to assess the effectiveness and cost-effectiveness of implementing such a program.


Subject(s)
Biliary Atresia , Infant , Infant, Newborn , Humans , Biliary Atresia/diagnosis , Bilirubin , Cohort Studies , Utah/epidemiology , Liver Function Tests
15.
J Pediatr Gastroenterol Nutr ; 76(4): 440-446, 2023 04 01.
Article in English | MEDLINE | ID: mdl-36720105

ABSTRACT

OBJECTIVES: We sought to evaluate the safety and effectiveness of fecal microbiota transplantation (FMT) for recurrent Clostridioides difficile infection (CDI) in pediatric immunocompromised (IC) patients. METHODS: This is a multicenter retrospective cohort study of pediatric participants who underwent FMT between March 2013 and April 2020 with 12-week follow-up. Pediatric patients were included if they met the definition of IC and were treated with FMT for an indication of recurrent CDI. We excluded patients over 18 years of age, those with incomplete records, insufficient follow-up, or not meeting study definition of IC. We also excluded those treated for Clostridioides difficile recurrence without meeting the study definition and those with inflammatory bowel disease without another immunocompromising condition. RESULTS: Of 59 pediatric patients identified at 9 centers, there were 42 who met inclusion and no exclusion criteria. Included patients had a median age of 6.7 years. Etiology of IC included: solid organ transplantation (18, 43%), malignancy (12, 28%), primary immunodeficiency (10, 24%), or other chronic conditions (2, 5%). Success rate was 79% after first FMT and 86% after 1 or more FMT. There were no statistically significant differences in patient characteristics or procedural components when patients with a failed FMT were compared to those with a successful FMT. There were 15 total serious adverse events (SAEs) in 13 out of 42 (31%) patients that occurred during the follow-up period; 4 (9.5%) of which were likely treatment-related. There were no deaths or infections with multidrug resistant organisms during follow-up and all patients with a SAE fully recovered. CONCLUSIONS: The success rate of FMT for recurrent CDI in this pediatric IC cohort is high and mirrors data for IC adults and immunocompetent children. FMT-related SAEs do occur (9.5%) and highlight the need for careful consideration of risk and benefit.


Subject(s)
Clostridioides difficile , Clostridium Infections , Adult , Humans , Child , Adolescent , Fecal Microbiota Transplantation/adverse effects , Retrospective Studies , Treatment Outcome , Recurrence , Clostridium Infections/therapy
16.
N Z Vet J ; 71(3): 128-132, 2023 May.
Article in English | MEDLINE | ID: mdl-36688794

ABSTRACT

AIMS: To compare the effect on mortality and length of hospital stay of propofol with that of sodium thiopentone for the management of dogs with status epilepticus (SE) and refractory status epilepticus (RSE). METHODS: In this cohort study, medical records of a veterinary referral clinic in Argentina were retrospectively searched for dogs that were hospitalised and required induction of therapeutic coma (TC) with either propofol or sodium thiopentone for the management of SE or RSE of any cause. A logistic regression model was performed to evaluate the association between the type of anaesthetic used and in-hospital mortality adjusting for the type of epilepsy (idiopathic, structural, or reactive). Kaplan-Meier estimated survival curves for the length of hospital stay by the type of anaesthetic drug were compared using the log-rank test (deaths were considered censored events). Cox proportional hazards regression was used to estimate hazard ratios for time to hospital discharge, unadjusted and adjusted for type of epilepsy. RESULTS: A total of 24 dogs with SE were included in the study: eight treated with propofol and 16 treated with sodium thiopentone. Four dogs treated with propofol (proportion = 0.50; 95% CI = 0.15-0.84), and eight treated with sodium thiopentone (proportion = 0.50; 95% CI = 0.50-0.74) died during hospitalisation. The median hospitalisation time was 43 (IQR 24-56) hours for dogs that were treated with propofol and 72 (IQR 64-96) hours for dogs that were treated with sodium thiopentone. There was no evidence of a difference in the median duration of TC in dogs treated with propofol (12 (IQR 8-24) hours) or with sodium thiopentone (12 (IQR 7.5-20) hours; p = 0.946). In the logistic regression model, no evidence of association between the anaesthetic protocol for the management of RSE and in-hospital mortality, adjusted for the type of epilepsy, was found (OR 1.09 (95% CI = 0.17-6.87); p = 0.925). Cox regression analysis revealed a difference in the time to hospital discharge, adjusted by the type of epilepsy, between treatment groups (HR = 0.05 (95% CI = 0.01-0.54); p = 0.013). CONCLUSIONS AND CLINICAL RELEVANCE: The time spent in hospital before discharge was longer in dogs with RSE treated with sodium thiopentone compared to those treated with propofol. However, as the sample size was very small, the results obtained in the present study should be analysed with caution. Further studies including a greater number of dogs are required.


Subject(s)
Anesthetics , Dog Diseases , Propofol , Status Epilepticus , Dogs , Animals , Thiopental/therapeutic use , Thiopental/pharmacology , Propofol/therapeutic use , Propofol/pharmacology , Cohort Studies , Retrospective Studies , Status Epilepticus/drug therapy , Status Epilepticus/veterinary , Anesthetics/therapeutic use , Sodium/therapeutic use , Dog Diseases/drug therapy
17.
J Dairy Sci ; 106(2): 937-953, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36460507

ABSTRACT

The aim of the study was to investigate the effects of substituting silage of chopped grass with pulp silage of grass fractionated once or twice in a biorefinery using a screw press on fiber kinetics, protein value, and production of CH4 in dairy cows. Six lactating multiparous Holstein cows in mid-lactation (176 ± 93 d in milk), cannulated in the rumen, duodenum, and ileum, were used in an incomplete 6 × 4 Latin square design with a 2 × 3 factorial arrangement of treatments. Perennial ryegrass was harvested in third regrowth from the same field at early and late developmental stage (35 and 44 d of regrowth, respectively) and subjected to 1 of 3 types of processing within each developmental stage. Grass was either harvested for normal silage making (mowed, wilted, chopped, and ensiled), or harvested fresh and fractionated using a screw press. Half of the pulp from the first fractionation was ensiled, whereas the other half of the pulp was rehydrated, fractionated a second time, and pulp hereof was ensiled. The grass and pulp silages were used with concentrates (65:35 forage to concentrate ratio) to make total mixed rations (TMR) based on either silage of chopped grass (GS), pulp silage of grass fractionated once (1×P), or pulp silage of grass fractionated twice (2×P), harvested either at early (E) or late (L) developmental stage resulting in 6 different TMR treatments (EGS, E1×P, E2×P, LGS, L1×P, L2×P). The TMR were fed for ad libitum intake and samples of intestinal digesta and feces were collected for determination of digestibility. The effect of processing on ash-free neutral detergent fiber (aNDFom) concentration in silages depended on developmental stage, but showed that within each developmental stage, pulp silage of grass fractionated twice had higher aNDFom concentration than pulp silage of grass fractionated once and silage of chopped grass. The 2×P resulted in lower (14.9 ± 0.55 vs. 17.5 ± 0.54 kg/d) dry matter intake (DMI) compared with GS. The effects of processing and developmental stage interacted such that apparent total-tract aNDFom digestibility was higher (784 ± 13 vs. 715 ± 13 g/kg) for L2×P compared with LGS, whereas no difference was found between E2×P and EGS. Moreover, the protein value was higher (106 ± 5 vs. 92 ± 5 g AA digested in the small intestine/kg of DMI) for 2×P compared with GS. Unexpectedly, processing had no effect on fractional rate of digestion of digestible aNDFom or CH4 yield (L/kg of DMI), whereas feeding forages harvested at early compared with late developmental stage resulted in lower CH4 yield. Feeding pulp silage of grass fractionated once generally yielded results intermediate to cows fed silage of chopped grass and pulp silage of grass fractionated twice. This study showed that pulp silage of fractionated grass could serve as feed for dairy cows because the fiber digestibility and protein value improved, but further research investigating effects of physical processing of forage on fiber kinetics is required.


Subject(s)
Lolium , Female , Cattle , Animals , Silage/analysis , Lactation , Diet/veterinary , Poaceae/metabolism , Milk/metabolism , Rumen/metabolism , Digestion , Zea mays
18.
Hepatology ; 77(2): 512-529, 2023 02 01.
Article in English | MEDLINE | ID: mdl-36036223

ABSTRACT

BACKGROUND AND AIMS: Alagille syndrome (ALGS) is a multisystem disorder, characterized by cholestasis. Existing outcome data are largely derived from tertiary centers, and real-world data are lacking. This study aimed to elucidate the natural history of liver disease in a contemporary, international cohort of children with ALGS. APPROACH AND RESULTS: This was a multicenter retrospective study of children with a clinically and/or genetically confirmed ALGS diagnosis, born between January 1997 and August 2019. Native liver survival (NLS) and event-free survival rates were assessed. Cox models were constructed to identify early biochemical predictors of clinically evident portal hypertension (CEPH) and NLS. In total, 1433 children (57% male) from 67 centers in 29 countries were included. The 10 and 18-year NLS rates were 54.4% and 40.3%. By 10 and 18 years, 51.5% and 66.0% of children with ALGS experienced ≥1 adverse liver-related event (CEPH, transplant, or death). Children (>6 and ≤12 months) with median total bilirubin (TB) levels between ≥5.0 and <10.0 mg/dl had a 4.1-fold (95% confidence interval [CI], 1.6-10.8), and those ≥10.0 mg/dl had an 8.0-fold (95% CI, 3.4-18.4) increased risk of developing CEPH compared with those <5.0 mg/dl. Median TB levels between ≥5.0 and <10.0 mg/dl and >10.0 mg/dl were associated with a 4.8 (95% CI, 2.4-9.7) and 15.6 (95% CI, 8.7-28.2) increased risk of transplantation relative to <5.0 mg/dl. Median TB <5.0 mg/dl were associated with higher NLS rates relative to ≥5.0 mg/dl, with 79% reaching adulthood with native liver ( p < 0.001). CONCLUSIONS: In this large international cohort of ALGS, only 40.3% of children reach adulthood with their native liver. A TB <5.0 mg/dl between 6 and 12 months of age is associated with better hepatic outcomes. These thresholds provide clinicians with an objective tool to assist with clinical decision-making and in the evaluation of therapies.


Subject(s)
Alagille Syndrome , Cholestasis , Hypertension, Portal , Humans , Child , Male , Female , Alagille Syndrome/epidemiology , Retrospective Studies , Hypertension, Portal/etiology
19.
Hepatology ; 77(2): 530-545, 2023 02 01.
Article in English | MEDLINE | ID: mdl-36069569

ABSTRACT

BACKGROUND AND AIMS: Detailed investigation of the biological pathways leading to hepatic fibrosis and identification of liver fibrosis biomarkers may facilitate early interventions for pediatric cholestasis. APPROACH AND RESULTS: A targeted enzyme-linked immunosorbent assay-based panel of nine biomarkers (lysyl oxidase, tissue inhibitor matrix metalloproteinase (MMP) 1, connective tissue growth factor [CTGF], IL-8, endoglin, periostin, Mac-2-binding protein, MMP-3, and MMP-7) was examined in children with biliary atresia (BA; n = 187), alpha-1 antitrypsin deficiency (A1AT; n = 78), and Alagille syndrome (ALGS; n = 65) and correlated with liver stiffness (LSM) and biochemical measures of liver disease. Median age and LSM were 9 years and 9.5 kPa. After adjusting for covariates, there were positive correlations among LSM and endoglin ( p = 0.04) and IL-8 ( p < 0.001) and MMP-7 ( p < 0.001) in participants with BA. The best prediction model for LSM in BA using clinical and lab measurements had an R2 = 0.437; adding IL-8 and MMP-7 improved R2 to 0.523 and 0.526 (both p < 0.0001). In participants with A1AT, CTGF and LSM were negatively correlated ( p = 0.004); adding CTGF to an LSM prediction model improved R2 from 0.524 to 0.577 ( p = 0.0033). Biomarkers did not correlate with LSM in ALGS. A significant number of biomarker/lab correlations were found in participants with BA but not those with A1AT or ALGS. CONCLUSIONS: Endoglin, IL-8, and MMP-7 significantly correlate with increased LSM in children with BA, whereas CTGF inversely correlates with LSM in participants with A1AT; these biomarkers appear to enhance prediction of LSM beyond clinical tests. Future disease-specific investigations of change in these biomarkers over time and as predictors of clinical outcomes will be important.


Subject(s)
Alagille Syndrome , Cholestasis , Elasticity Imaging Techniques , Liver Diseases , Humans , Child , Liver/pathology , Matrix Metalloproteinase 7 , Endoglin , Interleukin-8 , Cholestasis/pathology , Liver Cirrhosis/diagnosis , Liver Cirrhosis/pathology , Liver Diseases/pathology , Biomarkers , Alagille Syndrome/pathology
20.
Scand J Rheumatol ; 52(5): 468-480, 2023 09.
Article in English | MEDLINE | ID: mdl-36315419

ABSTRACT

OBJECTIVE: Dosing of tumour necrosis factor-α inhibitors (TNFis) is not personalized causing interindividual variation in serum drug levels; however, dose optimization is not widely implemented. We hypothesized that some patients are overdosed; thus, drug prescription could be reduced by therapeutic drug monitoring (TDM). METHOD: Independent of disease activity, 239 adults treated for rheumatoid arthritis (n = 99), psoriatic arthritis 15 (n = 48), or spondyloarthritis (n = 92) were recruited for a 48-week prospective, randomized open-label trial. Standard care alone or plus TDM was applied in chronic arthritis patients treated with infliximab (IFX), (n = 81), etanercept (ETN) (n = 79), or adalimumab (ADA) (n = 79). Serum TNFi trough levels assessed at inclusion and every 4 months determined patients within/outside predefined therapeutic intervals, supporting change in prescription or drug switch. The primary endpoint was reduced drug prescription. RESULTS: Compared to standard care, TDM reduced prescribed IFX [-12% (95% confidence interval -20, -3); p = 0.001] and ETN (-15% (-29, 1); p = 0.01], and prolonged the interdosing intervals of ETN [+235% (38, 432); p = 0.02] and ADA [+28% (6, 51); p = 0.04]. Time to drug switch was accelerated (χ2 = 6.03, p = 0.01). No group differences in adverse events, disease activity, or self-reported outcomes were shown, indicating equally sustained remission. CONCLUSIONS: TDM reduced prescription of IFX, ETN, and ADA and identified patients benefiting from accelerated drug switch, thereby minimizing treatment failure, risk of toxicity, and unnecessary adverse events.


Subject(s)
Antirheumatic Agents , Arthritis, Rheumatoid , Adult , Humans , Tumor Necrosis Factor Inhibitors/therapeutic use , Drug Monitoring , Prospective Studies , Tumor Necrosis Factor-alpha , Adalimumab/therapeutic use , Etanercept/therapeutic use , Infliximab/therapeutic use , Arthritis, Rheumatoid/drug therapy , Prescriptions , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...