Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 28
Filter
1.
BMC Pediatr ; 24(1): 131, 2024 Feb 19.
Article in English | MEDLINE | ID: mdl-38373918

ABSTRACT

BACKGROUND: The objective of this study was to assess the likelihood of acute appendicitis (AA) in children presenting with abdominal symptoms at the emergency department (ED), based on their prior primary care (PC) consultation history. METHODS: Between February and June 2021, we prospectively enrolled all children presenting at the ED with acute abdominal pain indicative of possible acute appendicitis (AA). Subsequently, they were categorized into three groups: those assessed by a PC physician (PG), those brought in by their family without a prior consultation (FG), and those admitted after a PC consultation without being assessed as such. The primary objective was to assess the probability of AA diagnosis using the Pediatric Appendicitis Score (PAS). Secondary objectives included analyzing PAS and C-reactive protein (CRP) levels based on the duration of pain and final diagnoses. RESULTS: 124 children were enrolled in the study (PG, n = 56; FG, n = 55; NG, n = 13). Among them, 29 patients (23.4%) were diagnosed with AA, with 13 cases (23.2%) from the PG and 14 cases (25.4%) from the FG. The mean PAS scores for AA cases from the PG and FG were 6.69 ± 1.75 and 7.57 ± 1.6, respectively, (p = 0.3340). Both PAS scores and CRP levels showed a significant correlation with AA severity. No cases of AA were observed with PAS scores < 4. CONCLUSIONS: There was no significant difference in PAS scores between patients addressed by PG and FG, even though PAS scores tended to be higher for patients with AA. We propose a new decision-making algorithm for PC practice, which incorporates inflammatory markers and pain duration. TRIAL REGISTRATION: Institutional Ethics Committee registration number: 447-2021-103 (10/01/2021). CLINICAL TRIALS REGISTRATION NUMBER: ClinicalTrials.gov Identifier: NCT04885335 (Registered on 13/05/2021).


Subject(s)
Abdomen, Acute , Appendicitis , Child , Humans , Appendicitis/diagnosis , Appendicitis/complications , Prospective Studies , Abdominal Pain/etiology , Abdominal Pain/complications , Leukocyte Count , Acute Disease , Sensitivity and Specificity
2.
Ann Intern Med ; 177(1): 29-38, 2024 01.
Article in English | MEDLINE | ID: mdl-38079634

ABSTRACT

BACKGROUND: Endoscopic resection of adenomas prevents colorectal cancer, but the optimal technique for larger lesions is controversial. Piecemeal endoscopic mucosal resection (EMR) has a low adverse event (AE) rate but a variable recurrence rate necessitating early follow-up. Endoscopic submucosal dissection (ESD) can reduce recurrence but may increase AEs. OBJECTIVE: To compare ESD and EMR for large colonic adenomas. DESIGN: Participant-masked, parallel-group, superiority, randomized controlled trial. (ClinicalTrials.gov: NCT03962868). SETTING: Multicenter study involving 6 French referral centers from November 2019 to February 2021. PARTICIPANTS: Patients with large (≥25 mm) benign colonic lesions referred for resection. INTERVENTION: The patients were randomly assigned by computer 1:1 (stratification by lesion location and center) to ESD or EMR. MEASUREMENTS: The primary end point was 6-month local recurrence (neoplastic tissue on endoscopic assessment and scar biopsy). The secondary end points were technical failure, en bloc R0 resection, and cumulative AEs. RESULTS: In total, 360 patients were randomly assigned to ESD (n = 178) or EMR (n = 182). In the primary analysis set (n = 318 lesions in 318 patients), recurrence occurred after 1 of 161 ESDs (0.6%) and 8 of 157 EMRs (5.1%) (relative risk, 0.12 [95% CI, 0.01 to 0.96]). No recurrence occurred in R0-resected cases (90%) after ESD. The AEs occurred more often after ESD than EMR (35.6% vs. 24.5%, respectively; relative risk, 1.4 [CI, 1.0 to 2.0]). LIMITATION: Procedures were performed under general anesthesia during hospitalization in accordance with the French health system. CONCLUSION: Compared with EMR, ESD reduces the 6-month recurrence rate, obviating the need for systematic early follow-up colonoscopy at the cost of more AEs. PRIMARY FUNDING SOURCE: French Ministry of Health.


Subject(s)
Adenoma , Colonic Neoplasms , Colorectal Neoplasms , Humans , Colonic Neoplasms/surgery , Colonic Neoplasms/pathology , Colonoscopy/adverse effects , Colonoscopy/methods , Biopsy , Adenoma/surgery , Adenoma/pathology , Treatment Outcome , Colorectal Neoplasms/surgery , Colorectal Neoplasms/pathology , Neoplasm Recurrence, Local , Intestinal Mucosa/pathology , Intestinal Mucosa/surgery , Retrospective Studies
3.
Front Med (Lausanne) ; 10: 1180769, 2023.
Article in English | MEDLINE | ID: mdl-37425298

ABSTRACT

Introduction: Cytomegalovirus (CMV) is the most frequent infectious complication following solid organ transplantation. Torque teno viruses (TTV) viremia has been proposed as a biomarker of functional immunity in the management of kidney transplant recipients (KTR). The QuantiFERON®-CMV (QF-CMV) is a commercially available assay that allows the assessment of CD8+ T-cell responses in routine diagnostic laboratories. Methods: In a prospective national multicenter cohort of 64 CMV-seropositive (R+) KTR, we analyzed the value of TTV load and the two markers of the QF-CMV assay [QF-Ag (CMV-specific T-cell responses) and QF-Mg (overall T-cell responses)], alone and in combination, in prediction of CMV reactivation (≥3 log10 IU/ ml) in the first post-transplant year. We compared previously published cut-offs and specific cut-offs optimized from ROC curves for our population. Results: Using the conventional cut-off (3.45 log10 copies/ml), TTV load at D0 [inclusion visit on the day of transplantation before induction (D0)], or at M1 (1-month post-transplant visit) perform better in predicting CMV viremia control than CMV reactivation. Survival analyses suggest a better performance of our optimized TTV cut-offs (3.78 log10 copies/ml at D0 and 4.23 log10 copies/ml at M1) for risk stratification of CMV reactivation in our R+ KTR cohort. The QF-CMV (QF-Ag = 0.2 IU/ml, and QF-Mg = 0.5 IU/ml) also appears to better predict CMV viremia control than CMV reactivation. Moreover, survival analyses suggest that the QF-Mg would perform better than the QF-Ag in stratifying the risk of CMV reactivation. The use of our optimized QF-Mg cut-off (1.27 IU/ml) at M1 further improved risk stratification of CMV reactivation. Using conventional cut-offs, the combination of TTV load and QF-Ag or TTV load and QF-Mg did not improve prediction of CMV viremia control compared to separate analysis of each marker but resulted in an increase of positive predictive values. The use of our cut-offs slightly improved risk prediction of CMV reactivation. Conclusion: The combination of TTV load and QF-Ag or TTV load and QF-Mg could be useful in stratifying the risk of CMV reactivation in R+ KTR during the first post-transplant year and thereby have an impact on the duration of prophylaxis in these patients. Clinical trial registration: ClinicalTrials.gov registry, identifier NCT02064699.

4.
Front Microbiol ; 14: 1148319, 2023.
Article in English | MEDLINE | ID: mdl-36998410

ABSTRACT

Objectives: The study aimed to describe the dynamics and risk factors of Gram-negative bacteria (GNB) acquisition in preterm infants. Methods: This prospective multicenter French study included mothers hospitalized for preterm delivery and their newborns, followed until hospital discharge. Maternal feces and vaginal fluids at delivery, and neonatal feces from birth to discharge were tested for cultivable GNB, potential acquired resistance, and integrons. The primary outcome was the acquisition of GNB and integrons in neonatal feces, and their dynamics, evaluated by survival analysis using the actuarial method. Risk factors were analyzed using Cox models. Results: Two hundred thirty-eight evaluable preterm dyads were included by five different centers over 16 months. GNB were isolated in 32.6% of vaginal samples, with 15.4% of strains producing extended-spectrum beta-lactamase (ESBL) or hyperproducing cephalosporinase (HCase), and in 96.2% of maternal feces, with 7.8% ESBL-GNB or HCase-GNB. Integrons were detected in 40.2% of feces and 10.6% of GNB strains. The mean (SD) length of stay of newborns was 39.5 (15.9) days; 4 died in the hospital. At least one infection episode occurred in 36.1% of newborns. The acquisition of GNB and integrons was progressive from birth to discharge. At discharge, half of newborns had ESBL-GNB or HCase-GNB, independently favored by a premature rupture of membranes (Hazard Ratio (HR), 3.41, 95% confidence interval (CI), 1.71; 6.81), and 25.6% had integrons (protective factor: multiple gestation, HR, 0.367, 95% CI, 0.195; 0.693). Conclusion: In preterm newborns, the acquisitions of GNB, including resistant ones, and integrons are progressive from birth to discharge. A premature rupture of membranes favored the colonization by ESBL-GNB or Hcase-GNB.

5.
Int J Nurs Stud ; 135: 104348, 2022 Nov.
Article in English | MEDLINE | ID: mdl-36088731

ABSTRACT

INTRODUCTION: Bowel movements and sleep disturbances in the elderly impact their quality of life and dignity. The management of bowel movements is one of carers' main activities in nursing homes. This activity is under-recognized. It is routinely managed with laxatives or anti-diarrhea treatments, rather than being targeted at each resident's habits. We hypothesized that the implementation of a daytime person-centered bowel program in dependent elderly nursing-home residents could reduce nocturnal bowel movements and sleep disturbance due to night-time care activities. Our study evaluated the impact of a bowel program on the frequency of nights with bowel movement. METHOD: We conducted a single-center randomized controlled trial in two parallel groups: a control group with usual management versus an experimental group with the implementation of the person-centered bowel program. RESULTS: Fifty dependent elderly residents of nursing homes for over one month were included. The implementation of the person-centered bowel program significantly impacted the frequency of nights with bowel movement: 12.0 (7.0; 15.5) in the control group versus 3.7 (2.0; 6.0) in the experimental group (p < 0.000). The strategy had no significant impact on laxative intake (p = 0.470). CONCLUSIONS: The introduction of a daytime person-centered bowel program significantly reduces the frequency of nights with bowel movements for dependent nursing-home residents. This person-centered strategy restores a key role to this basic need care. Further studies could explore the impact of this program on respect, dignity, comfort and night-time rest. It also offers carers new perspectives on care, with respect to the human being. STUDY REGISTRATION: The study was registered in ClinicalTrials N°NCT03118401. TWEETABLE ABSTRACT: A daytime bowel program significantly reduces the number of nights with bowel movements for dependent nursing-home residents.


Subject(s)
Laxatives , Quality of Life , Aged , Caregivers , Humans , Laxatives/therapeutic use , Nursing Homes , Patient-Centered Care/methods
6.
Neuromodulation ; 25(4): 624-632, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35227582

ABSTRACT

BACKGROUND: Fibromyalgia is a chronic painful condition without real, effective treatment. The administration of repetitive transcranial magnetic stimulation (rTMS) has been shown to have a therapeutic effect on pain, but there are still questions about the maintenance of its effect over time. Continuation of the treatment upon clinical response through maintenance sessions is promising and merits further exploration. MATERIALS AND METHODS: We conducted a randomized, parallel-group, controlled study involving 78 patients to evaluate the effect of rTMS vs sham stimulation after a three-week induction treatment and six months of maintenance treatment (three-week periodicity) on 22 patients who presented a clinical response to the induction treatment. The clinical response was defined as a ≥30% decrease of the baseline visual analog scale (VAS) for pain and a score for the Patient Global Impression of Change (PGIC) >5. The clinic global impression, fibromyalgia impact questionnaire, symptom severity score, and Beck's depression inventory were also studied. RESULTS: A significant clinical response to treatment with rTMS was observed after the induction phase and maintained over six months, particularly as measured by the PGIC parameter of pain, as well as of the intensity of fatigue and depression, with an absence of adverse effects induced by this method. CONCLUSION: A three-week rTMS treatment, characterized by a reduction in pain, as evaluated by VAS, should be continued with the administration of rTMS maintenance sessions for an additional six months to maintain the best possible long-term effects.


Subject(s)
Fibromyalgia , Transcranial Magnetic Stimulation , Chronic Disease , Fibromyalgia/etiology , Fibromyalgia/therapy , Humans , Pain/etiology , Pain Measurement , Pilot Projects , Transcranial Magnetic Stimulation/methods , Treatment Outcome
7.
BJA Open ; 3: 100024, 2022 Sep.
Article in English | MEDLINE | ID: mdl-37588574

ABSTRACT

Background: Tonsil surgery causes significant and challenging postoperative pain. The Analgesia Nociception Index (ANI) and videopupillometry are two techniques of interest to monitor nociception in adults and may predict postoperative morphine requirements. We hypothesised that these techniques could predict the need for morphine after tonsillectomy in children. The main objective was to assess the prognostic significance of ANI and videopupillometry, measured at the end of surgery, on morphine consumption determined by a Face, Legs, Activity, Cry, Consolability (FLACC) scale score >3 in the Post Anesthesia Care Unit (PACU). Methods: A single-centre, prospective, interventional study evaluating children between 2 and 7 yr old undergoing tonsil surgery was performed. ANI and videopupillometry with tetanic stimulation were measured under general anaesthesia 4 min after the end of the surgical procedure. Each child was evaluated every 10 min by a nurse using the FLACC scale in the PACU and blinded to the measurements performed in the operating theatre. Results: Eighty-nine children were analysed and 39 (44%) received morphine in the PACU. Neither ANI values nor videopupillometry values were predictive of postoperative morphine consumption (areas under the receiver operating characteristic curve 0.54, 95% confidence interval [CI; 0.42-0.65], and P=0.57; and 0.52, 95% CI [0.41-0.63], and P=0.69, respectively). Neither ANI values nor videopupillometry values were correlated to the maximum FLACC scale score in the PACU with ρ=0.04 (P=0.71) and ρ=0.06 (P=0.57), respectively. Conclusions: Neither ANI nor videopupillometry performed at the end of surgery can predict morphine consumption in the PACU in children undergoing tonsillectomy.

9.
J Neurol Sci ; 420: 117257, 2021 01 15.
Article in English | MEDLINE | ID: mdl-33290920

ABSTRACT

RATIONALE: Hypermetabolism (HM) in Amyotrophic lateral sclerosis (ALS) is the reflection of a high energy metabolic level, but this alteration seems controversial. The main objective of the study was to confirm the existence of HM during ALS compared to healthy subjects. METHODS: A cohort of ALS patients was compared to a control group without metabolic disorder. The assessment included anthropometric criteria measurements, body composition by bioelectric impedance analysis and resting energy expenditure (REE) by indirect calorimetry. HM was defined as a variation > +10% between measured and calculated REE. Statistical analysis used Mann-Withney and Chi2 tests. Multivariate analysis included logistic regression. RESULTS: 287 patients and 75 controls were included. The metabolic level was higher in ALS patients (1500 kcal/24 h [1290-1693] vs. 1230 kcal/24 h [1000-1455], p < 0.0001) as well as the REE/fat free mass ratio (33.5 kcal/kg/24 h [30.4-37.8] vs. 28.3 kcal/kg/24 h [26.1-33.6], p < 0.0001). 55.0% of ALS patients had HM vs. 13.3% of controls (p < 0.0001). HM was strongly and positively associated with ALS (OR = 9.50 [4.49-20.10], p < 0.0001). CONCLUSIONS: HM in ALS is a reality, which affects more than half of the patients and is associated with ALS. This work confirms a very frequent metabolic deterioration during ALS. The identification of HM can allow a better adaptation of the patients' nutritional intake.


Subject(s)
Amyotrophic Lateral Sclerosis , Body Composition , Calorimetry, Indirect , Energy Metabolism , Healthy Volunteers , Humans
10.
Dig Liver Dis ; 53(2): 231-237, 2021 02.
Article in English | MEDLINE | ID: mdl-33153929

ABSTRACT

BACKGROUND: A surveillance program was performed in colorectal cancer (CRC) patients after surgery, to diagnose asymptomatic recurrence. AIMS: To assess whether 18-FDG positron emission tomography/CT (PET/CT) improved the detection of recurrence during a 3-year follow-up. METHODS: A multicentre, two-arm randomised prospective trial comparing different 36-month follow-up strategies. Complete colonoscopy was performed at baseline and after 3 years and clinical exams with imaging every 3 months. The conventional arm (A) received carcinoembryonic antigen, liver echography, and alternated between lung radiography and computed tomography (CT) scans. The experimental arm (B) received PET/CT. RESULTS: A total of 365 patients with colon (79.4%) or rectal cancer (20.6%), stages II (48.2%) or III (50.8%), were enroled in this study. At 36 months, intention-to-treat analysis revealed recurrence in 31 (17.2%) patients in arm A and 47 (25.4%) in arm B (p = 0.063). At 3 years, 7 of 31 relapses (22.5%) in arm A were surgically treated with curative intent, compared to 17 of 47 (36.2%) in arm B (p = 0.25). The rates of recurrence and new cancers were higher in arm B than arm A (p = 0.038). CONCLUSIONS: PET/CT follow-up every 6 months did not increase the rate of recurrence at 3 years or the rate of surgically treated recurrence compared with conventional follow-up.


Subject(s)
Carcinoembryonic Antigen/blood , Colorectal Neoplasms/diagnostic imaging , Fluorodeoxyglucose F18 , Neoplasm Recurrence, Local/diagnostic imaging , Positron Emission Tomography Computed Tomography/methods , Adult , Aged , Aged, 80 and over , Colorectal Neoplasms/blood , Colorectal Neoplasms/surgery , Female , Follow-Up Studies , France , Humans , Male , Middle Aged , Neoplasm Recurrence, Local/epidemiology , Prospective Studies
11.
Integr Cancer Ther ; 19: 1534735420969818, 2020.
Article in English | MEDLINE | ID: mdl-33228382

ABSTRACT

OBJECTIVES: Physical activity (PA) programs are recommended for breast cancer care. However, their modalities remain to be discussed. This study determined the best time to begin a personalized or adapted program based on cardiopulmonary exercise test function. This randomized controlled trial evaluated the effect of home-based adapted PA (APA) performed during or after treatment on cardiorespiratory fitness (CRF) at 12 months. METHOD: The primary endpoint was the peak oxygen consumption (VO2peak) at 12 months (group A vs C and B vs C). Secondary endpoints included the 6-minute walking test, assessment of muscle strength, fatigue, quality of life, anxiety, and depression, and a questionnaire on PA levels. All tests were evaluated at baseline and at 6 and 12 months. A total of 94 patients with breast cancer were randomized to 3 different groups: group A, performing 6 months of APA during adjuvant care; group B, 6 months of APA after adjuvant care; and group C, 12 months of APA during and after specific care. The program combined 1 resistance session and 2 aerobic sessions per week. Analysis of variance was used for repeated measures, Student's t-test or the Mann-Whitney U-test for continuous variables, and χ2 test for binary or categorical variables. RESULTS: The study assessed 81 participants at 6 months and 73 at 12 months. The majority of patients completed more than 85% of the exercise sessions. The baseline for VO2peak and secondary outcomes did not differ among the groups. VO2peak increased during the exercise period and decreased during the chemotherapy period without APA, but at 12 months no significant difference was observed. The same variation was observed in the 6-minute walking test, with significance at 6 months between A+C versus B (P = .04), but no difference among the groups at 12 months. In the 3 groups, no decreases in other studied parameters were noted, except at 6 months in group B without APA. CONCLUSION: Home-based APA in breast cancer patients has a positive effect on CRF and physical functions, with no differences based on the timing of this program based on specific cancer treatment. TRIAL REGISTRATION: ClinicalTrials.gouv.fr (NCT01795612). Registered 20 February 2013.


Subject(s)
Breast Neoplasms , Cardiorespiratory Fitness , Breast Neoplasms/drug therapy , Exercise , Exercise Therapy , Female , Humans , Muscle Strength , Physical Fitness , Quality of Life , Treatment Outcome
12.
J Gynecol Obstet Hum Reprod ; 49(8): 101852, 2020 Oct.
Article in English | MEDLINE | ID: mdl-32623065

ABSTRACT

INTRODUCTION: Few studies have investigated the effect of electromagnetic waves on the human fetus whereas nowadays mobile phone use is ubiquitous. The aim of this study was to evaluate the association between mobile phone use by pregnant women and fetal development during pregnancy in the general population. MATERIAL AND METHODS: Data came from the NéHaVi cohort ("prospective follow-up, from intrauterine development to the age of 18 years, for children born in Haute-Vienne"), a prospective, longitudinal, multicenter (three maternity units in Haute-Vienne) observational cohort focusing on children born between April 2014 and April 2017. Main objective was to investigate the association of mobile phone use on fetal growth. Univariate and multivariate models were generated adjusted for the socioprofessional category variables of the mother, and other variables likely to influence fetal growth. RESULTS: For the analysis 1378 medical charts were considered from which 1368 mothers (99.3 %) used their mobile phones during pregnancy. Mean phone time was 29.8 min (range: 0.0-240.0 min) per day. After adjustment, newborns whose mothers used their mobile phones for more than 30 min/day were significantly more likely to have an AUDIPOG score ≤ 10th percentile than those whose mothers used their mobile phones for less than 5 min/day during pregnancy (aOR = 1.54 [1.03; 2.31], p = 0.0374). For women using their cell phones 5-15 min and 15-30 min, there wasn't a significant association with an AUDIPOG score ≤ 10th, respectively aOR = 0.98 [0.58; 1.65] and aOR = 1.68 [0.99; 2.82]. CONCLUSION: Using a mobile phone for calls for more than 30 min per day during pregnancy may have a negative impact on fetal growth. A prospective study should be performed to further evaluate this potential link.


Subject(s)
Cell Phone Use/adverse effects , Electromagnetic Radiation , Fetal Development , Adult , Apgar Score , Birth Weight , Female , Fetal Growth Retardation/epidemiology , France/epidemiology , Humans , Infant, Newborn , Longitudinal Studies , Pregnancy , Prenatal Exposure Delayed Effects , Prospective Studies , Time Factors
13.
J Pediatric Infect Dis Soc ; 9(6): 686-694, 2020 Dec 31.
Article in English | MEDLINE | ID: mdl-32068854

ABSTRACT

BACKGROUND: Congenital cytomegalovirus (CMV) remains an important healthcare burden, resulting from primary or secondary infection in pregnant women. Exposure to young children's saliva is a major risk factor, as prevalence of CMV shedding can reach 34%. METHODS: This cross-sectional, multicenter, nationwide study was conducted in randomly selected day care centers (DCCs), and complemented with a survey among parents and DCCs. All children aged >3 months were eligible. The study measured the CMV shedding prevalence in children's saliva and described CMV genotypes epidemiology. The risk factors for CMV shedding and high viral load were evaluated using multivariable models. RESULTS: A total of 93 DCCs participated. Among the 1770 enrolled children with evaluable samples, the CMV shedding prevalence was 40% (713/1770, 95% confidence interval, 34.6-46.1), independently associated with children aged between 12 and 18 months, history of CMV infection in ≥1 parents, a mid-level income. Prevalence increased with DCC staff workload and attending children number. Viral load was ≥5 log-copies CMV/mL in 48% (342/713). Risk factors for higher viral load included children aged between 12 and 18 months, and still being breastfed. The most frequent genotype combinations were gB1-gN4c-gH2 (6.9%), gB1-gN2-gH2 (6.3%), gB4a-gN3a-gH1 (6.3%), and gB1-gN3b-gH2 (5,7%). CMV awareness was low in parents: their serological status was unknown by 72% of mothers and 82% of fathers. Only 41% knew something about CMV. CONCLUSIONS: CMV shedding was independently associated with risk factors related to the children, family and DCC. Some of these risk factors may influence prevention strategies, including through an improved information provided to parents. CLINICAL TRIALS REGISTRATION: NCT01704222.


Subject(s)
Cytomegalovirus Infections , Cytomegalovirus , Child , Child, Preschool , Cross-Sectional Studies , Cytomegalovirus Infections/epidemiology , Day Care, Medical , Female , Humans , Infant , Parents , Pregnancy , Risk Factors , Virus Shedding
14.
World J Biol Psychiatry ; 21(10): 739-747, 2020 12.
Article in English | MEDLINE | ID: mdl-32081048

ABSTRACT

OBJECTIVES: Brain-Derived Neurotrophic Factor (BDNF) has been associated with alcohol dependence and appear to vary after withdrawal, although the link with the withdrawal outcome on the long term is unknown. We aimed to assess the evolution of BDNF levels during the six months following withdrawal and determine the association with the status of alcohol consumption. METHODS: Serum BDNF levels of alcohol-dependent patients (n = 248) and biological and clinical parameters were determined at the time of alcohol cessation (D0), 14 days (D14), 28 days (D28), and 2, 4, and 6 months after (M2, M4, M6). RESULTS: Abstinence decreased during follow-up and was 31.9% after six months. BDNF levels increased by 14 days after withdrawal and remained elevated throughout the six-month period, independently of alcohol consumption. Serum BDNF levels evolved over time (p < 0.0001), with a correlation between BDNF and GGT levels. The prescription of baclofen at the time of withdrawal was associated with higher serum BDNF levels throughout the follow-up and that of anti-inflammatory drugs with lower BDNF levels. CONCLUSIONS: A link between BDNF levels, liver function, and the inflammatory state in the context of alcohol abuse and not only with alcohol dependence itself is proposed.


Subject(s)
Alcoholism , Substance Withdrawal Syndrome , Alcohol Drinking , Brain-Derived Neurotrophic Factor , Humans , Time Factors
15.
World J Hepatol ; 12(12): 1326-1340, 2020 Dec 27.
Article in English | MEDLINE | ID: mdl-33442458

ABSTRACT

BACKGROUND: The recommended monitoring tools for evaluating nucleot(s)ide analogue renal toxicity, such as estimated glomerular filtration rate (eGFR) and phosphatemia, are late markers of proximal tubulopathy. Multiple early markers are available, but no consensus exists on their use. AIM: To determine the 24 mo prevalence of subclinical proximal tubulopathy (SPT), as defined with early biomarkers, in treated vs untreated hepatitis B virus (HBV)-monoinfected patients. METHODS: A prospective, non-randomized, multicenter study of HBV-monoinfected patients with a low number of renal comorbidities was conducted. The patients were separated into three groups: Naïve, starting entecavir (ETV) treatment, or starting tenofovir disoproxil (TDF) treatment. Data on the early markers of SPT, the eGFR and phosphatemia, were collected quarterly. SPT was defined as a maximal tubular reabsorption of phosphate/eGFR below 0.8 mmoL/L and/or uric acid fractional excretion above 10%. The prevalence and cumulative incidence of SPT at month 24 (M24) were calculated. Quantitative data were analyzed using analyses of variance or Kruskal-Wallis tests, whereas chi-squared or Fisher's exact tests were used to analyze qualitative data. Multivariate analyses were used to adjust for any potential confounding factors. RESULTS: Of the 196 patients analyzed, 138 (84 naïve, 28 starting ETV, and 26 starting TDF) had no SPT at inclusion. At M24, the prevalence of SPT was not statistically different between naïve and either treated group (21.1% vs 30.7%, P < 0.42 and 50.0% vs 30.7%, P = 0.32 for ETV and TDF, respectively); no patient had an eGFR lower than 50 mL/min/1.73 m² or phosphatemia less than 0.48 mmoL/L. In the multivariate analysis, no explanatory variables were identified after adjustment. The cumulative incidence of SPT over 24 mo (25.5%, 13.3%, and 52.9% in the naïve, ETV, and TDF groups, respectively) tended to be higher in the TDF group vs the naïve group (hazard ratio: 2.283, P = 0.05). SPT-free survival at M24 was 57.6%, 68.8%, and 23.5% for the naïve, ETV, and TDF groups, respectively. The median survival time without SPT, evaluated only in the TDF group, was 5.9 mo. CONCLUSION: The prevalence and incidence of SPT was higher in TDF-treated patients compared to naïve patients. SPT in the naïve population suggests that HBV can induce renal tubular toxicity.

16.
Int J Nurs Stud Adv ; 2: 100005, 2020 Nov.
Article in English | MEDLINE | ID: mdl-38745906

ABSTRACT

Purpose: To investigate whether the shape of the food plate could affect the conservation of praxis in institutionalised elderly adults with severe Alzheimer's disease or mixed dementia. Patients and methods: We conducted a monocentric, prospective, observational, before-after case-only study in 32 patients with a loss of the ability to self-feed. The primary objective was to assess the change of food praxis using the Blandford scale at 3 weeks after changing the food plate. Secondary variables included the impact of the change of diet on the food praxis at 6 weeks, the patient's autonomy in the food intake evaluated by Tully's Eating Behaviour Scale (EBS), and the enjoyment of eating assessed by Part D of the Alzheimer's Disease-Related Quality of Life (ADRQL) scale at 3 and 6 weeks. Results: At 3 weeks after changing the food plate we observed a significant decrease in the number of aversive feeding behaviours (Δ = -0.90 ± 2.23; p = 0.03) and an improved autonomy in self-feeding (Δ = 1.88 ± 3.36.23; p = 0.001). There was also an increase in the enjoyment of eating at 3 weeks (Δ = 4.07 ± 13.02), but it was not statistically significant. These results were not consolidated at the 6 week timepoint. Conclusion: A simple change in the organisation of care during meals and the use of a familiar object can positively affect the recovery of the self-feeding autonomy of patients with severe dementia.

17.
Open Forum Infect Dis ; 6(12): ofz510, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31868865

ABSTRACT

BACKGROUND: In France, pneumococcal vaccination in adults is recommended for risk groups (chronic conditions/immunosuppression). We conducted a study on invasive pneumococcal disease (IPD) in adults to identify factors associated with disease severity and death. METHODS: We included IPD cases, excluding meningitis, from 25 acute care hospitals in 6 regions. We defined severe cases as those with shock or severe sepsis or intensive care unit admission/mechanical ventilation. We included deaths occurring within 30 days of hospitalization. Infectious disease specialists collected clinical/microbiological data on cases. RESULTS: During 2014-2017, 908 nonmeningitis IPD cases were diagnosed; 48% were severe, 84% had comorbidities, 21% died. Ninety percent of cases with comorbidities who previously sought health care were not vaccinated against pneumococcus. Compared with previously healthy cases, the risk of severe IPD increased from 20% (adjusted risk ratio [aRR], 1.2; 95% confidence interval [CI], 1.0-1.4) in cases with 1-2 chronic diseases to 30% (aRR, 1.3; 95% CI, 1.0-7.0) in those with >2 chronic diseases. Among risk groups, 13-valent pneumococcal conjugate vaccine (PCV13) serotypes and 23-valent pneumococcal polysaccharide vaccine (PPSV23) nonPCV13 serotypes were more likely to induce severe IPD compared with nonvaccine serotypes (aRR, 1.5; 95% CI, 1.3-1.9; aRR, 1.3; 95% CI, 1.0-1.5, respectively). CONCLUSIONS: We observed a cumulative effect of concurrent comorbidities on severe IPD. Vaccine serotypes were more likely to induce severe IPD among risk groups. The missed opportunities for vaccination underscore the need to enhance vaccination in risk groups.

18.
Int J Nurs Stud ; 95: 1-6, 2019 Jul.
Article in English | MEDLINE | ID: mdl-30981953

ABSTRACT

BACKGROUND: In pre-continent children, collection bags are frequently used as a first-line option to obtain a urine specimen. This practice, acknowledged by several guidelines for the step of UTI screening, is driven by a perception of the technique as being more convenient and less painful. However, our own experience led us to consider bag removal as a painful experience. OBJECTIVE: Our aim was to determine whether the use of an oleo-calcareous liniment to aid bag removal reduced the acute pain expressed by young children. METHODS: This prospective, randomized, controlled, single blind study was carried out in two emergency pediatrics departments. Pre-continent children aged 0-36 months admitted with an indication for urine testing were eligible for the study. Urine for dipstick test screening was obtained using a collection bag. At micturition, the patients were randomized into bag removal with (intervention group) or without (control group) liniment. Bag removal was recorded on video in such a manner as to permit independent assessments of pain by two evaluators blinded to group allocation. Pain was assessed using the FLACC scale. FINDINGS: 135 patients were analyzed: 70 in the intervention group and 65 in the control group. The median FLACC scores [interquartile range] for the intervention and control groups, respectively 4.0 [2.0-7.0] and 4.0 [3.0-7.0], did not differ significantly (p = 0.5). A FLACC score ≥4 was obtained for 56% of the patients and a score ≥7 for 28%. CONCLUSION: Removal of urine collection bags caused moderate to severe pain in half of the children included. The use of an oleo-calcareous liniment did not reduce this induced pain.


Subject(s)
Pain/etiology , Urinary Catheterization/adverse effects , Urinary Catheterization/instrumentation , Urine Specimen Collection/methods , Female , Humans , Infant , Infant, Newborn , Male , Prospective Studies , Single-Blind Method
19.
Reg Anesth Pain Med ; 43(6): 596-604, 2018 Aug.
Article in English | MEDLINE | ID: mdl-29672368

ABSTRACT

BACKGROUND AND OBJECTIVES: General anesthesia for breast surgery may be supplemented by using a regional anesthetic technique. We evaluated the efficacy of the first pectoral nerve block (Pecs I) in treating postoperative pain after breast cancer surgery. METHODS: A randomized, double-blind, dual-centered, placebo-controlled trial was performed. One hundred twenty-eight patients scheduled for unilateral breast cancer surgery were recruited. A multimodal analgesic regimen and surgeon-administered local anesthetic infiltration were used for all patients. Ultrasound-guided Pecs I was performed using bupivacaine or saline. The primary outcome was the patient pain score (numerical rating scale [NRS]) in the recovery unit 30 minutes after admission or just before the morphine administration (NRS ≥4/10). The secondary outcomes were postoperative opioid consumption (ie, in the recovery unit and after 24 hours). RESULTS: During recovery, no significant difference in NRS was observed between the bupivacaine (n = 62, 3.0 [1.0-4.0]) and placebo (n = 65, 3.0 [1.0-5.0]) groups (P = 0.55). However, the NRS was statistically significantly different, although not clinically significant, for patients undergoing major surgeries (mastectomies or tumorectomies with axillary clearance) (n = 29, 3.0 [0.0-4.0] vs 4.0 [2.0-5.0], P = 0.04). Morphine consumption during recovery did not differ (1.5 mg [0.0-6.0 mg] vs 3.0 mg [0.0-6.0 mg], P = 0.20), except in the major surgery subgroup (1.5 mg [0.0-6.0 mg] vs 6.0 mg [0.0-12.0 mg], P = 0.016). Intraoperative sufentanil and cumulative morphine consumption up to 24 hours did not differ between the 2 groups. Three patients experienced complications related to the Pecs I. CONCLUSIONS: Pecs I is not better than a saline placebo in the presence of multimodal analgesia for breast cancer surgery. However, its role in extended (major) breast surgery may warrant further investigation. CLINICAL TRIAL REGISTRATION: This study was registered at ClinicalTrials.gov, identifier NCT01670448.


Subject(s)
Analgesia/trends , Autonomic Nerve Block/methods , Breast Neoplasms/surgery , Mastectomy/trends , Pain, Postoperative/prevention & control , Thoracic Nerves , Aged , Breast Neoplasms/diagnosis , Breast Neoplasms/drug therapy , Double-Blind Method , Female , Humans , Mastectomy/adverse effects , Middle Aged , Pain, Postoperative/diagnosis
20.
Neuroepidemiology ; 49(1-2): 64-73, 2017.
Article in English | MEDLINE | ID: mdl-28873374

ABSTRACT

BACKGROUND: Amyotrophic Lateral Sclerosis (ALS) is an age-related neurodegenerative disease with unclear characteristics and prognosis in the oldest old (80 years and over). The aim of this study was to compare the oldest old and younger ALS patients in terms of clinical and socio-demographic characteristics, and prognosis. METHODS: ALS incident cases from the register of ALS in Limousin (FRALim), diagnosed between January 2000 and July 2013, were included. Descriptive and comparative analyses by age group were carried out. For time to event univariate analysis, Kaplan-Meier estimator and log rank test were used. Univariate and multivariate survival analyses were carried out with Cox's proportional hazard model. RESULTS: Out of 322 patients, 50 (15.5%) were aged 80 or over ("oldest old" ALS) at the time of diagnosis. Among them, the male:female gender-ratio was 1.27, and 32.6% had a bulbar onset (not different from subjects aged less than 80 years). With increasing age, there was a worsening of the clinical state of the patients at time of diagnosis in terms of weight loss, forced vital capacity, ALSFRS-R and manual muscular testing. Access to ALS referral centres decreased with age, and the use of riluzole tended to be lower in the oldest old group. The median survival of oldest old patients appeared to be 10 months shorter than that of subjects aged less than 80 years (7.4 vs. 17.4 months). CONCLUSION: The survival of oldest old ALS patients is particularly short. It relates to prognostic features at baseline and to an independent effect of advanced age.


Subject(s)
Amyotrophic Lateral Sclerosis/epidemiology , Aged , Aged, 80 and over , Female , Health Services Accessibility , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Prognosis , Registries , Retrospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...