Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 577
Filter
1.
Nutr Metab Cardiovasc Dis ; 34(8): 1912-1921, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38740537

ABSTRACT

BACKGROUND AND AIM: Coronary artery calcification (CAC) partially explains the excess cardiovascular morbidity and mortality after kidney transplantation. This study aimed to investigate determinants of CAC in stable kidney transplant recipients at 12 months post-transplantation. METHODS AND RESULTS: CAC-score was quantified by the Agatston method using non-contrast enhanced computed tomography, and age- and sex-standardized CAC-percentiles were calculated. Univariable and multivariable multinomial logistic regression was performed to study potential determinants of CAC. The independent determinants were included in multivariable multinomial logistic regression adjusting for potential confounders. 203 KTRs (age 54.0 ± 14.7 years, 61.1% male) were included. Participants were categorized into four groups according to CAC percentiles (p = 0 [CAC-score = 0], n = 68; p ≥ 1%-p ≤ 50% [CAC score = 29.0 (4.0-166.0)], n = 31; p > 50 ≤ 75% [CAC score = 101.0 (23.8-348.3)], n = 26; and p>75% [CAC score = 581.0 (148.0-1652)], n = 83). Upon multivariable multinomial logistic regression, patients with a narrower phase angle and patients who had received a graft from a deceased donor had a higher risk of being in the >75th CAC-percentile. CONCLUSIONS: This study identifies not only metabolic and transplant-related factors, but also phase angle, a composite marker of cell integrity, as an independent determinant of CAC at 12 months after kidney transplantation. This study offers new perspectives for future research into the value of bioelectrical impedance analysis in relation to vascular calcification in kidney transplant recipients.


Subject(s)
Computed Tomography Angiography , Coronary Artery Disease , Kidney Transplantation , Vascular Calcification , Humans , Kidney Transplantation/adverse effects , Male , Middle Aged , Female , Vascular Calcification/diagnostic imaging , Vascular Calcification/epidemiology , Coronary Artery Disease/diagnostic imaging , Coronary Artery Disease/epidemiology , Coronary Artery Disease/diagnosis , Risk Factors , Time Factors , Adult , Aged , Risk Assessment , Treatment Outcome , Coronary Angiography , Predictive Value of Tests , Donor Selection , Tissue Donors
2.
Nutrition ; 121: 112361, 2024 May.
Article in English | MEDLINE | ID: mdl-38367316

ABSTRACT

OBJECTIVE: We investigated the associations of sarcopenia alone, overweight or obesity, and sarcopenic overweight or obesity with COVID-19 hospitalization. METHODS: Participants from the Lifelines COVID-19 cohort who were infected with COVID-19 were included in this study. Sarcopenia was defined as a relative deviation of muscle mass of ≤ -1.0 SD from the sex-specific mean 24-h urinary creatinine excretion. Overweight or obesity was defined as a body mass index ≥ 25 kg/m2. Sarcopenic overweight or obesity was defined as the presence of overweight or obesity and low muscle mass. COVID-19 hospitalization was self-reported. Logistic regression models were used to analyze the associations of sarcopenia alone, overweight or obesity, and sarcopenic overweight or obesity with COVID-19 hospitalization. RESULTS: Of the 3594 participants infected with COVID-19 and recruited in this study, 173 had been admitted to the hospital. Compared with the reference group, individuals with overweight or obesity and sarcopenic overweight or obesity were 1.78-times and 2.09-times more likely to have been hospitalized for COVID-19, respectively, whereas sarcopenia alone did not increase the risk of COVID-19 hospitalization. CONCLUSIONS: In this middle-aged population, sarcopenic overweight or obesity elevated the risk of hospitalization for COVID-19 in those infected with COVID-19 more than overweight or obesity alone. These data support the relevance of sarcopenic overweight or obesity as a risk factor beyond the geriatric setting and should be considered in risk stratification in future public health and vaccination campaigns.


Subject(s)
COVID-19 , Sarcopenia , Male , Middle Aged , Female , Humans , Aged , Sarcopenia/complications , Sarcopenia/epidemiology , Overweight/complications , Overweight/epidemiology , Prospective Studies , COVID-19/epidemiology , COVID-19/complications , Obesity/complications , Obesity/epidemiology , Weight Gain , Hospitalization
3.
JPEN J Parenter Enteral Nutr ; 48(1): 93-99, 2024 01.
Article in English | MEDLINE | ID: mdl-37886877

ABSTRACT

BACKGROUND: Many intensive care unit (ICU) survivors suffer long-term health issues that affect their quality of life. Nutrition inadequacy can limit their rehabilitation potential. This study investigates nutrition intake and support during ICU admission and recovery. METHODS: In this prospective cohort study, 81 adult ICU patients with stays ≥48 h were included. Data on dietary intake, feeding strategies, baseline and ICU characteristics, and 1-year outcomes (physical health and readmission rates) were collected. The number of patients achieving 1.2 gram per kilogram per day of protein and 25 kilocalories per kilogram per day at 3 months, 6 months, and 12 months after ICU admission was recorded. The impact of dietary supplementation during the year was assessed. Baseline characteristics, intake barriers, and rehabilitation's influence on nutrition intake at 12 months were evaluated, along with the effect of inadequate intake on outcomes. RESULTS: After 12 months, only 10% of 60 patients achieved 1.2 g/kg/day protein intake, whereas 28% reached the advised 25 kcal/kg/day energy target. Supplementary feeding significantly increased protein intake at 3, 6, and 12 months (P = 0.003, P = 0.012, and P = 0.033, respectively) and energy intake at 3 months (P = 0.003). A positive relation was found between female sex and energy intake at 12 months after ICU admission (ß = 4.145; P = 0.043) and taste issues were independently associated with higher protein intake (ß = 0.363; P = 0.036). However, achieving upper-quartile protein or energy intake did not translate into improved physical health outcomes. CONCLUSION: Continuous and improved nutrition care is urgently needed to support patients in reaching nutrition adequacy.


Subject(s)
Energy Intake , Quality of Life , Adult , Humans , Female , Cohort Studies , Prospective Studies , Intensive Care Units , Critical Illness/therapy
4.
Eur J Nutr ; 63(2): 435-443, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37985508

ABSTRACT

PURPOSE: We investigated the associations of socioeconomic position (SEP) with total and type of fish intake in a large general population and validated whether types of fish intake were differently associated with plasma EPA and DHA in a subset of the population. METHODS: From the Lifelines Cohort Study, 94,246 participants aged 44 ± 13 years old were included to test the association of two SEP indicators, i.e., education level and household income level, with dietary intakes of total, oily, lean, fried, and other types of fish. In a subset of 575 participants (mean age: 50 ± 13 years), EPA and DHA levels were measured in plasma phospholipids and triglycerides. Dietary fish intake was assessed using Food Frequency Questionnaire. Linear regressions were applied and adjusted for relevant covariates. RESULTS: Compared to the high education level, lower education levels were negatively associated with total, oily, lean, and other fish intake (p < 0.001 for all), and positively associated with fried fish intake (ß (SE): 0.04 (0.04), p < 0.001 for middle education; 0.07 (0.04), p < 0.001 for low education), independently of relevant covariates. Similar results were observed for income levels. In the subset population, total and oily fish intakes were positively associated with plasma EPA and DHA (p < 0.02 for all). Lean and other fish intakes were positively associated with only DHA (p < 0.008 for all), but not EPA, while fried fish was not associated with either EPA or DHA in plasma (p > 0.1 for all). CONCLUSION: Lower SEP was associated with a lower total intake of fish, and of oily and lean fish, but with higher intake of fried fish. Fried fish was not associated with the fish-based EPA and DHA in plasma. Hence, SEP-related differences in fish consumption are both quantitative and qualitative.


Subject(s)
Fatty Acids, Omega-3 , Animals , Humans , Adult , Middle Aged , Cross-Sectional Studies , Cohort Studies , Diet , Fishes , Educational Status , Docosahexaenoic Acids , Eicosapentaenoic Acid
5.
Nutr Metab Cardiovasc Dis ; 34(2): 455-465, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38160137

ABSTRACT

BACKGROUND AND AIMS: Whether coffee consumption is associated with changes in estimated glomerular filtration rate (eGFR) is unknown. We investigated the relationship between coffee consumption and annual eGFR change in a large Dutch population-based study. METHODS AND RESULTS: This study was performed in 78,346 participants without chronic kidney disease (CKD) in the population-based Lifelines Cohort Study. Coffee consumption was assessed at baseline using food frequency questionnaires. Outcomes were annual eGFR change and a composite kidney outcome (defined as eGFR <60 mL/min per 1.73 m2 or >20 % eGFR decline). Multivariable linear and logistic regression analyses were used to evaluate the associations of coffee consumption (categories and cups/day) with kidney outcomes. Overall, 90 % of the participants drank coffee daily and 36 % drank >2-4 cups/day. Unadjusted mean ± SD annual eGFR change ranged from -2.86 ± 2.96 (for non-coffee drinkers) to -2.35 ± 2.62 (for participants consuming >6 cups/day) mL/min per 1.73 m2. During 3.6 ± 0.9 years follow-up, 11.1 % of participants reached the composite kidney outcome. As compared to non-coffee drinkers, higher coffee consumption was associated with less annual eGFR decline in multivariable models (ß [95 % CIs] ranged from 0.15 [0.07, 0.22] for >0-2 cups/day to 0.29 [0.20, 0.38] for >6 cups/day, P-trend <0.001). Consumption of one more cup of coffee per day was associated with a 3 % lower risk of the composite kidney outcome (OR [95%CI], 0.97 [0.96, 0.99]). The inverse association was more pronounced in a subgroup of individuals with diabetes. CONCLUSION: Coffee consumption was inversely associated with annual eGFR change and CKD risk in a large Dutch population-based cohort.


Subject(s)
Kidney , Renal Insufficiency, Chronic , Humans , Cohort Studies , Risk Factors , Renal Insufficiency, Chronic/diagnosis , Renal Insufficiency, Chronic/epidemiology , Renal Insufficiency, Chronic/prevention & control , Glomerular Filtration Rate
6.
Int J Behav Med ; 2023 Dec 08.
Article in English | MEDLINE | ID: mdl-38066237

ABSTRACT

BACKGROUND: Patients with chronic kidney disease are often requested to engage in self-monitoring sodium (i.e. salt) intake, but it is currently unknown how self-monitoring would empower them. This study aims to assess: (1) how frequent self-monitoring tools are being used during low-sodium diet self-management interventions; (2) whether self-efficacy (i.e. trust in own capability to manage the chronic disease) is associated with self-monitoring frequency; and (3) whether higher self-monitoring frequency is associated with an improvement in self-efficacy over time. METHOD: Data from two multicenter randomized controlled trials (ESMO [n = 151] and SUBLIME [n = 99]) among adult Dutch patients with chronic kidney disease (eGFR ≥ 20-25 mL/min/1.73 m2) were used. In both studies, routine care was compared to a 3-month low-sodium diet self-management intervention with several self-monitoring tools (online food diary, home blood pressure monitor, and urinary sodium measurement device [only ESMO]). Data was collected on usage frequency of self-monitoring tools. Frequencies during the interventions were compared between low and high baseline self-efficacy groups using the Mann-Whitney U test and T-test and associated with changes in self-efficacy during the interventions using Spearman correlation coefficients. RESULTS: Large variations in self-monitoring frequency were observed. In both interventions, usage of self-monitoring tools was highest during the first month with sharp drops thereafter. The online food diary was the most frequently used tool. In the ESMO intervention, low baseline self-efficacy was associated with a higher usage frequency of self-monitoring tools. This finding was not confirmed in the SUBLIME intervention. No significant associations were found between usage frequency of self-monitoring tools and changes in self-efficacy over time. CONCLUSION: Patients with low self-efficacy might benefit most from frequent usage of self-monitoring tools when sufficient guidance and support is provided.

7.
J Intensive Care Soc ; 24(4): 356-363, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37841292

ABSTRACT

Background: Intensive care unit (ICU) survivors often suffer from long-term mental problems and a reduced health-related quality of life (HRQoL). Symptoms of depression, anxiety, and post-traumatic stress disorder may render patients mentally frail post-ICU, resulting in impaired recovery and an increased informal caregiver burden. The aim of this study was to investigate the prevalence of mental frailty up to 12 months after ICU admission and pinpoint markers for early risk-assessment in clinical practice. Methods: A retrospective cohort study (2012-2018) in which clinical and post-ICU data of long-stay (⩾48 h) ICU-patients was used. Mental frailty was identified as clinically relevant symptoms of depression, anxiety, or post-traumatic distress disorder at 12 months after discharge. Results: The prevalence of mental frailty at 12 months post-ICU among the total group of 239 patients was 38%. Mental frailty was defined as clinically relevant symptoms of depression, anxiety, and/or trauma. To achieve this, previously validated cut off values were used for the HADS (HADS-Anxiety ⩾ 8; HADS-Depression ⩾ 8) and TSQ (⩾6), and CSI (⩾7). Conclusion: A significant proportion of ICU-survivors can be identified as mentally frail, which is associated with impaired HRQoL at baseline and post-ICU, and high caregiver strain. These findings emphasize the need for integrative aftercare programs for both the patient and their informal caregivers.

8.
Kidney Med ; 5(10): 100712, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37753249

ABSTRACT

Rationale & Objective: Patients with chronic kidney disease (CKD) not receiving dialysis, including kidney transplant recipients, often experience difficulties regarding self-management. An important barrier for adherence to self-management recommendations may be the presence of psychological distress, consisting of depressive and anxiety symptoms. We investigated relationships between psychological distress and adherence to self-management recommendations. Study Design: Cross-sectional online questionnaire data as part of the E-GOAL study. Setting & Participants: Patients with CKD (estimated glomerular filtration rate, 20-89 mL/min/1.73 m2) were recruited from April 2018 to October 2020 at 4 hospitals in The Netherlands and completed online screening questionnaires. Exposures: Psychological distress, depressive symptoms, and anxiety symptoms. Outcomes: Dietary adherence, physical activity, medication adherence, smoking, body mass index, and a CKD self-management index (ie, the sum of 5 binary indicators of nonadherence to the recommended self-management factors). Analytical Approach: Adjusted multivariable regression and ordinal logistic regression analyses. Results: In our sample (N = 460), 27.2% of patients reported psychological distress, and 69.8% were nonadherent to 1 or more recommendations. Higher psychological distress was significantly associated with poorer dietary adherence (ßadj, -0.13; 95% CI, -0.23 to -0.04), less physical activity (ßadj, -0.13; 95% CI, -0.22 to -0.03), and lower medication adherence (ßadj, -0.15; 95% CI, -0.24 to -0.05), but not with smoking and body mass index. Findings were similar for depressive symptoms, whereas anxiety was only associated with poorer dietary and medication adherence. Every 1-point higher psychological distress was also associated with a higher likelihood of being nonadherent to an accumulating number of different recommendations (adjusted OR, 1.04; 95% CI, 1.02-1.07). Limitations: Cross-sectional design, possible residual confounding, and self-report. Conclusions: Many people with CKD experience psychological distress, of whom most have difficulties self-managing their CKD. Given the relationship between psychological distress and adherence to CKD self-management recommendations, behavioral interventions are needed to identify and treat psychological distress as a potential barrier to CKD self-management. Plain-Language Summary: This online questionnaire study investigated relationships between psychological distress and self-management among 460 people with chronic kidney disease. Over a quarter of them reported mild-to-severe psychological distress. Alarmingly, 4 out of 5 patients with psychological distress were also nonadherent to 1 or more self-management recommendations, and higher levels of psychological distress were associated with poorer dietary and medication adherence and lower physical activity. Moreover, patients who suffered from moderate-to-severe distress were relatively more often nonadherent to 3 or more recommendations compared with patients with no or mild distress symptoms. So, it seems that psychological distress can be a barrier for self-management. To support patients in managing chronic kidney disease, researchers and health professionals should not overlook patients' mental health.

9.
Nephrol Dial Transplant ; 39(1): 74-83, 2023 Dec 20.
Article in English | MEDLINE | ID: mdl-37418245

ABSTRACT

BACKGROUND: Fatigue and impaired health-related quality of life (HRQoL) are common among kidney transplant recipients (KTR). We hypothesized that both may partially be attributable to poor sleep. METHODS: Cross-sectional and longitudinal data of KTR enrolled in the TransplantLines Biobank and Cohort Study were used. Sleep quality was assessed using the Pittsburgh Sleep Quality Index questionnaire. Individual strength (i.e. a composite of fatigue, concentration, motivation and physical activity), societal participation and HRQoL were assessed using validated questionnaires. RESULTS: We included 872 KTR (39% female, age 56 ± 13 years) and 335 healthy controls. In total, 33% of male KTR and 49% of female KTR reported poor sleep quality, which was higher compared with male and female healthy controls (19% and 28%, respectively, P < .001 for both). In logistic regression analyses, female sex, anxiety, active smoking, low protein intake, physically inactive lifestyle, low plasma magnesium concentration, using calcineurin inhibitors, not using mTOR inhibitors and using benzodiazepine agonists were associated with poor sleep quality. In adjusted linear regression analyses, poor sleep was strongly and independently associated with lower individual strength [standardized ß (st.ß) = 0.59, 95% confidence interval (CI) 0.45 to 0.74, P < .001], poorer societal participation (frequency: st.ß = -0.17, 95% CI -0.32 to -0.01, P = .04; restrictions: st.ß = -0.36, 95% CI -0.51 to -0.21, P < .001; satisfaction: st.ß = -0.44, 95% CI -0.59 to -0.28, P < .001) and lower HRQoL (physical: st.ß = -0.53, 95% CI -0.68 to -0.38, P < .001; mental: st.ß = -0.64, 95% CI -0.78 to -0.50, P < .001). The associations with poorer societal participation and lower HRQoL were strongly mediated by individual strength (P < .001 for all), yet the suggested direct effects of poor sleep quality on HRQoL remained significant (Pphysical = .03, Pmental = .002). Longitudinal data of 292 KTR showed that sleep quality improves after kidney transplantation in males (P < .001), but not in females (P = .9). CONCLUSIONS: Poor sleep quality is common among KTR, and may be a potential target to improve fatigue, societal participation and HRQoL among KTR.


Subject(s)
Kidney Transplantation , Quality of Life , Humans , Male , Female , Adult , Middle Aged , Aged , Longitudinal Studies , Cohort Studies , Cross-Sectional Studies , Sleep Quality , Fatigue/epidemiology , Fatigue/etiology , Transplant Recipients
10.
Am J Nephrol ; 54(9-10): 425-433, 2023.
Article in English | MEDLINE | ID: mdl-37231776

ABSTRACT

INTRODUCTION: In chronic kidney disease, proteinuria increases urinary copper excretion, inducing oxidative tubular damage and worsening kidney function. We investigated whether this phenomenon occurred in kidney transplant recipients (KTRs). In addition, we studied the associations of urinary copper excretion with the biomarker of oxidative tubular damage urinary liver-type fatty-acid binding protein (u-LFABP) and death-censored graft failure. METHODS: This prospective cohort study was performed in the Netherlands between 2008 and 2017, including outpatient KTR with a functioning graft for longer than 1 year, who were extensively phenotyped at baseline. Twenty-four-hour urinary copper excretion was measured by inductively coupled plasma mass spectrometry. Multivariable linear and Cox regression analyses were performed. RESULTS: In 693 KTR (57% men, 53 ± 13 years, estimated glomerular filtration rate [eGFR] 52 ± 20 mL/min/1.73 m2), baseline median urinary copper excretion was 23.6 (interquartile range 11.3-15.9) µg/24 h. Urinary protein excretion was positively associated with urinary copper excretion (standardized ß = 0.39, p < 0.001), and urinary copper excretion was positively associated with u-LFABP (standardized ß = 0.29, p < 0.001). During a median follow-up of 8 years, 109 (16%) KTR developed graft failure. KTR with relatively high copper excretion were at higher risk of long-term graft failure (hazard ratio [HR]: 1.57, 95% confidence interval [CI]: 1.32-1.86 per log2, p < 0.001), independent of multiple potential confounders like eGFR, urinary protein excretion, and time after transplantation. A dose-response relationship was observed over increasing tertiles of copper excretion (HR: 5.03, 95% CI: 2.75-9.19, tertile 3 vs. 1, p < 0.001). u-LFABP was a significant mediator of this association (74% of indirect effect, p < 0.001). CONCLUSION: In KTR, urinary protein excretion is positively correlated with urinary copper excretion. In turn, higher urinary copper excretion is associated with an independent increased risk of kidney graft failure, with a substantial mediating effect through oxidative tubular damage. Further studies are warranted to investigate whether copper excretion-targeted interventions could improve kidney graft survival.


Subject(s)
Kidney Transplantation , Male , Humans , Female , Kidney Transplantation/adverse effects , Copper , Prospective Studies , Kidney , Proteinuria/etiology , Transplant Recipients , Risk Factors , Graft Survival
11.
Nephrol Dial Transplant ; 38(10): 2321-2329, 2023 09 29.
Article in English | MEDLINE | ID: mdl-36893803

ABSTRACT

BACKGROUND: Deficiency of the essential trace element selenium is common in kidney transplant recipients (KTR), potentially hampering antioxidant and anti-inflammatory defence. Whether this impacts the long-term outcomes of KTR remains unknown. We investigated the association of urinary selenium excretion, a biomarker of selenium intake, with all-cause mortality; and its dietary determinants. METHODS: In this cohort study, outpatient KTR with a functioning graft for longer than 1 year were recruited (2008-11). Baseline 24-h urinary selenium excretion was measured by mass spectrometry. Diet was assessed by a 177-item food frequency questionnaire, and protein intake was calculated by the Maroni equation. Multivariable linear and Cox regression analyses were performed. RESULTS: In 693 KTR (43% men, 52 ± 12 years), baseline urinary selenium excretion was 18.8 (interquartile range 15.1-23.4) µg/24-h. During a median follow-up of 8 years, 229 (33%) KTR died. KTR in the first tertile of urinary selenium excretion, compared with those in the third, had over a 2-fold risk of all-cause mortality [hazard ratio 2.36 (95% confidence interval 1.70-3.28); P < .001], independent of multiple potential confounders including time since transplantation and plasma albumin concentration. The most important dietary determinant of urinary selenium excretion was protein intake (Standardized ß 0.49, P < .001). CONCLUSIONS: Relatively low selenium intake is associated with a higher risk of all-cause mortality in KTR. Dietary protein intake is its most important determinant. Further research is required to evaluate the potential benefit of accounting for selenium intake in the care of KTR, particularly among those with low protein intake.


Subject(s)
Kidney Transplantation , Selenium , Male , Humans , Female , Kidney Transplantation/adverse effects , Cohort Studies , Dietary Proteins , Diet , Transplant Recipients , Risk Factors
12.
Antioxidants (Basel) ; 12(2)2023 Feb 10.
Article in English | MEDLINE | ID: mdl-36830012

ABSTRACT

Kidney transplant recipients (KTR) are at increased risk of cardiovascular mortality. We investigated whether, in KTR, post-transplantation copper status is associated with the risk of cardiovascular mortality and potential effect modification by sex. In this cohort study, plasma copper was measured using mass spectrometry in extensively-phenotyped KTR with a functioning allograft >1-year. Cox regression analyses with the inclusion of multiplicative interaction terms were performed. In 660 KTR (53 ± 13 years old, 56% male), the median baseline plasma copper was 15.42 (IQR 13.53-17.63) µmol/L. During a median follow-up of 5 years, 141 KTR died, 53 (38%) due to cardiovascular causes. Higher plasma copper was associated with an increased risk of cardiovascular mortality in the overall KTR population (HR 1.37; 95% CI, 1.07-1.77 per 1-SD, p = 0.01). Sex was a significant effect modifier of this association (Pinteraction = 0.01). Among male KTR, higher plasma copper concentration was independently associated with a two-fold higher risk of cardiovascular mortality (HR 2.09; 95% CI, 1.42-3.07 per 1-SD, p < 0.001). Among female KTR, this association was absent. This evidence offers a rationale for considering a sex-specific assessment of copper's role in cardiovascular risk evaluation. Further studies are warranted to elucidate whether copper-targeted interventions may decrease cardiovascular mortality in male KTR.

13.
Am J Kidney Dis ; 82(2): 189-201.e1, 2023 08.
Article in English | MEDLINE | ID: mdl-36801431

ABSTRACT

RATIONALE & OBJECTIVE: Prior studies report that the use of proton pump inhibitors (PPIs) can adversely affect gut microbiota and gastrointestinal uptake of micronutrients, in particular iron and magnesium, and are used frequently by kidney transplant recipients. Altered gut microbiota, iron deficiency, and magnesium deficiency have been implicated in the pathogenesis of chronic fatigue. Therefore, we hypothesized that PPI use may be an important and underappreciated cause of fatigue and reduced health-related quality of life (HRQoL) in this population. STUDY DESIGN: Cross-sectional study. SETTING & PARTICIPANTS: Kidney transplant recipients (≥1 year after transplantation) enrolled in the TransplantLines Biobank and Cohort Study. EXPOSURE: PPI use, PPI type, PPI dosage, and duration of PPI use. OUTCOME: Fatigue and HRQoL, assessed using the validated Checklist Individual Strength 20 Revised questionnaire and Short Form-36 questionnaire. ANALYTICAL APPROACH: Logistic and linear regression. RESULTS: We included 937 kidney transplant recipients (mean age 56±13 years, 39% female) at a median of 3 (1-10) years after transplantation. PPI use was associated with fatigue severity (regression coefficient 4.02, 95% CI, 2.18 to 5.85, P<0.001), a higher risk of severe fatigue (OR 2.05, 95% CI, 1.48 to 2.84, P<0.001), lower physical HRQoL (regression coefficient-8.54, 95% CI, -11.54 to-5.54, P<0.001), and lower mental HRQoL (regression coefficient-4.66, 95% CI, -7.15 to-2.17, P<0.001). These associations were independent of potential confounders including age, time since transplantation, history of upper gastrointestinal disease, antiplatelet therapy, and the total number of medications. They were present among all individually assessed PPI types and were dose dependent. Duration of PPI exposure was only associated with fatigue severity. LIMITATIONS: Residual confounding and inability to assess causal relationships. CONCLUSIONS: PPI use is independently associated with fatigue and lower HRQoL among kidney transplant recipients. PPI use might be an easily accessible target for alleviating fatigue and improving HRQoL among kidney transplant recipients. Further studies examining the effect of PPI exposure in this population are warranted. PLAIN-LANGUAGE SUMMARY: In this observational study, we investigated the association of proton pump inhibitors with fatigue and health-related quality of life among kidney transplant recipients. Our data showed that proton pump inhibitors were independently associated with fatigue severity, severe fatigue, and lower physical and mental health-related quality of life. These associations were present among all individually assessed proton pump inhibitor types and were dose dependent. While we await future studies on this topic, proton pump inhibitor use might be an easily accessible target for alleviating fatigue and improving health-related quality of life among kidney transplant recipients.


Subject(s)
Kidney Transplantation , Quality of Life , Humans , Female , Adult , Middle Aged , Aged , Male , Cohort Studies , Proton Pump Inhibitors/therapeutic use , Cross-Sectional Studies , Biological Specimen Banks , Transplant Recipients
14.
J Diabetes Complications ; 37(4): 108433, 2023 04.
Article in English | MEDLINE | ID: mdl-36841085

ABSTRACT

AIMS: Baseline diabetic retinopathy (DR) and risk of development of microalbuminuria, kidney function decline, and cardiovascular events (CVEs) in type 2 diabetes. METHODS: Post-hoc analysis of the PRIORITY study including 1758 persons with type 2 diabetes and normoalbuminuria followed for a median of 2.5 (IQR: 2.0-3.0) years. DR diagnosis included non-proliferative and proliferative abnormalities, macular oedema, or prior laser treatment. Cox models were fitted to investigate baseline DR presence with development of persistent microalbuminuria (urinary albumin-creatinine ratio > 30 mg/g); chronic kidney disease (CKD) G3 (eGFR <60 ml/min/1.73m2); and CVE. Models were adjusted for relevant risk factors. RESULTS: At baseline, 304 (17.3 %) had DR. Compared to persons without DR, they were older (mean ± SD: 62.7 ± 7.7 vs 61.4 ± 8.3 years, p = 0.019), had longer diabetes duration (17.9 ± 8.4 vs. 10.6 ± 7.0 years, p < 0.001), and higher HbA1c (62 ± 13 vs. 56 ± 12 mmol/mol, p < 0.001). The adjusted hazard ratios of DR at baseline for development of microalbuminuria (n = 197), CKD (n = 166), and CVE (n = 64) were: 1.50 (95%CI: 1.07, 2.11), 0.87 (95%CI: 0.56, 1.34), and 2.61 (95%CI: 1.44, 4.72), compared to without DR. CONCLUSIONS: Presence of DR in normoalbuminuric type 2 diabetes was associated with an increased risk of developing microalbuminuria and CVE, but not with kidney function decline.


Subject(s)
Diabetes Mellitus, Type 2 , Diabetic Nephropathies , Diabetic Retinopathy , Renal Insufficiency, Chronic , Humans , Diabetes Mellitus, Type 2/complications , Diabetes Mellitus, Type 2/epidemiology , Kidney , Albuminuria/complications , Renal Insufficiency, Chronic/complications , Renal Insufficiency, Chronic/epidemiology , Diabetic Retinopathy/etiology , Diabetic Retinopathy/complications , Glomerular Filtration Rate
15.
Nat Rev Nephrol ; 19(3): 139-140, 2023 03.
Article in English | MEDLINE | ID: mdl-36658404

Subject(s)
Famine , Malnutrition , Humans , Kidney , China
16.
Psychosom Med ; 85(2): 203-215, 2023.
Article in English | MEDLINE | ID: mdl-36662615

ABSTRACT

OBJECTIVE: Psychological distress is common among patients with chronic kidney disease and can interfere with disease self-management. We assessed the effectiveness of the personalized E-GOAL electronic health care pathway with screening and cognitive-behavioral therapy including self-management support, aimed to treat psychological distress and facilitate self-management among people with chronic kidney disease not on dialysis ( N = 121). METHODS: Primary outcome of the open two-arm parallel randomized controlled trial in four Dutch hospitals was psychological distress at posttest directly after the intervention and at 3-month follow-up. Secondary outcomes were physical and mental health-related quality of life, self-efficacy, chronic disease self-management, and personalized outcomes, that is, perceived progress compared with the previous time point on functioning (e.g., mood or social functioning) and self-management (e.g., dietary or medication adherence) outcomes that were prioritized by each individual. RESULTS: Linear mixed-effects analyses showed no significant time-by-group interaction effects for psychological distress, health-related quality of life, self-efficacy, and chronic condition self-management, whereas analyses of covariance showed significantly more perceived progress in the intervention group at posttest on personally prioritized areas of functioning ( b = 0.46, 95% confidence interval = 0.07-0.85) and self-management ( b = 0.55, 95% confidence interval = 0.16-0.95), with Cohen d values of 0.46 and 0.54 (medium effects), respectively. Effects on personalized outcomes were maintained at follow-up. CONCLUSIONS: Compared with regular care only, the electronic health intervention did not reduce psychological distress, whereas personalized outcomes did improve significantly after intervention. Future studies could consider personalized outcomes that reflect individually relevant areas and treatment goals, matching person-tailored treatments. TRIAL REGISTRATION: Registered at the Netherlands Trial Register with study number NTR7555 ( https://trialsearch.who.int/Trial2.aspx?TrialID=NTR7555 ).


Subject(s)
Cognitive Behavioral Therapy , Renal Insufficiency, Chronic , Self-Management , Telemedicine , Humans , Quality of Life , Chronic Disease , Renal Insufficiency, Chronic/therapy
17.
Nutrients ; 15(2)2023 Jan 13.
Article in English | MEDLINE | ID: mdl-36678299

ABSTRACT

Background: Several studies have found a U-shaped association between sodium intake and mortality. The increased mortality risk of low sodium intake has raised debates and hampers widespread acceptance of public health campaigns and dietary guidelines on reducing sodium intake. Whether the excess risk can be attributed to low sodium intake alone or concomitant inadequate intake of other relevant nutrients is unknown. Objective: We investigated whether concomitant low protein intake could explain the lower part of the U-shaped association of sodium intake with all-cause mortality. Methods: We included 1603 individuals aged between 60 and 75 years old from the gender- and socioeconomic status-balanced prospective Lifelines-MINUTHE cohort study. Using multivariable Cox regression analyses, we investigated the association of sodium intake (24 h urinary sodium excretion) with all-cause mortality, including the interaction with protein intake calculated from the Maroni formula. Results: Mean intakes of sodium and protein were 3.9 ± 1.6 g/day and 1.1 ± 0.3 g/kg/day, respectively. After a median follow-up of 8.9 years, 125 individuals (7.8%) had died. The proportion of participants with insufficient protein intake (<0.8 g/kg/day) was inversely related to sodium intake (i.e., 23.3% in Q1 versus 2.8% in Q4, p < 0.001). We found an increased risk for mortality in both the highest quartile (Q4, >4.7 g/day; hazard ratio (HR) 1.74 (95% confidence interval (CI) 1.03−2.95)) and the lowest two quartiles of sodium intake (Q1, 0.7−2.8 g/day; 2.05 (1.16−3.62); p = 0.01 and Q2, 2.8−3.6 g/day; 1.85 (1.08−3.20); p = 0.03), compared with the third quartile of sodium intake (Q3, 3.6−4.7 g/day). This U-shaped association was significantly modified by protein intake (Pinteraction = 0.006), with the increased mortality risk of low sodium intake being reversed to the lowest mortality risk with concomitant high protein intake. In contrast, the increased mortality risk of low sodium intake was magnified by concomitant low protein intake. Conclusions: We found that a higher protein intake counteracts the increased mortality risk observed in subjects with a low sodium intake. In contrast, a joint low intake of sodium and protein is associated with an increased mortality risk, allegedly due to poor nutritional status. These findings support the guidelines that advocate a lower sodium intake, while highlighting the importance of recognizing overall nutritional status among older adults.


Subject(s)
Nutritional Status , Sodium, Dietary , Humans , Aged , Middle Aged , Prospective Studies , Cohort Studies , Sodium, Dietary/adverse effects , Risk Factors , Sodium
18.
Am J Transplant ; 23(4): 520-530, 2023 04.
Article in English | MEDLINE | ID: mdl-36695702

ABSTRACT

Vitamin K deficiency is common among kidney transplant recipients (KTRs) and likely contributes to progressive vascular calcification and stiffness. In this single-center, randomized, double-blind, placebo-controlled trial, we aimed to investigate the effects of vitamin K supplementation on the primary end point, serum calcification propensity (calciprotein particle maturation time, T50), and secondary end points arterial stiffness (pulse wave velocity [PWV]) and vitamin K status in 40 vitamin K-deficient KTRs (plasma dephosphorylated uncarboxylated matrix Gla protein [dp-ucMGP] ≥500 pmol/L). Participants (35% female; age, 57 ± 13 years) were randomized 1:1 to vitamin K2 (menaquinone-7, 360 µg/day) or placebo for 12 weeks. Vitamin K supplementation had no effect on calcification propensity (change in T50 vs baseline +2.3 ± 27.4 minutes) compared with placebo (+0.8 ± 34.4 minutes; Pbetween group = .88) but prevented progression of PWV (change vs baseline -0.06 ± 0.26 m/s) compared with placebo (+0.27 ± 0.43 m/s; Pbetween group = .010). Vitamin K supplementation strongly improved vitamin K status (change in dp-ucMGP vs baseline -385 [-631 to -269] pmol/L) compared with placebo (+39 [-188 to +183] pmol/L; Pbetween group < .001), although most patients remained vitamin K-deficient. In conclusion, vitamin K supplementation did not alter serum calcification propensity but prevented progression of arterial stiffness, suggesting that vitamin K has vascular effects independent of calciprotein particles. These results set the stage for longer-term intervention studies with vitamin K supplementation in KTRs. TRIAL REGISTRY: EU Clinical Trials Register (EudraCT Number: 2019-004906-88) and the Dutch Trial Register (NTR number: NL7687).


Subject(s)
Kidney Transplantation , Vascular Stiffness , Humans , Female , Adult , Middle Aged , Aged , Male , Vitamin K/pharmacology , Kidney Transplantation/adverse effects , Pulse Wave Analysis , Vitamin K 2/therapeutic use , Vitamin K 2/pharmacology , Dietary Supplements , Double-Blind Method
19.
Nephrol Dial Transplant ; 38(2): 491-498, 2023 02 13.
Article in English | MEDLINE | ID: mdl-35175356

ABSTRACT

BACKGROUND: Cognitive impairment is often present shortly after transplantation in kidney transplant recipients (KTR). To date, it is unknown whether these impairments persist in thelong term, to what extent they are associated with disease-related variables and whether they affect societal participation and quality of life (QoL) of KTR. METHOD: This study was part of the TransplantLines Biobank & Cohort Study in the University Medical Center Groningen. A total of 131 KTR, with a mean age of 53.6 years (SD = 13.5) transplanted ≥1 year ago (M = 11.2 years, range 1-41.7 years), were included and compared with 306 healthy controls (HC). KTR and HC were well matched; there were no significant differences regarding age, sex and education. All participants were assessed with neuropsychological tests measuring memory, mental speed, attention and executive functioning, and with questionnaires examining societal participation and QoL. RESULTS: Compared with HC, KTR performed significantly worse on memory, mental speed and measures of executive functioning (all P-values <0.05). Moreover, 16% of KTR met the criteria for mild cognitive impairment (MCI), compared with 2.6% of the HC. MCI in KTR was not significantly correlated with age- and disease-related variables. Poorer cognitive functioning was significantly related to lower levels of societal participation and to lower QoL (all P-values <0.01). CONCLUSIONS: This study shows long-term cognitive impairments in KTR, which are not related to disease-related variables. Neuropsychological assessment is important to timely signal these impairments, given their serious negative impact on societal participation and QoL.


Subject(s)
Cognitive Dysfunction , Kidney Transplantation , Humans , Infant , Child, Preschool , Child , Adolescent , Young Adult , Adult , Quality of Life/psychology , Kidney Transplantation/adverse effects , Kidney Transplantation/psychology , Cohort Studies , Cognitive Dysfunction/epidemiology , Cognitive Dysfunction/etiology , Cognition , Transplant Recipients/psychology , Neuropsychological Tests
20.
Nephrol Dial Transplant ; 38(1): 212-221, 2023 Jan 23.
Article in English | MEDLINE | ID: mdl-35731584

ABSTRACT

BACKGROUND: One of the challenges in living kidney donor screening is to estimate remaining kidney function after donation. Here we developed a new model to predict post-donation measured glomerular filtration rate (mGFR) from pre-donation serum creatinine, age and sex. METHODS: In the prospective development cohort (TransplantLines, n = 511), several prediction models were constructed and tested for accuracy, precision and predictive capacity for short- and long-term post-donation 125I-iothalamate mGFR. The model with optimal performance was further tested in specific high-risk subgroups (pre-donation eGFR <90 mL/min/1.73 m2, a declining 5-year post-donation mGFR slope or age >65 years) and validated in internal (n = 509) and external (Mayo Clinic, n = 1087) cohorts. RESULTS: In the development cohort, pre-donation estimated GFR (eGFR) was 86 ± 14 mL/min/1.73 m2 and post-donation mGFR was 64 ± 11 mL/min/1.73 m2. Donors with a pre-donation eGFR ≥90 mL/min/1.73 m2 (present in 43%) had a mean post-donation mGFR of 69 ± 10 mL/min/1.73 m2 and 5% of these donors reached an mGFR <55 mL/min/1.73 m2. A model using pre-donation serum creatinine, age and sex performed optimally, predicting mGFR with good accuracy (mean bias 2.56 mL/min/1.73 m2, R2 = 0.29, root mean square error = 11.61) and precision [bias interquartile range (IQR) 14 mL/min/1.73 m2] in the external validation cohort. This model also performed well in donors with pre-donation eGFR <90 mL/min/1.73 m2 [bias 0.35 mL/min/1.73 m2 (IQR 10)], in donors with a negative post-donation mGFR slope [bias 4.75 mL/min/1.73 m2 (IQR 13)] and in donors >65 years of age [bias 0.003 mL/min/1.73 m2 (IQR 9)]. CONCLUSIONS: We developed a novel post-donation mGFR prediction model based on pre-donation serum creatinine, age and sex.


Subject(s)
Iodine Radioisotopes , Kidney Transplantation , Humans , Aged , Glomerular Filtration Rate , Prospective Studies , Creatinine , Kidney , Living Donors
SELECTION OF CITATIONS
SEARCH DETAIL
...