Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 17.515
Filtrar
1.
Curr Dev Nutr ; 8(6): 103778, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38952351

RESUMO

Background: Fruits and vegetables (FV) are a critical source of nutrients, yet children in the United States are not meeting the Dietary Guidelines for Americans (DGA). The monthly FV cash value benefit (CVB) included in the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC)'s food package to support child FV intake (FVI) received a substantial increase for economic relief during the COVID-19 pandemic. Objectives: To evaluate how an expansion of the monthly WIC CVB to purchase FV for WIC children ages 1-4 y is associated with diversity in FV redeemed, and how changes in redeemed FV are related to FVI. Methods: Caregivers representing 1463 WIC-participating children recruited from Los Angeles County, California, completed surveys during the CVB augmentation (T1: CVB = $9/mo; T2 = $35/mo; T3 = $24/mo). Redeemed price look-up codes (PLUs), corresponding to a food item, were assigned to its corresponding MyPlate FV group. Multivariable generalized estimating equation regression models assessed changes in amount and diversity of FV redemption across MyPlate groups and associations between changes in FV diversity and changes in FVI. Results: Slightly over half of all households were food insecure (55%), half of the children were female (52%), and most were Hispanic (78%). Compared with T1, significant increases in the number of PLUs and dollars redeemed were observed in most MyPlate FV groups. From T1 to T2, significant increases in diversity scores were observed for total fruit (ß: 1.6 pts; 95% confidence interval [CI]: 1.4, 1.7), total vegetable (ß: 3.6 pts; 95%CI: 3.4, 3.9), and total FV (ß:7.8 pts; 95%CI: 7.4, 8.2). Similarly, increases in diversity score were observed at T3 compared with T1. Changes in FV diversity redeemed were not associated with changes in FVI. Conclusions: During the CVB augmentation, WIC participants redeemed a greater amount and variety of FV according to DGA MyPlate recommendations, supporting its permanent increase.

2.
Transl Behav Med ; 2024 Jul 02.
Artigo em Inglês | MEDLINE | ID: mdl-38954835

RESUMO

Food security is a commonly screened for health-related social need at hospitals and community settings, and until recently, there were no tools to additionally screen for nutrition security. The purpose of this study was to assess the potential advantage of including a one-item brief nutrition security screener (BNSS) alongside the commonly used two-item Hunger Vital Sign (HVS) food security screener for identifying individuals with diet-related health risks. Cross-sectional survey data were collected from April to June 2021. Generalized linear mixed models were used to assess associations between screening status and dietary and health variables. Recruitment was done across five states (California, Florida, Maryland, North Carolina, and Washington) from community-based organizations. Participants (n = 435) were, on average, 44.7 years old (SD = 14.5), predominantly women (77%), and racially/ethnically diverse. In adjusted analyses, being in the food insecure and nutrition insecure group (but not the food insecure and nutrition secure or food secure and nutrition insecure groups) was associated with significantly increased odds for self-reported "fair" or "poor" general health [OR = 2.914 (95% CI = 1.521-5.581)], reporting at least one chronic condition [2.028 (1.024-4.018)], and "low" fruit and vegetable intake [2.421 (1.258-4.660)], compared with the food secure and nutrition secure group. These findings support using both the HVS and BNSS simultaneously in health-related social needs screening to identify participants at the highest risk for poor dietary and health outcomes and warrant further investigation into applying these screeners to clinical and community settings.


Food security and nutrition security are related to a household's ability to get enough food and to get food that is good for their health, respectively. Patients at hospitals, or clients who go to food pantries for help, are often asked about their food security status. This is referred to as screening. On the basis of their answers, they may get help such as referral to a food pantry and/or consultation with a dietitian. While there is a standard tool to screen for food security status, until recently, there has not been one for nutrition security. We used both the commonly used Hunger Vital Sign (HVS) food security screener and the newly developed brief nutrition security screener to identify food and nutrition security screening status. Being in the food insecure and nutrition insecure groups (but not the food insecure and nutrition secure or food secure and nutrition insecure groups) was associated with significantly increased odds for poor dietary and health outcomes. These findings support using both the HVS and brief nutrition security screener simultaneously in health-related social needs screening to identify participants at the highest risk.

3.
Front Nutr ; 11: 1393596, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38962434

RESUMO

Purpose: Dietary factors play a crucial role in the development and management of chronic constipation, yet the relationship between dietary protein intake and constipation remains underexplored. This study aims to investigate the association between dietary protein intake and the prevalence of constipation among American adults, with a focus on potential gender differences, using large-scale national data. Materials and methods: Data from 14,048 participants aged 20 and above (7,072 men and 6,976 women) from the National Health and Nutrition Examination Survey (NHANES) 2005-2010 were analyzed. The Bristol Stool Form Scale's types 1 (separate hard lumps, resembling nuts) and 2 (sausage-shaped, but lumpy) were used to define constipation. A 24-h dietary recall technique was used to measure dietary protein intake. After controlling for covariates, the association between protein consumption and constipation risk was examined using multivariable logistic regression, smooth curve fitting, and testing for gender interaction effects. We then further determined the threshold effect between dietary protein intake and constipation risk. Results: Constipation was present in 7.49% of people overall, with a higher proportion among women (10.19%) than among males (4.82%). In men, higher protein intake was significantly associated with a lower rate of constipation. However, in women, higher protein intake correlated with an increased risk of constipation, and the interaction between gender was significant (P for interaction = 0.0298). These results were corroborated by smooth curve fits, which also demonstrated a dose-response effect. Further threshold effect analysis showed that the turning points of dietary protein intake differed between male and female participants (119.42 gm/day for men; 40.79 gm/day for women). Conclusion: The association between dietary protein intake and constipation was different in different genders with threshold effect. For men, moderately increasing protein intake could be beneficial, while for women, exceeding a certain level may increase the risk of constipation. These insights are crucial for guiding dietary protein recommendations for different genders and have significant clinical implications.

4.
Cancer Diagn Progn ; 4(4): 503-509, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38962538

RESUMO

Background/Aim: Physical decline is accompanied with malnutrition in advanced cancer patients, thus nutritional care is often provided with cancer rehabilitation. However, a limited number of studies have focused on which nutritional index serves as an important marker to provide more intensive nutritional support for patients. Patients and Methods: We retrospectively reviewed advanced cancer patients who received chemotherapy and rehabilitation during hospitalization. In analysis 1, patients were divided into two groups: a Well group with caloric intake ≥ basal metabolism, calculated by the Harris-Benedict equation, and a Poor group with caloric intake less than their basal energy expenditure. The primary endpoint was the ratio of patients whose Eastern Cooperative Oncology Group Performance Status (ECOG PS) or Barthel index (BI) was maintained during rehabilitation. In analysis 2, the cohort was restratified into Responders, whose ECOG PS and BI improved, and Non-responders, comprising the remaining patients. Several nutritional indices were compared between the groups. Results: Eighty-four patients were evaluated in analysis 1, namely 51 Well patients and 33 Poor patients. The ECOG PS-maintained rate was 98% and 91% (p=0.29), and the BI-maintained rate was 100% and 88% (p=0.02) in the Well and Poor groups, respectively. In analysis 2, 72 patients were evaluated after excluding 12 patients who lacked nutritional data after rehabilitation. Compared with the Responders group, caloric intake appeared worse in the Non-responders group, although their nutritional background tended to be better. Conclusion: Insufficient caloric intake might be a predictive marker of poor outcomes after rehabilitation in advanced cancer patients.

5.
Eur Addict Res ; : 1-10, 2024 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-38964299

RESUMO

INTRODUCTION: Craving is a multifactorial behavior caused by central circuit imbalance. The proposed treatments involve exercise and reduced food intake. However, the treatments frequently fail. This study aimed to investigate the effect of 10 consecutive sessions of anodal transcranial direct current stimulation over the right dorsolateral prefrontal cortex on food craving and eating consumption of women affected by overweight and obesity. METHODS: A randomized double-blind controlled trial with 50 volunteers was divided into two groups (active-tDCS: n = 25 and sham-tDCS: n = 25). There were a total of 10 consecutive tDCS sessions (2 mA, for 20 min) with an F4 anodal-F3 cathodal montage. We evaluated the effects on eating behavior (food craving, uncontrolled eating, emotional eating, and cognitive restriction), food consumption (calories and macronutrients), and anthropometric and body composition variables (weight, body mass index, waist circumference, and body fat percentage). RESULTS: There were no statistically significant results between groups at the baseline regarding sociodemographic and clinical characteristics. Also, there was no significant interaction between time versus group for any of the variables studied. Treatment with tDCS was well tolerated and there were no serious adverse effects. CONCLUSIONS: In women affected by overweight and obesity with food cravings, 10 sessions of F4 (anodal) and F3 (cathodal) tDCS did not produce changes in eating behavior, food consumption, and anthropometric and body composition.

6.
Hypertens Res ; 2024 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-38965426

RESUMO

The contrasting relationships of plant and animal protein intake with blood pressure (BP) may be partially attributed to the differential non-protein (e.g., saturated fat and fibre) and amino acid (AA) compositions. This study determined whether animal and plant protein intake were related to differential metabolomic profiles associated with BP. This study included 1008 adults from the African-PREDICT study (aged 20-30 years). Protein intake was determined using 24-h dietary recalls. Twenty-four-hour ambulatory BP was measured. Amino acids and acylcarnitines were analysed in spot urine samples using liquid chromatography-tandem mass spectrometry-based metabolomics. Participants with a low plant, high animal protein intake had higher SBP (by 3 mmHg, p = 0.011) than those with high plant, low animal protein intake (low-risk group). We found that the relationships of plant and animal protein intake with 24-h SBP were partially mediated by BMI and saturated fat intake, which were independently associated with SBP. Protein intake was therefore not related to SBP in multiple regression analysis after adjusting for confounders. In the low-risk group, methionine (Std. ß = -0.217; p = 0.034), glutamic acid (Std. ß = -0.220; p = 0.031), glycine (Std. ß = -0.234; p = 0.025), and proline (Std. ß = -0.266; p = 0.010) were inversely related to SBP, and beta-alanine (Std. ß = -0.277; p = 0.020) to DBP. Ultimately a diet high in animal and low in plant protein intake may contribute to higher BP by means of increased BMI and saturated fat intake. Conversely, higher levels of urinary AAs observed in adults consuming a plant rich diet may contribute to lower BP.

7.
Asia Pac J Clin Nutr ; 33(3): 405-412, 2024 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-38965728

RESUMO

BACKGROUND AND OBJECTIVES: If the proportion of calcium intake over a whole day is related to the risk of cognitive impairment in adults is still largely unknown. This research aimed to examine the relation of dietary calcium intake at dinner versus breakfast with the risk of cognitive impairment by using data from the China Health and Nutrition Survey (CHNS). METHODS AND STUDY DESIGN: A total of 2,099 participants (including 668 cognitive impairment) in the CHNS (1997-2006) were included. The participants were categorized into 5 groups in accordance with the ratio of dietary calcium intake at dinner and breakfast (Δ = dinner/breakfast). After adjustment was conducted for a series of confounding factors, Cox hazard regression modelling was performed to discuss the relation of Δ with cognitive impairment. Dietary substitution models were used to explore the changes in cognitive impairment risk when a 5% dietary calcium intake at dinner was replaced with dietary calcium intake at breakfast. RESULTS: Participants in the highest distribution of Δ showed a greater susceptibility to cognitive impairment than those in the lowest quintile, with an adjusted hazard ratio of cognitive impairment of 1.38 (95% CI: 1.08-1.76). When maintaining total calcium intake, substituting 5% of dietary calcium intake at dinner with calcium intake at breakfast was related to an 8% decrease in the risk of cognitive impairment. CONCLUSIONS: Higher dietary calcium intake at dinner was associated with an increased risk of cognitive impairment, emphasizing the importance of appropriately distributing dietary calcium intake between breakfast and dinner.


Assuntos
Desjejum , Cálcio da Dieta , Disfunção Cognitiva , Humanos , Cálcio da Dieta/administração & dosagem , Masculino , Feminino , China/epidemiologia , Pessoa de Meia-Idade , Disfunção Cognitiva/epidemiologia , Estudos de Coortes , Adulto , Refeições , Inquéritos Nutricionais , Idoso , Fatores de Risco , População do Leste Asiático
8.
Asia Pac J Clin Nutr ; 33(3): 413-423, 2024 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-38965729

RESUMO

BACKGROUND AND OBJECTIVES: Non-alcoholic fatty liver disease (NAFLD) has become a worldwide public health problem. Current evidence on the association between dietary iron intake and the risk of NAFLD is limited. The present study aimed to investigate the associations of animal-derived dietary iron (ADDI) intake, plant-derived dietary iron (PDDI) intake, and the ratio of PDDI:ADDI with NAFLD risk among U.S. adult population. METHODS AND STUDY DESIGN: This was a repeated cross-sectional study. Data were collected from the National Health and Nutrition Examination Survey (NHANES) 2007-2018. NAFLD was defined as a United States Fatty Lives Index ≥30, and dietary iron intake was assessed through two 24-h dietary recall in-terviews. Logistic regression and restricted cubic spline models were applied to examine the associations between dietary iron intake from different sources and NAFLD risk. RESULTS: A total of 9478 participants aged ≥20 years were enrolled in the present study. After adjustment for multiple confounding factors, relative to the lowest quartile, the odds ratio (OR) and 95% confidence interval (CI) of NAFLD for the highest quartile was 1.01(95% CI, 0.82-1.24) for ADDI intake, 0.82 (95% CI, 0.64-0.99) for PDDI intake, and 1.00 (95% CI, 0.81-1.24) for the PDDI: ADDI intake ratio. In stratified analysis by sex and age, the significantly negative associations of PDDI intake with NAFLD was observed in women and participants older than 45 years. Dose-response analyses indicated that NAFLD was negatively associated with PDDI intake in a non-linear manner. CONCLUSIONS: PDDI intake was negatively associated with NAFLD in U.S. adults.


Assuntos
Ferro da Dieta , Hepatopatia Gordurosa não Alcoólica , Inquéritos Nutricionais , Humanos , Hepatopatia Gordurosa não Alcoólica/epidemiologia , Masculino , Feminino , Adulto , Estudos Transversais , Ferro da Dieta/administração & dosagem , Pessoa de Meia-Idade , Dieta/métodos , Dieta/estatística & dados numéricos , Adulto Jovem , Estados Unidos/epidemiologia
9.
Artigo em Inglês | MEDLINE | ID: mdl-38967394

RESUMO

Telomere length is closely linked to biological aging, oxidative stress, and the development of cardiovascular diseases. This study aimed to assess the association between dietary selenium intake and telomere length in individuals with hypertension. Data on dietary selenium intake were captured through the National Health and Nutrition Examination Survey (NHANES) computer-assisted dietary interview system (CADI). Telomere length determination entailed selecting blood samples from all participants in the NHANES database. The analysis was performed using Analysis System software, with Empower stats utilized for data analysis. Results showed that there was a significant association between dietary selenium intake and telomere length in hypertension, particularly within the female group. In female hypertension cases, a 1 mcg increase in dietary selenium intake corresponded to a telomere length increase of 1.19 bp, even after adjusting for age, race, BMI, marital status, physical activity, energy intake, and stroke history. The relationship between dietary selenium intake and telomere length exhibited a linear pattern in female hypertension patients. This study identified a positive association between dietary selenium intake and telomere length in hypertension, particularly within the female group.

10.
Clin Nutr ESPEN ; 63: 274-282, 2024 Jun 14.
Artigo em Inglês | MEDLINE | ID: mdl-38972038

RESUMO

BACKGROUND & AIMS: For children and adolescents undergoing hematopoietic stem cell transplant (HSCT), adequate protein and energy intake is essential to mitigate malnutrition risk. However, little is known about optimal requirements, including adequate dietary protein intake in this population. We conducted an international benchmarking survey and a scoping review to explore current practices in determining protein requirements (PR) and examine existing evidence for PR and dietary protein intake in pediatric HSCT. METHODS: Twelve pediatric oncology centers were surveyed to elicit current practices in determining PR in pediatric HSCT. A scoping review then collected sources of evidence from six databases (MEDLINE, Embase, CINAHL, PubMed, Cochrane Library and Web of Science) and grey literature (Google Scholar). RESULTS: Survey data revealed variable practices in determining PR for pediatric HSCT patients. Four centers (44%) used the American Society for Parenteral and Enteral Nutrition (ASPEN) Nutrition Support in Pediatric Critically Ill Patient Guidelines 2009 and four (44%) used local guidelines or their national nutrient reference values (NRV). The scoping review included nineteen studies. The review highlighted a broad range of PR used in this population, ranging from 0.8 to 3.0 g/kg/d. Practices regarding the documentation and frequency of collecting protein intake data varied. Only five studies reported estimated protein requirement (EPR) status and just two studies met EPR. No clinical guidelines on PR in pediatric HSCT were identified. CONCLUSIONS: Given the existing gap in evidence, the optimal amount of protein required for children and adolescents undergoing HSCT remains unknown. To establish specific, evidence-based PR guidelines, comprehensive research is needed. Future investigations should prioritize evaluating current clinical practices, assessing the gap between actual protein intake and EPR, and understanding the relationship between protein intake, protein status, and the impact on treatment outcomes. Addressing these research priorities is crucial for bridging the current evidence gap, thereby enabling the development of enhanced and personalized nutritional support for children and adolescents undergoing HSCT.

11.
Poult Sci ; 103(9): 103974, 2024 Jun 18.
Artigo em Inglês | MEDLINE | ID: mdl-38972283

RESUMO

Improving feed utilization is a vital strategy to meet the growing global demand for meat and promote sustainable food production. Over the past few decades, significant improvements in the feed intake (FI) and feed utilization efficiency of broilers have been achieved through advanced breeding procedures, although dynamic changes in FI and their effects on the feed conversion ratio (FCR) have remained unclear. In this study, we measured individual weekly FI and body weight of 274 male broilers to characterize the dynamic FI patterns and investigate their relationship with growth performance. The broilers were from 2 purebred lines and their crossbreed and measurements were collected from 4 to 6 wk of age. Overall, a continuous increase in the weekly FI occurred from 4 to 6 wk of age, whereas the body weight gain (BWG) reached an inflection point in wk 5. The dynamic change in weekly FI was observed to follow 3 distinct FI patterns: pattern 1, a continuous weekly increase in FI; pattern 2, an increase followed by a plateau; pattern 3, an increase followed by a decrease. The prevalence of these patterns was similar in the purebred and crossbred populations: pattern 2 was most frequent, followed by a moderate proportion of pattern 1, and the lowest proportion of pattern 3. Broilers following pattern 1 displayed significantly better growth performance and feed utilization efficiency than those following pattern 3, emphasizing the importance of maintaining good appetite in the last stage of broiler production. In summary, this study has characterized the dynamic patterns of FI and their association with growth performance. Our results offer a new foundation for improving feed utilization efficiency and investigating feeding regulation in broilers.

12.
Can J Diabetes ; 2024 Jul 05.
Artigo em Inglês | MEDLINE | ID: mdl-38972477

RESUMO

INTRODUCTION: Evidence suggests that glucose levels in menstruating females with type 1 diabetes change throughout the menstrual cycle, reaching a peak during the luteal phase. The Type 1 Diabetes Exercise Initiative (T1DEXI) study provided the opportunity to assess glycemic metrics between early and late phases of the menstrual cycle, and whether differences could be explained by exercise, insulin, and carbohydrate intake. RESEARCH DESIGN AND METHODS: One hundred and sixty two adult females were included in the analysis. Glycemic metrics, carbohydrate intake, insulin requirements, and exercise habits during the early vs. late phases of the menstrual cycles (i.e. 2-4 days after vs. 2-4 days before reported menstruation start date) were compared. RESULTS: Mean glucose increased from 8.2±1.5 mmol/L (148±27 mg/dL) during the early follicular phase to 8.6±1.6 mmol/L (155±29 mg/dL) during the late luteal phase (p<0.001). Mean percent time-in-range (3.9-10.0 mmol/L [70-180 mg/dL] ) decreased from 73±17% to 70±18% (p=0.002), and median percent time >10.0 mmol/L (>180 mg/dL) increased from 21% to 23% (p<0.001). Median total daily insulin requirements increased from 37.4 units during the early follicular to 38.5 units during the late luteal phase (p=0.02) and mean daily carbohydrate consumption increased slightly from 127±47 g to 133±47 g (p=0.05), but the difference in mean glucose during early follicular vs. late luteal phase was not explained by differences in exercise duration, total daily insulin units, or reported carbohydrate intake. CONCLUSIONS: Glucose levels during the late luteal phase were higher than the early follicular phase of the menstrual cycle. These glycemic changes suggest that glucose management for females with type 1 diabetes may need to be fine-tuned within the context of their menstrual cycles.

13.
Artigo em Inglês | MEDLINE | ID: mdl-38972782

RESUMO

Central ceramides regulate energy metabolism by impacting hypothalamic neurons. This allows ceramides to integrate endocrine signals - such as leptin, ghrelin, thyroid hormones, or estradiol - and to modulate the central control of puberty. In this forum article we discuss recent evidence suggesting that specific ceramide species and neuronal populations are involved in these effects.

14.
JMIR Serious Games ; 12: e52231, 2024 Jun 25.
Artigo em Inglês | MEDLINE | ID: mdl-38967387

RESUMO

Background: Exercise offers substantial health benefits but can induce oxidative stress and inflammation, especially in high-intensity formats such as high-intensity interval exercise (HIIE). Exergaming has become an effective, enjoyable fitness tool for all ages, particularly older adults. Enzyme supplements may enhance exercise performance by improving lactate metabolism and reducing oxidative stress. Objective: This study investigates the efficacy of fruit and vegetable enzyme supplementation in modulating fatigue and enhancing aerobic capacity in older adults following HIIE through exergaming. Methods: The study recruited 16 older adult female participants and allocated them into 2 distinct groups (enzyme and placebo) based on their pretest lactate levels. This division used pairwise grouping to guarantee comparability between the groups, ensuring the integrity of the results. They engaged in HIIE using Nintendo Switch Ring Fit Adventure, performing 8 sets of 20 seconds of maximum effort exercise interspersed with 30 seconds of rest, totaling 370 seconds of exercise. Key metrics assessed included blood lactate levels, heart rate, rating of perceived exertion, and training impulse. Participants in the enzyme group were administered a fruit and vegetable enzyme supplement at a dosage of 30 mL twice daily over a period of 14 days. Results: The enzyme group showed significantly lower blood lactate levels compared to the placebo group, notably after the fourth (mean 4.29, SD 0.67 vs mean 6.34, SD 1.17 mmol/L; P=.001) and eighth (mean 5.84, SD 0.63 vs mean 8.20, SD 1.15 mmol/L; P<.001) exercise sessions. This trend continued at 5 minutes (mean 6.85, SD 0.82 vs mean 8.60, SD 1.13 mmol/L; P=.003) and 10 minutes (mean 5.91, SD 1.16 vs mean 8.21, SD 1.27 mmol/L; P=.002) after exercise. Although both groups exceeded 85% of their estimated maximum heart rate during the exercise, enzyme supplementation did not markedly affect the perceived intensity or effort. Conclusions: The study indicates that fruit and vegetable enzyme supplementation can significantly reduce blood lactate levels in older adults following HIIE through exergaming. This suggests a potential role for these enzymes in modulating lactate production or clearance during and after high-intensity exercise. These findings have implications for developing targeted interventions to enhance exercise tolerance and recovery in older adults.

15.
Biol Trace Elem Res ; 2024 Jul 06.
Artigo em Inglês | MEDLINE | ID: mdl-38970711

RESUMO

Rare earth elements (REEs) are a group consisting of the following 17 metals: Ce, Dy, Er, Eu, Gd, Ho, La, Lu, Nd, Pr, Pm, Sc, Sm, Tb, Tm, Y and Yb. In the current century, the number of applications of REEs has significantly increased. They are being used as components in high technology devices of great importance industrial/economic. However, information on the risk of human exposure to REEs, as well as the potential toxic effects of these elements is still limited. In general terms, dietary intake is the main route of exposure to metals for non-occupationally exposed individuals, which should be also expected for REEs. The current paper aimed at reviewing the studies -conducted over the world- that focused on determining the levels of REEs in foods, as well as the dietary intake of these elements. Most studies do not suggest potential health risk for consumers of freshwater and marine species of higher consumption, or derived from the intake of a number of vegetables, fruits, mushrooms, as well as other various foodstuffs (honey, tea, rice, etc.). The current estimated daily intake (EDI) of REEs does not seem to be of concern. However, considering the expected wide use of these elements in the next years, it seems to be clearly recommendable to assess periodically the potential health risk of the dietary exposure to REEs. This is already being done with well-known toxic elements such as As, Cd, Pb and Hg, among other potentially toxic metals.

17.
Regul Toxicol Pharmacol ; : 105672, 2024 Jul 03.
Artigo em Inglês | MEDLINE | ID: mdl-38968965

RESUMO

Nitrosamine drug substance related impurities or NDSRIs can be formed if an active pharmaceutical ingredient (API) has an intrinsic secondary amine that can undergo nitrosation. This is a concern as 1) nitrosamines are potentially highly potent carcinogens, 2) secondary amines in API are common, and 3) NDSRIs that might form from such secondary amines will be of unknown carcinogenic potency. Approaches for evaluating NDSRIs include read across, quantum mechanical modeling of reactivity, in vitro mutation data, and transgenic in vivo mutation data. These approaches were used here to assess NDSRIs that could potentially form from the drugs fluoxetine, duloxetine and atomoxetine. Based on a read across informed by modeling of physicochemical properties and mechanistic activation from quantum mechanical modeling, NDSRIs of fluoxetine, duloxetine, and atomoxetine were 10-100-fold less potent compared with highly potent nitrosamines such as NDMA or NDEA. While the NDSRIs were all confirmed to be mutagenic in vitro (Ames assay) and in vivo (TGR) studies, the latter data indicated that the potency of the mutation response was > 4400 ng/day for all compounds- an order of magnitude higher than published regulatory limits for these NDSRIs. The approaches described herein can be used qualitatively to better categorize NDSRIs with respect to potency and inform whether they are in the ICH M7R2-designated Cohort of Concern.

18.
BMC Public Health ; 24(1): 1770, 2024 Jul 03.
Artigo em Inglês | MEDLINE | ID: mdl-38961413

RESUMO

In the UK people living in disadvantaged communities are less likely than those with higher socio-economic status to have a healthy diet. To address this inequality, it is crucial scientists, practitioners and policy makers understand the factors that hinder and assist healthy food choice in these individuals. In this scoping review, we aimed to identify barriers and facilitators to healthy eating among disadvantaged individuals living in the UK. Additionally, we used the Theoretical Domains Framework (TDF) to synthesise results and provide a guide for the development of theory-informed behaviour change interventions. Five databases were searched, (CINAHL, Embase, MEDLINE, PsycINFO, and Web of Science) for articles assessing healthy dietary intake of disadvantaged adults living in the UK. A total of 50 papers (34 quantitative; 16 qualitative) were included in this review. Across all studies we identified 78 barriers and 49 facilitators found to either impede and/or encourage healthy eating. Both barriers and facilitators were more commonly classified under the Environmental, Context and Resources TDF domain, with 74% of studies assessing at least one factor pertaining to this domain. Results thus indicate that context related factors such as high cost and accessibility of healthy food, rather than personal factors, such as lack of efficiency in healthy lifestyle drive unhealthy eating in disadvantaged individuals in the UK. We discuss how such factors are largely overlooked in current interventions and propose that more effort should be directed towards implementing interventions that specifically target infrastructures rather than individuals.


Assuntos
Dieta Saudável , Populações Vulneráveis , Humanos , Reino Unido , Populações Vulneráveis/psicologia , Dieta Saudável/psicologia , Adulto
19.
Diabetes Metab ; 50(5): 101554, 2024 Jun 29.
Artigo em Inglês | MEDLINE | ID: mdl-38950854

RESUMO

BACKGROUND: The association between dietary magnesium (Mg) intake and the risk of atherosclerotic cardiovascular disease (ASCVD) remains uncertain. We aimed to examine the associations of dietary Mg intake with the risk of ASCVD events and mortality in individuals with and without type 2 diabetes. METHODS: A total of 149,929 participants (4603 with type 2 diabetes) from the UK Biobank were included in the analyses. The hazard ratios (HRs) and 95 % confidence intervals (CIs) were estimated using Cox proportional hazard models. Furthermore, interactions of dietary Mg intake with type 2 diabetes status were examined on multiplicative and additive scales. RESULTS: During a median follow-up of 12.0 and 12.1 years, 7811 incident ASCVD events and 5000 deaths (including 599 ASCVD deaths) were documented, respectively. There were significantly negative associations between sufficient dietary Mg intake (equal to or greater than the recommended daily intake) and the risk of ASCVD incidence (HR 0.63 [95 % CI 0.49;0.82]), ASCVD mortality (0.45 [0.24;0.87]), and all-cause mortality (0.71 [0.52;0.97]) in participants with type 2 diabetes, whereas no significant association was observed in participants without type 2 diabetes (1.01 [0.94;1.09] for ASCVD incidence; 1.25 [0.93;1.66] for ASCVD mortality; 0.97 [0.88;1.07] for all-cause mortality). Multiplicative and additive interactions of dietary Mg intake with type 2 diabetes status were both observed. CONCLUSION: Sufficient dietary Mg intake was significantly associated with lower risks of ASCVD events and mortality in individuals with type 2 diabetes but not in those without type 2 diabetes. Our findings provide insight into the importance of dietary Mg intake for reducing modifiable cardiovascular burden in individuals with type 2 diabetes, which may inform future personalized dietary guidelines.

20.
Skin Res Technol ; 30(7): e13829, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38951954

RESUMO

BACKGROUND: In the continuous endeavor to find safe and efficient treatments for Atopic Dermatitis (AD), there remains a considerable focus on dietary adjustments. Nevertheless, the limited availability of research and conflicting findings in the academic literature pose a hurdle in establishing conclusive recommendations. METHOD: Mendelian randomization (MR) was applied to the most comprehensive genome-wide association studies (GWAS) data on tea intake (447 485), green tea intake (n = 64 949), flavored milk intake (n = 64 941), never eat eggs, dairy, wheat, sugar: Wheat products(n = 461 046), never eat eggs, dairy, wheat, sugar: Sugar or foods/drinks containing sugar (n = 461 046), never eat eggs, dairy, wheat, sugar: I eat all of the above (n = 461 046) and atopic dermatitis (n = 218 467). We used the inverse-variance weighted method (IVW) as the primary method. RESULTS: The IVW analyses have demonstrated an increased tea intake was genetically associated with a reduced risk of AD (odds ratio [OR]: 0.646, 95% confidence interval [CI]: 0.430-0.968, p = 0.034). Furthermore, green tea intake was significantly negatively associated with AD (IVW OR: 0.986, 95% CI: 0.975-0.998; p = 0.024) in the IVW model. AD risk could be reduced by never eating wheat products (IVW OR: 8.243E-04, 95% CI: 7.223E-06-9.408E-02, p = 0.003). There was no association between never eating eggs, dairy, wheat, sugar: Sugar, or foods/drinks containing sugar, I eat all of the above and AD. CONCLUSIONS: Our MR study suggests a causal relationship between tea intake, green tea intake, and the avoidance of eating wheat products with atopic dermatitis. Our findings recommend that preventing and managing atopic dermatitis may be achieved by never eating wheat products while increasing tea and green tea intake.


Assuntos
Dermatite Atópica , Dieta , Estudo de Associação Genômica Ampla , Análise da Randomização Mendeliana , Dermatite Atópica/genética , Humanos , Dieta/efeitos adversos , Chá , Ovos , Leite , Triticum/genética , Laticínios , Polimorfismo de Nucleotídeo Único
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...