Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 2.397
Filtrar
1.
Artigo em Inglês | MEDLINE | ID: mdl-38903962

RESUMO

Objectives: For early gastrointestinal lesions, size is an important factor in the selection of treatment. Virtual scale endoscope (VSE) is a newly developed endoscope that can measure size more accurately than visual measurement. This study aimed to investigate whether VSE measurement is accurate for early gastrointestinal lesions of various sizes and morphologies. Methods: This study prospectively enrolled patients with early gastrointestinal lesions ≤20 mm in size visually. Lesion sizes were measured in the gastrointestinal tract visually, on endoscopic resection specimens with VSE, and finally on endoscopic resection specimens using a ruler. The primary endpoint was the normalized difference (ND) of VSE measurement. The secondary endpoints were the ND of visual measurement and the variation between NDs of VSE and visual measurements. ND was calculated as (100 × [measured size - true size] / true size) (%). True size was defined as size measured using a ruler. Results: This study included 60 lesions from April 2022 to December 2022, with 20 each in the esophagus, stomach, and colon. The lesion size was 14.0 ± 6.3 mm (mean ± standard deviation). Morphologies were protruded, slightly elevated, and flat or slightly depressed type in 8, 24, and 28 lesions, respectively. The primary endpoint was 0.3 ± 8.8%. In the secondary endpoints, the ND of visual measurement was -1.7 ± 29.3%, and the variability was significantly smaller in the ND of VSE measurement than in that of visual measurement (p < 0.001, F-test). Conclusions: VSE measurement is accurate for early gastrointestinal lesions of various sizes and morphologies.

2.
Artigo em Inglês | MEDLINE | ID: mdl-38976164

RESUMO

BACKGROUND: Vitamin D is critical to bone health by regulating intestinal absorption of calcium, whereas proinflammatory cytokines, including IL-1, IL-6, IL-12, and TNF-α, are known to increase bone resorption. We hypothesized that vitamin D and these cytokines at the time of breast cancer diagnosis were predictive for fragility fractures in women receiving aromatase inhibitors (AIs). METHODS: In a prospective cohort of 1,709 breast cancer patients treated with AIs, we measured the levels of 25-hydroxyvitamin D (25OHD), IL-1ß, IL-6, IL-12, and TNF-α from baseline blood samples. The associations of these biomarkers were analyzed with bone turnover markers (BALP and TRACP), bone regulatory markers (OPG and RANKL), bone mineral density (BMD) close to cancer diagnosis, and risk of fragility fractures during a median of 7.5 years of follow up. RESULTS: Compared to patients with vitamin D deficiency, patients with sufficient levels had higher bone turnover, lower BMD, and higher fracture risk; the latter became non-significant after controlling for covariates including BMD and no longer existed when patients taking vitamin D supplement or bisphosphonates or with history of fracture or osteoporosis were excluded. There was a non-significant trend of higher levels of IL-1ß and TNF-α associated with higher risk of fracture (highest vs. lowest tertile, IL-1ß: adjusted HR=1.37, 95% CI=0.94-1.99; TNF-α: adjusted HR=1.38, 95% CI=0.96-1.98). CONCLUSIONS: Our results do not support proinflammatory cytokines or vitamin D levels as predictors for risk of fragility fractures in women receiving AIs for breast cancer.

3.
J Am Med Dir Assoc ; : 105130, 2024 Jul 04.
Artigo em Inglês | MEDLINE | ID: mdl-38972334

RESUMO

OBJECTIVE: This study explores the impact of leisure activity and the association between childhood starvation and the risk of diabetes in older Chinese adults. DESIGN: Prospective cohort study based on the Chinese Longitudinal Healthy Longevity Study (CLHLS), a nationwide cohort study in China. SETTING AND PARTICIPANTS: A total of 4637 older adults aged ≥65 years, all with documented diabetes history, experiences of childhood starvation, and participation in leisure activities were recruited. METHODS: Childhood starvation exposure was assessed via self-reported responses from a structured questionnaire. The leisure activities were measured by 9 distinctive components and categorized into 3 distinct categories: productive activity, recreational activity, and sedentary activity. Diabetes status was determined by self-reported, physician-diagnosed cases during the follow-up period. Nonparametric survival models were employed for analysis. RESULTS: Over an average follow-up period of 4.3 years, 215 of 4637 participants (4.6%) reported a confirmed diagnosis of diabetes. Nonparametric survival models showed that those reporting childhood starvation had a higher risk of late-life diabetes [hazard ratio (HR) 1.72, 95% CI 1.21-2.44]. Engaging in productive activity (HR 0.90, 95% CI 0.83-0.99) and recreational activity (HR 0.88, 95% CI 0.77-1.00) was linked with a reduced risk of late-life diabetes. Sedentary activity did not show a significant effect. Further analysis highlighted the interactions effects of leisure activities on diabetes risk across different demographic and historical exposure subgroups. CONCLUSIONS AND IMPLICATIONS: Engaging in productive and recreational leisure activities was inversely associated with the risk of diabetes in older adults who experienced childhood starvation. Promoting such activities could be beneficial in mitigating long-term diabetes risk related to early-life nutritional deficiencies.

4.
J Atheroscler Thromb ; 2024 Jul 06.
Artigo em Inglês | MEDLINE | ID: mdl-38972723

RESUMO

AIM: The constellation of cardiovascular disease (CVD) risk factors greatly impacts the lifetime risk (LTR) of incident CVD, but the LTR has not been thoroughly evaluated in the Japanese population. METHODS: We conducted a prospective study involving a total of 25,896 individuals 40-69 years old without a history of CVD in 1995 (Cohort I) and 1993-1994 (Cohort II) in Japan. CVD risk factors (blood pressure, non-high-density lipoprotein [HDL] cholesterol levels, smoking status, and glucose concentrations) were used to stratify them by risk. The sex-specific LTR of incident coronary heart disease, stroke, atherosclerotic CVD, and total CVD were estimated for participants 45 years old in the 4 risk categories with the cumulative incidence rate, adjusting for the competing risk of death. RESULTS: We found apparent differences in the LTR of total CVD according to the risk stratification. Individuals with ≥ 2 of the risk factors of blood pressure ≥ 140/90 mmHg or treated, non-HDL cholesterol level ≥ 170 mg/dL or treated, current smoker, and diabetes had substantially higher adjusted LTRs of CVD than those in other groups, with a LTR of 26.5% (95% confidence interval, 24.0%-29.0%) for men and 15.3% (13.1%-17.5%) for women at 45 years. The LTR of incident stroke was the highest among CVDs, and the presence of hypertension and diabetes mellitus strongly influenced the LTR of total CVD. CONCLUSION: The impact of risk accumulation on LTR of CVD was greater in men, and 1 in 4 men with ≥ 2 major risk factors at 45 years of age developed CVD in their lifetime.

5.
Neurocrit Care ; 2024 Jul 09.
Artigo em Inglês | MEDLINE | ID: mdl-38982001

RESUMO

BACKGROUND: The diagnosis of intensive care unit (ICU)-acquired weakness (ICUAW) and critical illness neuromyopathy (CINM) is frequently hampered in the clinical routine. We evaluated a novel panel of blood-based inflammatory, neuromuscular, and neurovascular biomarkers as an alternative diagnostic approach for ICUAW and CINM. METHODS: Patients admitted to the ICU with a Sequential Organ Failure Assessment score of ≥ 8 on 3 consecutive days within the first 5 days as well as healthy controls were enrolled. The Medical Research Council Sum Score (MRCSS) was calculated, and motor and sensory electroneurography (ENG) for assessment of peripheral nerve function were performed at days 3 and 10. ICUAW was defined by an MRCSS < 48 and CINM by pathological ENG alterations, both at day 10. Blood samples were taken at days 3, 10, and 17 for quantitative analysis of 18 different biomarkers (white blood cell count, C-reactive protein, procalcitonin, C-terminal agrin filament, fatty-acid-binding protein 3, growth and differentiation factor 15, syndecan 1, troponin I, interferon-γ, tumor necrosis factor-α, interleukin-1α [IL-1α], IL-1ß, IL-4, IL-6, IL-8, IL-10, IL-13, and monocyte chemoattractant protein 1). Results of the biomarker analysis were categorized according to the ICUAW and CINM status. Clinical outcome was assessed after 3 months. RESULTS: Between October 2016 and December 2018, 38 critically ill patients, grouped into ICUAW (18 with and 20 without) and CINM (18 with and 17 without), as well as ten healthy volunteers were included. Biomarkers were significantly elevated in critically ill patients compared to healthy controls and correlated with disease severity and 3-month outcome parameters. However, none of the biomarkers enabled discrimination of patients with and without neuromuscular impairment, irrespective of applied classification. CONCLUSIONS: Blood-based biomarkers are generally elevated in ICU patients but do not identify patients with ICUAW or CINM. TRIAL REGISTRATION: ClinicalTrials.gov identifier: NCT02706314.

6.
Front Oncol ; 14: 1384931, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38947887

RESUMO

Objective: This study aims to construct a predictive model based on machine learning algorithms to assess the risk of prolonged hospital stays post-surgery for colorectal cancer patients and to analyze preoperative and postoperative factors associated with extended hospitalization. Methods: We prospectively collected clinical data from 83 colorectal cancer patients. The study included 40 variables (comprising 39 predictor variables and 1 target variable). Important variables were identified through variable selection via the Lasso regression algorithm, and predictive models were constructed using ten machine learning models, including Logistic Regression, Decision Tree, Random Forest, Support Vector Machine, Light Gradient Boosting Machine, KNN, and Extreme Gradient Boosting, Categorical Boosting, Artificial Neural Network and Deep Forest. The model performance was evaluated using Bootstrap ROC curves and calibration curves, with the optimal model selected and further interpreted using the SHAP explainability algorithm. Results: Ten significantly correlated important variables were identified through Lasso regression, validated by 1000 Bootstrap resamplings, and represented through Bootstrap ROC curves. The Logistic Regression model achieved the highest AUC (AUC=0.99, 95% CI=0.97-0.99). The explainable machine learning algorithm revealed that the distance walked on the third day post-surgery was the most important variable for the LR model. Conclusion: This study successfully constructed a model predicting postoperative hospital stay duration using patients' clinical data. This model promises to provide healthcare professionals with a more precise prediction tool in clinical practice, offering a basis for personalized nursing interventions, thereby improving patient prognosis and quality of life and enhancing the efficiency of medical resource utilization.

7.
J Autoimmun ; 147: 103259, 2024 May 31.
Artigo em Inglês | MEDLINE | ID: mdl-38823158

RESUMO

BACKGROUND: High salt intake may play a critical role in the etiology of psoriasis. Yet, evidence on the association of high salt intake with risk of psoriasis is limited. OBJECTIVE: To estimate the association between frequency of adding salt to foods and risk of psoriasis. METHODS: We conducted a prospective cohort study of 433,788 participants from the UK Biobank. Hazard ratios (HRs) and their 95 % confidence intervals (CIs) for risk of psoriasis in relation to frequency of adding salt to foods were estimated using multivariable Cox proportional hazards models. We further evaluated the joint association of adding salt to foods and genetic susceptibility with risk of psoriasis. We conducted a mediation analysis to assess how much of the effect of adding salt to foods on risk of psoriasis was mediated through several selected mediators. RESULTS: During a median of 14.0 years of follow-up, 4279 incident cases of psoriasis were identified. In the multivariable-adjusted model, a higher frequency of adding salt to foods was significantly associated with an increased risk of psoriasis ("always" versus "never/rarely" adding salt to foods, HR = 1.25, 95 % CI: 1.10, 1.41). The observed positive association was generally similar across subgroups. In the joint association analysis, we observed that participants with a high genetic risk (above the second tertile) and the highest frequency of adding salt to foods experienced 149 % higher risk of psoriasis, when compared with participants with a low genetic risk (below the first tertile) and the lowest frequency of adding salt to foods (HR = 2.49, 95 % CI: 2.05, 3.02). Mediation analysis revealed that 1.8 %-3.2 % of the positive association between frequency of adding salt and risk of psoriasis was statistically significantly mediated by obesity and inflammatory biomarkers such as C-reactive protein and systemic immune-inflammation index (all P values < 0.004). CONCLUSIONS: Our study demonstrated a positive association between frequency of adding salt to foods and risk of psoriasis. The positive association was independent of multiple other risk factors, and may be partially mediated through obesity and inflammation.

8.
J Alzheimers Dis ; 100(1): 309-320, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38875039

RESUMO

Background: Conflicting research on retinal biomarkers of Alzheimer's disease and related dementias (AD/ADRD) is likely related to limited sample sizes, study design, and protocol differences. Objective: The prospective Eye Adult Changes in Thought (Eye ACT) seeks to address these gaps. Methods: Eye ACT participants are recruited from ACT, an ongoing cohort of dementia-free, older adults followed biennially until AD/ADRD, and undergo visual function and retinal imaging assessment either in clinic or at home. Results: 330 participants were recruited as of 03/2023. Compared to ACT participants not in Eye ACT (N = 1868), Eye ACT participants (N = 330) are younger (mean age: 70.3 versus 71.2, p = 0.014), newer to ACT (median ACT visits since baseline: 3 versus 4, p < 0.001), have more years of education (17.7 versus 16.2, p < 0.001) and had lower rates of visual impairment (12% versus 22%, p < 0.001). Compared to those seen in clinic (N = 300), Eye ACT participants seen at home (N = 30) are older (77.2 versus 74.9, p = 0.015), more frequently female (60% versus 49%, p = 0.026), and have significantly worse visual acuity (71.1 versus 78.9 Early Treatment Diabetic Retinopathy Study letters, p < 0.001) and contrast sensitivity (-1.9 versus -2.1 mean log units at 3 cycles per degree, p = 0.002). Cognitive scores and retinal imaging measurements are similar between the two groups. Conclusions: Participants assessed at home had significantly worse visual function than those seen in clinic. By including these participants, Eye ACT provides a unique longitudinal cohort for evaluating potential retinal biomarkers of dementia.


Assuntos
Doença de Alzheimer , Humanos , Feminino , Masculino , Idoso , Estudos Prospectivos , Estudos de Coortes , Doença de Alzheimer/diagnóstico por imagem , Retina/diagnóstico por imagem , Idoso de 80 Anos ou mais , Transtornos da Visão , Pessoa de Meia-Idade , Demência/diagnóstico por imagem , Tomografia de Coerência Óptica , Projetos de Pesquisa
9.
Artigo em Japonês | MEDLINE | ID: mdl-38910125

RESUMO

Objectives This study aimed to examine the cut-off point of the Risk Assessment Scale (RAS) for predicting the 9-year risk of functional disability among older Japanese adults.Methods This prospective, 9-year follow-up study used data from the Sasaguri Genkimon Study in Fukuoka. Of the 2,629 older adults who did not have functional disabilities and participated in the baseline survey in 2011, 2,254 with complete data were included in the analysis. The RAS was assessed using a questionnaire that showed good predictive and external validity for the 3-year risk of functional disability. The outcome was the incidence of functional disability during follow-up, which was defined as a new certification for the need for support or care. The cut-off point of the RAS was estimated as the point indicating the maximum χ2 value of the log-rank test. The predictive validity of the RAS for functional disability was examined using C-statistics for the total score, and sensitivity and specificity for the cut-off point, respectively. Participants were then categorized into two groups according to the cut-off point (high-score and low-score groups). Hazard ratio (HR) and 95% confidence interval (95% CI) of the 9-year risk of functional disability for the high-score group compared with the low-score group were calculated using the Cox proportional hazard model. In the multivariate model, HR was adjusted for living alone, education, economic status, drinking, smoking, and multimorbidity.Results New functional disability was certified in 647 participants (28.7%) during a median follow-up period of 8.75 years. The cut-off point for functional disability was 13/14. The C-statistic was 0.774, and the sensitivity and specificity were 0.726 and 0.712, respectively. Compared to the low-score group (0-13 points), the HR (95% CI) of the high-score group (≥ 14 points) for incident functional disability in 9 years was 5.50 (4.62-6.54) in the crude model, and 4.81 (4.00-5.78) in the multivariate model (P<.001).Conclusion This study, with its long follow-up period of 9 years, demonstrated that the 13/14 cut-off point of the RAS is suitable for the long-term assessment of functional disability risk. Our results suggest the possibility of using the 13/14 cut-off point of the RAS as a promising tool to grasp the risk of functional disability over a longer time frame, highlighting the potential for early prevention and intervention.

10.
BMC Res Notes ; 17(1): 177, 2024 Jun 25.
Artigo em Inglês | MEDLINE | ID: mdl-38918795

RESUMO

OBJECTIVE: To assess first-trimester recruitment and retention of pregnant patients who regularly used cannabis, but not other substances, measured by willingness to participate in a research study, completion of self-administered electronic questionnaires, and willingness to provide urine samples during each trimester of pregnancy. We designed and launched a prospective feasibility study titled, Cannabis Legalization in Michigan (CALM) - Maternal & Infant Health (MIH), in two Michigan clinics after the recreational use of cannabis became legal for adults 21 years and older. RESULTS: Over half (52%) of patients asked to participate in CALM-MIH were consented to the study. Two-thirds (66%) of screened patients initiated prenatal care during their first trimester of pregnancy and 50% used cannabis, of which the majority did not concurrently use other substances. Of those recruited into the prospective study, all participants completed the first-trimester questionnaire and provided urine samples. Study retention was 80% and all participants who completed follow-up assessments were willing to provide urine samples.


Assuntos
Cannabis , Estudos de Viabilidade , Humanos , Feminino , Gravidez , Adulto , Estudos Prospectivos , Primeiro Trimestre da Gravidez/urina , Seleção de Pacientes , Inquéritos e Questionários , Adulto Jovem , Michigan , Cuidado Pré-Natal/estatística & dados numéricos
11.
Sex Med ; 12(3): qfae034, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38846267

RESUMO

Background: Pelvic floor muscle training (PFMT) has emerged as a potential intervention to improve post-total hysterectomy (TH) sexual function. Electromyographic (EMG) biofeedback is an adjunct that may improve outcomes. Aim: In this study we aimed to compare the EMG biofeedback-assisted PFMT and PFMT alone for improving sexual function in women after TH. Methods: For this prospective study we enrolled women undergoing TH in our hospital between January 2022 and April 2023. Participants were divided according to the treatment they selected: EMG biofeedback-assisted PFMT or PFMT alone. Outcomes: The primary study outcome was change in patient sexual function evaluated by use of the Female Sexual Function Index. Secondary outcomes were changes in anxiety and depression evaluated with the Hospital Anxiety and Depression Scale score and pelvic floor muscle strength was evaluated with the Glazer assessment performed from before to after treatment. Results: A total of 73 patients were included, with 38 patients treated with Electromyographic biofeedback-assisted pelvic floor muscle training. After treatment, sexual function was significantly improved compared to baseline in all patients (all P < .001). Compared to patients with pelvic floor muscle training, the changes in total Female Sexual Function Index scores from before to after treatment in patients with Electromyographic biofeedback-assisted pelvic floor muscle training were significantly higher (all P < .05). There were no significant differences between the 2 groups in the changes in the Glazer score and Hospital Anxiety and Depression Scale scores from before to after treatment (both P > .05). Clinical Translation: The results demonstrate that Electromyographic biofeedback-assisted pelvic floor muscle training may be used to improve the sexual function of patients following TH. Strengths and Limitations: This study is limited by its single-center design, small sample size, lack of randomization, and absence of estrogen monitoring in enrolled participants. Conclusions: Electromyographic biofeedback-assisted pelvic floor muscle training appears to be more effective than pelvic floor muscle training alone in improving sexual function among patients after total hysterectomy.

12.
Trials ; 25(1): 384, 2024 Jun 14.
Artigo em Inglês | MEDLINE | ID: mdl-38877566

RESUMO

BACKGROUND: In recent years, alternative monitoring approaches, such as risk-based and remote monitoring techniques, have been recommended instead of traditional on-site monitoring to achieve more efficient monitoring. Remote risk-based monitoring (R2BM) is a monitoring technique that combines risk-based and remote monitoring and focuses on the detection of critical data and process errors. Direct data capture (DDC), which directly collects electronic source data, can facilitate R2BM by minimizing the extent of source documents that must be reviewed and reducing the additional workload on R2BM. In this study, we evaluated the effectiveness of R2BM and the synergistic effect of combining R2BM with DDC. METHODS: R2BM was prospectively conducted with eight participants in a randomized clinical trial using a remote monitoring system that uploaded photographs of source documents to a cloud location. Critical data and processes were verified by R2BM, and later, all were confirmed by on-site monitoring to evaluate the ability of R2BM to detect critical data and process errors and the workload of uploading photographs for clinical trial staff. In addition, the reduction of the number of uploaded photographs was evaluated by assuming that the DDC was introduced for data collection. RESULTS: Of the 4645 data points, 20.9% (n = 973, 95% confidence interval = 19.8-22.2) were identified as critical. All critical data errors corresponding to 5.4% (n = 53/973, 95% confidence interval = 4.1-7.1) of the critical data and critical process errors were detectable by R2BM. The mean number of uploaded photographs and the mean time to upload them per visit per participant were 34.4 ± 11.9 and 26.5 ± 11.8 min (mean ± standard deviation), respectively. When assuming that DDC was introduced for data collection, 45.0% (95% confidence interval = 42.2-47.9) of uploaded photographs for R2BM were reduced. CONCLUSIONS: R2BM can detect 100% of the critical data and process errors without on-site monitoring. Combining R2BM with DDC reduces the workload of R2BM and further improves its efficiency.


Assuntos
Fotografação , Humanos , Estudos Prospectivos , Medição de Risco , Carga de Trabalho , Computação em Nuvem , Coleta de Dados/métodos , Feminino , Masculino , Confiabilidade dos Dados , Projetos de Pesquisa
13.
Vascular ; : 17085381241258192, 2024 Jun 03.
Artigo em Inglês | MEDLINE | ID: mdl-38828763

RESUMO

OBJECTIVE: To evaluate the short-term clinical outcomes of radiofrequency ablation (RFA) using a radiofrequency (RF) needle device for varicose ulcers. METHODS: From September 2020 to September 2021, a total of 80 patients with varicose ulcers were included in this study. Based on the different surgical methods, the patients were divided into RF group and control groups, with 40 cases in each group. In the RF group, RFA was performed using an RF needle device and foam sclerotherapy was used for superficial veins. The control group was treated with conventional high-ligation stripping. The surgical data, hospitalization data, clinical efficacy, and postoperative complications of two groups were compared. Meanwhile, the correlation between RBC, HB, HCT, and ulcer healing time was analyzed. RESULTS: Compared to the control group, RF group had shorter surgery time, duration in the hospital, and less intraoperative bleeding (p < .05). The VCSS and CIVIQ scores in RF group were significantly higher than that in control group (p < .05). The healing time of ulcers was shorter in the RF group (x2 = 19.766, p = .000). The RF group had fewer postoperative complications. There was a positive correlation between RBC, HB, and HCT, and ulcer healing time (p < .05). CONCLUSION: The use of the RF needle device for RFA to treat patients with varicose ulcers showed acceptable short-term clinical outcomes with less incidence of trauma, faster recovery, and fewer complications.

14.
Prax Kinderpsychol Kinderpsychiatr ; 73(4): 311-330, 2024 Jun.
Artigo em Alemão | MEDLINE | ID: mdl-38840539

RESUMO

The Protective Role of Self-Regulation for HRQOL of Adolescents with a Chronic Physical Health Condition A physical chronic condition comes with many challenges and negatively impacts the healthrelated quality of life (HRQOL) of those affected. Self-regulation plays an important role in successfully coping with the demands of a chronic condition. In line with a resource-oriented approach, this study aimed to investigate themoderating effect of self-regulation on the relationship between disease severity andHRQOL. For this, 498 adolescents with cystic fibrosis, juvenile idiopathic arthritis, or type-1 diabetes aged of 12-21 years (M= 15.43, SD= 2.07) were recruited through three patient registers. Subjective disease severity, self-regulation (Brief Self-Control- Scale), andHRQOL (DISABKIDSChronicGenericMeasure)were examined at two time points (T1 and T2, one year apart). Cross-sectional analysis showed significant effects of subjective disease severity and self-regulation on HRQOL. Prospective analysis, in which HRQOL at T1 was controlled for, revealed that disease severity only predicted emotion-related HRQOL at T2; selfregulation emerged as a predictor for HRQOL subscales independence, emotion, inclusion, exclusion, and treatment. A significantmoderation effect of self-regulation was found on the relationship between disease severity and HRQOL emotion. Our results highlight the positive impact of self-regulation on quality of life, specifically in the context of chronic conditions and represent a starting point for prevention and intervention approaches.

15.
Indian J Crit Care Med ; 28(5): 512, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38738204

RESUMO

How to cite this article: Vijayakumar M, Selvam V, Renuka MK, Rajagopalan RE. Author Response. Indian J Crit Care Med 2024;28(5):512.

16.
Eur J Nutr ; 2024 May 22.
Artigo em Inglês | MEDLINE | ID: mdl-38775828

RESUMO

BACKGROUND: Live dietary microbes have been hypothesized to promoting human health. However, there has been lacking perceptions to crystallize nexus between consumption of foods with live microbes and mortality. OBJECTIVE: To investigate the association of consumption of foods with medium to high amounts of live microbes with all-cause, cancer-specific, and cardiovascular disease (CVD)-specific mortality. METHODS: The data were obtained from the National Health and Nutrition Examination Survey 1999-2018 at baseline linked to the 2019 National Death Index records. Based on consumption of foods that were categorized as either having medium or high microbial content (MedHi foods), participants were classified into three groups. Kaplan-Meier survival curves and multivariable Cox regression models were used to estimate the association of consumption of MedHi foods with mortality. Population-attributable fractions (PAFs) of consumption of MedHi foods in relation to mortality risk were also estimated. RESULTS: A total of 35,299 adults aged ≥ 20 years were included in this study. During a median follow-up of 9.67 years, compared with adults in G1, those in G3 had 16% (hazard ratio [HR], 0.84; 95% confidence interval [CI], 0.77-0.90) reduced risk of all-cause mortality, and 23% (HR, 0.77; 95% CI, 0.67-0.89) reduced risk of CVD-specific mortality. The PAF of high (G3) vs. intermediate or low consumption of MedHi foods (G1 + G2) with all-cause and CVD-specific mortality was 3.4% and 4.3%, respectively. CONCLUSIONS: Consumption of foods with higher microbial concentrations is associated with a reduced risk of all-cause and CVD-specific mortality in US adults.

17.
Antioxidants (Basel) ; 13(5)2024 May 16.
Artigo em Inglês | MEDLINE | ID: mdl-38790714

RESUMO

BRCA1 mutations predispose women to breast and ovarian cancer. The anticancer effect of zinc is typically linked to its antioxidant abilities and protecting cells against oxidative stress. Zinc regulates key processes in cancer development, including DNA repair, gene expression, and apoptosis. We took a blood sample from 989 female BRCA1 mutation carriers who were initially unaffected by cancer and followed them for a mean of 7.5 years thereafter. There were 172 incident cases of cancer, including 121 cases of breast cancer, 29 cases of ovarian cancers, and 22 cancers at other sites. A zinc level in the lowest tertile was associated with a modestly higher risk of ovarian cancer compared to women with zinc levels in the upper two tertiles (HR = 1.65; 95% CI 0.80 to 3.44; p = 0.18), but this was not significant. Among those women with zinc levels in the lowest tertile, the 10-year cumulative risk of ovarian cancer was 6.1%. Among those in the top two tertiles of zinc level, the ten-year cumulative risk of ovarian cancer was 4.7%. There was no significant association between zinc level and breast cancer risk. Our preliminary study does not support an association between serum zinc level and cancer risk in BRCA1 mutation carriers.

18.
BJUI Compass ; 5(5): 476-482, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38751955

RESUMO

Objectives: The aim was to investigate the predictive abilities of a preoperative diffusion-weighted MRI (dwMRI) among patients with surgically treated upper tract urothelial carcinoma (UTUC). Materials and methods: Written consent was obtained from all participants in this prospective and ethically approved study. Thirty-five UTUC patients treated with radical surgery were examined with a preoperative dwMRI and prospectively included during 2017-2022. Two radiologists examined the CT scans and dwMRIs for radiological stage, and the apparent diffusion coefficient (ADC) in the tumours at the dwMRI was registered. The radiologists were blinded for patient history, final histopathology and the readings of the other radiologist. The radiological variables were analysed regarding their abilities to predict muscle-invasive disease (MID, T2-T4) and tumour grade at final pathology after radical surgery. The predictive abilities were assessed using chi-square tests, Student's t-test and calculating the area under the curve in a receiver operating characteristic (ROC) curve. Correlation between the two radiologists was quantified calculating the intra-class correlation coefficient. P-values <0.05 were considered statistically significant. Results: Mean age was 72 years, 20 had high-grade tumour, and 13 patients had MID. The ADC values at the dwMRI were significantly lower among patients with MID compared to patients with non-muscle-invasive disease (930 vs 1189, p = <0.001). The area under the ROC curve (AUC) in an ROC curve to predict MID was 0.88 (CI 0.77-0.99, p = <0.001). The ADC values were significantly lower among patients with high-grade tumours compared to low-grade tumours (1005 vs 1210, p = 0.002). The correlation of the ADC measurements between the two radiologists was of 0.93 (CI 0.85-0.96, p < 0.001). Conclusion: Tumour ADC at the MRI emerges as a potential biomarker for aggressive disease. The results are promising but should be validated in a larger, multicentre study.

19.
Endocrine ; 2024 May 16.
Artigo em Inglês | MEDLINE | ID: mdl-38753244

RESUMO

BACKGROUND: Currently, the special blood pressure (BP) target for normotensive diabetic patients has not been recommended. We investigated the optimal systolic blood pressure (SBP) for lower cardiovascular disease (CVD) risk in normotensive diabetic patients. METHODS: In this 12-year follow-up study using the participants of the Kailuan Study, we mainly compared which SBP, 90-119 mmHg or 120-129 mmHg, had a lower risk of occurrence of CVD (stroke and myocardial infarction) in the 3072 normotensive diabetic participants and 21,532 normotensive and non-diabetic participants, respectively. The SBP was expressed as a mean time-weighted cumulative (MTWC) SBP, calculated from the multiple measurements of SBP during the follow-up. Multivariate competing risk regression analyses were used for the analysis. RESULTS: We found that in normotensive diabetic participants, MTWC SBP of 120-129 mmHg was associated with a lower risk of CVD (HR = 0.69 [0.50-0.95]), myocardial infarction (HR = 0.48 [0.24-0.96]), and trending towards lower risk of stroke (HR = 0.80 [0.55-1.16]), compared to MTWC SBP of 90-119 mmHg. Sensitivity analyses confirmed the relationship between low SBP and increased CVD risk. Whereas, in the normotensive and non-diabetic participants, MTWC SBP of 90-119 mmHg vs 120-129 mmHg did not exhibit any difference in the risk of CVD occurrence (HR = 0.99 [0.83-1.18]). CONCLUSIONS: The higher level of SBP in normotensive diabetic patients is especially associated with a lower risk of CVD occurrence.

20.
Arch Gerontol Geriatr ; 125: 105466, 2024 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-38749086

RESUMO

BACKGROUND: Higher education level is associated with longer disability-free life expectancy (DFLE). However, evidence is scarce regarding factors that can contribute to eliminating inequality in DFLE according to education level. This study aimed to clarify the association between education and DFLE and estimate whether DFLE in people with lower education may increase to the same level as that in people with higher education through social participation. METHODS: We analyzed data from 13,849 Japanese people aged 65 years and older who participated in a 13-year prospective study. At baseline, we collected information on education levels (low, middle, or high) and social participation. DFLE was defined as the average duration people expect to live without disability. To calculate DFLE for each education level group, the multistate life table method was employed using a Markov model. RESULTS: At the age of 65 years, DFLE (95 % confidence interval [CI]) in women with low education was 21.3 years (20.8-21.8) without social participation and 24.3 (23.8-24.9) with social participation. In the middle education group, DFLE was 22.1 (21.6-22.6) without social participation and 25.0 (24.6-25.5) with social participation. In the high education group, DFLE was 22.1 (21.5-22.8) without social participation and 25.5 (25.0-26.0) with social participation. Similar results were found for men. CONCLUSIONS: DFLE in people with low or middle education with social participation was almost the same as that in those with high education with social participation, suggesting the possibility that disparities in DFLE by education level could be offset by promoting social participation in older adults.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...