Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 39
Filter
1.
Menopause ; 29(10): 1184-1188, 2022 10 01.
Article in English | MEDLINE | ID: mdl-36150117

ABSTRACT

OBJECTIVE: The association between serum uric acid levels and the risk of diabetes mellitus in women stratified by menopausal status is not well-established. Therefore, we investigated this association among a cohort of Japanese urban women. METHODS: We conducted a prospective cohort study on 3,304 women (1,252 premenopausal and 2,052 postmenopausal), aged 30 to 79 years, with no prior cardiovascular disease or diabetes mellitus, and enrolled from a general urban population. Cox proportional hazard model was used to calculate hazard ratios and 95% confidence intervals (CIs) for incident diabetes mellitus according to serum uric acid quartiles. RESULTS: During 13.8 years of median follow-up, 219 incident diabetes mellitus cases were diagnosed. The incidence rate per 1,000 person-years was 3.42 in premenopausal women and 6.19 in postmenopausal women. After adjustment for potential risk factors, the multivariable hazard ratios (95% CIs) of the highest versus lowest serum uric acid quartiles were 1.56 (0.77-3.16) in premenopausal women, 2.00 (1.19-3.34) in postmenopausal women, and 1.81 (1.21-2.73) in all women. The interaction based on menopausal status was not significant ( P = 0.872). The corresponding population attributable fractions (95% CIs) were 13.3% (-8.9% to 31.1%), 19.1% (5.3%-30.9%), and 17.0% (5.6%-27.0%), respectively. CONCLUSIONS: Serum uric acid levels were positively associated with the risk of diabetes mellitus in postmenopausal women, but not in premenopausal women. However, the lack of an association in premenopausal women may have been due to limited power, so further research is required to confirm this menopausal status-specific association.


Subject(s)
Diabetes Mellitus , Uric Acid , Diabetes Mellitus/epidemiology , Female , Humans , Postmenopause , Premenopause , Prospective Studies , Risk Factors
2.
Acta Diabetol ; 59(12): 1531-1537, 2022 Dec.
Article in English | MEDLINE | ID: mdl-35972542

ABSTRACT

AIM: We aimed to investigate the combined impact of liver enzymes and alcohol consumption on the diabetes risk. METHODS: Data on 5972 non-diabetic participants aged 30-79 years from the Suita study were analyzed. Diabetes incidence was surveyed every 2 years. Current daily alcohol consumption was defined as light drinking (< 23.0 g ethanol/day in men and < 11.5 g in women), moderate drinking (23.0-45.9 g and 11.5-22.9 g), and heavy drinking (≥ 46.0 g and ≥ 23.0 g). The nondrinkers category included both never-drinkers and former drinkers. RESULTS: During the median follow-up of 13 years, 597 incident diabetes cases were diagnosed. Higher levels of γ-glutamyltransferase (GGT), alanine aminotransferase (GPT), and aspartate aminotransferase (GOT) were associated with an increased diabetes risk, and current light drinkers had a lower risk of diabetes than nondrinkers. No sex differences were observed in these associations. Compared to nondrinkers having the lowest quartiles of liver enzymes, nondrinkers and current moderate/heavy drinkers having the highest quartiles had an increased risk of diabetes. However, no association was observed for current light drinkers having the highest quartiles of liver enzymes; the multivariable hazard ratios (95% CIs) in current light drinkers with the highest quartile of liver enzymes were 1.27 (0.68-2.37) for GGT, 1.05 (0.59-1.89) for GPT, and 0.76 (0.40-1.47) for GOT, respectively. CONCLUSION: High liver enzymes were associated with an increased diabetes risk. No increased diabetes risk was observed in current light drinkers, even in these who had high levels of liver enzymes.


Subject(s)
Diabetes Mellitus , gamma-Glutamyltransferase , Male , Humans , Female , Alanine Transaminase , Risk Factors , Diabetes Mellitus/epidemiology , Diabetes Mellitus/etiology , Aspartate Aminotransferases , Alcohol Drinking/adverse effects , Alcohol Drinking/epidemiology , Liver , Ethanol
3.
Article in English | MEDLINE | ID: mdl-35675977

ABSTRACT

BACKGROUND: Although the age-adjusted incidence and mortality of cancer and cardiovascular disease (CVD) have been decreasing steadily in Japan, both diseases remain major contributors to morbidity and mortality along with the aging society. Herein, we aim to provide a prescription of 10 health tips for long and healthy life named the "Lifelong Health Support 10 (LHS10)." METHOD: The LHS10 was developed by the preventive medicine specialists at the National Cerebral and Cardiovascular Center in Suita, where it has been used for health guidance to prevent CVD, cancer, and cognitive decline in addition to their major risk factors such as hypertension, diabetes, and obesity. It consisted of the lifestyle modification recommendations of the 2014 Japanese Society of Hypertension guidelines and the 2017 Japan Atherosclerosis Society Guidelines for preventing atherosclerotic CVD. Further, it came in line with other international lifestyle modification guidelines. In this narrative review, we summarized the results of several Japanese epidemiological studies investigating the association between the LHS10 items and the risk of cancer, CVD, and other chronic diseases including dementia, diabetes, and chronic kidney disease. RESULTS: The LHS10 included avoiding smoking and secondhand smoke exposure, engaging in physical activity, refraining from excessive alcohol drinking, reducing fried foods and sugary soft drinks, cutting salt in food, consuming more vegetables, fruits, fish, soy foods, and fibers, and maintaining proper body weight. All items of the LHS10 were shown to reduce the risk of cancer, CVD, and other chronic diseases. CONCLUSIONS: The LHS10 can be a helpful tool for health guidance.


Subject(s)
Cardiovascular Diseases , Hypertension , Neoplasms , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/prevention & control , Humans , Hypertension/prevention & control , Japan/epidemiology , Life Style , Prescriptions , Risk Factors
4.
EPMA J ; 13(1): 77-86, 2022 Mar.
Article in English | MEDLINE | ID: mdl-35273660

ABSTRACT

Background: Short and long sleep durations are common behaviors that could predict several cardiovascular diseases. However, the association between sleep duration and atrial fibrillation (AF) risk is not well-established. AF is preventable, and risk prevention approaches could reduce its occurrence. Investigating whether sleep duration could predict AF incidence for possible preventive interventions and determining the impact of various lifestyle and clinical characteristics on this association to personalize such interventions are essential. Herein, we investigated the association between sleep duration and AF risk using a prospective cohort study and a meta-analysis of epidemiological evidence. Methods: Data of 6898 people, aged 30-84 years, from the Suita Study, were analyzed. AF was diagnosed during the follow-up by ECG, medical records, checkups, and death certificates, while a baseline questionnaire was used to assess sleep duration. The Cox regression was used to compute the hazard ratios (HRs) and 95% confidence intervals (CIs) of AF risk for daily sleep ≤ 6 (short sleep), ≥ 8 (long sleep), and irregular sleep, including night-shift work compared with 7 h (moderate sleep). Then, we combined our results with those from other eligible prospective cohort studies in two meta-analyses for the short and long sleep. Results: In the Suita Study, within a median follow-up period of 14.5 years, short and irregular sleep, but not long sleep, were associated with the increased risk of AF in the age- and sex-adjusted models: HRs (95% CIs) = 1.36 (1.03, 1.80) and 1.62 (1.16, 2.26) and the multivariable-adjusted models: HRs (95% CIs) = 1.34 (1.01, 1.77) and 1.63 (1.16, 2.30), respectively. The significant associations between short and irregular sleep and AF risk remained consistent across different ages, sex, smoking, and drinking groups. However, they were attenuated among overweight and hypertensive participants. In the meta-analyses, short and long sleep durations were associated with AF risk: pooled HRs (95% CIs) = 1.21 (1.02, 1.42) and 1.18 (1.03, 1.35). No signs of significant heterogeneity across studies or publication bias were detected. Conclusion: Short, long, and irregular sleep could be associated with increased AF risk. In the context of predictive, preventive, and personalized medicine, sleep duration should be considered in future AF risk scores to stratify the general population for potential personalized lifestyle modification interventions. Sleep management services should be considered for AF risk prevention, and these services should be individualized according to clinical characteristics and lifestyle factors. Supplementary Information: The online version contains supplementary material available at 10.1007/s13167-022-00275-4.

5.
Article in English | MEDLINE | ID: mdl-35288490

ABSTRACT

BACKGROUND: A protective role for physical activity against the development of atrial fibrillation (AF) has been suggested. Stair climbing is a readily available form of physical activity that many people practice. Herein, we investigated the association between stair climbing and the risk of AF in a Japanese population. METHODS: In this prospective cohort study, we used data of 6,575 people registered in the Suita Study, aged 30-84 years, and had no history of AF. The frequency of stair climbing was assessed by a baseline questionnaire, while AF was diagnosed during the follow-up using a 12-lead ECG, health records, check-ups, and death certificates. We used the Cox regression to calculate the hazard ratios and 95% confidence intervals of AF incidence for climbing stairs in 20-39%, 40-59%, and ≥60% compared with <20% of the time. RESULTS: Within 91,389 person-years of follow-up, 295 participants developed AF. The incidence of AF was distributed across the stair climbing groups <20%, 20-39%, 40-59%, and ≥60% as follows: 3.57, 3.27, 3.46, and 2.63/1,000 person-years, respectively. Stair climbing ≥60% of the time was associated with a reduced risk of AF after adjustment for age and sex 0.69 (0.49, 0.96). Further adjustment for lifestyle and medical history did not affect the results 0.69 (0.49, 0.98). CONCLUSION: Frequent stair climbing could protect from AF. From a preventive point of view, stair climbing could be a simple way to reduce AF risk at the population level.


Subject(s)
Atrial Fibrillation , Stair Climbing , Adult , Aged , Aged, 80 and over , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Atrial Fibrillation/etiology , Humans , Incidence , Middle Aged , Prospective Studies , Risk Factors
6.
J Atheroscler Thromb ; 29(11): 1663-1671, 2022 Nov 01.
Article in English | MEDLINE | ID: mdl-35034920

ABSTRACT

AIMS: This study aimed to investigate the association of mild hypertensive retinopathy with cardiovascular disease (CVD) risk. METHODS: A total of 7,027 residents aged 30-79 years without a history of CVD participated in the annual health checkups and retinal photography assessments. Retinal microvascular abnormalities were graded using the standard protocols and classified according to the Keith-Wagener-Barker classification. Mild hypertensive retinopathy was defined as grades 1 and 2. Cox proportional hazard model was used to calculate hazard ratios (HRs) and 95% confidence intervals (CIs) for total CVD and its subtypes according to the presence and absence of mild hypertensive retinopathy. RESULTS: During a median follow-up of 17 years, 351 incident stroke and 247 coronary heart disease (CHD) cases were diagnosed. After adjustment for traditional cardiovascular risk factors, mild hypertensive retinopathy was positively associated with risk of CVD (multivariable HR=1.24; 95% CI, 1.04-1.49) and stroke (1.28; 1.01-1.62) but not with risk of CHD (1.19; 0.89-1.58). Generalized arteriolar narrowing and enhanced arteriolar wall reflex were positively associated with CVD risk, the multivariable HR (95% CI) was 1.24 (1.00-1.54) and 1.33 (1.02-1.74), respectively. Moreover, mild hypertensive retinopathy was positively associated with stroke risk in normotensive participants. CONCLUSION: Mild hypertensive retinopathy was positively associated with CVD and stroke risk in the urban Japanese population. Especially, generalized arteriolar narrowing and enhanced arteriolar wall reflex were positively associated with CVD risk. These findings suggested that retinal photography could be helpful for cardiovascular risk stratification in the primary cardiovascular prevention.


Subject(s)
Cardiovascular Diseases , Coronary Disease , Hypertension , Hypertensive Retinopathy , Retinal Diseases , Stroke , Humans , Cardiovascular Diseases/diagnosis , Cardiovascular Diseases/epidemiology , Cardiovascular Diseases/etiology , Retinal Diseases/complications , Retinal Diseases/diagnosis , Retinal Diseases/epidemiology , Risk Factors , Hypertensive Retinopathy/complications , Hypertensive Retinopathy/diagnosis , Hypertensive Retinopathy/epidemiology , Hypertension/epidemiology , Coronary Disease/complications , Stroke/complications
7.
Cerebrovasc Dis ; 51(3): 323-330, 2022.
Article in English | MEDLINE | ID: mdl-34844243

ABSTRACT

INTRODUCTION: Stroke remains a major cause of death and disability in Japan and worldwide. Detecting individuals at high risk for stroke to apply preventive approaches is recommended. This study aimed to develop a stroke risk prediction model among urban Japanese using cardiovascular risk factors. METHODS: We followed 6,641 participants aged 30-79 years with neither a history of stroke nor coronary heart disease. The Cox proportional hazard model estimated the risk of stroke incidence adjusted for potential confounders at the baseline survey. The model's performance was assessed using the receiver operating characteristic curve and the Hosmer-Lemeshow statistics. The internal validity of the risk model was tested using derivation and validation samples. Regression coefficients were used for score calculation. RESULTS: During a median follow-up duration of 17.1 years, 372 participants developed stroke. A risk model including older age, current smoking, increased blood pressure, impaired fasting blood glucose and diabetes, chronic kidney disease, and atrial fibrillation predicted stroke incidence with an area under the curve = 0.76 and p value of the goodness of fit = 0.21. This risk model was shown to be internally valid (p value of the goodness of fit in the validation sample = 0.64). On a risk score from 0 to 26, the incidence of stroke for the categories 0-5, 6-7, 8-9, 10-11, 12-13, 14-15, and 16-26 was 1.1%, 2.1%, 5.4%, 8.2%, 9.0%, 13.5%, and 18.6%, respectively. CONCLUSION: We developed a new stroke risk model for the urban general population in Japan. Further research to determine the clinical practicality of this model is required.


Subject(s)
Coronary Disease , Stroke , Coronary Disease/diagnosis , Coronary Disease/epidemiology , Humans , Proportional Hazards Models , Risk Assessment , Risk Factors , Stroke/diagnosis , Stroke/epidemiology
9.
J Atheroscler Thromb ; 29(10): 1511-1521, 2022 Oct 01.
Article in English | MEDLINE | ID: mdl-34803086

ABSTRACT

AIM: Weight change could have many health outcomes. This study aimed to investigate the association between weight change and mortality risk due to total cardiovascular disease (CVD), ischemic heart disease (IHD), and stroke among Japanese. METHODS: We used Suita Study data from 4,746 people aged 30-79 years in this prospective cohort study. Weight change was defined as the difference between baseline weight and weight at age 20. We used Cox proportional hazards models to calculate hazard ratios (HRs) and 95% confidence intervals (CIs) of total CVD, IHD, and stroke mortality for 1) participants with a weight change (>10, 5 to 10, -5 to -10, and <-10 kg) compared to those with stable weight (-4.9 to 4.9 kg) and 2) participants who moved from one body mass index category (underweight, normal weight, or overweight) to another compared to those with normal weight at age 20 and baseline. RESULTS: Within a median follow-up period of 19.9 years, the numbers of total CVD, IHD, and stroke mortality were 268, 132, and 79, respectively. Weight loss of >10 kg was associated with the increased risk of total CVD mortality 2.07 (1.29, 3.32) and stroke mortality 3.02 (1.40, 6.52). Moving from normal weight at age 20 to underweight at baseline was associated with the increased risk of total CVD, IHD, and stroke mortality: 1.76 (1.12, 2.77), 2.10 (1.13, 3.92), and 2.25 (1.05, 4.83), respectively. CONCLUSION: Weight loss, especially when moving from normal to underweight, was associated with the increased risk of CVD mortality.


Subject(s)
Cardiovascular Diseases , Myocardial Ischemia , Stroke , Adult , Humans , Proportional Hazards Models , Prospective Studies , Risk Factors , Stroke/complications , Thinness/complications , Weight Loss , Young Adult
10.
J Pharm Health Care Sci ; 7(1): 25, 2021 Aug 01.
Article in English | MEDLINE | ID: mdl-34332639

ABSTRACT

BACKGROUND: Antimicrobial stewardship (AS) is defined as coordinated interventions to improve and measure the appropriate use of antimicrobial agents. However, available resources for AS differ depending on the size of the clinical setting. Therefore, AS programs based on guidelines need to be selected in order to implement AS in small- to medium-sized hospitals. The present study compared the impact of AS in a 126-bed community hospital between pre- and post-AS periods. METHODS: The present study was retrospectively performed by selecting data on eligible patients from electronic medical records stored in the central database of the hospital. The roles of the AS team included weekly rounds and recommendations on the appropriate use of antimicrobials, and pharmacists working on post-prescription audits and pharmaceutical care at the bedside closely communicated with the AS team to assist with its implementation. As process measurements, the order rate of culture examinations, the conducting rate of de-escalation, antimicrobial use density (AUD), days of therapy (DOT), and the AUD/DOT ratio of carbapenems and tazobactam-piperacillin (TAZ/PIPC) were measured. Thirty-day mortality and recurrence rates were examined as clinical outcomes. RESULTS: A total of 535 patients (288 in the pre-AS period and 247 in the post-AS period) were enrolled in the present study. The recommendation rate to prescribers significantly increased (p < 0.01) from 10.4% in the pre-AS period to 21.1% in the post-AS period. The order rate of culture examinations increased from 56.3 to 73.3% (p < 0.01). The conducting rate of de-escalation increased from 10.2 to 30.8% (p < 0.05). The AUD of carbapenems and TAZ/PIPC significantly decreased (p < 0.05). The DOT of carbapenems (p < 0.01) and TAZ/PIPC (p < 0.05) also significantly decreased. The AUD/DOT ratio of carbapenem significantly increased from 0.37 to 0.60 (p < 0.01). Thirty-day mortality rates were 11.2 and 14.2%, respectively, and were not significantly different. The 30-day recurrence rate significantly decreased (p < 0.05) from 14.7 to 7.5%. CONCLUSIONS: The implementation of AS in this hospital improved the appropriate use of antimicrobials without negatively affecting clinical outcomes. These results may be attributed to close communication between pharmacists working on post-prescription audits and pharmaceutical care at the bedside and the AS team.

11.
Biomedicines ; 9(4)2021 Apr 08.
Article in English | MEDLINE | ID: mdl-33917863

ABSTRACT

Learning and environmental adaptation increase the likelihood of survival and improve the quality of life. However, it is often difficult to judge optimal behaviors in real life due to highly complex social dynamics and environment. Consequentially, many different brain regions and neuronal circuits are involved in decision-making. Many neurobiological studies on decision-making show that behaviors are chosen through coordination among multiple neural network systems, each implementing a distinct set of computational algorithms. Although these processes are commonly abnormal in neurological and psychiatric disorders, the underlying causes remain incompletely elucidated. Machine learning approaches with multidimensional data sets have the potential to not only pathologically redefine mental illnesses but also better improve therapeutic outcomes than DSM/ICD diagnoses. Furthermore, measurable endophenotypes could allow for early disease detection, prognosis, and optimal treatment regime for individuals. In this review, decision-making in real life and psychiatric disorders and the applications of machine learning in brain imaging studies on psychiatric disorders are summarized, and considerations for the future clinical translation are outlined. This review also aims to introduce clinicians, scientists, and engineers to the opportunities and challenges in bringing artificial intelligence into psychiatric practice.

12.
Int J Mol Sci ; 21(7)2020 Apr 02.
Article in English | MEDLINE | ID: mdl-32252468

ABSTRACT

Jabara (Citrus jabara Hort. ex Y. Tanaka) is a type of citrus fruit known for its beneficial effect against seasonal allergies. Jabara is rich in the antioxidant narirutin whose anti-allergy effect has been demonstrated. One of the disadvantages in consuming Jabara is its bitter flavor. Therefore, we fermented the fruit to reduce the bitterness and make Jabara easy to consume. Here, we examined whether fermentation alters the anti-allergic property of Jabara. Suppression of degranulation and cytokine production was observed in mast cells treated with fermented Jabara and the effect was dependent on the length of fermentation. We also showed that 5-hydroxymethylfurfural (5-HMF) increases as fermentation progresses and was identified as an active component of fermented Jabara, which inhibited mast cell degranulation. Mast cells treated with 5-HMF also exhibited reduced degranulation and cytokine production. In addition, we showed that the expression levels of phospho-PLCγ1 and phospho-ERK1/2 were markedly reduced upon FcεRI stimulation. These results indicate that 5-HMF is one of the active components of fermented Jabara that is involved in the inhibition of mast cell activation.


Subject(s)
Citrus/chemistry , Furaldehyde/analogs & derivatives , Mast Cells/drug effects , Mast Cells/physiology , Plant Extracts/pharmacology , Receptors, IgE/metabolism , Cell Degranulation/drug effects , Cell Degranulation/immunology , Cytokines/genetics , Cytokines/metabolism , Fermentation , Fermented Foods , Furaldehyde/chemistry , Furaldehyde/pharmacology , Immunoglobulin E/immunology , Inflammation Mediators/metabolism , Plant Extracts/chemistry
13.
Prehosp Disaster Med ; 35(1): 69-75, 2020 Feb.
Article in English | MEDLINE | ID: mdl-31818341

ABSTRACT

Over 27,000 people were sickened by Ebola and over 11,000 people died between March of 2014 and June of 2016. The US Centers for Disease Control and Prevention (CDC; Atlanta, Georgia USA) was one of many public health organizations that sought to stop this outbreak. This agency deployed almost 2,000 individuals to West Africa during that timeframe. Deployment to these countries exposed these individuals to a wide variety of dangers, stressors, and risks.Being concerned about the at-risk populations in Africa, and also the well-being of its professionals who willingly deployed, the CDC did several things to help safeguard the health, safety, and resilience of these team members before, during, and after deployment.The accompanying special report highlights innovative pre-deployment training initiatives, customized screening processes, and post-deployment outreach efforts intended to protect and support the public health professionals fighting Ebola. Before deploying, the CDC team members were expected to participate in both internally-created and externally-provided trainings. These ranged from pre-deployment briefings, to Preparing for Work Overseas (PFWO) and Public Health Readiness Certificate Program (PHRCP) courses, to Incident Command System (ICS) 100, 200, and 400 courses.A small subset of non-clinical deployers also participated in a three-day training designed in collaboration with the Center for the Study of Traumatic Stress (CSTS; Bethesda, Maryland USA) to train individuals to assess and address the well-being and resilience of themselves and their teammates in the field during a deployment. Participants in this unique training were immersed in a Virtual Reality Environment (VRE) that simulated deployment to one of seven different types of emergencies.The CDC leadership also requested a pre-deployment screening process that helped professionals in the CDC's Occupational Health Clinic (OHC) determine whether or not individuals were at an increased risk of negative outcomes by participating in a rigorous deployment at that time.When deployers returned from the field, they received personalized invitations to participate in a voluntary, confidential, post-deployment operational debriefing one-on-one or in a group.Implementing these approaches provided more information to clinical decision makers about the readiness of deployers. It provided deployers with a greater awareness of the kinds of challenges they were likely to face in the field. The post-deployment outreach efforts reminded staff that their contributions were appreciated and there were resources available if they needed help processing any of the potentially-traumatizing things they may have experienced.


Subject(s)
Disease Outbreaks , Emergency Responders , Hemorrhagic Fever, Ebola/epidemiology , Inservice Training , Occupational Diseases/prevention & control , Africa, Western/epidemiology , Centers for Disease Control and Prevention, U.S. , Humans , Surveys and Questionnaires , United States
14.
Dysphagia ; 33(1): 26-32, 2018 Feb.
Article in English | MEDLINE | ID: mdl-28856459

ABSTRACT

In Japan, the viscosity of thickened liquids is different among hospitals and nursing homes. In order to standardize viscosity of thickened liquids, the dysphagia diet committee of the Japanese Society of Dysphagia Rehabilitation developed the Japanese Dysphagia Diet 2013 (JDD2013). To decide on a definition of thickened liquids, the committee reviewed categories from other countries. Especially, the criteria of the USA and Australia were used as references. The definition had three levels: mildly thick, moderately thick, and extremely thick. Then a sensory evaluation by health care workers was carried out to decide the viscosity range of each level, and a draft document was made. After collecting public comments, follow-up experiments using thickened water with thickeners using xanthan gum were performed, and the JDD2013 (Thickened Liquid) was determined. The JDD2013 (Thickened Liquid) evaluated the drinking properties, visual properties, and viscosity values of each level. The shear rate of 50 s-1 was adopted to measure the viscosity with a cone and plate type viscometer to duplicate the measurement criteria used by the USA. We also set the values of the JDD2013 with the Line Spread Test to promote the use of guidelines in clinical practice. We believe the JDD2013 standards help hospitals and other settings that care for people with dysphagia to use the same thickness level and the same labels. In the future, the JDD2013 levels will be compared with new international guidelines to help with international understanding of the JDD2013 levels.


Subject(s)
Deglutition Disorders/diet therapy , Deglutition/physiology , Diet , Viscosity , Humans , Japan
15.
J Nutr Sci Vitaminol (Tokyo) ; 63(4): 256-262, 2017.
Article in English | MEDLINE | ID: mdl-28978873

ABSTRACT

Undernutrition caused by difficulties in masticating is of growing concern among the elderly. Soft diets are often served at nursing homes; however, the styles differ with nursing homes. Improperly modified food texture and consistency may lead to further loss of nutritive value. Therefore, we developed a method to produce a soft diet using chicken. The texture-modified chicken was prepared by boiling a mixture of minced chicken and additive foodstuff that softened the meat. The best food additive was determined through testing cooking process, size after modification and texture. The optimum proportions of each component in the mixture were determined measuring food texture using a creep meter. Teriyaki chicken was cooked using the texture-modified chicken, and provided to a nursing home. The amount of food intake by elderly residents was subsequently surveyed. This study involved 22 residents (1 man and 21 women; mean age 91.4±5.3 y). Consequently, yakifu, which was made from wheat gluten, was the most suitable additive foodstuff. The hardness of the texture-modified chicken, with proportions of minced chicken, yakifu, and water being 50%, 10%, and 40% respectively, was under 40,000 N/m2. The intake amount of the texture-modified chicken of subjects whose intake amount of conventional chicken using chicken thigh was not 100% was significantly higher. These findings suggest that properly modified food textures could contribute to improve the quality of meals by preventing undernutrition among the elderly with mastication difficulties.


Subject(s)
Chickens , Cooking/methods , Diet/methods , Food , Nursing Homes , Aged, 80 and over , Animals , Deglutition Disorders , Eating , Female , Food Additives , Glutens , Humans , Male , Malnutrition/prevention & control , Mastication , Nutritive Value , Sensation , Tooth Loss
16.
J Texture Stud ; 48(3): 198-204, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28573730

ABSTRACT

Some patients with dysphagia are prone to aspiration of low-viscosity liquids. Thickened liquids are often used in attempts to prevent aspiration. The patients should be given thickened liquids with suitable thickness, and the thickness should be constant at all time. While rotational and cone-and-plate viscometers are used for the evaluation of thickened liquids, they are high-precision and expensive equipment. To control the thickness of liquids, a simple and objective evaluation method is thus necessary. We developed a method to evaluate thickened liquids using funnels, and verified the appropriateness of this method. We measured the outflow times of five thickened liquids through funnels. One of the thickened liquids was a commercially available nutritional supplement, another was made with a thickening agent that contained guar gum, and all others were made with a thickening agent that contained xanthan gum. Four funnels with different stem sizes were tested. We found that the outflow time of thickened liquids through a funnel depended on their viscosities at a shear rate between 10 and 50 s-1 , when the average inner diameter of the stem was in the range of 5.3-9.0 mm, and the volume of the liquid poured into the funnel was 30 mL. The correlation coefficient between the value of the sensory evaluation and the outflow time of the funnel with an average stem ID of 5.3 mm was 0.946. Therefore, this method may be useful in hospital and nursing home kitchens for evaluating thickened liquids. PRACTICAL APPLICATIONS: The findings of this study will help develop a new method for the evaluation of thickened liquids. Funnels made from polypropylene, which are inexpensive and light, were used in this method. The process for measuring the outflow time of thickened liquids through a funnel is simple, and we can obtain quantitative data that are objective. Even though line spread test (LST) is well known as a simple measurement method, nutritional supplements and liquids thickened using a thickening agent containing guar gum have not been evaluated accurately. The funnel method was found to have a stronger correlation with sensory evaluation compared to LST. This method is useful in hospital and nursing home kitchens for evaluating thickened liquids.


Subject(s)
Deglutition , Food, Formulated/analysis , Galactans/analysis , Mannans/analysis , Plant Gums/analysis , Polysaccharides, Bacterial/analysis , Rheology/instrumentation , Adolescent , Adult , Deglutition Disorders/diagnosis , Deglutition Disorders/physiopathology , Deglutition Disorders/therapy , Enteral Nutrition/methods , Equipment Design , Female , Food, Formulated/standards , Galactans/standards , Humans , Judgment , Mannans/standards , Models, Theoretical , Observer Variation , Plant Gums/standards , Polysaccharides, Bacterial/standards , Sensory Thresholds , Solutions , Time Factors , Viscosity , Young Adult
17.
Circ Cardiovasc Qual Outcomes ; 8(2 Suppl 1): S31-8, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25714826

ABSTRACT

BACKGROUND: Prompt recognition of acute myocardial infarction symptoms and timely care-seeking behavior are critical to optimize acute medical therapies. Relatively little is known about the symptom presentation and care-seeking experiences of women aged ≤55 years with acute myocardial infarction, a group shown to have increased mortality risk as compared with similarly aged men. Understanding symptom recognition and experiences engaging the healthcare system may provide opportunities to reduce delays and improve acute care for this population. METHODS AND RESULTS: We conducted a qualitative study using in-depth interviews with 30 women (aged 30-55 years) hospitalized with acute myocardial infarction to explore their experiences with prodromal symptoms and their decision-making process to seek medical care. Five themes characterized their experiences: (1) prodromal symptoms varied substantially in both nature and duration; (2) they inaccurately assessed personal risk of heart disease and commonly attributed symptoms to noncardiac causes; (3) competing and conflicting priorities influenced decisions about seeking acute care; (4) the healthcare system was not consistently responsive to them, resulting in delays in workup and diagnosis; and (5) they did not routinely access primary care, including preventive care for heart disease. CONCLUSIONS: Participants did not accurately assess their cardiovascular risk, reported poor preventive health behaviors, and delayed seeking care for symptoms, suggesting that differences in both prevention and acute care may be contributing to young women's elevated acute myocardial infarction mortality relative to men. Identifying factors that promote better cardiovascular knowledge, improved preventive health care, and prompt care-seeking behaviors represent important target for this population.


Subject(s)
Health Knowledge, Attitudes, Practice , Myocardial Infarction/psychology , Patient Acceptance of Health Care , Recognition, Psychology , Adult , Age Factors , Conflict, Psychological , Female , Health Status Disparities , Healthcare Disparities , Hospitalization , Humans , Interviews as Topic , Middle Aged , Myocardial Infarction/diagnosis , Myocardial Infarction/therapy , Preventive Health Services , Qualitative Research , Risk Assessment , Risk Factors , Sex Factors , Time Factors , Time-to-Treatment
18.
Arch Gerontol Geriatr ; 60(1): 59-61, 2015.
Article in English | MEDLINE | ID: mdl-25440137

ABSTRACT

The cause of falls is multifactorial, however, hip fractures in elderly would be prevented if accidental falls are predictable. We assessed magnetic resonance images of 38 patients with groin pain after taking a fall whose fracture could not be detected by plain X-rays, and 45 patients with no episode of falls. Their ages were over 65 years. Fatty degeneration of muscles, gluteus maximus, gluteus medius, gluteus minimus, obturator externus, adductor longus, rectus femoris and iliopsoas muscles, were evaluated by Goutallier's staging. Odds ratio was calculated by a logistic regression analysis allocating dependent variable for falls and independent variables for Goutallier's stage, age and gender. The fatty degeneration of gluteus maximus muscle was generalized, while that of gluteus minimus muscle was unevenly distributed, especially in anterior area. Gluteus minimus muscle initiated its fatty degeneration earlier than gluteus medius muscle. Odds ratio of falling was 3.2 (95% confidence intervals: 1, 14, 8.94) for Goutallier' stage of the gluteus medius muscle. Fatty degeneration of gluteus medius muscle has a crucial role in providing stability of the pelvis including hip joint. Evaluating fatty streaks in the gluteus minimus muscle could help give early indication to those who have a higher risk of falling.


Subject(s)
Accidental Falls , Buttocks/physiology , Muscle Strength/physiology , Accident Prevention , Aged , Aged, 80 and over , Buttocks/diagnostic imaging , Female , Hip , Hip Joint/physiology , Humans , Magnetic Resonance Imaging , Male , Muscle, Skeletal/physiology , Predictive Value of Tests , Radiography , Regression Analysis
19.
J Epidemiol Glob Health ; 4(4): 323-5, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25455650

ABSTRACT

OBJECTIVE: To evaluate whether shoe-wearing affords foot protection among school children living in southern Ethiopia. METHODS: Data collectors conducted a standardized foot assessment with children in an elementary school in southern Ethiopia (N=168). RESULTS: 54% reported wearing shoes consistently in the prior three days. Children wearing closed-toed shoes showed less adherent soil and toe nail dystrophy than those wearing open-toed sandals. There were no differences by shoe type with regard to signs of foot trauma or heel fissures. CONCLUSIONS: Shoe wearing provided limited foot protection. Interventions are needed to build behavioral skills, including foot washing and wearing appropriate shoes that maximize foot protection.


Subject(s)
Foot Diseases/prevention & control , Rural Population/statistics & numerical data , Shoes/adverse effects , Adolescent , Child , Cross-Sectional Studies , Ethiopia , Female , Foot Diseases/etiology , Humans , Male
20.
PLoS Negl Trop Dis ; 7(4): e2199, 2013.
Article in English | MEDLINE | ID: mdl-23638211

ABSTRACT

BACKGROUND: The role of footwear in protection against a range of Neglected Tropical Diseases (NTDs) is gaining increasing attention. Better understanding of the behaviors that influence use of footwear will lead to improved ability to measure shoe use and will be important for those implementing footwear programs. METHODOLOGY/PRINCIPAL FINDINGS: Using the PRECEDE-PROCEED model we assessed social, behavioral, environmental, educational and ecological needs influencing whether and when children wear shoes in a rural highland Ethiopian community endemic for podoconiosis. Information was gathered from 242 respondents using focus groups, semi-structured interviews and extended case studies. Shoe-wearing norms were said to be changing, with going barefoot increasingly seen as 'shameful'. Shoes were thought to confer dignity as well as protection against injury and cold. However, many practical and social barriers prevented the desire to wear shoes from being translated into practice. Limited financial resources meant that people were neither able to purchase more than one pair of shoes to ensure their longevity nor afford shoes of the preferred quality. As a result of this limited access, shoes were typically preserved for special occasions and might not be provided for children until they reached a certain age. While some barriers (for example fit of shoe and fear of labeling through use of a certain type of shoe) may be applicable only to certain diseases, underlying structural level barriers related to poverty (for example price, quality, unsuitability for daily activities and low risk perception) are likely to be relevant to a range of NTDs. CONCLUSIONS/SIGNIFICANCE: Using well established conceptual models of health behavior adoption, we identified several barriers to shoe wearing that are amenable to intervention and which we anticipate will be of benefit to those considering NTD prevention through shoe distribution.


Subject(s)
Neglected Diseases/epidemiology , Shoes/adverse effects , Ethiopia/epidemiology , Models, Theoretical , Tropical Climate
SELECTION OF CITATIONS
SEARCH DETAIL
...