Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 20 de 7.247
Filter
1.
Int. j. morphol ; 38(3): 731-736, June 2020. tab, graf
Article in English | LILACS (Americas) | ID: biblio-1098313

ABSTRACT

Regardless of sex or body size, police tasks may require officers to change direction speed (CODS) under occupational loads. The purpose of this study was to investigate body composition and CODS in female and male police cadets in both unloaded and occupationally loaded conditions. Body composition and CODS of 51 female (FPC) and 70 male police cadets (MPC) were assessed. Six body composition indices were used: Body mass index (BMI), percent body fat (PBF), percent of skeletal muscle mass (PSMM), protein fat index (PFI), index of hypokinesia (IH), and skeletal muscle mass index (SMMI). The CODS was assessed by Illinois Agility Test (IAT) and IAT while carrying a 10-kg load (LIAT). An independent sample t-test was used to identify the differences between the sexes. The regression determined associations between body composition and LIAT. The alpha level was set at p < 0.05 a priori. MPC had significantly higher (p < 0.001) BMI, PSMM, PFI and SMMI and lower PBF and IH than FPC. MPC were also faster in IAT and LIAT, carrying lower relative loads that imparted less of an impact on CODS performance. Body composition was strongly associated with the time to complete LIAT (R2 = 0.671, p < 0.001). Difference in relative load and body composition influenced CODS performance in both unloaded and loaded conditions. Thus, optimizing body composition through increasing skeletal muscle mass and reducing fat mass could positively influence unloaded and loaded CODS performance and improve elements of police task performance.


El trabajo policial puede requerir que los oficiales cambien la velocidad de dirección (CVD) bajo cargas ocupacionales, indistintamente del sexo o tamaño corporal. El propósito de este estudio fue investigar la composición corporal y los CVD en cadetes de policía de ambos sexos, tanto en condiciones de descarga como de trabajo. Se evaluaron la composición corporal y los CVD de 51 mujeres (CPF) y 70 cadetes de policía masculinos (CPM). Se utilizaron seis índices de composición corporal: índice de masa corporal (IMC), porcentaje de grasa corporal (PGC), porcentaje de masa muscular esquelética (PMME), índice de grasa proteica (IGP), índice de hipocinesia (IH) e índice de masa muscular esquelética (IMME). El CVD fue evaluado por Illinois Agility Test (IAT) y IAT mientras transportaba una carga de 10 kg (LIAT). Se usó una prueba t de muestra independiente para identificar las diferencias entre los sexos. La regresión determinó asociaciones entre la composición corporal y LIAT. El nivel alfa se estableció en p <0,05 a priori. CPM registraron un IMC, PMME, PGC y IMME significativamente más altos (p <0,001) y PGC e IH más bajos que las mujeres (CPF). Los CPM también fueron más rápidos en IAT y LIAT, llevando cargas más bajas, las que tuvieron un impacto menor en el rendimiento de CVD. La composición corporal estaba asociada con el tiempo para completar LIAT (R2 = 0,671, p <0,001). La diferencia en la carga relativa y la composición corporal influyeron en el rendimiento de CVD tanto en condiciones descargadas como cargadas. Por lo tanto, la optimización de la composición corporal a través del aumento de la masa del músculo esquelético, y la reducción de la masa grasa podrían influir de manera positiva en el rendimiento de CVD descargados y cargados, mejorando el rendimiento del trabajo policial.


Subject(s)
Humans , Male , Female , Young Adult , Body Composition , Sex Characteristics , Police , Movement/physiology , Body Mass Index , Logistic Models , Sex Factors , Adipose Tissue , Workload
2.
Int. j. morphol ; 38(2): 322-327, abr. 2020. tab, graf
Article in Spanish | LILACS (Americas) | ID: biblio-1056442

ABSTRACT

La estimación de edad compone un aspecto importante en investigaciones forenses. Diferentes métodos se han descrito en odontología forense basadas en la correlación entre la edad y estructuras dentales. Cameriere et al. proponen un método cuantitativo para estimación de edad en adultos, a partir de la evaluación de la relación del área pulpa/diente, en base a la aposición de dentina secundaria. El objetivo del estudio fue desarrollar modelos de regresión para la estimación de edad dental mediante la relación área pulpa/diente en caninos inferiores en una muestra Chilena. Se analizaron 212 radiografías periapicales digitales (RPD) (86 hombres y 126 mujeres) de caninos mandibulares mediante el programa Image J para establecer el área de la pulpa y el diente. Se registraron los datos de sexo y edad de las RPD seleccionadas en forma ciega. Fueron desarrollados modelos de regresión lineal simples para la estimación de edad. El coeficiente de determinación para R33 fue 27,8 % y de 29,6 % para R44, con un error absoluto medio de 11,02 años y 10,37 años respectivamente. El análisis de ANOVA no mostró diferencias estadísticamente significativas para las relaciones área pulpa/diente de caninos según sexo (p> 0,05). Según los resultados obtenidos, la metodología propuesta por Cameriere et al. es fiable para estimar la edad dental mediante la relación área pulpa/diente en adultos. Sin embargo, en los modelos de regresión desarrollados para la población Chilena, se puede afirmar que el ajuste indicado por los coeficientes de determinación muestran incerteza entre las variables área pulpa/diente y edad cronológica en caninos inferiores, por lo tanto se sugiere considerar otros métodos adicionales para estimar edad en esta población.


Age estimation is an important aspect In forensic investigations. Different methods in forensic odontology based on the correlation between age estimation in adults, from the analysis of the pulp/tooth area, based on the apposition of secondary dentine. The aim of the study was to develop regression models for the dental age estimation by the relation pulp/tooth area, in lower canines in a Chilean sample, using digital peri-apical radiographs (DPR) applying Cameriere's method. We analyzed 212 DPR (86 males and 126 females) mandibular canines through Image J program to measure the pulp/tooth area. Age and sex information was obtained of the DPR's blindly selected. We developed simple linear regression models for age estimation. The coefficient of determination to R33 was R2 age and dental structures have been described. Cameriere et al. proposed a quantitative method for 27.8 % and R2 29.6 % to R44, with a mean absolute error of 11.02 years, to R33 and 10.37 years to R44. ANOVA analysis showed no statistically significant differences for the pulp/tooth relation area of canines according to sex (p> 0.05). According to the results, the Cameriere's et al., method is reliable for dental age estimation according to pulp/tooth ratio in adults. However, in the regression models developed for Chilean population, it can be stated that the adjustment indicated by the coefficients of determination, show uncertainty between the pulp / tooth area and chronological age in lower canines, therefore it is suggested to use additional estimation methods for age in this population.


Subject(s)
Humans , Male , Female , Adolescent , Adult , Middle Aged , Aged , Young Adult , Tooth/anatomy & histology , Age Determination by Teeth/methods , Radiography, Dental, Digital/methods , Dental Pulp/anatomy & histology , Tooth/diagnostic imaging , Logistic Models , Chile , Analysis of Variance , Dental Pulp/diagnostic imaging , Age and Sex Distribution , Forensic Dentistry
3.
Int. j interdiscip. dent. (Print) ; 13(1): 26-29, abr. 2020. tab, graf
Article in Spanish | LILACS (Americas) | ID: biblio-1114889

ABSTRACT

OBJETIVO: Determinar la prevalencia de caries temprana de la infancia en niños en riesgo social y analizar sus factores de riesgo asociados. MÉTODO: Se realizó un estudio descriptivo de corte transversal con 246 niños de 24 a 71 meses de edad, reclutados de 13 barrios marginales, en Santiago, Chile. Se utilizó un cuestionario para obtener información sobre etnia, peso al nacer, edad y educación de la madre, uso de biberón nocturno, cepillado de dientes y visitas dentales. La caries temprana de la infancia fue registrada de acuerdo a la definición aceptada por la Academia Americana de Pediatría Dental. Se utilizó un análisis de regresión logística múltiple para investigar la influencia de los factores de riesgo en la experiencia de caries. RESULTADOS: La prevalencia de caries temprana de la infancia fue de un 63%. Los análisis bivariados mostraron asociaciones entre etnicidad, educación de la madre, uso de biberón nocturno, visitas dentales y caries temprana de infancia. El modelo multivariado final, mostró que los niños cuyas madres tenían un bajo nivel de educación tenían mayores probabilidades de desarrollar caries temprana de la infancia. CONCLUSIÓN: Los niños estudiados tuvieron una alta prevalencia de caries temprana de la infancia, siendo la educación de la madre, el determinante más importante.


OBJECTIVE: The aim of the present study was to evaluate the prevalence of early childhood caries (ECC) among children at social risk and to analyze its associated determinants. METHOD: A cross-sectional study with 246 children aged 24 to 71 months, recruited from 13 different slums, below the poverty line, in Santiago, Chile was performed. An interviewer-administered questionnaire was used to obtain information from the parents on ethnicity, birth weight, mother's age and education, night bottle feeding, tooth brushing and dental visits. Early childhood caries was defined using the American Academic of Pediatric Dentistry criteria. Multiple logistic regression analysis with a stepwise selection procedure was used to investigate the influence of risk factors on the early childhood caries experience. RESULTS: The prevalence of early childhood caries was 63%. Bivariate analyses showed associations among ethnicity, mother's education, bottle feeding at night, dental visits and caries experience. The final multivariate model showed that children whose mothers had a low level of education were more likely to develop early childhood caries. CONCLUSIONS: The children at social risk studied had a high prevalence of early childhood caries, with the education of the mother as the most important determinant.


Subject(s)
Humans , Male , Female , Child, Preschool , Dental Caries/epidemiology , Socioeconomic Factors , Logistic Models , Chile/epidemiology , Epidemiology, Descriptive , Prevalence , Cross-Sectional Studies , Multivariate Analysis , Surveys and Questionnaires , Risk Factors , Social Determinants of Health
4.
Braz. J. Psychiatry (São Paulo, 1999, Impr.) ; 42(1): 68-71, Jan.-Feb. 2020. tab, graf
Article in English | LILACS (Americas) | ID: biblio-1055368

ABSTRACT

Objective: Circadian dysregulation plays an important role in the etiology of mood disorders. Evening chronotype is frequent in these patients. However, prospective studies about the influence of chronotype on mood symptoms have reached unclear conclusions in patients with bipolar disorder (BD). The objective of this study was to investigate relationship between chronotype and prognostic factors for BD. Methods: At the baseline, 80 euthymic BD patients answered a demographic questionnaire and clinical scales to evaluate anxiety, functioning and chronotype. Circadian preference was measured using the Morningness-Eveningness Questionnaire, in which lower scores indicate eveningness. Mood episodes and hospitalizations were evaluated monthly for 18 months. Results: Among the BD patients, 14 (17.5%) were definitely morning type, 35 (43.8%), moderately morning, 27 (33.7%) intermediate (neither) and 4 (5%) moderately evening. Eveningness was associated with obesity or overweight (p = 0.03), greater anxiety (p = 0.002) and better functioning (p = 0.01), as well as with mood episodes (p = 0.04), but not with psychiatric hospitalizations (p = 0.82). This group tended toward depressive episodes (p = 0.06), but not (hypo)mania (p = 0.56). Conclusion: This study indicated that evening chronotype predicts a poor prognostic for BD. It reinforces the relevance of treating rhythm disruptions even during euthymia to improve patient quality of life and prevent mood episodes.


Subject(s)
Humans , Male , Female , Adult , Aged , Young Adult , Anxiety/physiopathology , Bipolar Disorder/physiopathology , Circadian Rhythm/physiology , Prognosis , Psychiatric Status Rating Scales , Quality of Life , Time Factors , Logistic Models , Prospective Studies , Surveys and Questionnaires , Statistics, Nonparametric , Chronobiology Disorders/physiopathology , Hospitalization/statistics & numerical data , Middle Aged
5.
Braz. J. Psychiatry (São Paulo, 1999, Impr.) ; 42(1): 22-26, Jan.-Feb. 2020. tab
Article in English | LILACS (Americas) | ID: biblio-1055359

ABSTRACT

Objective: German psychiatrist Kurt Schneider proposed the concept of first-rank symptoms (FRS) of schizophrenia in 1959. However, their relevance for diagnosis and prediction of treatment response are still unclear. Most studies have investigated FRS in chronic or medicated patients. The present study sought to evaluate whether FRS predict remission, response, or improvement in functionality in antipsychotic-naive first-episode psychosis. Methods: Follow-up study of 100 patients at first episode of psychosis (FEP), with no previous treatment, assessed at baseline and after 2 months of treatment. The participants were evaluated with the standardized Positive and Negative Syndrome Scale (PANSS) and Global Assessment of Functioning (GAF) and for presence of FRS. Results: Logistic regression analysis showed that, in this sample, up to three individual FRS predicted remission: voices arguing, voices commenting on one's actions, and thought broadcasting. Conclusion: Specific FRS may predict remission after treatment in FEP patients. This finding could give new importance to Kurt Schneider's classic work by contributing to future updates of diagnostic protocols and improving estimation of prognosis.


Subject(s)
Humans , Male , Female , Adult , Young Adult , Psychotic Disorders/diagnosis , Psychotic Disorders/drug therapy , Schizophrenia/diagnosis , Schizophrenia/drug therapy , Antipsychotic Agents/therapeutic use , Psychiatric Status Rating Scales , Reference Values , Remission Induction , Logistic Models , Predictive Value of Tests , Follow-Up Studies , Treatment Outcome
6.
Braz. J. Psychiatry (São Paulo, 1999, Impr.) ; 42(1): 72-76, Jan.-Feb. 2020. tab, graf
Article in English | LILACS (Americas) | ID: biblio-1055367

ABSTRACT

Objective: Depression has been associated with hepatitis C, as well as with its treatment with proinflammatory cytokines (i.e., interferon). The new direct-acting antiviral agents (DAAs) have minimal adverse effects and high potency, with a direct inhibitory effect on non-structural viral proteins. We studied the incidence and associated factors of depression in a real-life prospective cohort of chronic hepatitis C patients treated with the new DAAs. Methods: The sample was recruited from a cohort of 91 patients with hepatitis C, of both sexes, with advanced level of fibrosis and no HIV coinfection, consecutively enrolled during a 6-month period for DAA treatment; those euthymic at baseline (n=54) were selected. All were evaluated through the depression module of the Patient Health Questionnaire (PHQ-9-DSM-IV), at three time points: baseline, 4 weeks, and end-of-treatment. Results: The cumulative incidence (95%CI) of major depression and any depressive disorder during DAA treatment was 13% (6.4-24.4) and 46.3% (33.7-59.4), respectively. No differences were observed between those patients with and without cirrhosis or ribavirin treatment (p > 0.05). Risk factors for incident major depression during DAA treatment included family depression (relative risk 9.1 [1.62-51.1]), substance use disorder (11.0 [1.7-73.5]), and baseline PHQ-9 score (2.1 [1.1-3.1]). Conclusions: The findings of this study highlight the importance of screening for new depression among patients receiving new DAAs, and identify potential associated risk factors.


Subject(s)
Humans , Male , Female , Adult , Aged , Antiviral Agents/therapeutic use , Hepatitis C/psychology , Hepatitis C/drug therapy , Depressive Disorder/epidemiology , Psychiatric Status Rating Scales , Ribavirin/therapeutic use , Spain/epidemiology , Time Factors , Logistic Models , Incidence , Prospective Studies , Risk Factors , Treatment Outcome , Hepatitis C/epidemiology , Middle Aged
7.
Fisioter. Pesqui. (Online) ; 27(1): 85-92, jan.-mar. 2020. tab, graf
Article in Portuguese | LILACS (Americas) | ID: biblio-1090410

ABSTRACT

RESUMO O objetivo foi estimar a prevalência de incapacidade para realizar tarefas de vida diária e identificar associação com dor e fatores sociodemográficos. Trata-se de um estudo transversal, com amostra de conveniência, composta por indivíduos que buscaram ativamente assistência à saúde em um município de pequeno porte com queixas de dor musculoesquelética e dificuldades na realização de atividades de vida diária. Compuseram a amostra 766 indivíduos. Foram aplicados questionários para avaliar o grau de dificuldade de realização de atividades de vida diária e dor (Questionário Nórdico de Queixas Musculoesqueléticas e Escala Numérica de Dor). Foram estimadas as prevalências de incapacidade e dor, bem como foram construídos cinco modelos de regressão logística para incapacidade considerando sexo, idade, ocupação e presença e características da dor. Os dados encontrados mostraram que a prevalência de alguma dificuldade para realizar atividades de vida diária foi de 87,6%, de muita dificuldade 66,1%; a de dor musculoesquelética foi de 67,5%. Os indivíduos apresentaram incapacidade para realização de 3,6 atividades de vida diária em média. A dor foi o principal fator de associação para explicá-la (OR 9,9; IC95% 5,9-16,5), seguida da idade. A dificuldade na execução de atividades de vida diária foi associada à dor em membros inferiores, com frequência maior que quatro dias na semana, início há mais de cinco anos e intensidade forte ou insuportável nos episódios de crise. As prevalências de incapacidade e dores foram altas. A dor musculoesquelética e a idade impactaram na incapacidade funcional. Este estudo contribui para direcionar a construção de ações de cuidado que visem minimizar e prevenir dificuldades para realizar tarefas do dia a dia.


RESUMEN El objetivo fue estimar la prevalencia de incapacidad para realizar tareas de la vida diaria e identificar una asociación con el dolor y los factores sociodemográficos. Este es un estudio transversal, con una muestra de conveniencia compuesta de individuos que buscaron activamente asistencia médica en un municipio de pequeño porte, quejándose de dolor musculoesquelético y dificultades para llevar a cabo actividades de la vida diaria. La muestra comprendió 766 individuos. Se aplicaron cuestionarios para evaluar el grado de dificultad en la realización de actividades de la vida diaria y el dolor (Cuestionario nórdico musculoesquelético y Escala numérica del dolor). Se estimaron las prevalencias de incapacidad y dolor, así como se construyeron cinco modelos de regresión logística para la incapacidad considerando el sexo, la edad, la ocupación y la presencia y características del dolor. Los hallazgos evidenciaron el 87,6% de prevalencia de alguna dificultad, el 66,1% de mucha dificultad en la realización de actividades de la vida diaria; y el 67,5% de prevalencia de dolor musculoesquelético. Los individuos tuvieron incapacidad en la realización de 3,6 actividades como promedio en la vida diaria. El dolor fue el principal factor asociado para explicarla (OR 9,9; IC95% 5,9-16,5), seguido de la edad. La dificultad para realizar actividades de la vida diaria se asoció con el dolor en los miembros inferiores, con una frecuencia superior a cuatro días a la semana, que había comenzado hace más de cinco años, y de intensidad fuerte o insoportable en los episodios de crisis. Fueron altas las prevalencias de incapacidad y dolor. El dolor musculoesquelético y la edad influyen en la incapacidad funcional. Este estudio puede orientar la elaboración de acciones de cuidado con el fin de minimizar y prevenir dificultades en la realización de tareas diarias.


ABSTRACT The objective is to estimate the prevalence of incapacity to perform tasks of daily living and to identify its association with pain and sociodemographic factors. This is a cross-sectional study, with a convenience sample, composed of individuals who actively sought healthcare in a small city with complaints of musculoskeletal pain and difficulties in carrying out activities of daily living. The sample comprised 766 individuals. Questionnaires were applied to assess the degree of difficulty in performing activities of daily living and pain (Nordic Musculoskeletal Complaints Questionnaire and Numerical Pain Scale). The prevalence of disability and pain was estimated, as well as five logistic regression models for disability were constructed considering sex, age, occupation and presence and characteristics of pain. The data found showed that the prevalence of some difficulty in performing activities of daily living was 87.6%, with a lot of difficulty 66.1%; musculoskeletal pain was 67.5%. The individuals were unable to perform 3.6 activities of daily living on average. Pain was the main association factor to explain it (OR 9.9; 95%CI 5.9-16.5), followed by age. Difficulty in performing activities of daily living was associated with pain in the lower limbs, with a frequency greater than four days a week, beginning more than five years ago and severe or unbearable intensity in crisis episodes. The prevalence of disability and pain was high. Musculoskeletal pain and age impacted functional disability. This study contributes to direct the construction of care actions that aim to minimize and prevent difficulties to perform daily tasks.


Subject(s)
Humans , Male , Female , Adolescent , Adult , Middle Aged , Aged , Disabled Persons , Musculoskeletal Pain/physiopathology , Musculoskeletal Pain/epidemiology , Socioeconomic Factors , Pain Measurement , Brazil , Activities of Daily Living , Logistic Models , Demography , Chronic Disease/epidemiology , Prevalence , Cross-Sectional Studies , Surveys and Questionnaires , Development Indicators , Family Health Strategy , Delivery of Health Care , Disability Evaluation , Health Policy
8.
Asia Pacific Allergy ; (4): 3-2020.
Article in English | WPRIM (Western Pacific) | ID: wprim-785462

ABSTRACT

BACKGROUND: A reliable objective tool using as a predictor of asthma control status could assist asthma management.OBJECTIVE: To find the parameters of forced oscillation technique (FOT) as predictors for the future loss of asthma symptom control.METHODS: Children with well-controlled asthma symptom, aged 6–12 years, were recruited for a 12-week prospective study. FOT and spirometer measures and their bronchodilator response were evaluated at baseline. The level of asthma symptom control was evaluated according to Global Initiative for Asthma.RESULTS: Among 68 recruited children, 41 children (60.3%) maintain their asthma control between 2 visits (group C-C), and 27 children (39.7%) lost their asthma control on the follow-up visit (group C-LC). Baseline FOT parameters, including the values of respiratory resistance at 5 Hz (R5), respiratory resistance at 20 Hz (R20), respiratory reactance at 5 Hz, area of reactance, %predicted of R5 and percentage of bronchodilator response (%∆) of R5 and R20 were significantly different between C-C and C-LC groups. In contrast, only %∆ of forced vital capacity, forced expiratory volume in 1 second (FEV₁), and FEF25%–75% (forced expiratory flow 25%–75%) were significantly different between groups. Multiple logistic regression analysis revealed that %predicted of R5, %∆R5, %predicted of FEV₁ and %∆FEV₁ were the predictive factors for predicting the future loss of asthma control. The following cutoff values demonstrated the best sensitivity and specificity for predicting loss of asthma control: %predicted of R5=91.28, %∆R5=21.2, %predicted of FEV₁=89.5, and %∆FEV₁=7.8. The combination of these parameters predicted the risk of loss of asthma control with area under the curve of 0.924, accuracy of 83.8%.CONCLUSION: Resistance FOT measures have an additive role to spirometric parameter in predicting future loss of asthma control.


Subject(s)
Asthma , Child , Follow-Up Studies , Forced Expiratory Volume , Humans , Logistic Models , Prospective Studies , Sensitivity and Specificity , Spirometry , Vital Capacity
9.
Article in English | WPRIM (Western Pacific) | ID: wprim-785429

ABSTRACT

PURPOSE: Patients with secondary hyperparathyroidism are at high risk for developing postoperative hypocalcemia. However, there are limited data regarding predictors of postoperative hypocalcemia in renal failure patient with secondary hyperparathyroidism. This study aimed to determine the clinical presentations of renal hyperparathyroidism and the predictors of early postoperative hypocalcemia after total parathyroidectomy.METHODS: Data of patients with renal hyperparathyroidism who underwent total parathyroidectomy between January 2007 to December 2014 were reviewed retrospectively. Patients were divided into 2 cohort groups according to their serum calcium levels within 24 hours of parathyroidectomy: the hypocalcemia group (calcium levels of 2 mmol/L or less), and the normocalcemia group (calcium levels more than 2 mmol/L). With the use of multivariable logistic regression analyses, the predictors of early postoperative hypocalcemia after total parathyroidectomy in patients with renal hyperparathyroidism were investigated.RESULTS: Among 68 patients, 56 patients (82.4%) were symptomatic preoperatively. Fifty patients (73.5%) presented with bone pain and 14 patients (20.6%) had muscle weakness. Early postoperative hypocalcemia occurred in 25 patients (36.8%). Preoperative alkaline phosphatase level was the predictor of early postoperative hypocalcemia (adjusted odds ratio, 1.004; 95% confidence interval, 1.001–1.006; P = 0.002).CONCLUSION: Results from our study show that most of the patients with renal hyperparathyroidism were symptomatic preoperatively and the most common clinical presentations were bone pain and muscle weakness. The significant predictor of early postoperative hypocalcemia after total parathyroidectomy was the preoperative alkaline phosphatase levels.


Subject(s)
Alkaline Phosphatase , Calcium , Cohort Studies , Humans , Hyperparathyroidism , Hyperparathyroidism, Secondary , Hypocalcemia , Logistic Models , Muscle Weakness , Odds Ratio , Parathyroid Hormone , Parathyroidectomy , Renal Insufficiency , Retrospective Studies
10.
Article in English | WPRIM (Western Pacific) | ID: wprim-785395

ABSTRACT

BACKGROUND: Pyuria seems to be common in chronic kidney disease (CKD), irrespective of urinary tract infection (UTI). It has been hypothesized that sterile pyuria occurs in CKD because of chronic renal parenchymal inflammation. However, there are limited data on whether CKD increases the rate of pyuria or how pyuria in CKD should be interpreted. We investigated the prevalence and characteristics of asymptomatic pyuria (ASP) in CKD via urinary white blood cell (WBC) analysis.METHODS: Urine examination was performed for all stable hemodialysis (HD) and non-dialysis CKD patients of the outpatient clinic (total N=298). Patients with infection symptoms or recent history of antibiotic use were excluded. Urine culture and WBC analysis were performed when urinalysis revealed pyuria.RESULTS: The prevalence of ASP was 30.5% (24.1% in non-dialysis CKD and 51.4% in HD patients). Over 70% of the pyuria cases were sterile. The majority of urinary WBCs were neutrophils, even in sterile pyuria. However, the percentage of neutrophils was significantly lower in sterile pyuria. In multivariate logistic regression analysis, the degree of pyuria, percentage of neutrophils, and presence of urinary nitrites remained independently associated with sterile pyuria.CONCLUSIONS: The prevalence of ASP was higher in CKD patients and increased according to CKD stage. Most ASP in CKD was sterile. Ascertaining the number and distribution of urinary WBCs may be helpful for interpreting ASP in CKD.


Subject(s)
Ambulatory Care Facilities , Humans , Inflammation , Leukocytes , Logistic Models , Neutrophils , Nitrites , Prevalence , Pyuria , Renal Dialysis , Renal Insufficiency, Chronic , Urinalysis , Urinary Tract Infections , Viperidae
11.
Article in English | WPRIM (Western Pacific) | ID: wprim-782506

ABSTRACT

BACKGROUND: To evaluate the macular pigment optical density (MPOD) with age in the Korean population using the Macular Pigment Screener II (MPSII®).METHODS: One hundred and twenty-six eyes were retrospectively reviewed. MPOD was measured using MPSII®, which uses a heterochromatic flicker photometry method, and the estimated values were analyzed. Spearman's correlation test was used to evaluate correlations between MPOD and age. The association between MPOD and age was determined using a simple linear regression analysis. MPODs among the four groups were compared via the post hoc analysis with Bonferroni correction, MPODs between the age-related macular degeneration (AMD) group and aged-matched healthy subjects were compared via the Mann-Whitney U test. Other risk factors for AMD were identified via a logistic regression analysis.RESULTS: Estimated MPOD decreased significantly with increasing age in the general population. In the simple regression analysis, a statistically significant linear regression model was observed, and the estimated values of MPOD decreased by −0.005 as age increased by 1 year. Aged (> 50 years) showed lower MPOD than younger (30–49 years) subjects. But, in the healthy population, the estimated MPOD values exhibited a decreasing trend with age, but there were no significant differences according to age, after excluding patients with AMD. MPOD was significantly lower in patients with AMD than in aged healthy controls. Furthermore, hypertension, dyslipidemia, and smoking were identified as risk factors for AMD.CONCLUSION: MPOD measured with MPSII® reflects the MP density in healthy individuals and patients with dry AMD. Aging was not significantly associated with low MPOD in healthy population, but the presence of dry AMD was significantly associated with low MPOD. Then, low MPOD may be a risk factor for development of dry AMD. Furthermore, routine screening with MPS II® for ages 50 and older is thought to help detect early low MPOD and identify individuals who should take supplements.


Subject(s)
Aging , Dyslipidemias , Healthy Volunteers , Humans , Hypertension , Linear Models , Logistic Models , Macular Degeneration , Macular Pigment , Mass Screening , Methods , Photometry , Retrospective Studies , Risk Factors , Smoke , Smoking
12.
Article in English | WPRIM (Western Pacific) | ID: wprim-782495

ABSTRACT

BACKGROUND: Medical staff members are concentrated in the intensive care unit (ICU), and medical residents are essentially needed to operate the ICU. However, the recent trend has been to restrict resident working hours. This restriction may lead to a shortage of ICU staff, and there is a chance that regional academic hospitals will face running ICUs without residents in the near future.METHODS: We performed a retrospective observational study (intensivist crossover design) of medical patients who were transferred to two ICUs from general wards between September 2017 and February 2019 at one academic hospital. We compared the ICU outcomes according to the ICU type (ICU with resident management under high-intensity intensivist staffing vs. ICU with direct management by intensivists without residents).RESULTS: Of 314 enrolled patients, 70 were primarily managed by residents, and 244 were directly managed by intensivists. The latter patients showed better ICU mortality (29.9% vs. 42.9%, P = 0.042), lower cardiopulmonary resuscitation (CPR) (10.2% vs. 21.4%, P = 0.013), lower continuous renal replacement therapy (CRRT) (24.2% vs. 40.0%, P = 0.009), and more advanced care planning decisions before death (87.3% vs. 66.7%, P = 0.013) than the former patients. The better ICU mortality (hazard ratio, 1.641; P = 0.035), lower CPR (odds ratio [OR], 2.891; P = 0.009), lower CRRT (OR, 2.602; P = 0.005), and more advanced care planning decisions before death (OR, 4.978; P = 0.007) were also associated with intensivist direct management in the multivariate cox and logistic regression analysis.CONCLUSION: Intensivist direct management might be associated with better ICU outcomes than resident management under the supervision of an intensivist. Further large-scale prospective randomized trials are required to draw a definitive conclusion.


Subject(s)
Cardiopulmonary Resuscitation , Critical Care , Cross-Over Studies , Humans , Intensive Care Units , Internal Medicine , Internship and Residency , Logistic Models , Medical Staff , Mortality , Observational Study , Organization and Administration , Patients' Rooms , Prospective Studies , Renal Replacement Therapy , Retrospective Studies , Running , Survival Rate
13.
Article in English | WPRIM (Western Pacific) | ID: wprim-782481

ABSTRACT

BACKGROUND: Few studies have examined the relationship between cardiac function and geometry and serum hepcidin levels in patients with chronic kidney disease (CKD). We aimed to identify the relationship between cardiac function and geometry and serum hepcidin levels.METHODS: We reviewed data of 1,897 patients in a large-scale multicenter prospective Korean study. Logistic regression analysis was used to identify the relationship between cardiac function and geometry and serum hepcidin levels.RESULTS: The mean relative wall thickness (RWT) and left ventricular mass index (LVMI) were 0.38 and 42.0 g/m2.7, respectively. The mean ejection fraction (EF) and early diastolic mitral inflow to annulus velocity ratio (E/e′) were 64.1% and 9.9, respectively. Although EF and E/e′ were not associated with high serum hepcidin, RWT and LVMI were significantly associated with high serum hepcidin levels in univariate logistic regression analysis. In multivariate logistic regression analysis after adjusting for variables related to anemia, bone mineral metabolism, comorbidities, and inflammation, however, only each 0.1-unit increase in RWT was associated with increased odds of high serum hepcidin (odds ratio, 1.989; 95% confidence interval, 1.358–2.916; P < 0.001). In the subgroup analysis, the independent relationship between RWT and high serum hepcidin level was valid only in women and patients with low transferrin saturation (TSAT).CONCLUSION: Although the relationship was not cause-and-effect, increased RWT was independently associated with high serum hepcidin, particularly in women and patients with low TSAT. The relationship between cardiac geometry and serum hepcidin in CKD patients needs to be confirmed in future studies.


Subject(s)
Anemia , Comorbidity , Female , Hepcidins , Humans , Inflammation , Logistic Models , Metabolism , Miners , Prospective Studies , Renal Insufficiency, Chronic , Transferrin
14.
Article in English | WPRIM (Western Pacific) | ID: wprim-782276

ABSTRACT

BACKGROUND: The number of workers in non-standard employment (NSE) is increasing due to industrial change and technological development. Dependent self-employment (DSE), a type of NSE, was created decades ago. Despite the problems associated with this new type of employment, few studies have been conducted on the effects by DSE on health, especially sleep quality. This study aims to determine the relationship between DSE and sleep quality.METHODS: This study analyzed data of 50,250 wage workers from the fifth Korean Working Conditions Survey. Workers that did not respond or refused to answer any questions related to variables were excluded, and finally 36,709 participants were included in this study. A total of 2,287 workers (6.2%) were compared with non-DSE (34,422; 93.8%) workers, and multiple logistic regression analyses were applied.RESULTS: DSE status had a significant association with difficulty falling asleep (odds ratio [OR]: 1.331, 95% confidence interval [CI]: 1.178–1.504), difficulty maintaining sleep (OR: 1.279; 95% CI: 1.125–1.455), and extreme fatigue after waking up (OR: 1.331; 95% CI: 1.184–1.496). A multiple logistic regression of the variables for sleep quality in DSE showed a significant association with exposure to physical factors for all types of poor sleep quality as well as shift work for difficulty maintaining sleep with extreme fatigue after waking up. Long working hours and emotional labor were also associated with extreme fatigue after waking up.CONCLUSIONS: This study shows a significant association between DSE and poor sleep quality, especially when workers were exposed to physical risk factors (noise, vibration, abnormal temperature, etc.) and shift work.


Subject(s)
Accidental Falls , Employment , Fatigue , Logistic Models , Risk Factors , Salaries and Fringe Benefits , Vibration
15.
Article in English | WPRIM (Western Pacific) | ID: wprim-782271

ABSTRACT

OBJECTIVES: The aim of this study was to develop machine learning (ML) and initial nursing assessment (INA)-based emergency department (ED) triage to predict adverse clinical outcome.METHODS: The retrospective study included ED visits between January 2016 and December 2017 that resulted in either intensive care unit admission or emergency room death. We trained four classifiers using logistic regression and a deep learning model on INA and low dimensional (LD) INA, logistic regression on the Korea Triage and acuity scale (KTAS) and Sequential Related Organ Failure Assessment (SOFA). We varied the outcome ratio for external validation. Finally, variables of importance were identified using the random forest model's information gain. The four most influential variables were used for LD modeling for efficiency.RESULTS: A total of 86,304 patient visits were included, with an overall outcome rate of 3.5%. The area under the curve (AUC) values for the KTAS model were 76.8 (74.9–78.6) with logistic regression and 74.0 (72.1–75.9) for the SOFA model, while the AUC values of the INA model were 87.2 (85.9–88.6) and 87.6 (86.3–88.9) with logistic regression and deep learning, suggesting that the ML and INA-based triage system result more accurately predicted the outcomes. The AUC values for the LD model were 81.2 (79.4–82.9) and 80.7 (78.9–82.5) for logistic regression and deep learning, respectively.CONCLUSIONS: We developed an ML and INA-based triage system for EDs. The novel system was able to predict clinical outcomes more accurately than existing triage systems, KTAS and SOFA.


Subject(s)
Area Under Curve , Emergencies , Emergency Service, Hospital , Forests , Humans , Intensive Care Units , Korea , Learning , Logistic Models , Machine Learning , Nursing Assessment , Nursing , Retrospective Studies , Triage
16.
Article in English | WPRIM (Western Pacific) | ID: wprim-782258

ABSTRACT

BACKGROUND/OBJECTIVES: This study aimed to examine differences in weight control practices, beliefs, self-efficacy, and eating behaviors of weight class athletes according to weight control level.SUBJECTS/METHODS: Subjects were weight class athletes from colleges in Gyeong-gi Province. Subjects (n = 182) responded to a questionnaire assessing study variables by self-report, and data on 151 athletes were used for statistical analysis. Subjects were categorized into High vs. Normal Weight Loss (HWL, NWL) groups depending on weight control level. Data were analyzed using t-test, ANCOVA, χ²-test, and multiple logistic regressions.RESULTS: Seventy-three percent of subjects were in the HWL group. The two groups showed significant differences in weight control practices such as frequency (P < 0.01), duration and magnitude of weight loss, methods, and satisfaction with weight control (P < 0.001). Multiple logistic regression showed that self-efficacy (OR: 0.846, 95% CI: 0.730, 0.980), eating behaviors during training period (OR: 1.285, 95% CI: 1.112, 1.485), and eating behaviors during the weight control period (OR: 0.731, 95% CI: 0.620, 0.863) were associated with weight control level. Compared to NWL athletes, HWL athletes agreed more strongly on the disadvantages of rapid weight loss (P < 0.05 – P < 0.01), perceived less confidence in controlling overeating after matches (P < 0.001), and making weight within their weight class (P < 0.05). HWL athletes showed more inappropriate eating behaviors than NWL athletes, especially during the weight control period (P < 0.05 – P < 0.001).CONCLUSIONS: Self-efficacy was lower and eating behaviors during pre-competition period were more inadequate in HWL athletes. Education programs should include strategies to help athletes apply appropriate methods for weight control, increase self-efficacy, and adopt desirable eating behaviors.


Subject(s)
Athletes , Eating , Education , Feeding Behavior , Humans , Hyperphagia , Logistic Models , Weight Loss
17.
Article in English | WPRIM (Western Pacific) | ID: wprim-782257

ABSTRACT

BACKGROUND/OBJECTIVES: Osteoporosis is characterized by low bone mass and results in vulnerability to fracture. Calcium and vitamin D are known to play an important role in bone health. Recently, potassium has been identified as another important factor in skeletal health. We examined the link between potassium intake and bone health among the Korean older adult population.SUBJECTS/METHODS: This retrospective, cross-sectional study included 8,732 men and postmenopausal women over 50 years old who completed the Korean National Health and Nutrition Survey (KNHANES) between 2008 and 2011. Potassium consumption was evaluated using a 24-hour recall method. Bone mineral density (BMD) was measured at three sites (total hip, femur neck, and lumbar spine) by dual-energy X-ray absorptiometry (DEXA). Multinomial logistic regression was used to examine the link between potassium intake and prevalence of osteoporosis and osteopenia, after controlling for potential confounding variables.RESULTS: The BMD of the total femur and Ward's triangle were significantly different according to the potassium intake among men (P = 0.031 and P = 0.010, respectively). Women in the top tertile for potassium intake showed higher BMD than those in the bottom tertile at all measurement sites (all P < 0.05). Daily potassium intake was significantly related to a decreased risk of osteoporosis at the lumbar spine in postmenopausal women (odds ratios: 0.68, 95% confidence interval: 0.48-0.96, P trend = 0.031). However, the dietary potassium level was not related to the risk of osteoporosis in men.CONCLUSION: Current findings indicate that higher dietary potassium levels have a favorable effect on bone health and preventing osteoporosis in older Korean women.


Subject(s)
Absorptiometry, Photon , Adult , Bone Density , Bone Diseases, Metabolic , Calcium , Confounding Factors, Epidemiologic , Cross-Sectional Studies , Epidemiology , Female , Femur , Femur Neck , Hip , Humans , Logistic Models , Male , Methods , Nutrition Surveys , Osteoporosis , Population Surveillance , Potassium , Potassium, Dietary , Prevalence , Retrospective Studies , Spine , Vitamin D
18.
Article in English | WPRIM (Western Pacific) | ID: wprim-782256

ABSTRACT

BACKGROUND/OBJECTIVES: A number of studies examined secular trends in blood lipid profiles using time series data of national surveys whereas few studies investigated individual-level factors contributing to such trends. The present study aimed to examine secular trends in dietary and modifiable factors and hyper-LDL-cholesterolemia (HC) prevalence and evaluate their associations using time series data of nationwide surveys.SUBJECTS/METHODS: The study included 41,073 Korean adults aged ≥ 30 years from the 2005, 2007–2009, 2010–2012, 2013–2015, and 2016 Korea National Health and Nutrition Examination Surveys. Stepwise logistic regression analysis was performed to select significant factors associated with HC, which was defined as serum LDL cholesterol levels ≥130 mg/dL.RESULTS: The following factors showed a positive association with HC (P < 0.05): for men having higher body mass index (BMI), being married, having an office job, and consuming higher dairy and vegetable oil products; for women having higher age or BMI, having no job or a non-office job, not in a low-income household, and consuming higher dairy products. In the given model, the 2016 survey data showed that a 2 kg/m² reduction in BMI of obese persons resulted in a decreased HC prevalence from 30.8% to 29.3% among men and from 33.6% to 32.5% among women.CONCLUSIONS: Based on these findings, it is suggested that primary prevention programs should advocate having proper BMI for Korean adults with a high-risk of HC. However, whether discouraging consumption of dairy and vegetable oil products can reduce HC prevalence warrants further studies with a prospective longitudinal design.


Subject(s)
Adult , Body Mass Index , Cholesterol , Cholesterol, LDL , Dairy Products , Family Characteristics , Female , Humans , Korea , Logistic Models , Male , Prevalence , Primary Prevention , Prospective Studies , Vegetables
19.
Article in English | WPRIM (Western Pacific) | ID: wprim-782254

ABSTRACT

BACKGROUND/OBJECTIVES: Few epidemiological studies examined the association between fried food intake and hypertension. This study examined whether fried food intake was associated with higher prevalence of prehypertension and hypertension combined in a cross-sectional study of the Filipino Women's Diet and Health Study (FiLWHEL).SUBJECTS/METHODS: This study included a total of 428 women aged 20–57 years who have ever been married to Korean men. Prehypertension was defined as 120 - < 140 mmHg of SBP or 80 - < 90 mmHg of DBP and hypertension as SBP ≥ 140 mmHg or DBP ≥ 90 mmHg. Fried food intake was assessed using one-day 24-hour recall. Fried foods were categorized into total, deep/shallow and pan/stir fried foods. The odds ratio (OR)s and 95% confidence interval (CI)s were calculated using multivariate logistic regression.RESULTS: The prevalence of prehypertension and hypertension combined was 41.36% in this population. High fried food intake was associated with high prevalence of prehypertension and hypertension combined. The odds of having prehypertension and hypertension was higher in the 3rd tertile of fried food intake among fried food consumers compared to non-fried food consumers (OR = 2.46, 95% CI = 1.24, 4.87; P for trend = 0.004). Separate analysis for types of frying showed that deep and shallow fried food intake was associated with prevalence of prehypertension and hypertension combined for comparing the 3rd tertile vs. non-fried food consumers (OR = 2.93; 95% CI = 1.57-5.47; P for trend = < 0.001).CONCLUSIONS: This study showed the evidence that high fried food intake was significantly associated with high prevalence of prehypertension and hypertension combined among Filipino women married to Korean men.


Subject(s)
Blood Pressure , Cross-Sectional Studies , Diet , Eating , Emigrants and Immigrants , Epidemiologic Studies , Female , Humans , Hypertension , Logistic Models , Male , Odds Ratio , Prehypertension , Prevalence
20.
Article in English | WPRIM (Western Pacific) | ID: wprim-782219

ABSTRACT

BACKGROUND: Use of appropriate antibiotics for the treatment of pneumonia is integral in patients admitted to intensive care units (ICUs). Although it is recommended that empirical treatment regimens should be based on the local distribution of pathogens in patients with suspected hospital-acquired pneumonia, few studies observe patients admitted to ICUs with nursing home–acquired pneumonia (NHAP). We found factors associated with the use of inappropriate antibiotics in patients with pneumonia admitted to the ICU via the emergency room (ER).METHODS: We performed a retrospective cohort study of 83 pneumonia patients with confirmed causative bacteria admitted to ICUs via ER March 2015–May 2017. We compared clinical parameters, between patients who received appropriate or inappropriate antibiotics using the Mann-Whitney U, Pearson's chi-square, and Fisher's exact tests. We investigated independent factors associated with inappropriate antibiotic use in patients using multivariate logistic regression.RESULTS: Among 83 patients, 30 patients (36.1%) received inappropriate antibiotics. NHAP patients were more frequently treated with inappropriate antibiotics than with appropriate antibiotics (47.2% vs. 96.7%, p<0.001). Methicillin-resistant Staphylococcus aureus was more frequently isolated from individuals in the inappropriate antibiotics–treated group than in the appropriate antibiotics–treated group (7.5% vs. 70.0%, p<0.001). In multivariate analysis, NHAP was independently associated with the use of inappropriate antibiotics in patients with pneumonia admitted to the ICU via ER.CONCLUSION: NHAP is a risk factor associated with the use of inappropriate antibiotics in patients with pneumonia admitted to the ICU via the ER.


Subject(s)
Anti-Bacterial Agents , Bacteria , Cohort Studies , Critical Care , Emergency Service, Hospital , Hospitals, Teaching , Humans , Intensive Care Units , Logistic Models , Methicillin-Resistant Staphylococcus aureus , Multivariate Analysis , Nursing , Pneumonia , Retrospective Studies , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL