ABSTRACT
Objective:To explore the potential categories of return-to-work self-efficacy of postoperative patients with thyroid cancer and analyze the influencing factors, so as to provide theoretical basis for implementing precise interventions of occupational rehabilitation.Methods:This was a cross-sectional study. A convenient sampling method was used to select 257 postoperative patients with thyroid cancer in Zhujiang Hospital of Southern Medical University from May 2022 to July 2023. The General Information Questionnaire, Return-To-Work Self-Efficacy Questionnaire and Cancer Fatigue Scale were used for investigation. Latent profile analysis was used to explore the potential categories of return-to-work self-efficacy of postoperative patients with thyroid cancer. Logistic regression and decision tree were used to analyze the influencing factors of different potential categories.Results:Finally, 250 postoperative patients with thyroid cancer were included. There were 76 males and 174 females, aged (37.91 ± 8.04) years old. The return-to-work self-efficacy of postoperative patients with thyroid cancer was divided into 2 potential categories: low return-to-work self-efficacy group (72.0%, 180/250) and high return-to-work self-efficacy group (28.0%, 70/250). Logistic regression showed education, thyrotropin suppressive therapy, cancer-related fatigue and age were factors influencing the potential categories of return-to-work self-efficacy of postoperative patients with thyroid cancer ( OR values were 0.951 - 19.820, all P<0.05). Decision tree model showed education level and cancer-related fatigue were the most important factors ( χ2 = 31.40, 16.95, both P<0.05). Conclusions:There were two potential categories of return-to-work self-efficacy of postoperative patients with thyroid cancer. Most of them had low levels of return-to-work self-efficacy. Health care professionals should focus on patients who are less educated and having cancer-related fatigue, meanwhile, should not ignore patients who are substandard thyrotropin suppressive therapy, and older. Implement precise interventions of occupational rehabilitation to improve the return-to-work self-efficacy of postoperative patients with thyroid cancer so as to help them reintegrate into society.
ABSTRACT
Objective@#To understand the occurrence and predictive factors of depressive symptoms among multi ethnic middle school students in Yunnan Province, so as to provide a referential framework for schools to carry out targeted mental health education.@*Methods@#From October to December 2022, 8 500 first grade students from 23 middle schools were selected from 11 minority areas in Yunnan Province by cluster random sampling method. Demographic information and data relating to the students lifestyles were collected by questionnaire, and the Children s Depression Inventory (CDI) was used to evaluate depressive symptoms. Chi square test was performed to compare differences in the detection rate of depressive symptoms among first grade middle school students for univariate analysis. A decision tree model of depressive symptoms in middle school students was established by using the Chi squared automatic interaction detector (CHAID).@*Results@#The detection rate of depressive symptoms among first grade students from multi ethnic middle schools in Yunnan Province was 28.26%. The decision tree model of depressive symptoms was academic stress ( χ 2=469.08) at the first level, breakfast behaviors (low/moderate academic stress: χ 2=155.49; severe academic stress: χ 2=105.24) at the second level, and the number of close friends (low/moderate academic stress and consuming breakfast 0- 2 days weekly: χ 2=23.15; low/moderate academic stress and consuming breakfast 3-4 days weekly: χ 2=14.99; severe academic stress and consuming breakfast 0-2 days weekly: χ 2=29.26; severe academic stress and consuming breakfast 3-4 days weekly: χ 2=20.15), ethnicity ( χ 2=78.22) and drinking ( χ 2=50.36) at the third level ( P <0.01).@*Conclusions@#The study identifies academic stress, breakfast behaviors, number of close friends, drinking and ethnicity as predictive factors of depressive symptoms among multi ethnic middle school students in Yunnan Province. Schools should develop targeted strategies for preventing and managing depressive symptoms in middle school students, so as to reduce their occurrence.
ABSTRACT
Abstract Introduction Cardiovascular disease (CVD) is the lead-ing cause of death globally, with a high proportion of hospitalizations and costs. In view of this, it is essential to understand the main CVDs in patients admitted to hospital emergency services and the role of physiotherapists, in order to plan and direct health services, and to denote participation and encourage specific physiotherapy training in the context of tertiary care. Objective To outline the profile of cardiovascular emergencies and to evaluate physiotherapy in adult patients in the emergency department of a hospital in the interior of the state of São Paulo. Methods This was an observational study which analyzed 1,256 on-call records over a period of eight months. The data collected included age, gender, cardiovascular diagnostic hypothesis and physiotherapy treatment carried out. Results A total of 75 patients with cardiovascular emergencies were included, the most prevalent of which were: heart failure (n = 21), acute coronary syndrome (n = 14), acute myocardial infarction (n = 13), bradyarrhythmia (n = 6) and hypertensive crisis (n = 5). Regarding physiotherapeutic actions and their applications, the most frequent were invasive mechanical ventilation management (n = 34), lung re-expansion maneuvers (n = 17), orotracheal intubation assistance (n = 17), non-invasive mechanical ventilation (n = 14), bronchial hygiene maneuvers (n = 12), kinesiotherapy (n = 10) and sedation (n = 10). Conclusion Heart failure and acute coronary syndrome were the cardiovascular diseases that caused the most admissions to the hospital emergency department and that the procedures with an emphasis on the respiratory system were the most applied.
Resumo Introdução As doenças cardiovasculares (DCV) repre-sentam a principal causa de morte global, destacando-se em internações e gastos. Diante disso, é essencial compreender as principais DCV em pacientes admitidos em serviços de emergência hospitalar e a atuação do fisioterapeuta para planejamento e direcionamento dos serviços de saúde e para denotar a participação e incentivar formações fisioterapêuticas específicas no contexto da atenção terciária. Objetivo Traçar o perfil de emergências cardiovasculares e avaliar a atuação fisioterapêutica em pacientes adultos de serviço de emergência de um hospital no interior do estado de São Paulo. Métodos Trata-se de um estudo observacional, em que foram analisadas 1.256 fichas de passagem de plantão, no período de oito meses. Os dados coletados foram idade, sexo, hipótese diagnóstica cardiovascular e tratamento fisioterapêutico realizado. Resultados Foram incluídos 75 pacientes que apresentavam o perfil de emergências cardiovasculares, sendo as mais prevalentes: insuficiência cardíaca (n = 21), síndrome corona-riana aguda (n = 14), infarto agudo do miocárdio (n = 13), bradarritmia (n = 6) e crise hipertensiva (n = 5). Em relação à atuação fisioterapêutica e suas aplicações, as mais frequentes foram manejo da ventilação mecânica invasiva (n = 34), manobras de reexpansão pulmonar (n = 17), auxílio a intubação orotraqueal (n = 17), ventila-ção mecânica não invasiva (n = 14), manobras de higiene brônquica (n = 12), cinesioterapia (n = 10) e sedestação (n = 10). Conclusão A insuficiência cardíaca e a síndrome coronária aguda foram as doenças cardiovasculares que mais ocasionaram internação no serviço de emergência hospitalar e as condutas com ênfase no aparelho respiratório foram as mais aplicadas.
ABSTRACT
INTRODUCTION: Gastric cancer (GC) is the fifth most diagnosed neoplasia and the third leading cause of cancer-related deaths. A substantial number of patients exhibit an advanced GC stage once diagnosed. Therefore, the search for biomarkers contributes to the improvement and development of therapies. OBJECTIVE: This study aimed to identify potential GC biomarkers making use of in silico tools. METHODS: Gastric tissue microarray data available in Gene Expression Omnibus and The Cancer Genome Atlas Program was extracted. We applied statistical tests in the search for differentially expressed genes between tumoral and non-tumoral adjacent tissue samples. The selected genes were submitted to an in-house tool for analyses of functional enrichment, survival rate, histological and molecular classifications, and clinical follow-up data. A decision tree analysis was performed to evaluate the predictive power of the potential biomarkers. RESULTS: In total, 39 differentially expressed genes were found, mostly involved in extracellular structure organization, extracellular matrix organization, and angiogenesis. The genes SLC7A8, LY6E, and SIDT2 showed potential as diagnostic biomarkers considering the differential expression results coupled with the high predictive power of the decision tree models. Moreover, GC samples showed lower SLC7A8 and SIDT2 expression, whereas LY6E was higher. SIDT2 demonstrated a potential prognostic role for the diffuse type of GC, given the higher patient survival rate for lower gene expression. CONCLUSION: Our study outlines novel biomarkers for GC that may have a key role in tumor progression. Nevertheless, complementary in vitro analyses are still needed to further support their potential.
Subject(s)
Stomach Neoplasms/diagnosis , Biomarkers, Tumor , Computational Biology , Prognosis , Computer Simulation , Gene Expression , Tissue Array AnalysisABSTRACT
Objective:To investigate predictive factors for successful endovascular recanalization in patients with non-acute symptomatic internal carotid artery occlusion (SICAO), to develop a decision tree model using the Classification and Regression Tree (CART) algorithm, and to evaluate the predictive performance of the model.Methods:Patients with non-acute SICAO received endovascular therapy at 8 comprehensive stroke centers in China were included retrospectively. They were randomly assigned to a training set and a validation set. In the training set, the least absolute shrinkage and selection operator (LASSO) algorithm was used to screen important variables, and a decision tree prediction model was constructed based on CART algorithm. The model was evaluated using the receiver operating characteristic (ROC) curve, Hosmer-Lemeshow goodness-of-fit test and confusion matrix in the validation set.Results:A total of 511 patients with non-acute SICAO were included. They were randomly divided into a training set ( n=357) and a validation set ( n=154) in a 7:3 ratio. The successful recanalization rates after endovascular therapy were 58.8% and 58.4%, respectively. There was no statistically significant difference ( χ2=0.007, P=0.936). A CART decision tree model consisting of 5 variables, 5 layers and 9 classification rules was constructed using the six non-zero-coefficient variables selected by LASSO regression. The predictive factors for successful recanalization included fewer occluded segments, proximal tapered stump, ASITN/SIR collateral grading of 1-2, ischemic stroke, and a recent event to endovascular therapy time of 1-30 d. ROC analysis showed that the area under curve of the decision tree model in the training set was 0.810 (95% confidence interval 0.764-0.857), and the optimal cut-off value for predicting successful recanalization was 0.71. The area under curve in the validation set was 0.763 (95% confidence interval 0.687-0.839). The accuracy was 70.1%, precision was 81.4%, sensitivity was 63.3%, and specificity was 79.7%. The Hosmer-Lemeshow test in both groups showed P>0.05. Conclusion:Based on the type of ischemic event, the time from the latest event to endovascular therapy, proximal stump morphology, the number of occluded segments, and the ASITN/SIR collateral grading constructed the decision tree model can effectively predict successful recanalization after non-acute SICAO endovascular therapy.
ABSTRACT
Objective:To establish a decision tree model of pediatric complicated appendicitis (CA) based on Pediatric Appendicitis Score (PAS) combined with inflammatory indicators, and to evaluate its clinical application efficacy in pediatrics.Methods:The clinical data of 544 children diagnosed with appendicitis in Children′s Hospital Affiliated to Shanghai Jiao Tong University School of Medicine from January 2018 to December 2021 was retrospectively analyzed. According to postoperative pathology, the children were divided into uncomplicated appendicitis group and CA group. The independent risk factors of CA were screened by univariate and multivariate logistic regression analysis, and these parameters were included to establish the decision tree model. The accuracy of the decision tree model was verified by receiver operating characteristic (ROC) curve.Results:Binary logistic regression analysis indicated that the PAS, C-reactive protein (CRP) and neutrophil to lymphocyte ratio (NLR) were identified as independent risk factors for complicated appendicitis in children (all P<0.05). PAS, CRP and NLR were included as covariables to construct the decision tree model and binary logistic regression model for predicting CA. The decision tree demonstrated an overall accuracy of 79.2% with a sensitivity of 86.7% and specificity of 71.9%, and achieved an area under curve (AUC) of 0.821(95% CI: 0.786-0.857). The binary logistic regression model had a sensitivity of 79.6% and specificity of 69.1%, with an overall accuracy of 75.1% and achieved an AUC of 0.808(95% CI: 0.770-0.845). Conclusions:The decision tree model based on PAS score combined with CRP, NLR is a simple, intuitive and effective tool , which can provide pediatric emergency physicians a reliable basis for diagnosis of pediatric CA.
ABSTRACT
Objective:To analyze the influencing factors of delayed nausea and vomiting in patients with primary liver cancer after transarterial chemoembolization based on Logistic regression model and decision tree model.Methods:This was a cross-sectional study. A total of 236 patients with primary liver cancer after transarterial chemoembolization in The Second Affiliated Hospital of Air Force Military Medical University from March 2021 to June 2022 were conveniently selected as the research subjects. The factors related to delayed nausea and vomiting were collected, and Logistic regression and decision tree models were established, respectively, and the differences between the two models were compared.Results:The incidence of delayed nausea and vomiting of patients with primary liver cancer after transarterial chemoembolization was 45.34% (107/236). Logistic regression model showed that age, anxiety, sleep disorder, emetic risk level of chemotherapeutic drugs, embolic agent type, and pain 24 hours after surgery were the influencing factors of delayed nausea and vomiting in patients with primary liver cancer after transarterial chemoembolization(all P<0.05). Decision tree model showed that age, sleep disorder, emetic risk level of chemotherapeutic drugs, embolic agent type, and pain 24 hours after surgery were the influencing factors of delayed nausea and vomiting in patients with primary liver cancer after transarterial chemoembolization (all P<0.05). The classification accuracy rates of Logistic regression, decision tree model and combined diagnosis of two models were 72.9%, 71.2% and 72.0% respectively; the areas under the ROC curve were 0.778, 0.781 and 0.806 respectively, with no significant difference (all P>0.05). Conclusions:The analysis results of Logistic regression and decision tree model on the influencing factors of delayed nausea and vomiting in patients with primary liver cancer after transarterial chemoembolization are highly consistent, which can be combined to provide a more comprehensive reference for the evaluation and intervention of medical staff.
ABSTRACT
Esta tese objetivou identificar estratégias de triagem para infecção latente por Mycobacterium tuberculosis (Mtb) ILTB em profissionais de saúde que viabilizem o uso mais eficiente dos recursos disponíveis. No Brasil, recomenda-se que os profissionais de saúde, um dos grupos de risco para a ILTB, realizem triagem periódica para detecção da infecção e aqueles que apresentarem conversão aos testes de diagnóstico, indica-se o tratamento preventivo da tuberculose (TB) TPT uma vez que pessoas com conversão recente apresentam elevada chance de progressão para a doença. Desenvolvemos, no primeiro artigo, um modelo preditivo para identificar profissionais de saúde com maior probabilidade de resultado negativo para dois testes de diagnóstico da ILTB a partir de uma análise secundária de dados publicados anteriormente de 708 profissionais de saúde da atenção primária, de cinco capitais brasileiras, submetidos à prova tuberculínica (PT) e ao Quantiferon®-TB Gold in-tube (QFT-IT®). Construímos um modelo preditivo utilizando árvore de classificação e regressão (CART, classification and regression tree). A avaliação do desempenho foi realizada por meio da análise receiver operating characteristics (ROC) e area under the curve (AUC). Utilizou-se o mesmo banco de dados para validação cruzada do modelo. Entre os 708 profissionais de saúde, 247 (34,9%) apresentaram resultado negativo para os testes. A CART identificou que os médicos e agentes comunitários de saúde apresentaram chances duas vezes maior de testes negativos (probabilidade = 0,60) do que os enfermeiros e técnicos/auxiliares de enfermagem (probabilidade = 0,28) naqueles com menos de 5,5 anos de atuação na atenção primária. Na validação cruzada, a acurácia do modelo preditivo foi de 68% [intervalo de confiança de 95% (IC95%) 65 71) ], AUC de 62% (IC95% 58 66), especificidade de 78% (IC95% 74 81) e sensibilidade de 44% (IC95% 38 50). Apesar do baixo poder preditivo do modelo, a CART permitiu identificar subgrupos com maior probabilidade de terem ambos os testes negativos. No segundo artigo, analisou-se a razão de custo-efetividade de dois testes de sensibilidade cutânea baseados em antígenos específicos do Mtb -Diaskintest® e C-TST® - e a do QFT-Plus® para o diagnóstico da ILTB comparadas com a estratégia diagnóstica atual (PT) entre profissionais de saúde. Desenvolveu-se um modelo analítico de decisão, representado por coorte hipotética de 100.000 profissionais de saúde, de ambos os sexos, com resultado negativo para a PT no ano anterior, horizonte temporal de cinco anos, na perspectiva do Sistema Único de Saúde. Avaliaram-se três regimes de tratamento para a ILTB: três meses de doses semanais de rifapentina (900 mg) e isoniazida (900 mg) (3HP), seis e nove meses de doses diárias de isoniazida (300 mg) (6H e 9H, respectivamente). Aplicou-se taxa de desconto de 5% na efetividade, medida em casos de TB ativa evitados, e nos custos das estratégias de triagem e de tratamento avaliados, estimados em dólares americanos (US$) com taxa média anual de 2021 de acordo com o Banco Central (US$ 1 = 5,39 reais). Foram realizadas análises de sensibilidade determinística univariada e probabilística. Os testes Diaskintest®, C-TST® e QFT-Plus® apresentam maior especificidade (0,98, 0,98 e 0,97, respectivamente). Os custos com QFT-Plus® foram maiores devido aos equipamentos, mão de obra e ao custo do kit. O Diaskintest® foi o teste mais econômico (US$ 7.042, US$ 5.781 e US$ 18.892 por caso de TB ativa evitado para os regimes de tratamento com 3HP, 6H e 9H, respectivamente), inclusive nas análises de sensibilidade. No cenário nacional, o Diaskintest® foi o teste de melhor custo-efetividade para avaliação anual dos profissionais de saúde.
This thesis aimed to identify screening strategies for tuberculosis infection (TBI) in healthcare workers (HCW) that enable the most efficient use of available resources. Investigation of TBI in HCWs is recommended in Brazil as part of the worker's pre-employment and periodic (annual) health visits. HCWs with a first tuberculin skin test (TST) induration < 10 mm are invited to repeat the test in one to three weeks to assess the booster effect (induration size increment of 10 mm). Those with a persistent TST < 10 mm will undergo a one-step TST every 12 months. TPT is recommended when conversion (10 mm increment over latest induration size) occurs. We developed, in the first manuscript, a predictive model to identify HCWs best targeted for TBI screening. We carried out a secondary analysis of previously published results of 708 HCWs working in primary care services in five Brazilian State capitals who underwent two TBI tests: tuberculin skin test and Quantiferon®-TB Gold in-tube. We used a classification and regression tree (CART) model to predict HCWs with negative results for both tests. The performance of the model was evaluated using the receiver operating characteristics (ROC) curve and the area under the curve (AUC), cross-validated using the same dataset. Among the 708 HCWs, 247 (34.9%) had negative results for both tests. CART allowed us to identify that physicians or a community health agents were twice more likely to be uninfected (probability = 0.60) than registered or aid nurse (probability = 0.28) when working less than 5.5 years in the primary care setting. In cross validation, the predictive accuracy was 68% [95% confidence interval (95%CI): 65 71], AUC was 62% (95%CI 58 66), specificity was 78% (95%CI 74 81), and sensitivity was 44% (95%CI 38 50). Despite the low predictive power of this model, CART allowed to identify subgroups with higher probability of having both tests negative. In the second manuscript, we analyzed the cost-effectiveness of two TB antigen-based skin tests (TBST) using the recombinant ESAT-6 and CFP-10 immunogens (Diaskintest® and C-TST®) and of QFT-Plus® for TBI diagnosis compared with the current standard of care, TST, among HCWs in Brazil. A state-transition Markov model was created, simulating a cohort of 100,000 HCWs (five annual cycles) for TBI treatment scenarios with 3 months of weekly doses of rifapentine (900 mg) and isoniazid (900 mg) (3HP). We adopted the Brazilian public health system perspective. Effects [tuberculosis disease (TBD) averted) and costs for screening and treating TBI were discounted at 5%. Incremental cost-effectiveness per TBD averted was calculated. Hypothetical cohort of 100,000 HCWs of both sexes with a negative result of TST in the previous year. Diaskintest®, C-TST® and QFT-Plus® tests have higher specificity (0.98, 0.98 and 0.97, respectively). Costs with QFT-Plus® were higher due to equipment, human labor and cost of the kit by test. Diaskintest® was the most cost-effective test (US$ 7,042, US$ 5,781, and US$18,892 per case of TBD averted for the 3HP, 6H, and 9H treatment regimens, respectively), including sensitivity analyses. In the Brazilian scenario, Diaskintest® is the most cost-effective test for sequential testing of HCWs.
Subject(s)
Technology Assessment, Biomedical , Cost-Benefit Analysis , Health Personnel , Latent Tuberculosis/diagnosisABSTRACT
ABSTRACT Introduction: With the continuous development of society and the continuous improvement of the economic level, the willingness of Chinese people to participate in sports is also showing an upward trend. However, how to reduce sports damage as much as possible during exercise should be a hot issue of particular concern to athletes in the sports world. Objective: It aimed to discuss the simulation of the relationship between joint motion amplitude (JMA) and motion damage (MD) via a rough set decision-making algorithm to avoid MD. Based on the rough set decision algorithm, JMA and MD models were constructed, and a motion data decision table was established. Methods: Joint change parameters and constraint conditions were set, and joint change parameters were analyzed. Moreover, the changing parameters, feature strength, and algorithm partition accuracy of the simulation model were analyzed. Results: The feature strength and the division accuracy of the rough set decision algorithm all showed good accuracy. The model constructed by such a method can well describe the relationship between JMA and MD. Conclusion: The proposed rough set decision algorithm can describe the relationship between JMA and MD scientifically and effectively, which provided reference value for sports. Level of evidence II; Therapeutic studies - investigation of treatment results.
RESUMO Introdução: Com o desenvolvimento contínuo da sociedade e a melhoria contínua do nível econômico, a disposição do povo chinês para a prática de esportes também apresenta uma tendência ascendente. No entanto, como reduzir os danos ao esporte tanto quanto possível durante o exercício deve ser uma questão importante de particular preocupação para os atletas do mundo dos esportes. Objetivo: o objetivo foi discutir a simulação da relação entre amplitude de movimento articular (JMA) e dano de movimento (MD) por meio de um algoritmo de tomada de decisão de conjunto aproximado, para evitar MD. Com base no algoritmo de decisão de conjunto aproximado, os modelos JMA e MD foram construídos e uma tabela de decisão de dados de movimento foi estabelecida. Métodos: os parâmetros de mudança da junta e as condições de restrição foram definidos, e os parâmetros de mudança da junta foram analisados. Além disso, foram analisados os parâmetros de alteração, a força do recurso e a precisão da partição do algoritmo do modelo de simulação. Resultados: A força do recurso e a precisão da divisão do algoritmo de decisão do conjunto aproximado mostraram boa precisão. O modelo construído por esse método pode descrever bem a relação entre JMA e MD. Conclusão: O algoritmo de decisão de conjunto aproximado proposto pode descrever a relação entre JMA e MD de forma científica e eficaz, o que forneceu valor de referência para a área de esportes. Nível de evidência II; Estudos terapêuticos- investigação dos resultados do tratamento.
RESUMEN Introducción: Con el desarrollo continuo de la sociedad y la mejora continua del nivel económico, la disposición del pueblo chino a participar en deportes también está mostrando una tendencia al alza. Sin embargo, cómo reducir el daño deportivo tanto como sea posible durante el ejercicio debería ser un tema candente de especial preocupación para los atletas en el mundo del deporte. Objetivo: Su objetivo era discutir la simulación de la relación entre la amplitud del movimiento articular (JMA) y el daño por movimiento (MD) a través de un algoritmo de toma de decisiones de conjunto aproximado, para evitar MD. Con base en el algoritmo de decisión de conjunto aproximado, se construyeron modelos JMA y MD, y se estableció una tabla de decisión de datos de movimiento. Métodos: Se establecieron los parámetros de cambio de la articulación y las condiciones de restricción, y se analizaron los parámetros de cambio de la articulación. Además, se analizaron los parámetros cambiantes, la fuerza de la característica y la precisión de la partición del algoritmo del modelo de simulación. Resultados: La fuerza de la característica y la precisión de la división del algoritmo de decisión de conjunto aproximado mostraron una buena precisión. El modelo construido por tal método puede describir bien la relación entre JMA y MD. Conclusión: El algoritmo de decisión de conjunto aproximado propuesto puede describir la relación entre JMA y MD de manera científica y efectiva, lo que proporcionó un valor de referencia para el campo de los deportes. Nivel de evidencia II; Estudios terapéuticos- investigación de los resultados del tratamiento.
Subject(s)
Humans , Exercise Movement Techniques , Joints/physiology , Athletic Injuries/prevention & control , AlgorithmsABSTRACT
Objective To establish a lung cancer risk prediction model using data mining technology and compare the performance of decision tree C5.0 and artificial neural networks in the application of risk prediction model, and to explore the value of data mining techniques in lung cancer risk prediction. Methods We collected the data of 180 patients with lung cancer and 240 patients with benign lung lesion which contained 17 variables of risk factors and clinical symptoms. Decision tree C5.0 and artificial neural networks models were established to compare the prediction performance. Results There were 420 valid samples collected in total and proportioned with the ratio of 7:3 for the training set and testing set. The accuracy, sensitivity, specificity, Youden index, positive predictive value, negative predictive value and AUC of artificial neural networks model were 65.3%, 61.7%, 73.3%, 0.350, 54.9%, 73.1% and 0.675 (95%CI: 0.628-0.720) in testing set; those of decision tree C5.0 model were 61.0%, 47.8%, 80.4%, 0.282, 35.3%, 80.6% and 0.641 (95%CI: 0.593-0.687) in testing set. Conclusion The artificial neural networks model is superior to the decision tree C5.0 model at overall performance and it has potential application value in the risk prediction of lung cancer.
ABSTRACT
ABSTRACT: Objectives: Healthcare workers (HCWs) have a high risk of acquiring tuberculosis infection (TBI). However, annual testing is resource-consuming. We aimed to develop a predictive model to identify HCWs best targeted for TBI screening. Methodology: We conducted a secondary analysis of previously published results of 708 HCWs working in primary care services in five Brazilian State capitals who underwent two TBI tests: tuberculin skin test and Quantiferon®-TB Gold in-tube. We used a classification and regression tree (CART) model to predict HCWs with negative results for both tests. The performance of the model was evaluated using the receiver operating characteristics (ROC) curve and the area under the curve (AUC), cross-validated using the same dataset. Results: Among the 708 HCWs, 247 (34.9%) had negative results for both tests. CART identified that physician or a community health agent were twice more likely to be uninfected (probability = 0.60) than registered or aid nurse (probability = 0.28) when working less than 5.5 years in the primary care setting. In cross validation, the predictive accuracy was 68% [95% confidence interval (95%CI): 65 - 71], AUC was 62% (95%CI 58 - 66), specificity was 78% (95%CI 74 - 81), and sensitivity was 44% (95%CI 38 - 50). Conclusion: Despite the low predictive power of this model, CART allowed to identify subgroups with higher probability of having both tests negative. The inclusion of new information related to TBI risk may contribute to the construction of a model with greater predictive power using the same CART technique.
RESUMO: Objetivos: Desenvolver um modelo preditivo para identificar profissionais de saúde com maior probabilidade de resultado negativo para dois testes de diagnóstico da infecção latente por Mycobacterium tuberculosis (ILTB). Métodos: Foi realizada uma análise secundária dos resultados publicados anteriormente de 708 profissionais de saúde da atenção primária, de cinco capitais brasileiras, submetidos à prova tuberculínica e ao Quantiferon®-TB Gold in-tube. Um modelo preditivo com árvore de classificação e regressão (CART, Classification and regression tree) foi construído. A avaliação do desempenho foi realizada por meio da análise receiver operating characteristics (ROC) e area under the curve (AUC). Utilizamos o mesmo banco de dados para validação cruzada do modelo. Resultados: Entre os 708 profissionais de saúde, 247 (34,9%) apresentaram resultado negativo para os testes. A CART identificou que os médicos e agentes comunitários de saúde apresentaram duas vezes mais chances de não estarem infectados (probabilidade = 0,60) que os enfermeiros e técnicos/auxiliares de enfermagem (probabilidade = 0,28) nos casos com menos de 5,5 anos de atuação na atenção primária. Na validação cruzada, a acurácia do modelo preditivo foi de 68% [intervalo de confiança de 95% (IC95%) 65 - 71)], AUC de 62% (IC95% 58 - 66), especificidade de 78% (IC95% 74 - 81) e sensibilidade de 44% (IC95% 38 - 50). Conclusão: Apesar do baixo poder preditivo do modelo, a CART permitiu identificar subgrupos com maior probabilidade de terem ambos os testes negativos. A inclusão de novas informações relacionadas ao risco de ILTB pode contribuir para a construção de um modelo com maior poder preditivo utilizando a mesma técnica.
Subject(s)
Humans , Tuberculosis/diagnosis , Tuberculosis/epidemiology , Brazil/epidemiology , Cross-Sectional Studies , Risk Factors , Health PersonnelABSTRACT
ABSTRACT Objective To explore an artificial intelligence approach based on gradient-boosted decision trees for prediction of all-cause mortality at an intensive care unit, comparing its performance to a recent logistic regression system in the literature, and a logistic regression model built on the same platform. Methods A gradient-boosted decision trees model and a logistic regression model were trained and tested with the Medical Information Mart for Intensive Care database. The 1-hour resolution physiological measurements of adult patients, collected during 5 hours in the intensive care unit, consisted of eight routine clinical parameters. The study addressed how the models learn to categorize patients to predict intensive care unit mortality or survival within 12 hours. The performance was evaluated with accuracy statistics and the area under the Receiver Operating Characteristic curve. Results The gradient-boosted trees yielded an area under the Receiver Operating Characteristic curve of 0.89, compared to 0.806 for the logistic regression. The accuracy was 0.814 for the gradient-boosted trees, compared to 0.782 for the logistic regression. The diagnostic odds ratio was 17.823 for the gradient-boosted trees, compared to 9.254 for the logistic regression. The Cohen's kappa, F-measure, Matthews correlation coefficient, and markedness were higher for the gradient-boosted trees. Conclusion The discriminatory power of the gradient-boosted trees was excellent. The gradient-boosted trees outperformed the logistic regression regarding intensive care unit mortality prediction. The high diagnostic odds ratio and markedness values for the gradient-boosted trees are important in the context of the studied unbalanced dataset.
RESUMO Objetivo Explorar uma abordagem de inteligência artificial baseada em árvores de decisão impulsionadas por gradiente para previsão de mortalidade por todas as causas em unidade de terapia intensiva, comparando seu desempenho com um sistema de regressão logística recente na literatura e um modelo de regressão logística construído na mesma plataforma. Métodos Foram desenvolvidos um modelo de árvores impulsionadas por gradiente e um modelo de regressão logística, treinados e testados com o banco de dados Medical Information Mart for Intensive Care. As medidas fisiológicas de pacientes adultos com resolução de 1 hora, coletadas durante 5 horas na unidade de terapia intensiva, consistiram em oito parâmetros clínicos de rotina. Estudou-se como os modelos aprendem a categorizar os pacientes para prever a mortalidade ou a sobrevida, em unidades de terapia intensiva, em 12 horas. O desempenho foi avaliado por meio de estatísticas de acurácia e pela área sob a curva Característica de Operação do Receptor. Resultados As árvores impulsionadas por gradiente produziram área sob a curva Característica de Operação do Receptor de 0,89, em comparação com 0,806 para a regressão logística. A acurácia foi de 0,814 para as árvores impulsionadas por gradiente, em comparação com 0,782 para a regressão logística. A razão de chances de diagnóstico foi de 17,823 para as árvores impulsionadas por gradiente, em comparação a 9,254 para a regressão logística. O kappa de Cohen, a medida F, o coeficiente de correlação de Matthews e a marcação foram maiores para as árvores impulsionadas por gradiente. Conclusão O poder discriminatório das árvores impulsionadas por gradiente foi excelente. As árvores impulsionadas por gradiente superaram a regressão logística em relação à previsão de mortalidade em unidade de terapia intensiva. A alta razão de chances de diagnóstico e os valores de marcação para as árvores impulsionadas por gradiente são importantes no contexto do conjunto de dados não balanceados estudado.
Subject(s)
Humans , Adult , Artificial Intelligence , Machine Learning , Logistic Models , ROC Curve , Hospital Mortality , Intensive Care UnitsABSTRACT
Abstract Objectives This study aimed to investigate patterns and risk factors related to the feasibility of achieving technical quality and periapical healing in root canal non-surgical retreatment, using regression and data mining methods. Methodology This retrospective observational study included 321 consecutive patients presenting for root canal retreatment. Patients were treated by graduate students, following standard protocols. Data on medical history, diagnosis, treatment, and follow-up visits variables were collected from physical records and periapical radiographs and transferred to an electronic chart database. Basic statistics were tabulated, and univariate and multivariate analytical methods were used to identify risk factors for technical quality and periapical healing. Decision trees were generated to predict technical quality and periapical healing patterns using the J48 algorithm in the Weka software. Results Technical outcome was satisfactory in 65.20%, and we observed periapical healing in 80.50% of the cases. Several factors were related to technical quality, including severity of root curvature and altered root canal morphology (p<0.05). Follow-up periods had a mean of 4.05 years. Periapical lesion area, tooth type, and apical resorption proved to be significantly associated with retreatment failure (p<0.05). Data mining analysis suggested that apical root resorption might prevent satisfactory technical outcomes even in teeth with straight root canals. Also, large periapical lesions and poor root filling quality in primary endodontic treatment might be related to healing failure. Conclusion Frequent patterns and factors affecting technical outcomes of endodontic retreatment included root canal morphological features and its alterations resulting from primary endodontic treatment. Healing outcomes were mainly associated with the extent of apical periodontitis pathological damages in dental and periapical tissues. To determine treatment predictability, we suggest patterns including clinical and radiographic features of apical periodontitis and technical quality of primary endodontic treatment.
Subject(s)
Humans , Periapical Periodontitis , Dental Pulp Cavity/diagnostic imaging , Root Canal Therapy , Retrospective Studies , Retreatment , Data MiningABSTRACT
ABSTRACT Objective: To generate a Classification Tree for the correct inference of the Nursing Diagnosis Fluid Volume Excess (00026) in chronic renal patients on hemodialysis. Method: Methodological, cross-sectional study with patients undergoing renal treatment. The data were collected through interviews and physical evaluation, using an instrument with socio-demographic variables, related factors, associated conditions and defining characteristics of the studied diagnosis. The classification trees were generated by the Chi-Square Automation Interaction Detection method, which was based on the Chi-square test. Results: A total of 127 patients participated, of which 79.5% (101) presented the diagnosis studied. The trees included the elements "Excessive sodium intake" and "Input exceeds output", which were significant for the occurrence of the event, as the probability of occurrence of the diagnosis in the presence of these was 0.87 and 0.94, respectively. The prediction accuracy of the trees was 63% and 74%, respectively. Conclusion: The construction of the trees allowed to quantify the probability of the occurrence of Fluid Volume Excess (00026) in the studied population and the elements "Excessive sodium intake" and "Input exceeds output" were considered predictors of this diagnosis in the sample.
RESUMEN Objetivo: Generar un Árbol de Clasificación para la inferencia correcta del Diagnóstico de Enfermería Volumen de Líquido Excesivo (00026) en pacientes renales crónicos que hacen hemodiálisis. Método: Se trata de un estudio metodológico transversal con pacientes en tratamiento renal. Los datos se recogieron mediante entrevistas y evaluación física, utilizando un instrumento con variables sociodemográficas, factores relacionados, condición asociada y características definidoras del diagnóstico estudiado. Los árboles de clasificación se generaron por el método Detección de Interacción Automática del Chi-cuadrado, basado en la prueba del Chi-cuadrado. Resultados: Participaron 127 pacientes, de los cuales el 79,5% (101) presentaba el diagnóstico mencionado; los árboles incluían los elementos "Ingestión excesiva de sodio" e "Ingestión superior a la eliminación", ambos significativos para el acaecimiento del evento. Los pacientes con estos indicadores tenían probabilidades de presentar el diagnóstico de 0,87 y 0,94, y la capacidad de predicción de los árboles era del 63% y 74%, respectivamente. Conclusión: La construcción de los árboles ha permitido cuantificar la probabilidad del acaecimiento del Volumen de Líquido Excesivo (00026) en la población estudiada. Los elementos "Ingestión excesiva de sodio" e "Ingestión superior a la eliminación" están considerados como premonitores del referido diagnóstico en la muestra.
RESUMO Objetivo: Gerar Árvore de Classificação para correta inferência do Diagnóstico de Enfermagem Volume de Líquido Excessivo (00026) em pacientes renais crônicos hemodialíticos. Método: Estudo metodológico, transversal, com pacientes em tratamento renal. Os dados foram coletados por meio de entrevista e avaliação física, utilizando instrumento com variáveis sociodemográficas, fatores relacionados, condição associada e características definidoras do Diagnóstico estudado. As árvores de classificação foram geradas pelo método Chi-Square Automation Interaction Detection, que se baseou no teste do Qui-quadrado. Resultados: Participaram 127 pacientes. Apresentaram o referido diagnóstico 79,5% (101), e as árvores incluíram os elementos "Ingesta excessiva de sódio" e "Ingestão maior que a eliminação" significativos para ocorrência do evento. Os pacientes com esses indicadores tiveram probabilidade de apresentar o diagnóstico de 0.87 e 0.94, e a capacidade de predição das árvores foi de 63% e 74%, respectivamente. Conclusão: A construção das árvores permitiu quantificar a probabilidade de ocorrência de Volume de Líquido Excessivo (00026) na população estudada. Os elementos "Ingesta excessiva de sódio" e "Ingestão maior que a eliminação" foram considerados preditores do referido diagnóstico na amostra.
Subject(s)
Nursing Diagnosis , Decision Making , Renal Insufficiency, Chronic , Decision Trees , Classification , Validation StudyABSTRACT
A leptospirose se relaciona a problemas de saneamento ambiental, com incremento de casos em períodos de inundações. Levando-se em consideração as questões relacionadas a mudanças climáticas, as inundações tendem a um aumento. As inundações não atingem as populações de maneira homogênea, em geral os menos favorecidos em termos socioeconômicos são os mais acometidos. Para saber se o número de inundações aumentaria a incidência de leptospirose e sua relação com as variáveis contextuais, utilizou-se dados socioeconômicos, ambientais e de ocorrência da doença no nível municipal. Os municípios que tinham problemas no esgotamento sanitário apresentaram maior risco para a ocorrência da leptospirose. O total de inundações adquirida a partir da decretação pela autoridade municipal constituiu um importante marcador de risco para a ocorrência de leptospirose. A modelagem de árvore de regressão mostrou-se útil para estimar a ocorrência de leptospirose no Brasil.
Leptospirosis is related to problems with environmental sanitation, and the incidence tends to increase during flood periods. Considering issues related to climate change, floods can be expected to increase. Floods do not affect populations homogeneously, and communities with worse socioeconomic conditions tend to be impacted more heavily. In order to determine whether the number of floods increases the incidence of leptospirosis and its relationship to contextual variables, the study used socioeconomic, environmental, and disease occurrence data at the municipal (county) level. Municipalities suffering problems with sewage disposal showed a higher risk of leptospirosis incidence. Total flooding since the municipality's declaration of flood emergency was an important risk marker for leptospirosis incidence. Regression tree modeling proved useful for estimating leptospirosis incidence in Brazil.
La leptospirosis se relaciona con problemas de saneamiento ambiental, así como con el incremento de casos en períodos de inundaciones. Teniendo en consideración las cuestiones relacionadas con el cambio climático, las inundaciones tienden a aumentar. Las inundaciones no afectan a las poblaciones de manera homogénea, en general, los menos favorecidos en términos socioeconómicos son los más afectados. Para saber si el número de inundaciones aumentaría la incidencia de leptospirosis, y su relación con variables contextuales, se utilizaron datos socioeconómicos, ambientales y de ocurrencia de la enfermedad en el nivel municipal. Los municipios que poseían problemas en el alcantarillado sanitario presentaron un mayor riesgo para la ocurrencia de leptospirosis. El total de inundaciones sufridas a partir de su reconocimiento oficial por parte de la autoridad municipal constituyó un importante marcador de riesgo para la ocurrencia de leptospirosis. El modelo de árbol de regresión se mostró útil para estimar la ocurrencia de leptospirosis en Brasil.
Subject(s)
Humans , Floods , Leptospirosis/epidemiology , Brazil/epidemiology , Cities/epidemiology , Data MiningABSTRACT
Abstract The aim of this study was to construct a predictive model that uses classification tree statistical analysis to predict the occurrence of temporomandibular disorder, by dividing the sample into groups of high and low risk for the development of the disease. The use of predictive statistical approaches that facilitate the process of recognizing and/or predicting the occurrence of temporomandibular disorder is of interest to the scientific community, for the purpose of providing patients with more adequate solutions in each case. This was a cross-sectional analytical population-based study that involved a sample of 776 individuals who had sought medical or dental attendance at the Family Health Units in Recife, PE, Brazil. The sample was submitted to anamnesis using the instrument Research Diagnostic Criteria for Temporomandibular Disorders. The data were inserted into the software Statistical Package for the Social Sciences 20.0 and analyzed by the Pearson Chi-square test for bivariate analysis, and by the classification tree method for the multivariate analysis. Temporomandibular disorder could be predicted by orofacial pain, age and depression. The high-risk group was composed of individuals with orofacial pain, those between the ages of 25 and 59 years and those who presented depression. The low risk group was composed of individuals without orofacial pain. The authors were able to conclude that the best predictor for temporomandibular disorder was orofacial pain, and that the predictive model proposed by the classification tree could be applied as a tool for simplifying decision making relative to the occurrence of temporomandibular disorder.
Resumo O objetivo deste estudo foi construir um modelo preditivo que utiliza a análise estatística de árvore de classificação para predizer a ocorrência de disfunção temporomandibular, dividindo a amostra em grupos de alto e baixo risco para o desenvolvimento da doença. A utilização de abordagens estatísticas preditivas que facilitem o processo de reconhecimento e / ou previsão da ocorrência de disfunção temporomandibular é de interesse da comunidade científica, com o objetivo de fornecer aos pacientes soluções mais adequadas a cada caso. Trata-se de um estudo transversal analítico de base populacional que envolveu uma amostra de 776 indivíduos que procuraram atendimento médico ou odontológico nas Unidades de Saúde da Família de Recife, PE, Brasil. A amostra foi submetida à anamnese por meio do instrumento Research Diagnostic Criteria for Temporomandibular Disorders. Os dados foram inseridos no software Statistical Package for the Social Sciences 20.0 e analisados pelo teste Qui-quadrado de Pearson para análise bivariada e pelo método de árvore de classificação para análise multivariada. A desordem temporomandibular pode ser prevista pela presença da dor orofacial, idade e depressão. O grupo de alto risco foi composto por indivíduos com dor orofacial, entre 25 e 59 anos e que apresentavam depressão. O grupo de baixo risco foi composto por indivíduos sem dor orofacial. Os autores puderam concluir que o melhor preditor para a disfunção temporomandibular foi a dor orofacial e que o modelo preditivo proposto pela árvore de classificação pode ser aplicado como ferramenta para simplificar a tomada de decisão em relação à ocorrência de disfunção temporomandibular.
Subject(s)
Humans , Adult , Middle Aged , Facial Pain , Temporomandibular Joint Disorders , Brazil , Cross-Sectional Studies , Risk FactorsABSTRACT
Background: The medical researchers are developing different non-invasive methods for early detection of Neurodegenerative Diseases (NDDs) when pharmacological interventions are still possible to further prevent the disease progression. The NDDs are associated with the degradation in the complex gait dynamicsand motor activity. The classification ofgait data using machine learning techniques can assist the physiciansfor early diagnosis of the neural disorder when clinical manifestation of the diseases is not yet apparent. Aims: The present study was undertaken to classify the control and NDD subjects using decision trees based classifiers (Random Forest (RF), J48 and REPTree).Methodology:The data used in the study comprises of 16 control, 20 Huntington’s Disease (HD), 15 Parkinson’s Disease (PD), and 13 Amyotrophic Lateral Sclerosis (ALS) subjects, which were taken from publicly available database from Physionet. The age range of control subjects was 20-74, HD subjects was 36-70, PD subjects was 44-80, and ALS subjects was 29-71. There were 13 attributes associated with the data. Important features/attributes of the data were selected using correlation feature selection -subset evaluation (cfs) method. Three tree based machine learning algorithms (RF, J48 and REPTree) were used to classify the control and NDD subjects. The performance of classifiers were evaluated using Precision, Recall, F-Measure, MAE and RMSE.Results:In order to evaluate the performance of tree based classifiers, two different settings of data i.e. complete features and selected featureswere used. In classifying control vs HD subjects, RF provides the robust separation with classification accuracy of 84.79% using complete features and 83.94% using selected features. While in classifying control vs PD subjects, and control vs ALS subjects, RF also provides the best separation with classification accuracy of 86.51% and 94.95% respectively using complete features and 85.19% and 93.64% respectively using selected features.Conclusion:The variability analysis of physiological signals provides a valuable non-invasive tool for quantifying the system of dynamics of healthy subjects and to examine the alternations in the controlling mechanism of these systems with aging and disease. It is concluded that selected features encode adequate information about neural control of the gait. Moreover,the selected featuresalong with tree based machine learning algorithms can play a vital for early detection of NDDs, when pharmacological interventions are still possible
ABSTRACT
ABSTRACT Objectives: Describe a predictive model of hospitalization frequency for children and adolescents with chronic disease. Methods: A decision tree-based model was built using a database of 141 children and adolescents with chronic disease admitted to a federal public hospital; 18 variables were included and the frequency of hospitalization was defined as the outcome. Results: The decision tree obtained in this study could properly classify 80.85% of the participants. Model reading provided an understanding that situations of greater vulnerability such as unemployment, low income, and limited or lack of family involvement in care were predictors of a higher frequency of hospitalization. Conclusions: The model suggests that nursing professionals should adopt prevention actions for modifiable factors and authorities should make investments in health promotion for non-modifiable factors. It also enhances the debate about differentiated care to these patients.
RESUMEN Objetivos: Describir un modelo predictor de frecuencia de internación hospitalaria para niños y adolescentes con enfermedades crónicas. Métodos: Se construyó un modelo basado en árboles de decisión, utilizando un banco de datos de 141 niños y adolescentes con enfermedades crónicas internados en hospital público federal. Para elaborar el modelo fueron consideradas 18 variables, la frecuencia de internación fue definida como desenlace. Resultados: Se obtuvo un árbol de decisiones capaz de clasificar correctamente al 80,85% de los participantes. La lectura del modelo permitió entender que las situaciones de mayor vulnerabilidad, como desempleo, bajos ingresos, restricciones y ausencia de compromiso familiar para el cuidado, actuaron como predictoras de mayor frecuencia de internación hospitalaria. Conclusiones: El modelo sugiere a la enfermería y equipo acciones preventivas para aquellos factores modificables, e inversión en promoción de salud para los factores no modificables; fortaleciendo también el debate sobre el cuidado diferenciado para esta población.
RESUMO Objetivos: Descrever um modelo preditor de frequência de internação hospitalar para crianças e adolescentes com doença crônica. Métodos: Foi construído um modelo baseado em árvore de decisão, a partir do banco de dados de 141 crianças e adolescentes, com doença crônica, internados em um hospital público federal. Para construção do modelo, foram incluídas 18 variáveis e a frequência de internação foi definida como desfecho. Resultados: Obteve-se uma árvore de decisão capaz de classificar corretamente 80,85% dos participantes. A leitura do modelo proporcionou o entendimento de que as situações de maior vulnerabilidade, como desemprego, baixa renda, restrições e ausência de envolvimento da família no cuidado, foram preditoras da maior frequência de internação hospitalar. Conclusões: O modelo sugere à enfermagem e equipe ações de prevenção para os fatores modificáveis e investimentos em promoção à saúde para os fatores não modificáveis e fortalece o debate sobre o cuidado diferenciado para esse público.
ABSTRACT
ABSTRACT Objectives: Describe a predictive model of hospitalization frequency for children and adolescents with chronic disease. Methods: A decision tree-based model was built using a database of 141 children and adolescents with chronic disease admitted to a federal public hospital; 18 variables were included and the frequency of hospitalization was defined as the outcome. Results: The decision tree obtained in this study could properly classify 80.85% of the participants. Model reading provided an understanding that situations of greater vulnerability such as unemployment, low income, and limited or lack of family involvement in care were predictors of a higher frequency of hospitalization. Conclusions: The model suggests that nursing professionals should adopt prevention actions for modifiable factors and authorities should make investments in health promotion for non-modifiable factors. It also enhances the debate about differentiated care to these patients.
RESUMEN Objetivos: Describir un modelo predictor de frecuencia de internación hospitalaria para niños y adolescentes con enfermedades crónicas. Métodos: Se construyó un modelo basado en árboles de decisión, utilizando un banco de datos de 141 niños y adolescentes con enfermedades crónicas internados en hospital público federal. Para elaborar el modelo fueron consideradas 18 variables, la frecuencia de internación fue definida como desenlace. Resultados: Se obtuvo un árbol de decisiones capaz de clasificar correctamente al 80,85% de los participantes. La lectura del modelo permitió entender que las situaciones de mayor vulnerabilidad, como desempleo, bajos ingresos, restricciones y ausencia de compromiso familiar para el cuidado, actuaron como predictoras de mayor frecuencia de internación hospitalaria. Conclusiones: El modelo sugiere a la enfermería y equipo acciones preventivas para aquellos factores modificables, e inversión en promoción de salud para los factores no modificables; fortaleciendo también el debate sobre el cuidado diferenciado para esta población.
RESUMO Objetivos: Descrever um modelo preditor de frequência de internação hospitalar para crianças e adolescentes com doença crônica. Métodos: Foi construído um modelo baseado em árvore de decisão, a partir do banco de dados de 141 crianças e adolescentes, com doença crônica, internados em um hospital público federal. Para construção do modelo, foram incluídas 18 variáveis e a frequência de internação foi definida como desfecho. Resultados: Obteve-se uma árvore de decisão capaz de classificar corretamente 80,85% dos participantes. A leitura do modelo proporcionou o entendimento de que as situações de maior vulnerabilidade, como desemprego, baixa renda, restrições e ausência de envolvimento da família no cuidado, foram preditoras da maior frequência de internação hospitalar. Conclusões: O modelo sugere à enfermagem e equipe ações de prevenção para os fatores modificáveis e investimentos em promoção à saúde para os fatores não modificáveis e fortalece o debate sobre o cuidado diferenciado para esse público.
ABSTRACT
Crop harvest scheduling and profits and losses predications require strategies that estimate crop yield. This work aimed to investigate the contribution of phenological variables using path analysis and remote sensing techniques on cotton boll yield and to generate a model using decision trees that help predict cotton boll yield. The sampling field was installed in Chapadão do Céu, in an area of 90 ha. The following phenological variables were evaluated at 30 sample points: plant height at 26, 39, 51, 68, 82, 107, 128, and 185 days after emergence (DAE); number of floral buds at 68, 81, 107, 128, and 185 DAE; number of bolls at 185 DAE; Rededge vegetation index at 23, 35, 53, 91, and 168 DAE; and cotton boll yield. The main variables that can be used to predict cotton boll yield are the number of floral buds (at 107 days after emergence) and the Rededge vegetation index (at 53 and 91 days after emergence). To obtain higher cotton boll yields, the Rededge vegetation index must be greater than 39 at 53 days after emergence, and the plant must present at least 14 floral buds at 107 days after emergence.
O escalonamento de colheitas e a previsão de ganhos e perdas requerem estratégias que estimam a produtividade das culturas. Este trabalho teve como objetivo investigar a contribuição de variáveis fenológicas utilizando técnicas de análise de trilha e sensoriamento remoto sobre a produtividade de algodão em caroço e gerar um modelo utilizando árvores de decisão que ajudam a prever esta variável. O campo de amostragem foi instalado em Chapadão do Céu, em uma área de 90 ha. As seguintes variáveis fenológicas foram avaliadas em 30 pontos amostrais: altura das plantas aos 26, 39, 51, 68, 82, 107, 128 e 185 dias após a emergência (DAE); número de gemas florais aos 68, 81, 107, 128 e 185 DAE; número de cápsulas a 185 DAE; Índice de vegetação Rededge em 23, 35, 53, 91 e 168 DAE; e produção de algodão em caroço. As principais variáveis que podem ser utilizadas para prever a produção de caroço de algodão são o número de gemas florais (aos 107 dias após a emergência) e o índice de vegetação de Rededge (aos 53 e 91 dias após a emergência). Para obter maiores produtividades de algodão, o índice de vegetação de Rededge deve ser superior a 39 aos 53 dias após a emergência e a planta deve apresentar pelo menos 14 gemas florais aos 107 dias após a emergência.