Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Med Biol Eng Comput ; 59(6): 1325-1337, 2021 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-33987805

RESUMO

This work presents a novel and promising approach to the clinical management of acute stroke. Using machine learning techniques, our research has succeeded in developing accurate diagnosis and prediction real-time models from hemodynamic data. These models are able to diagnose stroke subtype with 30 min of monitoring, to predict the exitus during the first 3 h of monitoring, and to predict the stroke recurrence in just 15 min of monitoring. Patients with difficult access to a CT scan and all patients that arrive at the stroke unit of a specialized hospital will benefit from these positive results. The results obtained from the real-time developed models are the following: stroke diagnosis around 98% precision (97.8% sensitivity, 99.5% specificity), exitus prediction with 99.8% precision (99.8% Sens., 99.9% Spec.), and 98% precision predicting stroke recurrence (98% Sens., 99% Spec.). Graphical abstract depicting the complete process since a patient is monitored until the data collected is used to generate models.


Assuntos
Acidente Vascular Cerebral , Humanos , Aprendizado de Máquina , Acidente Vascular Cerebral/diagnóstico , Tomografia Computadorizada por Raios X
2.
BMC Bioinformatics ; 20(1): 491, 2019 Oct 11.
Artigo em Inglês | MEDLINE | ID: mdl-31601182

RESUMO

BACKGROUND: The analysis of health and medical data is crucial for improving the diagnosis precision, treatments and prevention. In this field, machine learning techniques play a key role. However, the amount of health data acquired from digital machines has high dimensionality and not all data acquired from digital machines are relevant for a particular disease. Primary Progressive Aphasia (PPA) is a neurodegenerative syndrome including several specific diseases, and it is a good model to implement machine learning analyses. In this work, we applied five feature selection algorithms to identify the set of relevant features from 18F-fluorodeoxyglucose positron emission tomography images of the main areas affected by PPA from patient records. On the other hand, we carried out classification and clustering algorithms before and after the feature selection process to contrast both results with those obtained in a previous work. We aimed to find the best classifier and the more relevant features from the WEKA tool to propose further a framework for automatic help on diagnosis. Dataset contains data from 150 FDG-PET imaging studies of 91 patients with a clinic prognosis of PPA, which were examined twice, and 28 controls. Our method comprises six different stages: (i) feature extraction, (ii) expertise knowledge supervision (iii) classification process, (iv) comparing classification results for feature selection, (v) clustering process after feature selection, and (vi) comparing clustering results with those obtained in a previous work. RESULTS: Experimental tests confirmed clustering results from a previous work. Although classification results for some algorithms are not decisive for reducing features precisely, Principal Components Analisys (PCA) results exhibited similar or even better performances when compared to those obtained with all features. CONCLUSIONS: Although reducing the dimensionality does not means a general improvement, the set of features is almost halved and results are better or quite similar. Finally, it is interesting how these results expose a finer grain classification of patients according to the neuroanatomy of their disease.


Assuntos
Biologia Computacional/métodos , Aprendizado de Máquina , Doenças Neurodegenerativas/classificação , Afasia Primária Progressiva/classificação , Afasia Primária Progressiva/diagnóstico , Afasia Primária Progressiva/diagnóstico por imagem , Feminino , Humanos , Masculino , Doenças Neurodegenerativas/diagnóstico , Doenças Neurodegenerativas/diagnóstico por imagem , Tomografia por Emissão de Pósitrons
3.
IEEE Trans Nanobioscience ; 16(8): 727-743, 2017 12.
Artigo em Inglês | MEDLINE | ID: mdl-28504945

RESUMO

3-D network-on-chip (NoC) systems are getting popular among the integrated circuit (IC) manufacturer because of reduced latency, heterogeneous integration of technologies on a single chip, high yield, and consumption of less interconnecting power. However, the addition of functional units in the -direction has resulted in higher on-chip temperature and appearance of local hotspots on the die. The increase in temperature degrades the performance, lifetime, and reliability, and increases the maintenance cost of 3-D ICs. To keep the heat within an acceptable limit, floorplanning is the widely accepted solution. Proper arrangement of functional units across different layers can lead to uniform thermal distribution in the chip. For systems with high density of elements, few hotspots cannot be eliminated in the floorplanning approach. To overcome, liquid microchannel cooling technology has emerged as an efficient and scalable solution for 3-D NoC. In this paper, we propose a novel hybrid algorithm combining both floorplanning, and liquid microchannel placement to alleviate the hotspots in high-density systems. A mathematical model is proposed to deal with heat transfer due to diffusion and convention. The proposed approach is independent of topology. Three different topologies: 3-D stacked homogeneous mesh architecture, 3-D stacked heterogeneous mesh architecture, and 3-D stacked ciliated mesh architecture are considered to check the effectiveness of the proposed algorithm in hotspot reduction. A thermal comparison is made with and without the proposed thermal management approach for the above architectures considered. It is observed that there is a significant reduction in on-chip temperature when the proposed thermal management approach is applied.


Assuntos
Algoritmos , Equipamentos e Provisões Elétricas , Modelos Genéticos , Nanotecnologia/métodos , Biomimética , Desenho de Equipamento , Termodinâmica
4.
J Biomed Inform ; 62: 136-47, 2016 08.
Artigo em Inglês | MEDLINE | ID: mdl-27260782

RESUMO

Prediction of symptomatic crises in chronic diseases allows to take decisions before the symptoms occur, such as the intake of drugs to avoid the symptoms or the activation of medical alarms. The prediction horizon is in this case an important parameter in order to fulfill the pharmacokinetics of medications, or the time response of medical services. This paper presents a study about the prediction limits of a chronic disease with symptomatic crises: the migraine. For that purpose, this work develops a methodology to build predictive migraine models and to improve these predictions beyond the limits of the initial models. The maximum prediction horizon is analyzed, and its dependency on the selected features is studied. A strategy for model selection is proposed to tackle the trade off between conservative but robust predictive models, with respect to less accurate predictions with higher horizons. The obtained results show a prediction horizon close to 40min, which is in the time range of the drug pharmacokinetics. Experiments have been performed in a realistic scenario where input data have been acquired in an ambulatory clinical study by the deployment of a non-intrusive Wireless Body Sensor Network. Our results provide an effective methodology for the selection of the future horizon in the development of prediction algorithms for diseases experiencing symptomatic crises.


Assuntos
Algoritmos , Doença Crônica , Simulação por Computador , Previsões , Humanos , Avaliação de Sintomas
5.
Sensors (Basel) ; 15(7): 15419-42, 2015 Jun 30.
Artigo em Inglês | MEDLINE | ID: mdl-26134103

RESUMO

Migraine is one of the most wide-spread neurological disorders, and its medical treatment represents a high percentage of the costs of health systems. In some patients, characteristic symptoms that precede the headache appear. However, they are nonspecific, and their prediction horizon is unknown and pretty variable; hence, these symptoms are almost useless for prediction, and they are not useful to advance the intake of drugs to be effective and neutralize the pain. To solve this problem, this paper sets up a realistic monitoring scenario where hemodynamic variables from real patients are monitored in ambulatory conditions with a wireless body sensor network (WBSN). The acquired data are used to evaluate the predictive capabilities and robustness against noise and failures in sensors of several modeling approaches. The obtained results encourage the development of per-patient models based on state-space models (N4SID) that are capable of providing average forecast windows of 47 min and a low rate of false positives.


Assuntos
Transtornos de Enxaqueca/diagnóstico , Modelos Estatísticos , Monitorização Ambulatorial/métodos , Tecnologia de Sensoriamento Remoto/métodos , Algoritmos , Eletrocardiografia Ambulatorial , Desenho de Equipamento , Feminino , Hemodinâmica , Humanos , Transtornos de Enxaqueca/fisiopatologia , Reprodutibilidade dos Testes , Temperatura Cutânea
6.
J Biomed Inform ; 48: 183-92, 2014 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-24407050

RESUMO

Chronic patients must carry out a rigorous control of diverse factors in their lives. Diet, sport activity, medical analysis or blood glucose levels are some of them. This is a hard task, because some of these controls are performed very often, for instance some diabetics measure their glucose levels several times every day, or patients with chronic renal disease, a progressive loss in renal function, should strictly control their blood pressure and diet. In order to facilitate this task to both the patient and the physician, we have developed a web application for chronic diseases control which we have particularized to diabetes. This system, called glUCModel, improves the communication and interaction between patients and doctors, and eventually the quality of life of the former. Through a web application, patients can upload their personal and medical data, which are stored in a centralized database. In this way, doctors can consult this information and have a better control over patient records. glUCModel also presents three novelties in the disease management: a recommender system, an e-learning course and a module for automatic generation of glucose levels model. The recommender system uses Case Based Reasoning. It provides automatic recommendations to the patient, based on the recorded data and physician preferences, to improve their habits and knowledge about the disease. The e-learning course provides patients a space to consult information about the illness, and also to assess their own knowledge about the disease. Blood glucose levels are modeled by means of evolutionary computation, allowing to predict glucose levels using particular features of each patient. glUCModel was developed as a system where a web layer allows the access of the users from any device connected to the Internet, like desktop computers, tablets or mobile phones.


Assuntos
Glicemia/análise , Diabetes Mellitus/sangue , Diabetes Mellitus/terapia , Algoritmos , Doença Crônica , Simulação por Computador , Bases de Dados Factuais , Humanos , Informática Médica , Modelos Teóricos , Monitorização Fisiológica/métodos , Reprodutibilidade dos Testes , Autocuidado , Software
7.
Sensors (Basel) ; 12(8): 10659-77, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-23112621

RESUMO

Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.


Assuntos
Inteligência Artificial , Redes de Comunicação de Computadores , Conservação de Recursos Energéticos , Algoritmos , Cidades
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...