Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(14)2023 Jul 16.
Artigo em Inglês | MEDLINE | ID: mdl-37514736

RESUMO

Continuous monitoring of patients involves collecting and analyzing sensory data from a multitude of sources. To overcome communication overhead, ensure data privacy and security, reduce data loss, and maintain efficient resource usage, the processing and analytics are moved close to where the data are located (e.g., the edge). However, data quality (DQ) can be degraded because of imprecise or malfunctioning sensors, dynamic changes in the environment, transmission failures, or delays. Therefore, it is crucial to keep an eye on data quality and spot problems as quickly as possible, so that they do not mislead clinical judgments and lead to the wrong course of action. In this article, a novel approach called federated data quality profiling (FDQP) is proposed to assess the quality of the data at the edge. FDQP is inspired by federated learning (FL) and serves as a condensed document or a guide for node data quality assurance. The FDQP formal model is developed to capture the quality dimensions specified in the data quality profile (DQP). The proposed approach uses federated feature selection to improve classifier precision and rank features based on criteria such as feature value, outlier percentage, and missing data percentage. Extensive experimentation using a fetal dataset split into different edge nodes and a set of scenarios were carefully chosen to evaluate the proposed FDQP model. The results of the experiments demonstrated that the proposed FDQP approach positively improved the DQ, and thus, impacted the accuracy of the federated patient similarity network (FPSN)-based machine learning models. The proposed data-quality-aware federated PSN architecture leveraging FDQP model with data collected from edge nodes can effectively improve the data quality and accuracy of the federated patient similarity network (FPSN)-based machine learning models. Our profiling algorithm used lightweight profile exchange instead of full data processing at the edge, which resulted in optimal data quality achievement, thus improving efficiency. Overall, FDQP is an effective method for assessing data quality in the edge computing environment, and we believe that the proposed approach can be applied to other scenarios beyond patient monitoring.


Assuntos
Algoritmos , Confiabilidade dos Dados , Humanos , Conscientização , Comunicação , Poder Psicológico
2.
J Pers Med ; 12(5)2022 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-35629190

RESUMO

Precision medicine can be defined as the comparison of a new patient with existing patients that have similar characteristics and can be referred to as patient similarity. Several deep learning models have been used to build and apply patient similarity networks (PSNs). However, the challenges related to data heterogeneity and dimensionality make it difficult to use a single model to reduce data dimensionality and capture the features of diverse data types. In this paper, we propose a multi-model PSN that considers heterogeneous static and dynamic data. The combination of deep learning models and PSN allows ample clinical evidence and information extraction against which similar patients can be compared. We use the bidirectional encoder representations from transformers (BERT) to analyze the contextual data and generate word embedding, where semantic features are captured using a convolutional neural network (CNN). Dynamic data are analyzed using a long-short-term-memory (LSTM)-based autoencoder, which reduces data dimensionality and preserves the temporal features of the data. We propose a data fusion approach combining temporal and clinical narrative data to estimate patient similarity. The experiments we conducted proved that our model provides a higher classification accuracy in determining various patient health outcomes when compared with other traditional classification algorithms.

3.
IEEE Access ; 9: 74044-74067, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34812394

RESUMO

Cardio Vascular Diseases (CVD) is the leading cause of death globally and is increasing at an alarming rate, according to the American Heart Association's Heart Attack and Stroke Statistics-2021. This increase has been further exacerbated because of the current coronavirus (COVID-19) pandemic, thereby increasing the pressure on existing healthcare resources. Smart and Connected Health (SCH) is a viable solution for the prevalent healthcare challenges. It can reshape the course of healthcare to be more strategic, preventive, and custom-designed, making it more effective with value-added services. This research endeavors to classify state-of-the-art SCH technologies via a thorough literature review and analysis to comprehensively define SCH features and identify the enabling technology-related challenges in SCH adoption. We also propose an architectural model that captures the technological aspect of the SCH solution, its environment, and its primary involved stakeholders. It serves as a reference model for SCH acceptance and implementation. We reflected the COVID-19 case study illustrating how some countries have tackled the pandemic differently in terms of leveraging the power of different SCH technologies, such as big data, cloud computing, Internet of Things, artificial intelligence, robotics, blockchain, and mobile applications. In combating the pandemic, SCH has been used efficiently at different stages such as disease diagnosis, virus detection, individual monitoring, tracking, controlling, and resource allocation. Furthermore, this review highlights the challenges to SCH acceptance, as well as the potential research directions for better patient-centric healthcare.

4.
Comput Methods Programs Biomed ; 166: 137-154, 2018 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-30415713

RESUMO

BACKGROUND AND OBJECTIVES: Mobile and ubiquitous devices are everywhere, generating an exorbitant amount of data. New generations of healthcare systems are using mobile devices to continuously collect large amounts of different types of data from patients with chronic diseases. The challenge with such Mobile Big Data in general, is how to meet the growing performance demands of the mobile resources handling these tasks, while simultaneously minimizing their consumption. METHODS: This research proposes a scalable architecture for processing Mobile Big Data. The architecture is developed around three new algorithms for the effective use of resources in performing mobile data processing and analytics: mobile resources optimization, mobile analytics customization, and mobile offloading. The mobile resources optimization algorithm monitors resources and automatically switches off unused network connections and application services whenever resources are limited. The mobile analytics customization algorithm attempts to save energy by customizing the analytics processes through the implementation of some data-aware schemes. Finally, the mobile offloading algorithm uses some heuristics to intelligently decide whether to process data locally, or delegate it to a cloud back-end server. RESULTS: The three algorithms mentioned above are tested using Android-based mobile devices on real Electroencephalography (EEG) data streams retrieved from sensors and an online data bank. Results show that the three combined algorithms proved their effectiveness in optimizing the resources of mobile devices in handling, processing, and analyzing EEG data. CONCLUSION: We developed an energy-efficient model for Mobile Big Data which addressed key limitations in mobile device processing and analytics and reduced execution time and limited battery resources. This was supported with the development of three new algorithms for the effective use of resources, energy saving, parallel processing and analytics customization.


Assuntos
Big Data , Informática Médica/métodos , Telemedicina/métodos , Algoritmos , Computação em Nuvem , Computadores , Eletroencefalografia , Processamento Eletrônico de Dados , Humanos , Informática Médica/normas , Reprodutibilidade dos Testes , Software , Telemedicina/normas
5.
Comput Methods Programs Biomed ; 149: 79-94, 2017 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-28802332

RESUMO

BACKGROUND AND OBJECTIVES: Recent advances in miniature biomedical sensors, mobile smartphones, wireless communications, and distributed computing technologies provide promising techniques for developing mobile health systems. Such systems are capable of monitoring epileptic seizures reliably, which are classified as chronic diseases. Three challenging issues raised in this context with regard to the transformation, compression, storage, and visualization of big data, which results from a continuous recording of epileptic seizures using mobile devices. METHODS: In this paper, we address the above challenges by developing three new algorithms to process and analyze big electroencephalography data in a rigorous and efficient manner. The first algorithm is responsible for transforming the standard European Data Format (EDF) into the standard JavaScript Object Notation (JSON) and compressing the transformed JSON data to decrease the size and time through the transfer process and to increase the network transfer rate. The second algorithm focuses on collecting and storing the compressed files generated by the transformation and compression algorithm. The collection process is performed with respect to the on-the-fly technique after decompressing files. The third algorithm provides relevant real-time interaction with signal data by prospective users. It particularly features the following capabilities: visualization of single or multiple signal channels on a smartphone device and query data segments. RESULTS: We tested and evaluated the effectiveness of our approach through a software architecture model implementing a mobile health system to monitor epileptic seizures. The experimental findings from 45 experiments are promising and efficiently satisfy the approach's objectives in a price of linearity. Moreover, the size of compressed JSON files and transfer times are reduced by 10% and 20%, respectively, while the average total time is remarkably reduced by 67% through all performed experiments. CONCLUSIONS: Our approach successfully develops efficient algorithms in terms of processing time, memory usage, and energy consumption while maintaining a high scalability of the proposed solution. Our approach efficiently supports data partitioning and parallelism relying on the MapReduce platform, which can help in monitoring and automatic detection of epileptic seizures.


Assuntos
Eletroencefalografia , Processamento Eletrônico de Dados , Telemedicina , Algoritmos , Compressão de Dados , Humanos , Armazenamento e Recuperação da Informação , Estudos Prospectivos
6.
IEEE J Biomed Health Inform ; 21(2): 349-360, 2017 03.
Artigo em Inglês | MEDLINE | ID: mdl-26863682

RESUMO

Monitoring heart diseases often requires frequent measurements of electrocardiogram (ECG) signals at different periods of the day, and at different situations (e.g., traveling, and exercising). This can only be implemented using mobile devices in order to cope with mobility of patients under monitoring, thus supporting continuous monitoring practices. However, these devices are energy-aware, have limited computing resources (e.g., CPU speed and memory), and might lose network connectivity, which makes it very challenging to maintain a continuity of the monitoring episode. In this paper, we propose a mobile monitoring solution to cope with these challenges by compromising on the fly resources availability, battery level, and network intermittence. In order to solve this problem, first we divide the whole process into several subtasks such that each subtask can be executed sequentially either in the server or in the mobile or in parallel in both devices. Then, we developed a mathematical model that considers all the constraints and finds a dynamic programing solution to obtain the best execution path (i.e., which substep should be done where). The solution guarantees an optimum execution time, while considering device battery availability, execution and transmission time, and network availability. We conducted a series of experiments to evaluate our proposed approach using some key monitoring tasks starting from preprocessing to classification and prediction. The results we have obtained proved that our approach gives the best (lowest) running time for any combination of factors including processing speed, input size, and network bandwidth. Compared to several greedy but nonoptimal solutions, the execution time of our approach was at least 10 times faster and consumed 90% less energy.


Assuntos
Mineração de Dados/métodos , Processamento de Sinais Assistido por Computador , Telemedicina/métodos , Algoritmos , Eletrocardiografia Ambulatorial/métodos , Humanos , Aprendizado de Máquina
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...