Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
Heliyon ; 9(9): e19195, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37681141

RESUMO

The COVID-19 pandemic has had far-reaching consequences globally, including a significant loss of lives, escalating unemployment rates, economic instability, deteriorating mental well-being, social conflicts, and even political discord. Vaccination, recognized as a pivotal measure in mitigating the adverse effects of COVID-19, has evoked a diverse range of sentiments worldwide. In particular, numerous users on social media platforms have expressed concerns regarding vaccine availability and potential side effects. Therefore, it is imperative for governmental authorities and senior health policy strategists to gain insights into the public's perspectives on vaccine mandates in order to effectively implement their vaccination initiatives. Despite the critical importance of comprehending the underlying factors influencing COVID-19 vaccine sentiment, the existing literature offers limited research studies on this subject matter. This paper presents an innovative methodology that harnesses Twitter data to extract sentiment pertaining to COVID-19 vaccination through the utilization of Artificial Intelligence techniques such as sentiment analysis, entity detection, linear regression, and logistic regression. The proposed methodology was applied and tested on live Twitter feeds containing COVID-19 vaccine-related tweets, spanning from February 14, 2021, to April 2, 2023. Notably, this approach successfully processed tweets in 45 languages originating from over 100 countries, enabling users to select from an extensive scenario space of approximately 3.55 × 10249 possible scenarios. By selecting specific scenarios, the proposed methodology effectively identified numerous determinants contributing to vaccine sentiment across iOS, Android, and Windows platforms. In comparison to previous studies documented in the existing literature, the presented solution emerges as the most robust in detecting the fundamental drivers of vaccine sentiment and demonstrates the vaccination sentiments over a substantially longer period exceeding 24 months.

2.
J Thorac Cardiovasc Surg ; 148(5): 1850-1855.e2, 2014 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-24655903

RESUMO

OBJECTIVES: To update the Australian System for Cardiac Operative Risk Evaluation (AusSCORE) model for operative estimation of 30-day mortality risk after isolated coronary artery bypass grafting in the Australian population. METHODS: Data were collected by the Australian and New Zealand Society of Cardiac and Thoracic Surgeons registry from 2001 to 2011 in 25 hospitals. A total of 31,250 patients underwent isolated coronary artery bypass grafting and the outcome was 30-day mortality. A total of 2154 (6.9%) patients had 1 or multiple missing values. Missing values were estimated assuming missing completely at random and logistic regression with a generalized estimating equation was used to address within-hospital variance. Bootstrapping methods were used to construct and validate the updated model (AusSCORE II). Also the model was validated on an out-of-creation sample of 4700 patients who underwent bypass surgery in 2012. RESULTS: The average age of the patients was 65.6±12.9 years and 78.6% were male. Thirteen variables were selected in the updated model. The bootstrap discrimination and calibration of the AusSCORE II was very good (receiver operating characteristics [ROC], 82.0%; slope calibration, 0.987). The overall observed/AusSCORE II predicted mortality was 1.63% compared with the original AusSCORE predicted mortality of 1.01%. The validation of the AusSCORE II on the out-of-sample data also showed a high performance of the model (ROC, 84.5%; Hosmer-Lemoshow P value, .7654). CONCLUSIONS: The AusSCORE II model provides improved prediction of 30-day mortality and successfully stratifies patient risk. The model will be useful to improve the preoperative consultation regarding risk stratification in terms of 30-day mortality.


Assuntos
Ponte de Artéria Coronária/mortalidade , Técnicas de Apoio para a Decisão , Idoso , Idoso de 80 Anos ou mais , Austrália/epidemiologia , Distribuição de Qui-Quadrado , Ponte de Artéria Coronária/efeitos adversos , Feminino , Humanos , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Nova Zelândia/epidemiologia , Razão de Chances , Seleção de Pacientes , Valor Preditivo dos Testes , Sistema de Registros , Reprodutibilidade dos Testes , Medição de Risco , Fatores de Risco , Fatores de Tempo , Resultado do Tratamento
3.
J Med Syst ; 35(6): 1349-58, 2011 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-20703779

RESUMO

Adoption of compression technology is often required for wireless cardiovascular monitoring, due to the enormous size of Electrocardiography (ECG) signal and limited bandwidth of Internet. However, compressed ECG must be decompressed before performing human identification using present research on ECG based biometric techniques. This additional step of decompression creates a significant processing delay for identification task. This becomes an obvious burden on a system, if this needs to be done for a trillion of compressed ECG per hour by the hospital. Even though the hospital might be able to come up with an expensive infrastructure to tame the exuberant processing, for small intermediate nodes in a multihop network identification preceded by decompression is confronting. In this paper, we report a technique by which a person can be identified directly from his / her compressed ECG. This technique completely obviates the step of decompression and therefore upholds biometric identification less intimidating for the smaller nodes in a multihop network. The biometric template created by this new technique is lower in size compared to the existing ECG based biometrics as well as other forms of biometrics like face, finger, retina etc. (up to 8302 times lower than face template and 9 times lower than existing ECG based biometric template). Lower size of the template substantially reduces the one-to-many matching time for biometric recognition, resulting in a faster biometric authentication mechanism.


Assuntos
Doenças Cardiovasculares/diagnóstico , Compressão de Dados/métodos , Eletrocardiografia/instrumentação , Sistemas de Identificação de Pacientes/métodos , Algoritmos , Biometria , Humanos , Monitorização Fisiológica , Fatores de Tempo
4.
IEEE Trans Inf Technol Biomed ; 15(1): 33-9, 2011 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-21097383

RESUMO

Usage of compressed ECG for fast and efficient telecardiology application is crucial, as ECG signals are enormously large in size. However, conventional ECG diagnosis algorithms require the compressed ECG packets to be decompressed before diagnosis can be performed. This added step of decompression before performing diagnosis for every ECG packet introduces unnecessary delay, which is undesirable for cardiovascular diseased (CVD) patients. In this paper, we are demonstrating an innovative technique that performs real-time classification of CVD. With the help of this real-time classification of CVD, the emergency personnel or the hospital can automatically be notified via SMS/MMS/e-mail when a life-threatening cardiac abnormality of the CVD affected patient is detected. Our proposed system initially uses data mining techniques, such as attribute selection (i.e., selects only a few features from the compressed ECG) and expectation maximization (EM)-based clustering. These data mining techniques running on a hospital server generate a set of constraints for representing each of the abnormalities. Then, the patient's mobile phone receives these set of constraints and employs a rule-based system that can identify each of abnormal beats in real time. Our experimentation results on 50 MIT-BIH ECG entries reveal that the proposed approach can successfully detect cardiac abnormalities (e.g., ventricular flutter/fibrillation, premature ventricular contraction, atrial fibrillation, etc.) with 97% accuracy on average. This innovative data mining technique on compressed ECG packets enables faster identification of cardiac abnormality directly from the compressed ECG, helping to build an efficient telecardiology diagnosis system.


Assuntos
Arritmias Cardíacas/diagnóstico , Mineração de Dados/métodos , Eletrocardiografia/métodos , Processamento de Sinais Assistido por Computador , Telemetria/métodos , Algoritmos , Arritmias Cardíacas/fisiopatologia , Análise por Conglomerados , Redes de Comunicação de Computadores , Computadores de Mão , Bases de Dados Factuais , Sistemas de Informação Hospitalar , Humanos , Modelos Teóricos , Monitorização Fisiológica , Reconhecimento Automatizado de Padrão
5.
Artigo em Inglês | MEDLINE | ID: mdl-21096293

RESUMO

To prevent the threat of Cardiovascular Disease (CVD) related deaths, the usage of mobile phone based computational platforms, body sensors and wireless communications is proliferating. Since mobile phones have limited computational resources, existing PC based complex CVD detection algorithms are often unsuitable for wireless telecardiology applications. Moreover, if the existing Electrocardiography (ECG) based CVD detection algorithms are adopted for mobile telecardiology applications, then there will be processing delays due to the computational complexities of the existing algorithms. However, for a CVD affected patient, seconds worth of delay could be fatal, since cardiovascular cell damage is a totally irrecoverable process. This paper proposes a fast and efficient mechanism of CVD detection from ECG signal. Unlike the existing ECG based CVD diagnosis systems that detect CVD anomalies from hundreds of sample points, the proposed mechanism identifies cardiac abnormality from only 5 sample points. Therefore, according to our experiments the proposed mechanism is up to 3 times faster than the existing techniques. Due to less computational burden, the proposed mechanism is ideal for wireless telecardiology applications running on mobile phones.


Assuntos
Doenças Cardiovasculares/diagnóstico , Telefone Celular/instrumentação , Eletrocardiografia/métodos , Monitorização Fisiológica/instrumentação , Monitorização Fisiológica/métodos , Telemedicina/métodos , Doenças Cardiovasculares/diagnóstico por imagem , Doenças Cardiovasculares/fisiopatologia , Frequência Cardíaca/fisiologia , Humanos , Fatores de Tempo , Ultrassonografia , Tecnologia sem Fio
6.
J Med Syst ; 33(2): 121-32, 2009 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-19397097

RESUMO

With cardiovascular disease as the number one killer of modern era, Electrocardiogram (ECG) is collected, stored and transmitted in greater frequency than ever before. However, in reality, ECG is rarely transmitted and stored in a secured manner. Recent research shows that eavesdropper can reveal the identity and cardiovascular condition from an intercepted ECG. Therefore, ECG data must be anonymized before transmission over the network and also stored as such in medical repositories. To achieve this, first of all, this paper presents a new ECG feature detection mechanism, which was compared against existing cross correlation (CC) based template matching algorithms. Two types of CC methods were used for comparison. Compared to the CC based approaches, which had 40% and 53% misclassification rates, the proposed detection algorithm did not perform any single misclassification. Secondly, a new ECG obfuscation method was designed and implemented on 15 subjects using added noises corresponding to each of the ECG features. This obfuscated ECG can be freely distributed over the internet without the necessity of encryption, since the original features needed to identify personal information of the patient remain concealed. Only authorized personnel possessing a secret key will be able to reconstruct the original ECG from the obfuscated ECG. Distribution of the would appear as regular ECG without encryption. Therefore, traditional decryption techniques including powerful brute force attack are useless against this obfuscation.


Assuntos
Segurança Computacional/normas , Confidencialidade , Eletrocardiografia/instrumentação , Eletrocardiografia/métodos , Internet , Processamento de Sinais Assistido por Computador , Algoritmos , Diagnóstico por Computador/métodos , Humanos , Reconhecimento Automatizado de Padrão , Reprodutibilidade dos Testes
7.
J Med Syst ; 33(6): 475-83, 2009 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-20052899

RESUMO

With the advent of high-speed internet band-width consuming video conferencing applications will rapidly become attractive to e-patients seeking real-time video consultations from e-doctors. In a conventional system patients connect to a known server in a medical center of his choice. If the server (i.e. a server via which a medical consultant communicates with a patient) is busy, the patient must wait before the server becomes free. Such a system is not efficient as many patients in medical centers with busy servers may either have to wait long, or are simply turned away. Patients may also leave when they become impatient. Not only the patients suffer due to server unavailability, medical service providers also incur revenue losses due to lost patients. To counter these problems, we propose a distributed cooperative Video Consultation on Demand (VCoD) system where servers are located in many different medical centers in different neighbourhoods close to patient concentrations. In such a cooperative system if patients find their nearby servers under heavy load they are automatically directed to servers that are least loaded by using efficient server selection method (also called anycasting). Simple numerical analysis shows that this not only maximizes revenues for medical service providers by reducing number of lost patients, but also improves average response time for e-patients.


Assuntos
Acessibilidade aos Serviços de Saúde , Armazenamento e Recuperação da Informação/métodos , Internet , Consulta Remota/métodos , Humanos , Gravação de Videoteipe
8.
Conf Proc IEEE Eng Med Biol Soc ; 2005: 2817-20, 2005.
Artigo em Inglês | MEDLINE | ID: mdl-17282828

RESUMO

DNA sequences are generally very long chains of sequentially linked nucleotides. There are four different nucleotides and combinations of these build the nucleotide information of sequence files contained in data sources. When a user searches for any sequence for an organism, a compressed sequence file can be sent from the data source to the user. The compressed file then can be decompressed at the client end resulting in reduced transmission time over the Internet. A compression algorithm that provides a moderately high compression rate with minimal decompression time is proposed in this paper. We also compare a number of different compression techniques for achieving efficient delivery methods from an intelligent genomic search agent over the Internet.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...