Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 22
Filtrar
1.
Bioengineering (Basel) ; 11(2)2024 Feb 11.
Artigo em Inglês | MEDLINE | ID: mdl-38391661

RESUMO

The objective of this study was to evaluate the effectiveness of machine learning classification techniques applied to nerve conduction studies (NCS) of motor and sensory signals for the automatic diagnosis of carpal tunnel syndrome (CTS). Two methodologies were tested. In the first methodology, motor signals recorded from the patients' median nerve were transformed into time-frequency spectrograms using the short-time Fourier transform (STFT). These spectrograms were then used as input to a deep two-dimensional convolutional neural network (CONV2D) for classification into two categories: patients and controls. In the second methodology, sensory signals from the patients' median and ulnar nerves were subjected to multilevel wavelet decomposition (MWD), and statistical and non-statistical features were extracted from the decomposed signals. These features were utilized to train and test classifiers. The classification target was set to three categories: normal subjects (controls), patients with mild CTS, and patients with moderate to severe CTS based on conventional electrodiagnosis results. The results of the classification analysis demonstrated that both methodologies surpassed previous attempts at automatic CTS diagnosis. The classification models utilizing the motor signals transformed into time-frequency spectrograms exhibited excellent performance, with average accuracy of 94%. Similarly, the classifiers based on the sensory signals and the extracted features from multilevel wavelet decomposition showed significant accuracy in distinguishing between controls, patients with mild CTS, and patients with moderate to severe CTS, with accuracy of 97.1%. The findings highlight the efficacy of incorporating machine learning algorithms into the diagnostic processes of NCS, providing a valuable tool for clinicians in the diagnosis and management of neuropathies such as CTS.

2.
Bioengineering (Basel) ; 9(12)2022 Dec 14.
Artigo em Inglês | MEDLINE | ID: mdl-36551006

RESUMO

Even though non-steroidal anti-inflammatory drugs are the most effective treatment for inflammatory conditions, they have been linked to negative side effects. A promising approach to mitigating potential risks, is the development of new compounds able to combine anti-inflammatory with antioxidant activity to enhance activity and reduce toxicity. The implication of reactive oxygen species in inflammatory conditions has been extensively studied, based on the pro-inflammatory properties of generated free radicals. Drugs with dual activity (i.e., inhibiting inflammation related enzymes, e.g., LOX-3 and scavenging free radicals, e.g., DPPH) could find various therapeutic applications, such as in cardiovascular or neurodegenerating disorders. The challenge we embarked on using deep learning was the creation of appropriate classification and regression models to discriminate pharmacological activity and selectivity as well as to discover future compounds with dual activity prior to synthesis. An accurate filter algorithm was established, based on knowledge from compounds already evaluated in vitro, that can separate compounds with low, moderate or high activity. In this study, we constructed a customized highly effective one dimensional convolutional neural network (CONV1D), with accuracy scores up to 95.2%, that was able to identify dual active compounds, being LOX-3 inhibitors and DPPH scavengers, as an indication of simultaneous anti-inflammatory and antioxidant activity. Additionally, we created a highly accurate regression model that predicted the exact value of effectiveness of a set of recently synthesized compounds with anti-inflammatory activity, scoring a root mean square error value of 0.8. Eventually, we succeeded in observing the manner in which those newly synthesized compounds differentiate from each other, regarding a specific pharmacological target, using deep learning algorithms.

3.
Bioengineering (Basel) ; 8(11)2021 Nov 10.
Artigo em Inglês | MEDLINE | ID: mdl-34821747

RESUMO

Recent literature has revealed a long discussion about the importance and necessity of nerve conduction studies in carpal tunnel syndrome management. The purpose of this study was to investigate the possibility of automatic detection, based on electrodiagnostic features, for the median nerve mononeuropathy and decision making about carpal tunnel syndrome. The study included 38 volunteers, examined prospectively. The purpose was to investigate the possibility of automatically detecting the median nerve mononeuropathy based on common electrodiagnostic criteria, used in everyday clinical practice, as well as new features selected based on physiology and mathematics. Machine learning techniques were used to combine the examined characteristics for a stable and accurate diagnosis. Automatic electrodiagnosis reached an accuracy of 95% compared to the standard neurophysiological diagnosis of the physicians with nerve conduction studies and 89% compared to the clinical diagnosis. The results show that the automatic detection of carpal tunnel syndrome is possible and can be employed in decision making, excluding human error. It is also shown that the novel features investigated can be used for the detection of the syndrome, complementary to the commonly used ones, increasing the accuracy of the method.

4.
Entropy (Basel) ; 23(6)2021 Jun 16.
Artigo em Inglês | MEDLINE | ID: mdl-34208771

RESUMO

Aims: Bubble entropy (bEn) is an entropy metric with a limited dependence on parameters. bEn does not directly quantify the conditional entropy of the series, but it assesses the change in entropy of the ordering of portions of its samples of length m, when adding an extra element. The analytical formulation of bEn for autoregressive (AR) processes shows that, for this class of processes, the relation between the first autocorrelation coefficient and bEn changes for odd and even values of m. While this is not an issue, per se, it triggered ideas for further investigation. Methods: Using theoretical considerations on the expected values for AR processes, we examined a two-steps-ahead estimator of bEn, which considered the cost of ordering two additional samples. We first compared it with the original bEn estimator on a simulated series. Then, we tested it on real heart rate variability (HRV) data. Results: The experiments showed that both examined alternatives showed comparable discriminating power. However, for values of 10

5.
Stud Health Technol Inform ; 273: 255-257, 2020 Sep 04.
Artigo em Inglês | MEDLINE | ID: mdl-33087622

RESUMO

Smart devices, including the popular smart watches, often collect information on the heart beat rhythm and transmit it to a central server for storage or further processing. A factor introducing important limitations in the amount of data collected, transmitted and finally processed is the life of the mobile device or smart watch battery. Some devices choose to transmit the mean heart rate over relatively long periods of time, to save power. Heart Rate Variability (HRV) analysis gives useful information about the human heart, by only examining the heart rate time series. Its discriminating capability is affected by the amount of available information to process. Ideally, the whole RR interval time series should be used. We investigate here how this discriminating capability is affected, when the analysis is based on mean heart rate values transmitted over relatively long time periods. We show that we still can get useful information and the discriminating power is still remarkable, even when the amount of the available data is relatively small.


Assuntos
Coração , Frequência Cardíaca , Humanos
6.
Adv Exp Med Biol ; 1194: 457, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32468562

RESUMO

Cancer research has yielded tremendous gains over the last two decades with remarkable results addressing this worldwide major public health problem. Continuous technological developments and persistent research has led to significant progress in targeted therapies. This paper focuses on the study of mathematical models that describe in the most optimal way the development of malignant tumours induced in experimental animals of a particular species following chemical carcinogenesis with a complete carcinogen factor known as 3,4-benzopyrene. The purpose of this work is to study the phenomenon of chemical carcinogenesis, inhibition and growth of malignant tumours.


Assuntos
Carcinogênese , Simulação por Computador , Modelos Biológicos , Neoplasias , Animais , Carcinogênese/patologia , Carcinógenos , Modelos Animais de Doenças , Neoplasias/induzido quimicamente , Neoplasias/fisiopatologia , Neoplasias/prevenção & controle
7.
Entropy (Basel) ; 20(1)2018 Jan 13.
Artigo em Inglês | MEDLINE | ID: mdl-33265148

RESUMO

Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regularity/complexity of a time series. On the other hand, it is a computationally expensive method which may require a large amount of time when used in long series or with a large number of signals. The computationally intensive part is the similarity check between points in m dimensional space. In this paper, we propose new algorithms or extend already proposed ones, aiming to compute Sample Entropy quickly. All algorithms return exactly the same value for Sample Entropy, and no approximation techniques are used. We compare and evaluate them using cardiac inter-beat (RR) time series. We investigate three algorithms. The first one is an extension of the k d -trees algorithm, customized for Sample Entropy. The second one is an extension of an algorithm initially proposed for Approximate Entropy, again customized for Sample Entropy, but also improved to present even faster results. The last one is a completely new algorithm, presenting the fastest execution times for specific values of m, r, time series length, and signal characteristics. These algorithms are compared with the straightforward implementation, directly resulting from the definition of Sample Entropy, in order to give a clear image of the speedups achieved. All algorithms assume the classical approach to the metric, in which the maximum norm is used. The key idea of the two last suggested algorithms is to avoid unnecessary comparisons by detecting them early. We use the term unnecessary to refer to those comparisons for which we know a priori that they will fail at the similarity check. The number of avoided comparisons is proved to be very large, resulting in an analogous large reduction of execution time, making them the fastest algorithms available today for the computation of Sample Entropy.

8.
IEEE Trans Biomed Eng ; 64(11): 2711-2718, 2017 11.
Artigo em Inglês | MEDLINE | ID: mdl-28182552

RESUMO

Objective: A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy. Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.Objective: A critical point in any definition of entropy is the selection of the parameters employed to obtain an estimate in practice. We propose a new definition of entropy aiming to reduce the significance of this selection. Methods: We call the new definition Bubble Entropy. Bubble Entropy is based on permutation entropy, where the vectors in the embedding space are ranked. We use the bubble sort algorithm for the ordering procedure and count instead the number of swaps performed for each vector. Doing so, we create a more coarse-grained distribution and then compute the entropy of this distribution. Results: Experimental results with both real and synthetic HRV signals showed that bubble entropy presents remarkable stability and exhibits increased descriptive and discriminating power compared to all other definitions, including the most popular ones. Conclusion: The definition proposed is almost free of parameters. The most common ones are the scale factor r and the embedding dimension m . In our definition, the scale factor is totally eliminated and the importance of m is significantly reduced. The proposed method presents increased stability and discriminating power. Significance: After the extensive use of some entropy measures in physiological signals, typical values for their parameters have been suggested, or at least, widely used. However, the parameters are still there, application and dataset dependent, influencing the computed value and affecting the descriptive power. Reducing their significance or eliminating them alleviates the problem, decoupling the method from the data and the application, and eliminating subjective factors.


Assuntos
Simulação por Computador , Entropia , Processamento de Sinais Assistido por Computador , Adulto , Idoso , Algoritmos , Eletrocardiografia , Insuficiência Cardíaca/fisiopatologia , Frequência Cardíaca/fisiologia , Humanos , Pessoa de Meia-Idade
9.
Ann Noninvasive Electrocardiol ; 21(5): 508-18, 2016 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-27038287

RESUMO

BACKGROUND: Deceleration capacity (DC) of heart rate proved an independent mortality predictor in postmyocardial infarction patients. The original method (DCorig) may produce negative values (9% in our analyzed sample). We aimed to improve the method and to investigate if DC also predicts the arrhythmic mortality. METHODS: Time series from 221 heart failure patients was analyzed with DCorig and a new variant, the DCsgn, in which decelerations are characterized based on windows of four consecutive beats and not on anchors. After 41.2 months, 69 patients experienced sudden cardiac death (SCD) surrogate end points, while 61 died. RESULTS: (SCD+ vs SCD-group) DCorig: 3.7 ± 1.6 ms versus 4.6 ± 2.6 ms (P = 0.020) and DCsgn: 4.9 ± 1.7 ms versus 6.1 ± 2.2 ms (P < 0.001). After Cox regression (gender, age, left ventricular ejection fraction, filtered QRS, NSVT≥1/24h, VPBs≥240/24h, mean 24-h QTc, and each DC index added on the model separately), DCsgn (continuous) was an independent SCD predictor (hazard ratio [H.R.]: 0.742, 95% confidence intervals (C.I.): 0.631-0.871, P < 0.001). DCsgn ≤ 5.373 (dichotomous) presented 1.815 H.R. for SCD (95% C.I.: 1.080-3.049, P = 0.024), areas under curves (AUC)/receiver operator characteristic (ROC): 0.62 (DCorig) and 0.66 (DCsgn), P = 0.190 (chi-square). Results for deceased versus alive group: DCorig: 3.2 ± 2.0 ms versus 4.8 ± 2.4 ms (P < 0.001) and DCsgn: 4.6 ± 1.4 ms versus 6.2 ± 2.2 ms (P < 0.001). In Cox regression, DCsgn (continuous) presented H.R.: 0.686 (95% C.I. 0.546-0.862, P = 0.001) and DCsgn ≤ 5.373 (dichotomous) presented an H.R.: 2.443 for total mortality (TM) (95% C.I. 1.269-4.703, P = 0.008). AUC/ROC: 0.71 (DCorig) and 0.73 (DCsgn), P = 0.402. CONCLUSIONS: DC predicts both SCD and TM. DCsgn avoids the negative values, improving the method in a nonstatistical important level.


Assuntos
Arritmias Cardíacas/mortalidade , Arritmias Cardíacas/fisiopatologia , Insuficiência Cardíaca/mortalidade , Insuficiência Cardíaca/fisiopatologia , Determinação da Frequência Cardíaca/métodos , Frequência Cardíaca/fisiologia , Idoso , Morte Súbita Cardíaca , Desaceleração , Ecocardiografia , Eletrocardiografia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Prospectivos
10.
Artigo em Inglês | MEDLINE | ID: mdl-26737009

RESUMO

R-R interval signals that come from different subjects are regularly aligned and averaged according to the horological starting time of the recordings. We argue that the horological time is a faulty alignment criterion and provide evidence in the form of a new alignment method. Our main motivation is that the human heart rate (HR) rhythm follows a circadian cycle, whose pattern can vary among different classes of people. We propose two novel alignment algorithms that consider the HR circadian rhythm, the Puzzle Piece Alignment Algorithm (PPA) and the Event Based Alignment Algorithm (EBA). First, we convert the R-R interval signal into a series of time windows and compute the mean HR per window. Then our algorithms search for matching circadian patterns to align the signals. We conduct experiments using R-R interval signals extracted from two databases in the Physionet Data Bank. Both algorithms are able to align the signals with respect to the circadian rhythmicity of HR. Furthermore, our findings confirm the presence of more than one pattern in the circadian HR rhythm. We suggest an automatic classification of signals according to the three most prominent patterns.


Assuntos
Ritmo Circadiano/fisiologia , Frequência Cardíaca/fisiologia , Adulto , Idoso , Algoritmos , Automação , Interpretação Estatística de Dados , Bases de Dados Factuais , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Reconhecimento Automatizado de Padrão , Processamento de Sinais Assistido por Computador
13.
J Trauma Acute Care Surg ; 73(3): 625-8, 2012 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-22929493

RESUMO

BACKGROUND: Venous thromboembolism (VTE) is a significant risk in trauma patients. Although low-molecular weight heparin (LMWH) is effective in VTE prophylaxis, its use for patients with traumatic intracranial hemorrhage remains controversial. The purpose of this study was to evaluate the safety of LMWH for VTE prophylaxis in blunt intracranial injury. METHODS: We conducted a retrospective multicenter study of LMWH chemoprophylaxis on patients with intracranial hemorrhage caused by blunt trauma. Patients with brain Abbreviated Injury Scale score of 3 or higher, age 18 years or older, and at least one repeated head computed tomographic scan were included. Patients with previous VTE; on preinjury anticoagulation; hospitalized for less than 48 hours; on heparin for VTE prophylaxis; or required emergent thoracic, abdominal, or vascular surgery at admission were excluded. Patients were divided into two groups: those who received LMWH and those who did not. The primary outcome was progression of intracranial hemorrhage on repeated head computed tomographic scan. RESULTS: The study included 1,215 patients, of which 220 patients (18.1%) received LMWH and 995 (81.9%) did not. Hemorrhage progression occurred in 239 of 995 control subjects and 93 of 220 LMWH patients (24% vs. 42%, p < 0.001). Hemorrhage progression occurred in 32 patients after initiating LMWH (14.5%). Nine of these patients (4.1%) required neurosurgical intervention for hemorrhage progression. CONCLUSION: Patients receiving LMWH were at higher risk for hemorrhage progression. We were unable to demonstrate safety of LMWH for VTE prophylaxis in patients with brain injury. The risk of using LMWH may exceed its benefit. LEVEL OF EVIDENCE: Therapeutic study, level IV.


Assuntos
Lesões Encefálicas/complicações , Hemorragia/induzido quimicamente , Heparina de Baixo Peso Molecular/administração & dosagem , Tromboembolia Venosa/prevenção & controle , Adulto , Idoso , Anticoagulantes/administração & dosagem , Anticoagulantes/efeitos adversos , Lesões Encefálicas/diagnóstico , Lesões Encefálicas/terapia , Estudos de Casos e Controles , Feminino , Seguimentos , Hemorragia/epidemiologia , Heparina de Baixo Peso Molecular/efeitos adversos , Mortalidade Hospitalar , Humanos , Incidência , Escala de Gravidade do Ferimento , Masculino , Pessoa de Meia-Idade , Prevenção Primária/métodos , Valores de Referência , Estudos Retrospectivos , Medição de Risco , Gestão da Segurança , Sociedades Médicas , Análise de Sobrevida , Centros de Traumatologia , Resultado do Tratamento , Tromboembolia Venosa/epidemiologia , Tromboembolia Venosa/etiologia , Ferimentos não Penetrantes/complicações , Ferimentos não Penetrantes/diagnóstico , Ferimentos não Penetrantes/terapia
14.
Injury ; 43(9): 1355-61, 2012 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-22560130

RESUMO

Despite the establishment of evidence-based guidelines for the resuscitation of critically injured patients who have sustained cardiopulmonary arrest, rapid decisions regarding patient salvageability in these situations remain difficult even for experienced physicians. Regardless, survival is limited after traumatic cardiopulmonary arrest. One applicable, well-described resuscitative technique is the emergency department thoracotomy-a procedure that, when applied correctly, is effective in saving small but significant numbers of critically injured patients. By understanding the indications, technical details, and predictors of survival along with the inherent risks and costs of emergency department thoracotomy, the physician is better equipped to make rapid futile versus salvageable decisions for this most severely injured subset of patients.


Assuntos
Reanimação Cardiopulmonar/estatística & dados numéricos , Serviços Médicos de Emergência , Parada Cardíaca/cirurgia , Toracotomia/estatística & dados numéricos , Ferimentos não Penetrantes/cirurgia , Ferimentos Penetrantes/cirurgia , Adolescente , Adulto , Reanimação Cardiopulmonar/métodos , Criança , Pré-Escolar , Feminino , Parada Cardíaca/etiologia , Humanos , Lactente , Masculino , Guias de Prática Clínica como Assunto , Toracotomia/métodos , Estados Unidos/epidemiologia , Ferimentos não Penetrantes/complicações , Ferimentos Penetrantes/complicações , Adulto Jovem
16.
IEEE Trans Inf Technol Biomed ; 16(4): 615-22, 2012 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-22106154

RESUMO

The accurate diagnosis of diseases with high prevalence rate, such as Alzheimer, Parkinson, diabetes, breast cancer, and heart diseases, is one of the most important biomedical problems whose administration is imperative. In this paper, we present a new method for the automated diagnosis of diseases based on the improvement of random forests classification algorithm. More specifically, the dynamic determination of the optimum number of base classifiers composing the random forests is addressed. The proposed method is different from most of the methods reported in the literature, which follow an overproduce-and-choose strategy, where the members of the ensemble are selected from a pool of classifiers, which is known a priori. In our case, the number of classifiers is determined during the growing procedure of the forest. Additionally, the proposed method produces an ensemble not only accurate, but also diverse, ensuring the two important properties that should characterize an ensemble classifier. The method is based on an online fitting procedure and it is evaluated using eight biomedical datasets and five versions of the random forests algorithm (40 cases). The method decided correctly the number of trees in 90% of the test cases.


Assuntos
Algoritmos , Árvores de Decisões , Diagnóstico por Computador , Doença/classificação , Bases de Dados Factuais , Humanos
17.
J Biomed Inform ; 43(2): 307-20, 2010 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-19883796

RESUMO

The aim of this work is to present an automated method that assists in the diagnosis of Alzheimer's disease and also supports the monitoring of the progression of the disease. The method is based on features extracted from the data acquired during an fMRI experiment. It consists of six stages: (a) preprocessing of fMRI data, (b) modeling of fMRI voxel time series using a Generalized Linear Model, (c) feature extraction from the fMRI data, (d) feature selection, (e) classification using classical and improved variations of the Random Forests algorithm and Support Vector Machines, and (f) conversion of the trees, of the Random Forest, to rules which have physical meaning. The method is evaluated using a dataset of 41 subjects. The results of the proposed method indicate the validity of the method in the diagnosis (accuracy 94%) and monitoring of the Alzheimer's disease (accuracy 97% and 99%).


Assuntos
Doença de Alzheimer/diagnóstico , Árvores de Decisões , Modelos Lineares , Imageamento por Ressonância Magnética/métodos , Adolescente , Idoso , Idoso de 80 Anos ou mais , Algoritmos , Doença de Alzheimer/classificação , Bases de Dados Factuais , Progressão da Doença , Feminino , Humanos , Masculino , Reprodutibilidade dos Testes , Adulto Jovem
18.
IEEE Trans Inf Technol Biomed ; 13(4): 512-8, 2009 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-19273030

RESUMO

In this study, heartbeat time series are classified using support vector machines (SVMs). Statistical methods and signal analysis techniques are used to extract features from the signals. The SVM classifier is favorably compared to other neural network-based classification approaches by performing leave-one-out cross validation. The performance of the SVM with respect to other state-of-the-art classifiers is also confirmed by the classification of signals presenting very low signal-to-noise ratio. Finally, the influence of the number of features to the classification rate was also investigated for two real datasets. The first dataset consists of long-term ECG recordings of young and elderly healthy subjects. The second dataset consists of long-term ECG recordings of normal subjects and subjects suffering from coronary artery disease.


Assuntos
Inteligência Artificial , Frequência Cardíaca/fisiologia , Processamento de Sinais Assistido por Computador , Adulto , Idoso , Idoso de 80 Anos ou mais , Algoritmos , Doença da Artéria Coronariana/fisiopatologia , Eletrocardiografia , Humanos , Masculino , Modelos Estatísticos
19.
Comput Methods Programs Biomed ; 91(1): 48-54, 2008 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-18423927

RESUMO

The approximate entropy (ApEn) is a measure of systems complexity. The implementation of the method is computationally expensive and requires execution time analogous to the square of the size of the input signal. We propose here a fast algorithm which speeds up the computation of approximate entropy by detecting early some vectors that are not similar and by excluding them from the similarity test. Experimental analysis with various biomedical signals revealed a significant improvement in execution times.


Assuntos
Algoritmos , Diagnóstico por Computador/métodos , Entropia , Modelos Biológicos , Modelos Estatísticos , Processamento de Sinais Assistido por Computador , Simulação por Computador , Interpretação Estatística de Dados , Fatores de Tempo
20.
Comput Biol Med ; 37(5): 642-54, 2007 May.
Artigo em Inglês | MEDLINE | ID: mdl-16904097

RESUMO

The goal of this paper is to examine the classification capabilities of various prediction and approximation methods and suggest which are most likely to be suitable for the clinical setting. Various prediction and approximation methods are applied in order to detect and extract those which provide the better differentiation between control and patient data, as well as members of different age groups. The prediction methods are local linear prediction, local exponential prediction, the delay times method, autoregressive prediction and neural networks. Approximation is computed with local linear approximation, least squares approximation, neural networks and the wavelet transform. These methods are chosen since each has a different physical basis and thus extracts and uses time series information in a different way.


Assuntos
Frequência Cardíaca/fisiologia , Adulto , Fatores Etários , Idoso , Doença das Coronárias/fisiopatologia , Eletrocardiografia/classificação , Eletrocardiografia/estatística & dados numéricos , Eletrocardiografia/tendências , Eletrocardiografia Ambulatorial/classificação , Eletrocardiografia Ambulatorial/estatística & dados numéricos , Eletrocardiografia Ambulatorial/tendências , Feminino , Previsões , Análise de Fourier , Insuficiência Cardíaca/fisiopatologia , Humanos , Análise dos Mínimos Quadrados , Modelos Lineares , Masculino , Pessoa de Meia-Idade , Redes Neurais de Computação , Fatores de Tempo
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...