Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
J Healthc Eng ; 2017: 9580385, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-29065671

RESUMO

Noninvasive medical procedures are usually preferable to their invasive counterparts in the medical community. Anemia examining through the palpebral conjunctiva is a convenient noninvasive procedure. The procedure can be automated to reduce the medical cost. We propose an anemia examining approach by using a Kalman filter (KF) and a regression method. The traditional KF is often used in time-dependent applications. Here, we modified the traditional KF for the time-independent data in medical applications. We simply compute the mean value of the red component of the palpebral conjunctiva image as our recognition feature and use a penalty regression algorithm to find a nonlinear curve that best fits the data of feature values and the corresponding levels of hemoglobin (Hb) concentration. To evaluate the proposed approach and several relevant approaches, we propose a risk evaluation scheme, where the entire Hb spectrum is divided into high-risk, low-risk, and doubtful intervals for anemia. The doubtful interval contains the Hb threshold, say 11 g/dL, separating anemia and nonanemia. A suspect sample is the sample falling in the doubtful interval. For the anemia screening purpose, we would like to have as less suspect samples as possible. The experimental results show that the modified KF reduces the number of suspect samples significantly for all the approaches considered here.


Assuntos
Anemia/diagnóstico , Túnica Conjuntiva/diagnóstico por imagem , Olho , Algoritmos , Hemoglobinas/análise , Humanos
2.
Comput Methods Programs Biomed ; 137: 125-135, 2016 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-28110719

RESUMO

Examining the hemoglobin level of blood is an important way to achieve the diagnosis of anemia, but it requires blood drawing and blood test. Examining the color distribution of palpebral conjunctiva is a standard procedure of anemia diagnosis, which requires no blood test. However, since color perception is not always consistent among different people, we attempt to imitate the way of physical examination of palpebral conjunctiva to detect anemia, so that computers can identify anemia patients automatically in a consolidated manner for a screening process. In this paper we propose two algorithms for anemia diagnosis. The first algorithm is intended to be simple and fast, while the second one to be more sophisticated and robust, providing an option for different applications. The first algorithm consists of a simple two-stage classifier. In the first stage, we use a thresholding decision technique based on a feature called high hue rate (HHR) (extracted from the HSI color space). In the second stage, a feature called pixel value in the middle (PVM) (extracted from the RGB color space) is proposed, followed by the use of a minimum distance classifier based on Mahalanobis distance. In the second algorithm, we consider 18 possible features, including a newly added entropy feature, some improved features from the first algorithm, and 13 features proposed in a previous work. We use correlation and simple statistics to select 3 relatively independent features (entropy, binarization of HHR, and PVM of G component) for classification using a support vector machine or an artificial neural network. Finally, we evaluate the classification performance of the proposed algorithms in terms of sensitivity, specificity, and Kappa value. The experimental results show relatively good performance and prove the feasibility of our attempt, which may encourage more follow-up study in the future.


Assuntos
Anemia Ferropriva/diagnóstico , Túnica Conjuntiva/patologia , Processamento de Imagem Assistida por Computador , Humanos , Redes Neurais de Computação , Máquina de Vetores de Suporte
3.
IEEE Trans Inf Technol Biomed ; 13(5): 818-21, 2009 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-19447732

RESUMO

Hospitals and medical centers produce an enormous amount of digital medical images every day, especially in the form of image sequences, which requires considerable storage space. One solution could be the application of lossless compression. Among available methods, JPEG-LS has excellent coding performance. However, it only compresses a single picture with intracoding and does not utilize the interframe correlation among pictures. Therefore, this paper proposes a method that combines the JPEG-LS and an interframe coding with motion vectors to enhance the compression performance of using JPEG-LS alone. Since the interframe correlation between two adjacent images in a medical image sequence is usually not as high as that in a general video image sequence, the interframe coding is activated only when the interframe correlation is high enough. With six capsule endoscope image sequences under test, the proposed method achieves average compression gains of 13.3% and 26.3% over the methods of using JPEG-LS and JPEG2000 alone, respectively. Similarly, for an MRI image sequence, coding gains of 77.5% and 86.5% are correspondingly obtained.


Assuntos
Endoscopia por Cápsula/métodos , Compressão de Dados/métodos , Imageamento por Ressonância Magnética/métodos , Bases de Dados Factuais , Diagnóstico por Imagem/métodos , Humanos
4.
IEEE Trans Biomed Eng ; 52(3): 539-43, 2005 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-15759584

RESUMO

In a prior work, a wavelet-based vector quantization (VQ) approach was proposed to perform lossy compression of electrocardiogram (ECG) signals. In this paper, we investigate and fix its coding inefficiency problem in lossless compression and extend it to allow both lossy and lossless compression in a unified coding framework. The well-known 9/7 filters and 5/3 integer filters are used to implement the wavelet transform (WT) for lossy and lossless compression, respectively. The codebook updating mechanism, originally designed for lossy compression, is modified to allow lossless compression as well. In addition, a new and cost-effective coding strategy is proposed to enhance the coding efficiency of set partitioning in hierarchical tree (SPIHT) at the less significant bit representation of a WT coefficient. ECG records from the MIT/BIH Arrhythmia and European ST-T Databases are selected as test data. In terms of the coding efficiency for lossless compression, experimental results show that the proposed codec improves the direct SPIHT approach and the prior work by about 33% and 26%, respectively.


Assuntos
Algoritmos , Arritmias Cardíacas/diagnóstico , Arritmias Cardíacas/fisiopatologia , Compressão de Dados/métodos , Eletrocardiografia/métodos , Processamento de Sinais Assistido por Computador , Humanos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
5.
IEEE Trans Med Imaging ; 23(11): 1417-29, 2004 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-15554129

RESUMO

The enormous data of volumetric medical images (VMI) bring a transmission and storage problem that can be solved by using a compression technique. For the lossy compression of a very long VMI sequence, automatically maintaining the diagnosis features in reconstructed images is essential. The proposed wavelet-based adaptive vector quantizer incorporates a distortion-constrained codebook replenishment (DCCR) mechanism to meet a user-defined quality demand in peak signal-to-noise ratio. Combining a codebook updating strategy and the well-known set partitioning in hierarchical trees (SPIHT) technique, the DCCR mechanism provides an excellent coding gain. Experimental results show that the proposed approach is superior to the pure SPIHT and the JPEG2000 algorithms in terms of coding performance. We also propose an iterative fast searching algorithm to find the desired signal quality along an energy-quality curve instead of a traditional rate-distortion curve. The algorithm performs the quality control quickly, smoothly, and reliably.


Assuntos
Algoritmos , Compressão de Dados/métodos , Diagnóstico por Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Sistemas Computadorizados de Registros Médicos , Garantia da Qualidade dos Cuidados de Saúde/métodos , Cabeça/anatomia & histologia , Humanos , Imageamento por Ressonância Magnética/métodos , Controle de Qualidade , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Processamento de Sinais Assistido por Computador
6.
IEEE Trans Biomed Eng ; 49(7): 671-80, 2002 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-12083301

RESUMO

In this paper, we propose a novel vector quantizer (VQ) in the wavelet domain for the compression of electrocardiogram (ECG) signals. A vector called tree vector (TV) is formed first in a novel structure, where wavelet transformed (WT) coefficients in the vector are arranged in the order of a hierarchical tree. Then, the TVs extracted from various WT subbands are collected in one single codebook. This feature is an advantage over traditional WT-VQ methods, where multiple codebooks are needed and are usually designed separately because numerical ranges of coefficient values in various WT subbands are quite different. Finally, a distortion-constrained codebook replenishment mechanism is incorporated into the VQ, where codevectors can be updated dynamically, to guarantee reliable quality of reconstructed ECG waveforms. With the proposed approach both visual quality and the objective quality in terms of the percent of root-mean-square difference (PRD) are excellent even in a very low bit rate. For the entire 48 records of Lead II ECG data in the MIT/BIH database, an average PRD of 7.3% at 146 b/s is obtained. For the same test data under consideration, the proposed method outperforms many recently published ones, including the best one known as the set partitioning in hierarchical trees.


Assuntos
Algoritmos , Eletrocardiografia/métodos , Eletrocardiografia/estatística & dados numéricos , Modelos Cardiovasculares , Processamento de Sinais Assistido por Computador , Simulação por Computador , Interpretação Estatística de Dados , Bases de Dados Factuais , Estudos de Viabilidade , Humanos , Sensibilidade e Especificidade
7.
IEEE Trans Inf Technol Biomed ; 6(1): 46-53, 2002 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-11936596

RESUMO

A data-hiding technique called the "bipolar multiple-number base" was developed to provide capabilities of authentication, integration, and confidentiality for an electronic patient record (EPR) transmitted among hospitals through the Internet. The proposed technique is capable of hiding those EPR related data such as diagnostic reports, electrocardiogram, and digital signatures from doctors or a hospital into a mark image. The mark image could be the mark of a hospital used to identify the origin of an EPR. Those digital signatures from doctors and a hospital could be applied for the EPR authentication. Thus, different types of medical data can be integrated into the same mark image. The confidentiality is ultimately achieved by decrypting the EPR related data and digital signatures with an exact copy of the original mark image. The experimental results validate the integrity and the invisibility of the hidden EPR related data. This newly developed technique allows all of the hidden data to be separated and restored perfectly by authorized users.


Assuntos
Confidencialidade , Sistemas Computadorizados de Registros Médicos , Integração de Sistemas , Algoritmos
8.
IEEE Trans Biomed Eng ; 49(3): 233-9, 2002 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-11876287

RESUMO

For the compression of medical signals such as electrocardiogram (ECG), excellent reconstruction quality of a highly compressed signal can be obtained by using a wavelet-based approach. The most widely used objective quality criterion for the compressed ECG is called the percent of root-mean-square difference (PRD). In this paper, given a user-specified PRD, an algorithm is proposed to meet the PRD demand by searching for an appropriate bit rate in an automatic, smooth, and fast manner for the wavelet-based compression. The bit rate searching is modeled as a root-finding problem for a one-dimensional function, where an unknown rate-distortion curve represents the function and the desired rate is the root to be sought. A solution derived from root-finding methods in numerical analysis is proposed. The proposed solution is incorporated in a well-known wavelet-based coding strategy called set partitioning in hierarchical trees. ECG signals taken from the MIT/BIH database are tested, and excellent results in terms of convergence speed, quality variation, and coding performance are obtained.


Assuntos
Algoritmos , Eletrocardiografia , Processamento de Sinais Assistido por Computador , Humanos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...