Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 25
Filtrar
1.
Med Phys ; 43(2): 833-42, 2016 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-26843244

RESUMO

PURPOSE: The assessment of early therapeutic response to anticancer therapy is vital for treatment planning and patient management in clinic. With the development of personal treatment plan, the early treatment response, especially before any anatomically apparent changes after treatment, becomes urgent need in clinic. Positron emission tomography (PET) imaging serves an important role in clinical oncology for tumor detection, staging, and therapy response assessment. Many studies on therapy response involve interpretation of differences between two PET images, usually in terms of standardized uptake values (SUVs). However, the quantitative accuracy of this measurement is limited. This work proposes a statistically robust approach for therapy response assessment based on Gaussian random field (GRF) to provide a statistically more meaningful scale to evaluate therapy effects. METHODS: The authors propose a new criterion for therapeutic assessment by incorporating image noise into traditional SUV method. An analytical method based on the approximate expressions of the Fisher information matrix was applied to model the variance of individual pixels in reconstructed images. A zero mean unit variance GRF under the null hypothesis (no response to therapy) was obtained by normalizing each pixel of the post-therapy image with the mean and standard deviation of the pretherapy image. The performance of the proposed method was evaluated by Monte Carlo simulation, where XCAT phantoms (128(2) pixels) with lesions of various diameters (2-6 mm), multiple tumor-to-background contrasts (3-10), and different changes in intensity (6.25%-30%) were used. The receiver operating characteristic curves and the corresponding areas under the curve were computed for both the proposed method and the traditional methods whose figure of merit is the percentage change of SUVs. The formula for the false positive rate (FPR) estimation was developed for the proposed therapy response assessment utilizing local average method based on random field. The accuracy of the estimation was validated in terms of Euler distance and correlation coefficient. RESULTS: It is shown that the performance of therapy response assessment is significantly improved by the introduction of variance with a higher area under the curve (97.3%) than SUVmean (91.4%) and SUVmax (82.0%). In addition, the FPR estimation serves as a good prediction for the specificity of the proposed method, consistent with simulation outcome with ∼1 correlation coefficient. CONCLUSIONS: In this work, the authors developed a method to evaluate therapy response from PET images, which were modeled as Gaussian random field. The digital phantom simulations demonstrated that the proposed method achieved a large reduction in statistical variability through incorporating knowledge of the variance of the original Gaussian random field. The proposed method has the potential to enable prediction of early treatment response and shows promise for application to clinical practice. In future work, the authors will report on the robustness of the estimation theory for application to clinical practice of therapy response evaluation, which pertains to binary discrimination tasks at a fixed location in the image such as detection of small and weak lesion.


Assuntos
Processamento de Imagem Assistida por Computador , Tomografia por Emissão de Pósitrons , Método de Monte Carlo , Distribuição Normal , Medicina de Precisão , Razão Sinal-Ruído , Resultado do Tratamento
2.
Biomed Eng Online ; 13: 90, 2014 Jun 30.
Artigo em Inglês | MEDLINE | ID: mdl-24981916

RESUMO

BACKGROUND: The inter-patient classification schema and the Association for the Advancement of Medical Instrumentation (AAMI) standards are important to the construction and evaluation of automated heartbeat classification systems. The majority of previously proposed methods that take the above two aspects into consideration use the same features and classification method to classify different classes of heartbeats. The performance of the classification system is often unsatisfactory with respect to the ventricular ectopic beat (VEB) and supraventricular ectopic beat (SVEB). METHODS: Based on the different characteristics of VEB and SVEB, a novel hierarchical heartbeat classification system was constructed. This was done in order to improve the classification performance of these two classes of heartbeats by using different features and classification methods. First, random projection and support vector machine (SVM) ensemble were used to detect VEB. Then, the ratio of the RR interval was compared to a predetermined threshold to detect SVEB. The optimal parameters for the classification models were selected on the training set and used in the independent testing set to assess the final performance of the classification system. Meanwhile, the effect of different lead configurations on the classification results was evaluated. RESULTS: Results showed that the performance of this classification system was notably superior to that of other methods. The VEB detection sensitivity was 93.9% with a positive predictive value of 90.9%, and the SVEB detection sensitivity was 91.1% with a positive predictive value of 42.2%. In addition, this classification process was relatively fast. CONCLUSIONS: A hierarchical heartbeat classification system was proposed based on the inter-patient data division to detect VEB and SVEB. It demonstrated better classification performance than existing methods. It can be regarded as a promising system for detecting VEB and SVEB of unknown patients in clinical practice.


Assuntos
Eletrocardiografia/métodos , Frequência Cardíaca , Processamento de Sinais Assistido por Computador , Estatística como Assunto/métodos , Humanos , Máquina de Vetores de Suporte , Complexos Ventriculares Prematuros/fisiopatologia
3.
Biomed Eng Online ; 13: 72, 2014 Jun 05.
Artigo em Inglês | MEDLINE | ID: mdl-24903422

RESUMO

BACKGROUND: Left bundle branch block (LBBB) and right bundle branch block (RBBB) not only mask electrocardiogram (ECG) changes that reflect diseases but also indicate important underlying pathology. The timely detection of LBBB and RBBB is critical in the treatment of cardiac diseases. Inter-patient heartbeat classification is based on independent training and testing sets to construct and evaluate a heartbeat classification system. Therefore, a heartbeat classification system with a high performance evaluation possesses a strong predictive capability for unknown data. The aim of this study was to propose a method for inter-patient classification of heartbeats to accurately detect LBBB and RBBB from the normal beat (NORM). METHODS: This study proposed a heartbeat classification method through a combination of three different types of classifiers: a minimum distance classifier constructed between NORM and LBBB; a weighted linear discriminant classifier between NORM and RBBB based on Bayesian decision making using posterior probabilities; and a linear support vector machine (SVM) between LBBB and RBBB. Each classifier was used with matching features to obtain better classification performance. The final types of the test heartbeats were determined using a majority voting strategy through the combination of class labels from the three classifiers. The optimal parameters for the classifiers were selected using cross-validation on the training set. The effects of different lead configurations on the classification results were assessed, and the performance of these three classifiers was compared for the detection of each pair of heartbeat types. RESULTS: The study results showed that a two-lead configuration exhibited better classification results compared with a single-lead configuration. The construction of a classifier with good performance between each pair of heartbeat types significantly improved the heartbeat classification performance. The results showed a sensitivity of 91.4% and a positive predictive value of 37.3% for LBBB and a sensitivity of 92.8% and a positive predictive value of 88.8% for RBBB. CONCLUSIONS: A multi-classifier ensemble method was proposed based on inter-patient data and demonstrated a satisfactory classification performance. This approach has the potential for application in clinical practice to distinguish LBBB and RBBB from NORM of unknown patients.


Assuntos
Bloqueio de Ramo/fisiopatologia , Eletrocardiografia/métodos , Processamento de Sinais Assistido por Computador , Estatística como Assunto/métodos , Teorema de Bayes , Análise Discriminante , Humanos , Máquina de Vetores de Suporte
4.
IEEE Trans Biomed Eng ; 60(11): 3226-37, 2013 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-23846434

RESUMO

CHENFIT-AMP is a novel nonlinear strategy that combines the fitting (gain prescription) and amplification (gain implementation) procedures for cochlear hearing loss. The fitting part of CHENFIT-AMP prescribes gain for outer hair cell (OHC) and inner hair cell (IHC) loss, respectively. The gain for OHC loss varies with the cochlear gain decided by the value of OHC loss and the input level. The gain for IHC loss varies with the value of IHC loss only and will be limited to a constant if there is a "dead region." The amplification part of CHENFIT-AMP is responsible for estimating the input level and cochlear gain based on Chen's loudness model. CHENFIT-AMP is evaluated with four typical audiograms and nine individual audiograms. A widely used nonlinear fitting procedure, NAL-NL2, is evaluated to compare prescription results with CHENFIT-AMP; a standard nonlinear amplification algorithm, multichannel compression (MCC), with the parameters provided by NAL-NL2, is also evaluated to compare amplification results with CHENFIT-AMP. For long-term average speech spectrum (LTASS) inputs, CHENFIT-AMP generally prescribes similar gain as NAL-NL2 for the typical audiograms; however, gain prescribed by CHENFIT-AMP is more individualized than NAL-NL2 for the individual audiograms, especially when the audiograms have big deviations in the slope. For LTASS-shaped noise input, the gain implemented by MCC with parameters provided by NAL-NL2 cannot completely realize the gain prescribed by NAL-NL2. For speech sentence inputs, average ratings by subjects indicated that amplification by CHENFIT-AMP was preferred and led to a louder perception than that by MCC with parameters from NAL-NL2.


Assuntos
Auxiliares de Audição , Perda Auditiva Neurossensorial/terapia , Dinâmica não Linear , Processamento de Sinais Assistido por Computador , Humanos , Percepção Sonora/fisiologia
5.
Sensors (Basel) ; 13(4): 5167-80, 2013 Apr 18.
Artigo em Inglês | MEDLINE | ID: mdl-23598502

RESUMO

A research prototype CT scanner is currently under development in our lab. One of the key components in this project is the CT detector. This paper describes the design and performance evaluation of the modular CT detector unit for our proposed scanner. It consists of a Photodiode Array Assembly which captures irradiating X-ray photons and converts the energy into electrical current, and a mini Data Acquisition System which performs current integration and converts the analog signal into digital samples. The detector unit can be easily tiled together to form a CT detector. Experiments were conducted to characterize the detector performance both at the single unit level and system level. The noise level, linearity and uniformity of the proposed detector unit were reported and initial imaging studies were also presented which demonstrated the potential application of the proposed detector unit in actual CT scanners.


Assuntos
Tomógrafos Computadorizados , Tomografia Computadorizada por Raios X/instrumentação , Artefatos , Humanos , Imagens de Fantasmas , Análise de Componente Principal , Intensificação de Imagem Radiográfica
6.
Comput Med Imaging Graph ; 36(1): 11-24, 2012 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-21555207

RESUMO

Quantification of coronary arterial stenoses is useful for the diagnosis of several coronary heart diseases. Being noninvasive, economical and informative, computed tomographic angiography (CTA) has become a common modality for monitoring disease status and treatment effects. Here, we present a new method for detecting and quantifying coronary arterial stenosis in CTA using fuzzy distance transform (FDT) approach and a new coherence analysis of observed data for computing expected local diameter. FDT allows computing local depth at each image point in the presence of partial voluming and thus, eliminates the need for binarization, commonly, associated with inclusion of additional errors. In the current method, coronary arterial stenoses are detected and their severities are quantified by analyzing FDT values along the medial axis of an arterial tree obtained by its skeletonization. A new skeletal pruning algorithm has been developed toward improving the quality of medial axes and thereby, enhancing the accuracy of stenosis detection and quantification. Further, we have developed a new method to estimate "expected diameter" along a given arterial branch using a new coherence analysis of observed diameter values along the branch. The overall method is completed in the following steps--(1) fuzzy segmentation of coronary artery in CTA, (2) FDT computation of coronary arteries, (3) medial axis computation, (4) estimation of observed and expected diameters along arteries and (5) detection of stenoses and quantification of arterial blockage. The performance of this method has been quantitatively evaluated on a realistic coronary artery phantom dataset with randomly simulated stenoses and the results have been compared with a binary distance transform based and a conventional binary algorithm. The method has also been applied on a clinical CTA dataset from thirteen heart patients and the results have been compared with an expert's quantitative assessment of stenoses. Results of the phantom experiment indicate that the new method (error: 0.53%) is significantly more accurate as compared to both binary distance transform based (error 2.11%) and conventional binary (error 3.71%) methods. Also, the results of the clinical study indicate that the new FDT-based method (kappa coefficient = 87.9%) is highly in agreement with the expert's assessments and, in this respect, outperforms the other two methods (kappa coefficients = 75.2% and 69.5%).


Assuntos
Angiografia Coronária/métodos , Estenose Coronária/diagnóstico por imagem , Lógica Fuzzy , Imageamento Tridimensional/métodos , Reconhecimento Automatizado de Padrão/métodos , Interpretação de Imagem Radiográfica Assistida por Computador/métodos , Tomografia Computadorizada por Raios X/métodos , Algoritmos , Humanos , Intensificação de Imagem Radiográfica/métodos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
7.
Microsc Res Tech ; 75(3): 388-96, 2012 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-21898668

RESUMO

Morphological observation and analysis of cerebral microvascular network is an essential way to study cerebral function. Automated labeling of cerebral microvascular in microscopy images is one of the key steps for quantitative analysis of microvascular network in the specimens of brain mantle. It is presented in this work that an automated image processing approach based on curvilinear structure detector is applied to label and analyze the microvascular in the image. Steerable filter is also introduced to address the detecting confusion in branching regions. And then the vascular morphology analysis, such as average microvascular density, is also performed after image processing. Validation has demonstrated that the results from proposed approach are satisfied. The proposed method is finally applied in the study of cerebral microvascular dysfunction induced by γ-ray irradiation.


Assuntos
Córtex Cerebral/irrigação sanguínea , Processamento de Imagem Assistida por Computador/métodos , Algoritmos , Animais , Córtex Cerebral/efeitos da radiação , Circulação Cerebrovascular/efeitos da radiação , Raios gama/efeitos adversos , Camundongos , Camundongos Endogâmicos ICR , Microvasos/patologia , Microvasos/efeitos da radiação , Software
8.
Hear Res ; 282(1-2): 69-80, 2011 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-21983133

RESUMO

A model for calculating auditory excitation patterns and loudness for steady sounds for normal hearing is extended to deal with cochlear hearing loss. The filters used in the model have a double ROEX-shape, the gain of the narrow active filter being controlled by the output of the broad passive filter. It is assumed that the hearing loss at each audiometric frequency can be partitioned into a loss due to dysfunction of outer hair cells (OHCs) and a loss due to dysfunction of inner hair cells (IHCs). OHC loss is modeled by decreasing the maximum gain of the active filter, which results in increased absolute threshold, reduced compressive nonlinearity and reduced frequency selectivity. IHC loss is modeled by a level-dependent attenuation of excitation level, which results in elevated absolute threshold. The magnitude of OHC loss and IHC loss can be derived from measures of loudness recruitment and the measured absolute threshold, using an iterative procedure. The model accurately fits loudness recruitment data obtained using subjects with unilateral or highly asymmetric cochlear hearing loss who were required to make loudness matches between tones presented alternately to the two ears. With the same parameters, the model predicted loudness matches between narrowband and broadband sound reasonably well, reflecting loudness summation. The model can also predict when a dead region is present.


Assuntos
Cóclea/fisiopatologia , Perda Auditiva Neurossensorial/fisiopatologia , Perda Auditiva Neurossensorial/psicologia , Percepção Sonora , Modelos Neurológicos , Reconhecimento Fisiológico de Modelo , Estimulação Acústica , Audiometria , Limiar Auditivo , Cóclea/patologia , Células Ciliadas Auditivas Internas/patologia , Células Ciliadas Auditivas Externas/patologia , Perda Auditiva Neurossensorial/patologia , Humanos
9.
Hear Res ; 282(1-2): 204-15, 2011 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-21851853

RESUMO

A new method for calculating auditory excitation patterns and loudness for steady sounds is described. The method is based on a nonlinear filterbank in which each filter is the sum of a broad passive filter and a sharp active filter. All filters have a rounded-exponential shape. For each center frequency (CF), the gain of the active filter is controlled by the output of the passive filter. The parameters of the model were derived from large sets of previously published notched-noise masking data obtained from human subjects. Excitation patterns derived using the new filterbank include the effects of basilar membrane compression. Loudness can be calculated as the area under the excitation pattern when plotted in intensity-like units on an ERB(N)-number (Cam) scale; no transformation from excitation to specific loudness is required. The method predicts the standard equal-loudness contours and loudness as a function of bandwidth with good accuracy. With some additional assumptions, the method also gives reasonably accurate predictions of partial loudness.


Assuntos
Vias Auditivas/fisiologia , Membrana Basilar/fisiologia , Percepção Sonora , Modelos Biológicos , Dinâmica não Linear , Estimulação Acústica , Limiar Auditivo , Humanos , Ruído/efeitos adversos , Mascaramento Perceptivo , Reprodutibilidade dos Testes
10.
Comput Methods Biomech Biomed Engin ; 14(4): 371-8, 2011 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-21442495

RESUMO

Alzheimer's disease (AD) is considered one of the most common age-associated neurodegenerative disorders, affecting millions of senior people worldwide. Combination of protein-protein interaction (PPI) network analysis and gene expression studies provides a better insight into AD. A computational approach was developed in our work to identify protein signal pathways between amyloid precursor proteins and tau proteins, which are well known as important proteins for AD. First, a modified LA-SEN method, called the network-constrained regularisation analysis, was applied to microarray data from a transgenic mouse model and AD patients. Then protein pathways were constructed based on an integer linear programming model to integrate microarray data and the PPI database. Important pathways of AD, including some cancer-related pathways, were identified finally.


Assuntos
Doença de Alzheimer/metabolismo , Perfilação da Expressão Gênica , Proteínas/metabolismo , Transdução de Sinais , Algoritmos , Doença de Alzheimer/genética , Precursor de Proteína beta-Amiloide/genética , Precursor de Proteína beta-Amiloide/metabolismo , Animais , Quinase 3 da Glicogênio Sintase/genética , Quinase 3 da Glicogênio Sintase/metabolismo , Modelos Lineares , Camundongos , Camundongos Endogâmicos C57BL , Camundongos Transgênicos , Proteínas/genética , Proteínas tau/genética , Proteínas tau/metabolismo
11.
IEEE Trans Biomed Eng ; 57(12): 2876-83, 2010 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-20833597

RESUMO

Fluorescence molecular tomography (FMT) plays an important role in studying physiological and pathological processes of small animals in vivo at molecular level. However, this technique suffers from relatively low spatial resolution. To complement the problem, there has been a strong demand for providing functional and morphological analysis at the same time. In this paper, we proposed a hybrid full-angle free-space FMT and X-ray micro-cone-beam computed tomography (CT) (micro-CBCT) prototype system, providing both functional and anatomical images. During the whole acquisition, the two subsystems acquire projection images (fluorescence and CT) synchronously to keep consistent body position without moving the animals. The acquired datasets are intrinsically coregistered in the corresponding coordinate and identified geometry. Tomographic fluorescence and CT images are reconstructed using normalized Born-based spatial regularization and Feldkamp-Davis-Kress methods, respectively. The experimental results of both phantom and in vivo mouse preliminarily validate the accuracy and performance of the integrated system.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Espectrometria de Fluorescência/métodos , Microtomografia por Raio-X/métodos , Algoritmos , Animais , Desenho de Equipamento , Camundongos , Camundongos Nus , Modelos Animais , Espectrometria de Fluorescência/instrumentação , Microtomografia por Raio-X/instrumentação
12.
J Neurosci Methods ; 190(2): 299-309, 2010 Jul 15.
Artigo em Inglês | MEDLINE | ID: mdl-20580743

RESUMO

High content neuron image processing is considered as an important method for quantitative neurobiological studies. The main goal of analysis in this paper is to provide automatic image processing approaches to process neuron images for studying neuron mechanism in high content screening. In the nuclei channel, all nuclei are segmented and detected by applying the gradient vector field based watershed. Then the neuronal nuclei are selected based on the soma region detected in neurite channel. In neurite images, we propose a novel neurite centerline extraction approach using the improved line-pixel detection technique. The proposed neurite tracing method can detect the curvilinear structure more accurately compared with the current existing methods. An interface called NeuriteIQ based on the proposed algorithms is developed finally for better application in high content screening.


Assuntos
Processamento de Imagem Assistida por Computador/métodos , Neuritos , Neurônios/citologia , Algoritmos , Peptídeos beta-Amiloides/metabolismo , Animais , Automação , Núcleo Celular , Células Cultivadas , Modelos Lineares , Camundongos , Camundongos Endogâmicos C57BL , Neurônios/metabolismo , Distribuição Normal , Software , Design de Software
13.
Microsc Res Tech ; 73(2): 109-18, 2010 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-19697431

RESUMO

Neurons that come to populate the six-layered cerebral cortex are born deep within the developing brain in the surface of the embryonic cerebral ventricles. It is very important to detect these neurons for studying histogenesis of the brain and abnormal migration that had been linked to cognitive deficits, mental retardation, and motor disorders. The visualization of labeled cells in brain sections was performed by immunocytochemical examination and its image data were documented to microscopic pictures. Based on the fact, automatic accurate neurons labeling is prerequisite instead of time-consuming manual labeling. In this article, a fully automated image processing approach is proposed to detect all the stained neurons in microscopic images. First of all, dark stained neurons are achieved by thresholding in blue channel of image. And then a modified fuzzy c-means clustering method, called alternative fuzzy c-means is applied to achieve higher classification accuracy in extracting constraint factor. Finally, watershed based on gradient vector flow is employed to the constraint factor image to segment all the neurons, including clustered neurons. The results demonstrate that the proposed method can be a useful tool in neuron image analysis.


Assuntos
Automação/métodos , Encéfalo/citologia , Movimento Celular , Processamento de Imagem Assistida por Computador/métodos , Microscopia/métodos , Neurônios/fisiologia , Animais , Feminino , Masculino , Camundongos , Camundongos Endogâmicos ICR , Gravidez , Coloração e Rotulagem/métodos
14.
IEEE NIH Life Sci Syst Appl Workshop ; 2009: 128-132, 2009 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-20585413

RESUMO

We report multifactorial analysis of candidate mechanisms of Alzheimer's disease utilizing high content analysis, gene expression microarray, and linear regression model to integrate neuronal imaging data with hippocampal gene expression data. Our analysis led to the identification of several genes that may contribute to different image traits or phenotypes in the amyloid-beta (Aß) injured neurons. Gene network and biological pathways analysis for those genes were further analyzed and led to several novel pathways that may contribute to amyloid plaque triggered neurite loss.

15.
Magn Reson Med ; 60(3): 650-60, 2008 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-18727083

RESUMO

Parallel MRI (pMRI) achieves imaging acceleration by partially substituting gradient-encoding steps with spatial information contained in the component coils of the acquisition array. Variable-density subsampling in pMRI was previously shown to yield improved two-dimensional (2D) imaging in comparison to uniform subsampling, but has yet to be used routinely in clinical practice. In an effort to reduce acquisition time for 3D fast spin-echo (3D-FSE) sequences, this work explores a specific nonuniform sampling scheme for 3D imaging, subsampling along two phase-encoding (PE) directions on a rectilinear grid. We use two reconstruction methods-2D-GRAPPA-Operator and 2D-SPACE RIP-and present a comparison between them. We show that high-quality images can be reconstructed using both techniques. To evaluate the proposed sampling method and reconstruction schemes, results via simulation, phantom study, and in vivo 3D human data are shown. We find that fewer artifacts can be seen in the 2D-SPACE RIP reconstructions than in 2D-GRAPPA-Operator reconstructions, with comparable reconstruction times.


Assuntos
Imagem Ecoplanar/métodos , Imagem Ecoplanar/normas , Imageamento por Ressonância Magnética/métodos , Imageamento por Ressonância Magnética/normas , Adulto , Algoritmos , Artefatos , Encéfalo/anatomia & histologia , Humanos , Masculino , Imagens de Fantasmas , Intensificação de Imagem Radiográfica
16.
Comput Methods Programs Biomed ; 88(2): 131-43, 2007 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-17919766

RESUMO

For automated visualization and quantification of artery diseases, the accurate determination of the arterial centerline is a prerequisite. Existing tracking-based approaches usually suffer from the inaccuracy, inflexion and discontinuity in the extracted centerlines, and they may even fail in complicated situations. In this paper, an improved algorithm for coronary arterial centerline extraction is proposed, which incorporates a new tracking direction updating scheme, a self-adaptive magnitude of linear extrapolation and a dynamic-size search window for matched filtering. A simulation study is conducted for the determination of the optimal weighting factor which is used to combine the geometrical topology information and intensity distribution information to obtain the proposed tracking direction. Synthetic and clinical examples, representing some difficult situations that may occur in coronary angiograms, are presented. Results show that the proposed algorithm outperforms the conventional methods. By adopting the proposed algorithm, centerlines are successfully extracted under these complicated situations, and with satisfactory accuracy.


Assuntos
Algoritmos , Vasos Sanguíneos , Angiografia Coronária/métodos , China , Angiografia Coronária/normas , Humanos
17.
Doc Ophthalmol ; 112(3): 201-7, 2006 May.
Artigo em Inglês | MEDLINE | ID: mdl-16779498

RESUMO

PURPOSE: Increase the accuracy in classifying subjects with or without early diabetic retinopathy, by analyzing the multifocal electroretinogram (mfERG) responses. METHOD: The mfERG was recorded for 14 control subjects (Normal) and 26 non-insulin dependent diabetic patients, 16 of the patients had no apparent retinopathy (NDR), the other 10 had mild to moderate non-proliferative retinopathy (NPDR). The first order kernel (K1) and the first slice of second order kernel (K21) of the mfERG were summed across the field and exported as trace arrays. The feature subset was selected from the kernel arrays by criterion of inter-intra distance and sequential forward searching strategy. Based on the selected features, Fisher's linear classifiers were trained and the classification error rates were given. RESULT: With one feature selected, the error rates of the classifiers were bigger than 23% between the NDR and NPDR subjects. When six features were selected to train the classifiers, the classification error rate decreased to 6.5% between Normal and NDR, 0 between Normal and NPDR, and 9.6% between NDR and NPDR. The criterion function of inter-intra distance was calculated for each feature in the K1 and K21 arrays. According to rank of the function value, during early DR, deformations should be easier to find at the front limb of peaks N2 and P1 on the K1 trace, and at the front limb of peak N2 on the K21 trace. As a conclusion, the multifocal ERG responses have great potentials in exploring diabetic retinopathy, assisted with the pattern classification methods.


Assuntos
Retinopatia Diabética/diagnóstico , Retina/fisiopatologia , Adulto , Idoso , Retinopatia Diabética/fisiopatologia , Diagnóstico Diferencial , Eletrorretinografia/métodos , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Modelos Teóricos , Índice de Gravidade de Doença
18.
Vision Res ; 46(8-9): 1292-301, 2006 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-16380147

RESUMO

Psychophysics of reading with limited numbers of pixels has received increasing attention as the envisioned visual prosthesis offers a possibility to restore some useful reading ability to the blind through the pixelized vision it generates. This paper systematically studied how several important parameters of pixelized vision affect reading performance. A closed-circuit television reading platform with digital image processing capacities was developed to convert images of printed text into pixelized patterns made up of discrete dots. Reading rates in six normally sighted subjects were measured under different combinations of pixel number, window width, and angular subtense of pixel array. The results showed that reading is possible with as few as 6 x 6 binary pixels, at 15 words/min. It was also found that for a given array of pixels, maximum reading rates occur at a specific medium window width, due to a tradeoff between window width and character sampling resolution. It was also observed that pixelized reading exhibits more significant scale dependence than normal vision. Reading rates were decreased by increasing the angular subtense of the pixel array while keeping other parameters fixed. We hope these results will be helpful to the design of visual prosthesis for the rehabilitation of reading abilities.


Assuntos
Cegueira/reabilitação , Gráficos por Computador , Leitura , Auxiliares Sensoriais , Adulto , Humanos , Processamento de Imagem Assistida por Computador , Psicofísica , Percepção Visual
19.
Conf Proc IEEE Eng Med Biol Soc ; 2005: 4568-71, 2005.
Artigo em Inglês | MEDLINE | ID: mdl-17281256

RESUMO

About 25% epilepsy patients are suffering from medically intractable epileptic seizure. Many studies have shown that electroencephalogram (EEG) biofeedback therapy has the exciting potential for seizure control. In this paper, five patients with intractable epilepsy were trained to increase the production of sensorimotor (12~15Hz) activity and decrease the production of slow theta (4~7Hz) activity. Nonlinear analysis are proposed to evaluate the effect of biofeedback training. In all the five patients, the complexity and approximate entropy of EEG increased significantly (P<0.05) after (about 1-month) the biofeedback treatment.

20.
Conf Proc IEEE Eng Med Biol Soc ; 2005: 5223-6, 2005.
Artigo em Inglês | MEDLINE | ID: mdl-17281426

RESUMO

To investigate the effects of phosphene array irregularity in visual prostheses, a model of phosphenes' positional uncertainty was devised and two different image down-sampling schemes, adapting and non-adapting, were proposed. Based on a simulation system, visual acuity tests were given to four normally sighted subjects under seven degrees of array irregularity and the two down-sampling schemes. With the irregularity index increasing from 0 to 0.6, visual acuity fell over a range of 0.22 logMAR (logarithm of minimum angular resolution) for the adapting scheme and 0.47 logMAR for the non-adapting scheme, both monotonously. Comparison between the two down-sampling schemes was made. For low array irregularity, non-adapting down-sampling afforded higher visual acuity, whereas for higher irregular the adapting approach did. Head movements were observed to play significant roles in the acuity tests.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...