Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 19(10)2019 May 21.
Artigo em Inglês | MEDLINE | ID: mdl-31117299

RESUMO

In X-ray tomography image reconstruction, one of the most successful approaches involves a statistical approach with l 2 norm for fidelity function and some regularization function with l p norm, 1 < p < 2 . Among them stands out, both for its results and the computational performance, a technique that involves the alternating minimization of an objective function with l 2 norm for fidelity and a regularization term that uses discrete gradient transform (DGT) sparse transformation minimized by total variation (TV). This work proposes an improvement to the reconstruction process by adding a bilateral edge-preserving (BEP) regularization term to the objective function. BEP is a noise reduction method and has the purpose of adaptively eliminating noise in the initial phase of reconstruction. The addition of BEP improves optimization of the fidelity term and, as a consequence, improves the result of DGT minimization by total variation. For reconstructions with a limited number of projections (low-dose reconstruction), the proposed method can achieve higher peak signal-to-noise ratio (PSNR) and structural similarity index measurement (SSIM) results because it can better control the noise in the initial processing phase.

2.
Neural Netw ; 35: 70-81, 2012 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-22954480

RESUMO

This paper proposes a technique, called Evolving Probabilistic Neural Network (ePNN), that presents many interesting features, including incremental learning, evolving architecture, the capacity to learn continually throughout its existence and requiring that each training sample be used only once in the training phase without reprocessing. A series of experiments was performed on data sets in the public domain; the results indicate that ePNN is superior or equal to the other incremental neural networks evaluated in this paper. These results also demonstrate the advantage of the small ePNN architecture and show that its architecture is more stable than the other incremental neural networks evaluated. ePNN thus appears to be a promising alternative for a quick learning system and a fast classifier with a low computational cost.


Assuntos
Algoritmos , Inteligência Artificial , Lógica Fuzzy , Rede Nervosa , Redes Neurais de Computação , Simulação por Computador
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA