Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
IEEE Trans Image Process ; 20(3): 657-69, 2011 Mar.
Article in English | MEDLINE | ID: mdl-20876024

ABSTRACT

This article proposes a new algorithm to compute the projection on the set of images whose total variation is bounded by a constant. The projection is computed through a dual formulation that is solved by first order non-smooth optimization methods. This yields an iterative algorithm that applies iterative soft thresholding to the dual vector field, and for which we establish convergence rate on the primal iterates. This projection algorithm can then be used as a building block in a variety of applications such as solving inverse problems under a total variation constraint, or for texture synthesis. Numerical results are reported to illustrate the usefulness and potential applicability of our TV projection algorithm on various examples including denoising, texture synthesis, inpainting, deconvolution and tomography problems. We also show that our projection algorithm competes favorably with state-of-the-art TV projection methods in terms of convergence speed.


Subject(s)
Algorithms , Image Processing, Computer-Assisted/methods , Models, Theoretical , Phantoms, Imaging , Tomography
2.
IEEE Trans Image Process ; 18(2): 310-21, 2009 Feb.
Article in English | MEDLINE | ID: mdl-19131301

ABSTRACT

We propose an image deconvolution algorithm when the data is contaminated by Poisson noise. The image to restore is assumed to be sparsely represented in a dictionary of waveforms such as the wavelet or curvelet transforms. Our key contributions are as follows. First, we handle the Poisson noise properly by using the Anscombe variance stabilizing transform leading to a nonlinear degradation equation with additive Gaussian noise. Second, the deconvolution problem is formulated as the minimization of a convex functional with a data-fidelity term reflecting the noise properties, and a nonsmooth sparsity-promoting penalty over the image representation coefficients (e.g., l(1) -norm). An additional term is also included in the functional to ensure positivity of the restored image. Third, a fast iterative forward-backward splitting algorithm is proposed to solve the minimization problem. We derive existence and uniqueness conditions of the solution, and establish convergence of the iterative algorithm. Finally, a GCV-based model selection procedure is proposed to objectively select the regularization parameter. Experimental results are carried out to show the striking benefits gained from taking into account the Poisson statistics of the noise. These results also suggest that using sparse-domain regularization may be tractable in many deconvolution applications with Poisson noise such as astronomy and microscopy.


Subject(s)
Algorithms , Artifacts , Image Interpretation, Computer-Assisted/methods , Data Interpretation, Statistical , Image Enhancement/methods , Models, Statistical , Poisson Distribution , Reproducibility of Results , Sensitivity and Specificity
3.
IEEE Trans Image Process ; 17(7): 1093-108, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18586618

ABSTRACT

In order to denoise Poisson count data, we introduce a variance stabilizing transform (VST) applied on a filtered discrete Poisson process, yielding a near Gaussian process with asymptotic constant variance. This new transform, which can be deemed as an extension of the Anscombe transform to filtered data, is simple, fast, and efficient in (very) low-count situations. We combine this VST with the filter banks of wavelets, ridgelets and curvelets, leading to multiscale VSTs (MS-VSTs) and nonlinear decomposition schemes. By doing so, the noise-contaminated coefficients of these MS-VST-modified transforms are asymptotically normally distributed with known variances. A classical hypothesis-testing framework is adopted to detect the significant coefficients, and a sparsity-driven iterative scheme reconstructs properly the final estimate. A range of examples show the power of this MS-VST approach for recovering important structures of various morphologies in (very) low-count images. These results also demonstrate that the MS-VST approach is competitive relative to many existing denoising methods.


Subject(s)
Algorithms , Artifacts , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Computer Simulation , Models, Statistical , Poisson Distribution , Reproducibility of Results , Sensitivity and Specificity
4.
IEEE Trans Image Process ; 16(11): 2675-81, 2007 Nov.
Article in English | MEDLINE | ID: mdl-17990744

ABSTRACT

In a recent paper, a method called morphological component analysis (MCA) has been proposed to separate the texture from the natural part in images. MCA relies on an iterative thresholding algorithm, using a threshold which decreases linearly towards zero along the iterations. This paper shows how the MCA convergence can be drastically improved using the mutual incoherence of the dictionaries associated to the different components. This modified MCA algorithm is then compared to basis pursuit, and experiments show that MCA and BP solutions are similar in terms of sparsity, as measured by the l1 norm, but MCA is much faster and gives us the possibility of handling large scale data sets.


Subject(s)
Algorithms , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Artificial Intelligence , Principal Component Analysis , Reproducibility of Results , Sensitivity and Specificity
5.
IEEE Trans Image Process ; 14(2): 231-40, 2005 Feb.
Article in English | MEDLINE | ID: mdl-15700528

ABSTRACT

A novel Bayesian nonparametric estimator in the Wavelet domain is presented. In this approach, a prior model is imposed on the wavelet coefficients designed to capture the sparseness of the wavelet expansion. Seeking probability models for the marginal densities of the wavelet coefficients, the new family of Bessel K forms (BKF) densities are shown to fit very well to the observed histograms. Exploiting this prior, we designed a Bayesian nonlinear denoiser and we derived a closed form for its expression. We then compared it to other priors that have been introduced in the literature, such as the generalized Gaussian density (GGD) or the alpha-stable models, where no analytical form is available for the corresponding Bayesian denoisers. Specifically, the BKF model turns out to be a good compromise between these two extreme cases (hyperbolic tails for the alpha-stable and exponential tails for the GGD). Moreover, we demonstrate a high degree of match between observed and estimated prior densities using the BKF model. Finally, a comparative study is carried out to show the effectiveness of our denoiser which clearly outperforms the classical shrinkage or thresholding wavelet-based techniques.


Subject(s)
Algorithms , Artificial Intelligence , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Information Storage and Retrieval/methods , Pattern Recognition, Automated/methods , Bayes Theorem , Computer Graphics , Computer Simulation , Models, Statistical , Numerical Analysis, Computer-Assisted , Reproducibility of Results , Sensitivity and Specificity , Signal Processing, Computer-Assisted , Subtraction Technique , User-Computer Interface
SELECTION OF CITATIONS
SEARCH DETAIL
...