Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 18 de 18
Filter
1.
IEEE Trans Image Process ; 25(10): 4704-4718, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27448356

ABSTRACT

In this paper, we address the problem of unsupervised change detection on two or more coregistered images of the same object or scene at several time instants. We propose a novel empirical-Bayesian approach that is based on a false discovery rate formulation for statistical inference on local patch-based samples. This alternative error metric allows to efficiently adjust the family-wise error rate in case of the considered large-scale testing problem. The designed change detector operates in an unsupervised manner under the assumption of the limited amount of changes in the analyzed imagery. The detection is based on the use of various statistical features, which enable the detector to address application-specific detection problems provided an appropriate ad hoc feature choice. In particular, we demonstrate the use of the rank-based statistics: Wilcoxon and Cramér-von Mises for image pairs, and multisample Levene statistic for short image sequences. The experiments with remotely sensed radar, dermatological, and still camera surveillance imagery demonstrate accurate performance and flexibility of the proposed method.

2.
Phys Med Biol ; 60(9): 3415-31, 2015 May 07.
Article in English | MEDLINE | ID: mdl-25856087

ABSTRACT

The presence of illumination variation in dermatological images has a negative impact on the automatic detection and analysis of cutaneous lesions. This paper proposes a new illumination modeling and chromophore identification method to correct lighting variation in skin lesion images, as well as to extract melanin and hemoglobin concentrations of human skin, based on an adaptive bilateral decomposition and a weighted polynomial curve fitting, with the knowledge of a multi-layered skin model. Different from state-of-the-art approaches based on the Lambert law, the proposed method, considering both specular reflection and diffuse reflection of the skin, enables us to address highlight and strong shading effects usually existing in skin color images captured in an uncontrolled environment. The derived melanin and hemoglobin indices, directly relating to the pathological tissue conditions, tend to be less influenced by external imaging factors and are more efficient in describing pigmentation distributions. Experiments show that the proposed method gave better visual results and superior lesion segmentation, when compared to two other illumination correction algorithms, both designed specifically for dermatological images. For computer-aided diagnosis of melanoma, sensitivity achieves 85.52% when using our chromophore descriptors, which is 8~20% higher than those derived from other color descriptors. This demonstrates the benefit of the proposed method for automatic skin disease analysis.


Subject(s)
Algorithms , Image Enhancement/methods , Melanoma/pathology , Optical Imaging/methods , Humans
3.
IEEE Trans Image Process ; 22(10): 3791-806, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23799694

ABSTRACT

Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.


Subject(s)
Image Processing, Computer-Assisted/methods , Models, Statistical , Algorithms , Computer Simulation , Humans , Radar , Ultrasonography
4.
IEEE Trans Image Process ; 22(2): 561-72, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23008256

ABSTRACT

In this paper, we combine amplitude and texture statistics of the synthetic aperture radar images for the purpose of model-based classification. In a finite mixture model, we bring together the Nakagami densities to model the class amplitudes and a 2-D auto-regressive texture model with t-distributed regression error to model the textures of the classes. A non-stationary multinomial logistic latent class label model is used as a mixture density to obtain spatially smooth class segments. The classification expectation-maximization algorithm is performed to estimate the class parameters and to classify the pixels. We resort to integrated classification likelihood criterion to determine the number of classes in the model. We present our results on the classification of the land covers obtained in both supervised and unsupervised cases processing TerraSAR-X, as well as COSMO-SkyMed data.

5.
IEEE Trans Pattern Anal Mach Intell ; 32(1): 135-47, 2010 Jan.
Article in English | MEDLINE | ID: mdl-19926904

ABSTRACT

We present a new approach for building reconstruction from a single Digital Surface Model (DSM). It treats buildings as an assemblage of simple urban structures extracted from a library of 3D parametric blocks (like a LEGO set). First, the 2D-supports of the urban structures are extracted either interactively or automatically. Then, 3D-blocks are placed on the 2D-supports using a Gibbs model which controls both the block assemblage and the fitting to data. A Bayesian decision finds the optimal configuration of 3D-blocks using a Markov Chain Monte Carlo sampler associated with original proposition kernels. This method has been validated on multiple data set in a wide-resolution interval such as 0.7 m satellite and 0.1 m aerial DSMs, and provides 3D representations on complex buildings and dense urban areas with various levels of detail.

6.
Appl Opt ; 48(22): 4437-48, 2009 Aug 01.
Article in English | MEDLINE | ID: mdl-19649049

ABSTRACT

We propose an alternate minimization algorithm for estimating the point-spread function (PSF) of a confocal laser scanning microscope and the specimen fluorescence distribution. A three-dimensional separable Gaussian model is used to restrict the PSF solution space and a constraint on the specimen is used so as to favor the stabilization and convergence of the algorithm. The results obtained from the simulation show that the PSF can be estimated to a high degree of accuracy, and those on real data show better deconvolution as compared to a full theoretical PSF model.


Subject(s)
Microscopy, Confocal/instrumentation , Microscopy, Confocal/methods , Optics and Photonics , Algorithms , Arabidopsis/metabolism , Bayes Theorem , Computer Simulation , Equipment Design , Models, Statistical , Models, Theoretical , Normal Distribution , Poisson Distribution , Reproducibility of Results
7.
IEEE Trans Image Process ; 18(10): 2303-15, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19546039

ABSTRACT

We propose a new Bayesian method for detecting the regions of object displacements in aerial image pairs. We use a robust but coarse 2-D image registration algorithm. Our main challenge is to eliminate the registration errors from the extracted change map. We introduce a three-layer Markov random field (L(3)MRF) model which integrates information from two different features, and ensures connected homogenous regions in the segmented images. Validation is given on real aerial photos.


Subject(s)
Algorithms , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Pattern Recognition, Automated/methods , Photogrammetry/methods , Subtraction Technique , Artificial Intelligence , Markov Chains , Motion , Reproducibility of Results , Sensitivity and Specificity
8.
IEEE Trans Image Process ; 18(8): 1830-43, 2009 Aug.
Article in English | MEDLINE | ID: mdl-19447707

ABSTRACT

In this paper, we present a novel multiscale texture model and a related algorithm for the unsupervised segmentation of color images. Elementary textures are characterized by their spatial interactions with neighboring regions along selected directions. Such interactions are modeled, in turn, by means of a set of Markov chains, one for each direction, whose parameters are collected in a feature vector that synthetically describes the texture. Based on the feature vectors, the texture are then recursively merged, giving rise to larger and more complex textures, which appear at different scales of observation: accordingly, the model is named Hierarchical Multiple Markov Chain (H-MMC). The Texture Fragmentation and Reconstruction (TFR) algorithm, addresses the unsupervised segmentation problem based on the H-MMC model. The "fragmentation" step allows one to find the elementary textures of the model, while the "reconstruction" step defines the hierarchical image segmentation based on a probabilistic measure (texture score) which takes into account both region scale and inter-region interactions. The performance of the proposed method was assessed through the Prague segmentation benchmark, based on mosaics of real natural textures, and also tested on real-world natural and remote sensing images.


Subject(s)
Algorithms , Image Processing, Computer-Assisted/methods , Markov Chains , Pattern Recognition, Automated/methods , Cluster Analysis , Models, Statistical
9.
IEEE Trans Pattern Anal Mach Intell ; 30(1): 105-19, 2008 Jan.
Article in English | MEDLINE | ID: mdl-18000328

ABSTRACT

This work presents a framework for automatic feature extraction from images using stochastic geometry. Features in images are modeled as realizations of a spatial point process of geometrical shapes. This framework allows the incorporation of a priori knowledge on the spatial repartition of features. More specifically, we present a model based on the superposition of a process of segments and a process of rectangles. The former is dedicated to the detection of linear networks of discontinuities, while the latter aims at segmenting homogeneous areas. An energy is defined, favoring connections of segments, alignments of rectangles, as well as a relevant interaction between both types of objects. The estimation is performed by minimizing the energy using a simulated annealing algorithm. The proposed model is applied to the analysis of Digital Elevation Models (DEMs). These images are raster data representing the altimetry of a dense urban area. We present results on real data provided by the IGN (French National Geographic Institute) consisting in low quality DEMs of various types.


Subject(s)
Algorithms , Artificial Intelligence , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Pattern Recognition, Automated/methods , Signal Processing, Computer-Assisted , Information Storage and Retrieval/methods , Reproducibility of Results , Sensitivity and Specificity
10.
Article in English | MEDLINE | ID: mdl-18003522

ABSTRACT

In this paper, we propose a method for the iterative restoration of fluorescence Confocal Laser Scanning Microscopic (CLSM) images and parametric estimation of the acquisition system's Point Spread Function (PSF). The CLSM is an optical fluorescence microscope that scans a specimen in 3D and uses a pinhole to reject most of the out-of-focus light. However, the quality of the images suffers from two basic physical limitations. The diffraction-limited nature of the optical system, and the reduced amount of light detected by the photomultiplier cause blur and photon counting noise respectively. These images can hence benefit from post-processing restoration methods based on deconvolution. An efficient method for parametric blind image deconvolution involves the simultaneous estimation of the specimen 3D distribution of fluorescent sources and the microscope PSF. By using a model for the microscope image acquisition physical process, we reduce the number of free parameters describing the PSF and introduce constraints. The parameters of the PSF may vary during the course of experimentation, and so they have to be estimated directly from the observed data. A priori model of the specimen is further applied to stabilize the alternate minimization algorithm and to converge to the solutions.


Subject(s)
Image Processing, Computer-Assisted , Microscopy, Confocal , Algorithms
11.
Appl Opt ; 46(10): 1819-29, 2007 Apr 01.
Article in English | MEDLINE | ID: mdl-17356626

ABSTRACT

We comprehensively study the least-squares Gaussian approximations of the diffraction-limited 2D-3D paraxial-nonparaxial point-spread functions (PSFs) of the wide field fluorescence microscope (WFFM), the laser scanning confocal microscope (LSCM), and the disk scanning confocal microscope (DSCM). The PSFs are expressed using the Debye integral. Under an L(infinity) constraint imposing peak matching, optimal and near-optimal Gaussian parameters are derived for the PSFs. With an L1 constraint imposing energy conservation, an optimal Gaussian parameter is derived for the 2D paraxial WFFM PSF. We found that (1) the 2D approximations are all very accurate; (2) no accurate Gaussian approximation exists for 3D WFFM PSFs; and (3) with typical pinhole sizes, the 3D approximations are accurate for the DSCM and nearly perfect for the LSCM. All the Gaussian parameters derived in this study are in explicit analytical form, allowing their direct use in practical applications.


Subject(s)
Algorithms , Image Interpretation, Computer-Assisted/methods , Microscopy, Fluorescence/methods , Computer Simulation , Image Enhancement/methods , Models, Biological , Models, Statistical , Normal Distribution , Radiation Dosage , Radiometry , Reproducibility of Results , Scattering, Radiation , Sensitivity and Specificity
12.
IEEE Trans Image Process ; 15(9): 2686-93, 2006 Sep.
Article in English | MEDLINE | ID: mdl-16948313

ABSTRACT

Synthetic aperture radar (SAR) images are inherently affected by a signal dependent noise known as speckle, which is due to the radar wave coherence. In this paper, we propose a novel adaptive despeckling filter and derive a maximum a posteriori (MAP) estimator for the radar cross section (RCS). We first employ a logarithmic transformation to change the multiplicative speckle into additive noise. We model the RCS using the recently introduced heavy-tailed Rayleigh density function, which was derived based on the assumption that the real and imaginary parts of the received complex signal are best described using the alpha-stable family of distribution. We estimate model parameters from noisy observations by means of second-kind statistics theory, which relies on the Mellin transform. Finally, we compare the proposed algorithm with several classical speckle filters applied on actual SAR images. Experimental results show that the homomorphic MAP filter based on the heavy-tailed Rayleigh prior for the RCS is among the best for speckle removal.


Subject(s)
Algorithms , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Radar , Computer Simulation , Imaging, Three-Dimensional/methods , Information Storage and Retrieval/methods , Likelihood Functions , Models, Statistical
13.
IEEE Trans Image Process ; 15(6): 1429-42, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16764268

ABSTRACT

In the context of remotely sensed data analysis, an important problem is the development of accurate models for the statistics of the pixel intensities. Focusing on synthetic aperture radar (SAR) data, this modeling process turns out to be a crucial task, for instance, for classification or for denoising purposes. In this paper, an innovative parametric estimation methodology for SAR amplitude data is proposed that adopts a generalized Gaussian (GG) model for the complex SAR backscattered signal. A closed-form expression for the corresponding amplitude probability density function (PDF) is derived and a specific parameter estimation algorithm is developed in order to deal with the proposed model. Specifically, the recently proposed "method-of-log-cumulants" (MoLC) is applied, which stems from the adoption of the Mellin transform (instead of the usual Fourier transform) in the computation of characteristic functions and from the corresponding generalization of the concepts of moment and cumulant. For the developed GG-based amplitude model, the resulting MoLC estimates turn out to be numerically feasible and are also analytically proved to be consistent. The proposed parametric approach was validated by using several real ERS-1, XSAR, E-SAR, and NASA/JPL airborne SAR images, and the experimental results prove that the method models the amplitude PDF better than several previously proposed parametric models for backscattering phenomena.


Subject(s)
Algorithms , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Information Storage and Retrieval/methods , Models, Statistical , Computer Simulation , Normal Distribution , Reproducibility of Results , Sensitivity and Specificity
14.
Microsc Res Tech ; 69(4): 260-6, 2006 Apr.
Article in English | MEDLINE | ID: mdl-16586486

ABSTRACT

Confocal laser scanning microscopy is a powerful and popular technique for 3D imaging of biological specimens. Although confocal microscopy images are much sharper than standard epifluorescence ones, they are still degraded by residual out-of-focus light and by Poisson noise due to photon-limited detection. Several deconvolution methods have been proposed to reduce these degradations, including the Richardson-Lucy iterative algorithm, which computes maximum likelihood estimation adapted to Poisson statistics. As this algorithm tends to amplify noise, regularization constraints based on some prior knowledge on the data have to be applied to stabilize the solution. Here, we propose to combine the Richardson-Lucy algorithm with a regularization constraint based on Total Variation, which suppresses unstable oscillations while preserving object edges. We show on simulated and real images that this constraint improves the deconvolution results as compared with the unregularized Richardson-Lucy algorithm, both visually and quantitatively.


Subject(s)
Algorithms , Image Processing, Computer-Assisted , Imaging, Three-Dimensional , Microscopy, Confocal/methods
15.
IEEE Trans Pattern Anal Mach Intell ; 27(10): 1568-79, 2005 Oct.
Article in English | MEDLINE | ID: mdl-16237992

ABSTRACT

This paper addresses the problem of unsupervised extraction of line networks (for example, road or hydrographic networks) from remotely sensed images. We model the target line network by an object process, where the objects correspond to interacting line segments. The prior model, called "Quality Candy," is designed to exploit as fully as possible the topological properties of the network under consideration, while the radiometric properties of the network are modeled using a data term based on statistical tests. Two techniques are used to compute this term: one is more accurate, the other more efficient. A calibration technique is used to choose the model parameters. Optimization is done via simulated annealing using a Reversible Jump Markov Chain Monte Carlo (RJMCMC) algorithm. We accelerate convergence of the algorithm by using appropriate proposal kernels. The results obtained on satellite and aerial images are quantitatively evaluated with respect to manual extractions. A comparison with the results obtained using a previous model, called the "Candy" model, shows the interest of adding quality coefficients with respect to interactions in the prior density. The relevance of using an offline computation of the data potential is shown, in particular, when a proposal kernel based on this computation is added in the RJMCMC algorithm.


Subject(s)
Algorithms , Artificial Intelligence , Environmental Monitoring/methods , Image Interpretation, Computer-Assisted/methods , Information Storage and Retrieval/methods , Pattern Recognition, Automated/methods , Signal Processing, Computer-Assisted , Image Enhancement/methods , Numerical Analysis, Computer-Assisted
16.
IEEE Trans Image Process ; 13(4): 527-33, 2004 Apr.
Article in English | MEDLINE | ID: mdl-15376587

ABSTRACT

Synthetic aperture radar (SAR) imagery has found important applications due to its clear advantages over optical satellite imagery one of them being able to operate in various weather conditions. However, due to the physics of the radar imaging process, SAR images contain unwanted artifacts in the form of a granular look which is called speckle. The assumptions of the classical SAR image generation model lead to a Rayleigh distribution model for the histogram of the SAR image. However, some experimental data such as images of urban areas show impulsive characteristics that correspond to underlying heavy-tailed distributions, which are clearly non-Rayleigh. Some alternative distributions have been suggested such as the Weibull, log-normal, and the k-distribution which had success in varying degrees depending on the application. Recently, an alternative model namely the alpha-stable distribution has been suggested for modeling radar clutter. In this paper, we show that the amplitude distribution of the complex wave, the real and the imaginery components of which are assumed to be distributed by the alpha-stable distribution, is a generalization of the Rayleigh distribution. We demonstrate that the amplitude distribution is a mixture of Rayleighs as is the k-distribution in accordance with earlier work on modeling SAR images which showed that almost all successful SAR image models could be expressed as mixtures of Rayleighs. We also present parameter estimation techniques based on negative order moments for the new model. Finally, we test the performance of the model on urban images and compare with other models such as Rayleigh, Weibull, and the k-distribution.

17.
IEEE Trans Image Process ; 13(4): 613-21, 2004 Apr.
Article in English | MEDLINE | ID: mdl-15376594

ABSTRACT

The deconvolution of blurred and noisy satellite images is an ill-posed inverse problem, which can be regularized within a Bayesian context by using an a priori model of the reconstructed solution. Since real satellite data show spatially variant characteristics, we propose here to use an inhomogeneous model. We use the maximum likelihood estimator (MLE) to estimate its parameters and we show that the MLE computed on the corrupted image is not suitable for image deconvolution because it is not robust to noise. We then show that the estimation is correct only if it is made from the original image. Since this image is unknown, we need to compute an approximation of sufficiently good quality to provide useful estimation results. Such an approximation is provided by a wavelet-based deconvolution algorithm. Thus, a hybrid method is first used to estimate the space-variant parameters from this image and then to compute the regularized solution. The obtained results on high resolution satellite images simultaneously exhibit sharp edges, correctly restored textures, and a high SNR in homogeneous areas, since the proposed technique adapts to the local characteristics of the data.


Subject(s)
Algorithms , Environmental Monitoring/methods , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Information Storage and Retrieval/methods , Pattern Recognition, Automated , Spacecraft , Artificial Intelligence , Computer Simulation , Feedback , Models, Statistical , Normal Distribution , Reproducibility of Results , Sensitivity and Specificity , Signal Processing, Computer-Assisted
18.
IEEE Trans Image Process ; 11(3): 188-200, 2002.
Article in English | MEDLINE | ID: mdl-18244623

ABSTRACT

In this paper, we have derived analytic expressions for the phase correlation of downsampled images. We have shown that for downsampled images the signal power in the phase correlation is not concentrated in a single peak, but rather in several coherent peaks mostly adjacent to each other. These coherent peaks correspond to the polyphase transform of a filtered unit impulse centered at the point of registration. The analytic results provide a closed-form solution to subpixel translation estimation, and are used for detailed error analysis. Excellent results have been obtained for subpixel translation estimation of images of different nature and across different spectral bands.

SELECTION OF CITATIONS
SEARCH DETAIL
...