Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
Inf Process Med Imaging ; 19: 553-65, 2005.
Article in English | MEDLINE | ID: mdl-17354725

ABSTRACT

Having accurate left ventricle (LV) segmentations across a cardiac cycle provides useful quantitative (e.g. ejection fraction) and qualitative information for diagnosis of certain heart conditions. Existing LV segmentation techniques are founded mostly upon algorithms for segmenting static images. In order to exploit the dynamic structure of the heart in a principled manner, we approach the problem of LV segmentation as a recursive estimation problem. In our framework, LV boundaries constitute the dynamic system state to be estimated, and a sequence of observed cardiac images constitute the data. By formulating the problem as one of state estimation, the segmentation at each particular time is based not only on the data observed at that instant, but also on predictions based on past segmentations. This requires a dynamical system model of the LV, which we propose to learn from training data through an information-theoretic approach. To incorporate the learned dynamic model into our segmentation framework and obtain predictions, we use ideas from particle filtering. Our framework uses a curve evolution method to combine such predictions with the observed images to estimate the LV boundaries at each time. We demonstrate the effectiveness of the proposed approach on a large set of cardiac images. We observe that our approach provides more accurate segmentations than those from static image segmentation techniques, especially when the observed data are of limited quality.


Subject(s)
Artificial Intelligence , Heart Ventricles/anatomy & histology , Heart Ventricles/diagnostic imaging , Image Interpretation, Computer-Assisted/methods , Pattern Recognition, Automated/methods , Subtraction Technique , Ventricular Function, Left/physiology , Algorithms , Computer Simulation , Humans , Image Enhancement/methods , Imaging, Three-Dimensional/methods , Magnetic Resonance Imaging/methods , Models, Cardiovascular , Reproducibility of Results , Sensitivity and Specificity , Tomography, X-Ray Computed/methods
2.
Med Image Anal ; 8(4): 429-45, 2004 Dec.
Article in English | MEDLINE | ID: mdl-15567707

ABSTRACT

This paper presents extensions which improve the performance of the shape-based deformable active contour model presented earlier in [IEEE Conf. Comput. Vision Pattern Recog. 1 (2001) 463] for medical image segmentation. In contrast to that previous work, the segmentation framework that we present in this paper allows multiple shapes to be segmented simultaneously in a seamless fashion. To achieve this, multiple signed distance functions are employed as the implicit representations of the multiple shape classes within the image. A parametric model for this new representation is derived by applying principal component analysis to the collection of these multiple signed distance functions. By deriving a parametric model in this manner, we obtain a coupling between the multiple shapes within the image and hence effectively capture the co-variations among the different shapes. The parameters of the multi-shape model are then calculated to minimize a single mutual information-based cost criterion for image segmentation. The use of a single cost criterion further enhances the coupling between the multiple shapes as the deformation of any given shape depends, at all times, upon every other shape, regardless of their proximity. We found that this resulting algorithm is able to effectively utilize the co-dependencies among the different shapes to aid in the segmentation process. It is able to capture a wide range of shape variability despite being a parametric shape-model. And finally, the algorithm is robust to large amounts of additive noise. We demonstrate the utility of this segmentation framework by applying it to a medical application: the segmentation of the prostate gland, the rectum, and the internal obturator muscles for MR-guided prostate brachytherapy.


Subject(s)
Algorithms , Brain Mapping/methods , Image Interpretation, Computer-Assisted/methods , Pattern Recognition, Automated , Prostate/pathology , Computer Simulation , Humans , Image Enhancement/methods , Imaging, Three-Dimensional , Male , Models, Biological , Models, Statistical , Pelvis/pathology , Principal Component Analysis , Subtraction Technique
3.
Inf Process Med Imaging ; 18: 185-97, 2003 Jul.
Article in English | MEDLINE | ID: mdl-15344457

ABSTRACT

This paper presents extensions which improve the performance of the shape-based deformable active contour model presented earlier in [9]. In contrast to that work, the segmentation framework that we present in this paper allows multiple shapes to be segmented simultaneously in a seamless fashion. To achieve this, multiple signed distance functions are employed as the implicit representations of the multiple shape classes within the image. A parametric model for this new representation is derived by applying principal component analysis to the collection of these multiple signed distance functions. By deriving a parametric model in this manner, we obtain a coupling between the multiple shapes within the image and hence effectively capture the co-variations among the different shapes. The parameters of the multi-shape model are then calculated to minimize a single mutual information-based cost functional for image segmentation. The use of a single cost criterion further enhances the coupling between the multiple shapes as the deformation of any given shape depends, at all times, upon every other shape, regardless of their proximity. We demonstrate the utility of this algorithm to the segmentation of the prostate gland, the rectum, and the internal obturator muscles for MR-guided prostate brachytherapy.


Subject(s)
Algorithms , Artificial Intelligence , Image Interpretation, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Pattern Recognition, Automated , Prostate/pathology , Subtraction Technique , Computer Simulation , Humans , Image Enhancement/methods , Male , Models, Biological , Models, Statistical , Pelvis/pathology , Principal Component Analysis
4.
IEEE Trans Image Process ; 10(8): 1169-86, 2001.
Article in English | MEDLINE | ID: mdl-18255534

ABSTRACT

In this work, we first address the problem of simultaneous image segmentation and smoothing by approaching the Mumford-Shah paradigm from a curve evolution perspective. In particular, we let a set of deformable contours define the boundaries between regions in an image where we model the data via piecewise smooth functions and employ a gradient flow to evolve these contours. Each gradient step involves solving an optimal estimation problem for the data within each region, connecting curve evolution and the Mumford-Shah functional with the theory of boundary-value stochastic processes. The resulting active contour model offers a tractable implementation of the original Mumford-Shah model (i.e., without resorting to elliptic approximations which have traditionally been favored for greater ease in implementation) to simultaneously segment and smoothly reconstruct the data within a given image in a coupled manner. Various implementations of this algorithm are introduced to increase its speed of convergence. We also outline a hierarchical implementation of this algorithm to handle important image features such as triple points and other multiple junctions. Next, by generalizing the data fidelity term of the original Mumford-Shah functional to incorporate a spatially varying penalty, we extend our method to problems in which data quality varies across the image and to images in which sets of pixel measurements are missing. This more general model leads us to a novel PDE-based approach for simultaneous image magnification, segmentation, and smoothing, thereby extending the traditional applications of the Mumford-Shah functional which only considers simultaneous segmentation and smoothing.

5.
IEEE Trans Image Process ; 9(2): 256-66, 2000.
Article in English | MEDLINE | ID: mdl-18255392

ABSTRACT

We introduce a family of first-order multidimensional ordinary differential equations (ODEs) with discontinuous right-hand sides and demonstrate their applicability in image processing. An equation belonging to this family is an inverse diffusion everywhere except at local extrema, where some stabilization is introduced. For this reason, we call these equations "stabilized inverse diffusion equations" (SIDEs). Existence and uniqueness of solutions, as well as stability, are proven for SIDEs. A SIDE in one spatial dimension may be interpreted as a limiting case of a semi-discretized Perona-Malik equation. In an experiment, SIDE's are shown to suppress noise while sharpening edges present in the input signal. Their application to image segmentation is also demonstrated.

6.
IEEE Trans Image Process ; 9(3): 456-68, 2000.
Article in English | MEDLINE | ID: mdl-18255416

ABSTRACT

This paper addresses the problem of both segmenting and reconstructing a noisy signal or image. The work is motivated by large problems arising in certain scientific applications, such as medical imaging. Two objectives for a segmentation and denoising algorithm are laid out: it should be computationally efficient and capable of generating statistics for the errors in the reconstruction and estimates of the boundary locations. The starting point for the development of a suitable algorithm is a variational approach to segmentation (Shah 1992). This paper then develops a precise statistical interpretation of a one dimensional (1-D) version of this variational approach to segmentation. The 1-D algorithm that arises as a result of this analysis is computationally efficient and capable of generating error statistics. A straightforward extension of this algorithm to two dimensions would incorporate recursive procedures for computing estimates of inhomogeneous Gaussian Markov random fields. Such procedures require an unacceptably large number of operations. To meet the objective of developing a computationally efficient algorithm, the use of previously developed multiscale statistical methods is investigated. This results in the development of an algorithm for segmenting and denoising which is not only computationally efficient but also capable of generating error statistics, as desired.

7.
IEEE Trans Image Process ; 7(6): 825-37, 1998.
Article in English | MEDLINE | ID: mdl-18276296

ABSTRACT

In this paper, we investigate the problems of anomaly detection and localization from noisy tomographic data. These are characteristic of a class of problems that cannot be optimally solved because they involve hypothesis testing over hypothesis spaces with extremely large cardinality. Our multiscale hypothesis testing approach addresses the key issues associated with this class of problems. A multiscale hypothesis test is a hierarchical sequence of composite hypothesis tests that discards large portions of the hypothesis space with minimal computational burden and zooms in on the likely true hypothesis. For the anomaly detection and localization problems, hypothesis zooming corresponds to spatial zooming - anomalies are successively localized to finer and finer spatial scales. The key challenges we address include how to hierarchically divide a large hypothesis space and how to process the data at each stage of the hierarchy to decide which parts of the hypothesis space deserve more attention. For the latter, we pose and solve a nonlinear optimization problem for a decision statistic that maximally disambiguates composite hypotheses. With no more computational complexity, our optimized statistic shows substantial improvement over conventional approaches. We provide examples that demonstrate this and quantify how much performance is sacrificed by the use of a suboptimal method as compared to that achievable if the optimal approach were computationally feasible.

8.
IEEE Trans Image Process ; 6(1): 7-20, 1997.
Article in English | MEDLINE | ID: mdl-18282875

ABSTRACT

We present efficient multiscale approaches to the segmentation of natural clutter, specifically grass and forest, and to the enhancement of anomalies in synthetic aperture radar (SAR) imagery. The methods we propose exploit the coherent nature of SAR sensors. In particular, they take advantage of the characteristic statistical differences in imagery of different terrain types, as a function of scale, due to radar speckle. We employ a class of multiscale stochastic processes that provide a powerful framework for describing random processes and fields that evolve in scale. We build models representative of each category of terrain of interest (i.e., grass and forest) and employ them in directing decisions on pixel classification, segmentation, and anomalous behaviour. The scale-autoregressive nature of our models allows extremely efficient calculation of likelihoods for different terrain classifications over windows of SAR imagery. We subsequently use these likelihoods as the basis for both image pixel classification and grass-forest boundary estimation. In addition, anomaly enhancement is possible with minimal additional computation. Specifically, the residuals produced by our models in predicting SAR imagery from coarser scale images are theoretically uncorrelated. As a result, potentially anomalous pixels and regions are enhanced and pinpointed by noting regions whose residuals display a high level of correlation throughout scale. We evaluate the performance of our techniques through testing on 0.3-m resolution SAR data gathered with Lincoln Laboratory's millimeter-wave SAR.

9.
IEEE Trans Image Process ; 6(11): 1517-29, 1997.
Article in English | MEDLINE | ID: mdl-18282910

ABSTRACT

Recently, a class of multiscale stochastic models has been introduced in which random processes and fields are described by scale-recursive dynamic trees. A major advantage of this framework is that it leads to an extremely efficient, statistically optimal algorithm for least-squares estimation. In certain applications, however, estimates based on the types of multiscale models previously proposed may not be adequate, as they have tended to exhibit a visually distracting blockiness. We eliminate this blockiness by discarding the standard assumption that distinct nodes on a given level of the multiscale process correspond to disjoint portions of the image domain; instead, we allow a correspondence to overlapping portions of the image domain. We use these so-called overlapping-tree models for both modeling and estimation. In particular, we develop an efficient multiscale algorithm for generating sample paths of a random field whose second-order statistics match a prespecified covariance structure, to any desired degree of fidelity. Furthermore, we demonstrate that under easily satisfied conditions, we can "lift" a random field estimation problem to one defined on an overlapped tree, resulting in an estimation algorithm that is computationally efficient, directly produces estimation error covariances, and eliminates blockiness in the reconstructed imagery without any sacrifice in the resolution of fine-scale detail.

10.
IEEE Trans Image Process ; 6(3): 463-78, 1997.
Article in English | MEDLINE | ID: mdl-18282941

ABSTRACT

We use a natural pixel-type representation of an object, originally developed for incomplete data tomography problems, to construct nearly orthonormal multiscale basis functions. The nearly orthonormal behavior of the multiscale basis functions results in a system matrix, relating the input (the object coefficients) and the output (the projection data), which is extremely sparse. In addition, the coarsest scale elements of this matrix capture any ill conditioning in the system matrix arising from the geometry of the imaging system. We exploit this feature to partition the system matrix by scales and obtain a reconstruction procedure that requires inversion of only a well-conditioned and sparse matrix. This enables us to formulate a tomographic reconstruction technique from incomplete data wherein the object is reconstructed at multiple scales or resolutions. In case of noisy projection data we extend our multiscale reconstruction technique to explicitly account for noise by calculating maximum a posteriori probability (MAP) multiscale reconstruction estimates based on a certain self-similar prior on the multiscale object coefficients. The framework for multiscale reconstruction presented can find application in regularization of imaging problems where the projection data are incomplete, irregular, and noisy, and in object feature recognition directly from projection data.

11.
IEEE Trans Med Imaging ; 15(1): 92-101, 1996.
Article in English | MEDLINE | ID: mdl-18215892

ABSTRACT

The authors represent the standard ramp filter operator of the filtered-back-projection (FBP) reconstruction in different bases composed of Haar and Daubechies compactly supported wavelets. The resulting multiscale representation of the ramp-filter matrix operator is approximately diagonal. The accuracy of this diagonal approximation becomes better as wavelets with larger numbers of vanishing moments are used. This wavelet-based representation enables the authors to formulate a multiscale tomographic reconstruction technique in which the object is reconstructed at multiple scales or resolutions. A complete reconstruction is obtained by combining the reconstructions at different scales. The authors' multiscale reconstruction technique has the same computational complexity as the FBP reconstruction method. It differs from other multiscale reconstruction techniques in that (1) the object is defined through a one-dimensional multiscale transformation of the projection domain, and (2) the authors explicitly account for noise in the projection data by calculating maximum a posteriori probability (MAP) multiscale reconstruction estimates based on a chosen fractal prior on the multiscale object coefficients. The computational complexity of this maximum a posteriori probability (MAP) solution is also the same as that of the FBP reconstruction. This result is in contrast to commonly used methods of statistical regularization, which result in computationally intensive optimization algorithms.

12.
IEEE Trans Image Process ; 5(3): 459-70, 1996.
Article in English | MEDLINE | ID: mdl-18285131

ABSTRACT

We describe a variational framework for the tomographic reconstruction of an image from the maximum likelihood (ML) estimates of its orthogonal moments. We show how these estimated moments and their (correlated) error statistics can be computed directly, and in a linear fashion from given noisy and possibly sparse projection data. Moreover, thanks to the consistency properties of the Radon transform, this two-step approach (moment estimation followed by image reconstruction) can be viewed as a statistically optimal procedure. Furthermore, by focusing on the important role played by the moments of projection data, we immediately see the close connection between tomographic reconstruction of nonnegative valued images and the problem of nonparametric estimation of probability densities given estimates of their moments. Taking advantage of this connection, our proposed variational algorithm is based on the minimization of a cost functional composed of a term measuring the divergence between a given prior estimate of the image and the current estimate of the image and a second quadratic term based on the error incurred in the estimation of the moments of the underlying image from the noisy projection data. We show that an iterative refinement of this algorithm leads to a practical algorithm for the solution of the highly complex equality constrained divergence minimization problem. We show that this iterative refinement results in superior reconstructions of images from very noisy data as compared with the classical filtered back-projection (FBP) algorithm.

13.
IEEE Trans Med Imaging ; 14(2): 249-58, 1995.
Article in English | MEDLINE | ID: mdl-18215828

ABSTRACT

The estimation of dynamically evolving ellipsoids from noisy lower-dimensional projections is examined. In particular, this work describes a model-based approach using geometric reconstruction and recursive estimation techniques to obtain a dynamic estimate of left-ventricular ejection fraction from a gated set of planar myocardial perfusion images. The proposed approach differs from current ejection fraction estimation techniques both in the imaging modality used and in the subsequent processing which yields a dynamic ejection fraction estimate. For this work, the left ventricle is modeled as a dynamically evolving three-dimensional (3-D) ellipsoid. The left-ventricular outline observed in the myocardial perfusion images is then modeled as a dynamic, two-dimensional (2-D) ellipsoid, obtained as the projection of the former 3-D ellipsoid. This data is processed in two ways: first, as a 3-D dynamic ellipsoid reconstruction problem; second, each view is considered as a 2-D dynamic ellipse estimation problem and then the 3-D ejection fraction is obtained by combining the effective 2-D ejection fractions of each view. The approximating ellipsoids are reconstructed using a Rauch-Tung-Striebel smoothing filter, which produces an ejection fraction estimate that is more robust to noise since it is based on the entire data set; in contrast, traditional ejection fraction estimates are based only on true frames of data. Further, numerical studies of the sensitivity of this approach to unknown dynamics and projection geometry are presented, providing a rational basis for specifying system parameters. This investigation includes estimation of ejection fraction from both simulated and real data.

14.
IEEE Trans Image Process ; 4(2): 194-207, 1995.
Article in English | MEDLINE | ID: mdl-18289971

ABSTRACT

A class of multiscale stochastic models based on scale-recursive dynamics on trees has previously been introduced. Theoretical and experimental results have shown that these models provide an extremely rich framework for representing both processes which are intrinsically multiscale, e.g., 1/f processes, as well as 1D Markov processes and 2D Markov random fields. Moreover, efficient optimal estimation algorithms have been developed for these models by exploiting their scale-recursive structure. The authors exploit this structure in order to develop a computationally efficient and parallelizable algorithm for likelihood calculation. They illustrate one possible application to texture discrimination and demonstrate that likelihood-based methods using the algorithm achieve performance comparable to that of Gaussian Markov random field based techniques, which in general are prohibitively complex computationally.

15.
IEEE Trans Image Process ; 3(1): 41-64, 1994.
Article in English | MEDLINE | ID: mdl-18291908

ABSTRACT

A new approach to regularization methods for image processing is introduced and developed using as a vehicle the problem of computing dense optical flow fields in an image sequence. The solution of the new problem formulation is computed with an efficient multiscale algorithm. Experiments on several image sequences demonstrate the substantial computational savings that can be achieved due to the fact that the algorithm is noniterative and in fact has a per pixel computational complexity that is independent of image size. The new approach also has a number of other important advantages. Specifically, multiresolution flow field estimates are available, allowing great flexibility in dealing with the tradeoff between resolution and accuracy. Multiscale error covariance information is also available, which is of considerable use in assessing the accuracy of the estimates. In particular, these error statistics can be used as the basis for a rational procedure for determining the spatially-varying optimal reconstruction resolution. Furthermore, if there are compelling reasons to insist upon a standard smoothness constraint, the new algorithm provides an excellent initialization for the iterative algorithms associated with the smoothness constraint problem formulation. Finally, the usefulness of the approach should extend to a wide variety of ill-posed inverse problems in which variational techniques seeking a "smooth" solution are generally used.

16.
IEEE Trans Image Process ; 3(6): 773-88, 1994.
Article in English | MEDLINE | ID: mdl-18296246

ABSTRACT

In the computation of dense optical flow fields, spatial coherence constraints are commonly used to regularize otherwise ill-posed problem formulations, providing spatial integration of data. We present a temporal, multiframe extension of the dense optical flow estimation formulation proposed by Horn and Schunck (1981) in which we use a temporal coherence constraint to yield the optimal fusing of data from multiple frames of measurements. Conceptually, standard Kalman filtering algorithms are applicable to the resulting multiframe optical flow estimation problem, providing a solution that is sequential and recursive in time. Experiments are presented to demonstrate that the resulting multiframe estimates are more robust to noise than those provided by the original, single-frame formulation. In addition, we demonstrate cases where the aperture problem of motion vision cannot be resolved satisfactorily without the temporal integration of data enabled by the proposed formulation. Practically, the large matrix dimensions involved in the problem prohibit exact implementation of the optimal Kalman filter. To overcome this limitation, we present a computationally efficient, yet near-optimal approximation of the exact filtering algorithm. This approximation has a precise interpretation as the sequential estimation of a reduced-order spatial model for the optical flow estimation error process at each time step and arises from an estimation-theoretic treatment of the filtering problem. Experiments also demonstrate the efficacy of this near-optimal filter.

17.
IEEE Trans Image Process ; 2(3): 401-16, 1993.
Article in English | MEDLINE | ID: mdl-18296226

ABSTRACT

The authors describe and demonstrate a hierarchical reconstruction algorithm for use in noisy and limited-angle or sparse-angle tomography. The algorithm estimates an object's mass, center of mass, and convex hull from the available projections, and uses this information, along with fundamental mathematical constraints, to estimate a full set of smoothed projections. The mass and center of mass estimates are made using a least squares estimator derived from the principles of consistency of the Radon transform. The convex hull estimate is produced by first estimating the positions of support lines of the object from each available projection and then estimating the overall convex hull using prior shape information. Estimating the position of two support lines from a single projection is accomplished using a generalized likelihood ratio technique for estimating jumps in linear systems. Results for simulated objects in a variety of measurement situations are shown, and several possible extensions to this work are discussed.

18.
J Electrocardiol ; 23 Suppl: 102-10, 1990.
Article in English | MEDLINE | ID: mdl-2090726

ABSTRACT

The authors describe their perspective on the modeling of cardiac rhythms as a component of cardiac arrhythmia signal-processing algorithms. They emphasize that these models are for a specific end purpose and that the aspects of cardiac behavior that are captured by the models are only those relevant for the development of the signal-processing algorithms. The approach is to use statistics to describe ranges of cardiac behavior that share some common feature with respect to the purpose of the signal processing. The statistical approach has the advantage that, coupled with a statistical performance criterion, it specifies an optimal signal-processing algorithm. These optimal algorithms are often computationally intractable, however, especially for real-time use in instruments. Approximations are therefore crucial. The mathematical form of the model is then important since, even if two forms generate identical statistics, the approximations that are natural in different forms can be quite different. Two different mathematical formulations are described--stochastic Petri nets and interacting Markov chains--and the different types of approximately optimal signal-processing algorithms that are natural in these two frameworks are discussed.


Subject(s)
Algorithms , Arrhythmias, Cardiac/diagnosis , Computer Simulation , Electrocardiography , Models, Cardiovascular , Models, Theoretical , Signal Processing, Computer-Assisted , Heart Block/diagnosis , Humans , Markov Chains
19.
Comput Biomed Res ; 22(2): 136-59, 1989 Apr.
Article in English | MEDLINE | ID: mdl-2721167

ABSTRACT

We describe a methodology for modeling heart rhythms observed in electrocardiograms. In particular, we present a procedure to derive simple dynamic models that capture the cardiac mechanisms which control the particular timing sequences of P and R waves characteristic of different arrhythmias. By treating the cardiac electrophysiology at an aggregate level, simple network models of the wave generating system under a variety of diseased conditions can be developed. These network models are then systematically converted to stochastic Petri nets which offer a compact mathematical framework to express the dynamics and statistical variability of the wave generating mechanisms. Models of several arrhythmias are included in order to illustrate the methodology.


Subject(s)
Arrhythmias, Cardiac/physiopathology , Electrocardiography , Heart Conduction System/physiopathology , Models, Cardiovascular , Humans , Stochastic Processes
SELECTION OF CITATIONS
SEARCH DETAIL
...