Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 25
Filter
1.
Comput Biol Med ; 99: 53-62, 2018 08 01.
Article in English | MEDLINE | ID: mdl-29886261

ABSTRACT

Detecting and classifying cardiac arrhythmias is critical to the diagnosis of patients with cardiac abnormalities. In this paper, a novel approach based on deep learning methodology is proposed for the classification of single-lead electrocardiogram (ECG) signals. We demonstrate the application of the Restricted Boltzmann Machine (RBM) and deep belief networks (DBN) for ECG classification following detection of ventricular and supraventricular heartbeats using single-lead ECG. The effectiveness of this proposed algorithm is illustrated using real ECG signals from the widely-used MIT-BIH database. Simulation results demonstrate that with a suitable choice of parameters, RBM and DBN can achieve high average recognition accuracies of ventricular ectopic beats (93.63%) and of supraventricular ectopic beats (95.57%) at a low sampling rate of 114 Hz. Experimental results indicate that classifiers built into this deep learning-based framework achieved state-of-the art performance models at lower sampling rates and simple features when compared to traditional methods. Further, employing features extracted at a sampling rate of 114 Hz when combined with deep learning provided enough discriminatory power for the classification task. This performance is comparable to that of traditional methods and uses a much lower sampling rate and simpler features. Thus, our proposed deep neural network algorithm demonstrates that deep learning-based methods offer accurate ECG classification and could potentially be extended to other physiological signal classifications, such as those in arterial blood pressure (ABP), nerve conduction (EMG), and heart rate variability (HRV) studies.


Subject(s)
Arrhythmias, Cardiac/physiopathology , Databases, Factual , Deep Learning , Electrocardiography , Signal Processing, Computer-Assisted , Humans
2.
Proc IEEE Int Symp Biomed Imaging ; 2015: 1284-1287, 2015 Apr.
Article in English | MEDLINE | ID: mdl-28101301

ABSTRACT

Automated profiling of nuclear architecture, in histology sections, can potentially help predict the clinical outcomes. However, the task is challenging as a result of nuclear pleomorphism and cellular states (e.g., cell fate, cell cycle), which are compounded by the batch effect (e.g., variations in fixation and staining). Present methods, for nuclear segmentation, are based on human-designed features that may not effectively capture intrinsic nuclear architecture. In this paper, we propose a novel approach, called sparsity constrained convolutional regression (SCCR), for nuclei segmentation. Specifically, given raw image patches and the corresponding annotated binary masks, our algorithm jointly learns a bank of convolutional filters and a sparse linear regressor, where the former is used for feature extraction, and the latter aims to produce a likelihood for each pixel being nuclear region or background. During classification, the pixel label is simply determined by a thresholding operation applied on the likelihood map. The method has been evaluated using the benchmark dataset collected from The Cancer Genome Atlas (TCGA). Experimental results demonstrate that our method outperforms traditional nuclei segmentation algorithms and is able to achieve competitive performance compared to the state-of-the-art algorithm built upon human-designed features with biological prior knowledge.

3.
IEEE J Biomed Health Inform ; 19(2): 508-19, 2015 Mar.
Article in English | MEDLINE | ID: mdl-24846672

ABSTRACT

Recent results in telecardiology show that compressed sensing (CS) is a promising tool to lower energy consumption in wireless body area networks for electrocardiogram (ECG) monitoring. However, the performance of current CS-based algorithms, in terms of compression rate and reconstruction quality of the ECG, still falls short of the performance attained by state-of-the-art wavelet-based algorithms. In this paper, we propose to exploit the structure of the wavelet representation of the ECG signal to boost the performance of CS-based methods for compression and reconstruction of ECG signals. More precisely, we incorporate prior information about the wavelet dependencies across scales into the reconstruction algorithms and exploit the high fraction of common support of the wavelet coefficients of consecutive ECG segments. Experimental results utilizing the MIT-BIH Arrhythmia Database show that significant performance gains, in terms of compression rate and reconstruction quality, can be obtained by the proposed algorithms compared to current CS-based methods.


Subject(s)
Data Compression/methods , Electrocardiography/methods , Algorithms , Databases, Factual , Humans , Remote Sensing Technology , Wavelet Analysis , Wireless Technology
4.
Stud Health Technol Inform ; 196: 479-85, 2014.
Article in English | MEDLINE | ID: mdl-24732560

ABSTRACT

Providing real-time, interactive immersive surgical training has been a key research area in telemedicine. Earlier approaches have mainly adopted videotaped training that can only show imagery from a fixed view point. Recent advances on commodity 3D imaging have enabled a new paradigm for immersive surgical training by acquiring nearly complete 3D reconstructions of actual surgical procedures. However, unlike 2D videotaping that can easily stream data in real-time, by far 3D imaging based solutions require pre-capturing and processing the data; surgical trainings using the data have to be conducted offline after the acquisition. In this paper, we present a new real-time immersive 3D surgical training system. Our solution builds upon the recent multi-Kinect based surgical training system [1] that can acquire and display high delity 3D surgical procedures using only a small number of Microsoft Kinect sensors. We build on top of the system a client-server model for real-time streaming. On the server front, we efficiently fuse multiple Kinect data acquired from different viewpoints and compress and then stream the data to the client. On the client front, we build an interactive space-time navigator to allow remote users (e.g., trainees) to witness the surgical procedure in real-time as if they were present in the room.


Subject(s)
Surgical Procedures, Operative/education , Telemedicine/methods , Virtual Reality , Humans , Time Factors
5.
Stud Health Technol Inform ; 184: 161-7, 2013.
Article in English | MEDLINE | ID: mdl-23400150

ABSTRACT

Surgical training plays an important role in assisting residents to develop critical skills. Providing effective surgical training, however, remains as a challenging task. Existing videotaped training instructions can only show imagery from a fixed viewpoint that lacks both depth perception and interactivity. We present a new portable immersive surgical training system that is capable of acquiring and displaying high fidelity 3D reconstructions of actual surgical procedures. Our solution utilizes a set of Microsoft Kinect sensors to simultaneously recover the participants, the surgical environment, and the surgical scene itself. We then develop a space-time navigator to allow the trainees to witness and explore a prior procedure as if they were there. Preliminary feedback from residents shows that our system is much more effective than conventional videotaped system.


Subject(s)
Actigraphy/instrumentation , Biofeedback, Psychology/instrumentation , Computer-Assisted Instruction/instrumentation , Imaging, Three-Dimensional/instrumentation , Surgery, Computer-Assisted/instrumentation , Transducers , User-Computer Interface , Colorimetry/instrumentation , Educational Measurement/methods , Equipment Design , Equipment Failure Analysis , Humans
6.
Stud Health Technol Inform ; 173: 186-92, 2012.
Article in English | MEDLINE | ID: mdl-22356984

ABSTRACT

Surgery simulation is playing an increasing role in medical education. A long standing problem in this area is how to integrate fast yet realistic haptic feedback to the system. In this paper, we propose an algorithm to accelerate the recently proposed volume-based haptic feedback approach. Unlike existing techniques that require separately scanning along all three axes, we only scan the volume once along one axis and recover the penetration information along the other two based on geometric constraints and heuristics. This significantly reduces the computational cost and doubles the haptic refresh rate, which significantly improves the stability of haptic feedback.


Subject(s)
Algorithms , Elastic Modulus/physiology , Feedback , Touch Perception , Computer Simulation , Humans , Surgical Procedures, Operative
7.
Stud Health Technol Inform ; 173: 193-9, 2012.
Article in English | MEDLINE | ID: mdl-22356985

ABSTRACT

Surgery simulation plays an important role in surgery planning, surgeon training, and telemedicine. A long-standing problem in this region is how to integrate coherent visual illustrations to deformation. In this paper, we present a new non-photorealistic surgery simulation system that combines force visualization and dynamic pencil-stroke illustration. We estimate the elastic force field in real-time and integrate it with the contact force to form a combined force map. Then, our rendering module is able to dynamically compute the principal directions on deforming organ models and apply color coded, pencil-style strokes onto the model for illustrating deformations. We implement these modules on GPU using NVidia's CUDA. Our system can faithfully and coherently reveal geometric deformation of organs under the force field.


Subject(s)
Computer Simulation , Elasticity Imaging Techniques/methods , Surgical Procedures, Operative , Humans , Models, Anatomic
8.
Stud Health Technol Inform ; 163: 224-30, 2011.
Article in English | MEDLINE | ID: mdl-21335793

ABSTRACT

In surgery procedures, haptic interaction provides surgeons with indispensable information to accurately locate the surgery target. This is especially critical when visual feedback cannot provide sufficient information and tactile interrogation, such as palpating some region of tissue, is required to locate a specific underlying tumor. However, in most current surgery simulators, the haptic interaction model is usually simplified into a contact sphere or rod model, leaving arbitrarily shaped intersection haptic feedback between target tissue and surgery instrument less unreliable. In this paper, a novel haptic feedback algorithm is introduced for generating the feedback forces in surgery simulations. The proposed algorithm initially employs three Layered Depth Images (LDI) to sample the 3D objects in X, Y and Z directions. A secondary analysis scans through two sampled meshes and detects their penetration volume. Based on the principle that interaction force should minimize the penetration volume, the haptic feedback force is derived directly. Additionally, a post-processing technique is developed to render distinct physical tissue properties across different interaction areas. The proposed approach does not require any pre-processing and is applicable for both rigid and deformable objects.


Subject(s)
Biofeedback, Psychology/physiology , Connective Tissue/physiology , Connective Tissue/surgery , Models, Biological , Surgery, Computer-Assisted/methods , Touch/physiology , User-Computer Interface , Computer Simulation , Elastic Modulus/physiology , Hardness/physiology , Humans
9.
Stud Health Technol Inform ; 163: 691-5, 2011.
Article in English | MEDLINE | ID: mdl-21335882

ABSTRACT

We are developing agents for positron emission tomography (PET) imaging of cancer gene mRNA expression and software to fuse mRNA PET images with anatomical computerized tomography (CT) images to enable volumetric (3D) haptic (touch-and-feel) simulation of pancreatic cancer and surrounding organs prior to surgery in a particular patient. We have identified a novel ligand specific for epidermal growth factor receptor (EGFR) to direct PET agent uptake specifically into cancer cells, and created a volumetric haptic surgical simulation of human pancreatic cancer reconstructed from patient CT data. Young's modulus and the Poisson ratio for each tissue will be adjusted to fit the experience of participating surgeons.


Subject(s)
Imaging, Three-Dimensional/methods , Models, Biological , Molecular Imaging/methods , Neoplasms/diagnostic imaging , Neoplasms/surgery , Surgery, Computer-Assisted/methods , User-Computer Interface , Computer Simulation , Drug Design , Humans , Positron-Emission Tomography/methods , Radiopharmaceuticals/chemical synthesis
10.
Article in English | MEDLINE | ID: mdl-22254667

ABSTRACT

An innovative electrocardiogram compression algorithm is presented in this paper. The proposed method is based on matrix completion, a new paradigm in signal processing that seeks to recover a low-rank matrix based on a small number of observations. The low-rank matrix is obtained via normalization of electrocardiogram records. Using matrix completion, the ECG data matrix is recovered from a few number of entries, thereby yielding high compression ratios comparable to those obtained by existing compression techniques. The proposed scheme offers a low-complexity encoder, good tolerance to quantization noise, and good quality reconstruction.


Subject(s)
Algorithms , Data Compression/methods , Diagnosis, Computer-Assisted/methods , Electrocardiography/methods , Signal Processing, Computer-Assisted , Humans , Reproducibility of Results , Sensitivity and Specificity
11.
IEEE Trans Biomed Eng ; 57(10): 2402-12, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20409985

ABSTRACT

Repolarization alternans or T-wave alternans (TWA) is a subject of great interest as it has been shown as a risk stratifier for sudden cardiac death. As TWA consists of subtle and nonvisible variations of the ST-T complex, its detection may become more difficult in noisy environments, such as stress testing or Holter recordings. In this paper, a technique based on the empirical-mode decomposition (EMD) to separate the useful information of the ST-T complex from noise and artifacts is proposed. The identification of the useful part of the signal is based on the study of complexity in the EMD domain by means of the Hjorth descriptors. As a result, a robust technique to extract the trend of the ST-T complex has been achieved. The evaluation of the method is carried out with the spectral method (SM) over several public domain databases with ECGs sampled at different frequencies. The results show that the SM with the proposed technique outperforms the traditional SM by more than 2 dB. Also, the robustness of this technique is guaranteed as it does not introduce any additional distortion to the detector in noiseless conditions.


Subject(s)
Electrocardiography/methods , Models, Cardiovascular , Signal Processing, Computer-Assisted , Algorithms , Artifacts , Computer Simulation , Databases, Factual , Heart Ventricles/physiopathology , Humans , Nonlinear Dynamics
12.
Comput Biol Med ; 38(1): 1-13, 2008 Jan.
Article in English | MEDLINE | ID: mdl-17669389

ABSTRACT

The electrocardiogram (ECG) is widely used for diagnosis of heart diseases. Good quality ECG are utilized by physicians for interpretation and identification of physiological and pathological phenomena. However, in real situations, ECG recordings are often corrupted by artifacts. Two dominant artifacts present in ECG recordings are: (1) high-frequency noise caused by electromyogram induced noise, power line interferences, or mechanical forces acting on the electrodes; (2) baseline wander (BW) that may be due to respiration or the motion of the patients or the instruments. These artifacts severely limit the utility of recorded ECGs and thus need to be removed for better clinical evaluation. Several methods have been developed for ECG enhancement. In this paper, we propose a new ECG enhancement method based on the recently developed empirical mode decomposition (EMD). The proposed EMD-based method is able to remove both high-frequency noise and BW with minimum signal distortion. The method is validated through experiments on the MIT-BIH databases. Both quantitative and qualitative results are given. The simulations show that the proposed EMD-based method provides very good results for denoising and BW removal.


Subject(s)
Artifacts , Electrocardiography/methods , Signal Processing, Computer-Assisted , Algorithms , Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/physiopathology , Databases, Factual , Diagnosis, Computer-Assisted/methods , Humans , Reproducibility of Results
13.
IEEE Trans Neural Syst Rehabil Eng ; 15(2): 310-21, 2007 Jun.
Article in English | MEDLINE | ID: mdl-17601201

ABSTRACT

Methods to automatically convert graphics into raised-line images have been recently investigated. In this paper, concepts from previous research are extended to the vector graphics case, producing tactile pictures in which important features are emphasized. The proposed algorithm extracts object boundaries and employs a classification process, based on a graphic's hierarchical structure, to determine critical outlines. A single parameter is introduced into the classification process, enabling users to tailor graphics to their own preferences. The resulting outlines are printed using a Braille printer to produce tactile output. Critical outlines are embossed with raised dots of highest height while other lines and details are embossed with a lower height. Psychophysical experiments including discrimination, identification, and comprehension are utilized to evaluate and compare the proposed algorithm. Results indicate that the proposed method outperforms other methods in all three considered tasks. The results also show that emphasizing important features significantly increases comprehension of tactile graphics, validating the proposed method's effectiveness in conveying visual information.


Subject(s)
Algorithms , Computer Graphics , Image Interpretation, Computer-Assisted/methods , Sensory Aids , Signal Processing, Computer-Assisted , Touch , User-Computer Interface , Vision Disorders/rehabilitation , Computer Peripherals
14.
IEEE Trans Med Imaging ; 26(5): 712-27, 2007 May.
Article in English | MEDLINE | ID: mdl-17518065

ABSTRACT

Speckle is a multiplicative noise that degrades ultrasound images. Recent advancements in ultrasound instrumentation and portable ultrasound devices necessitate the need for more robust despeckling techniques, for both routine clinical practice and teleconsultation. Methods previously proposed for speckle reduction suffer from two major limitations: 1) noise attenuation is not sufficient, especially in the smooth and background areas; 2) existing methods do not sufficiently preserve or enhance edges--they only inhibit smoothing near edges. In this paper, we propose a novel technique that is capable of reducing the speckle more effectively than previous methods and jointly enhancing the edge information, rather than just inhibiting smoothing. The proposed method utilizes the Rayleigh distribution to model the speckle and adopts the robust maximum-likelihood estimation approach. The resulting estimator is statistically analyzed through first and second moment derivations. A tuning parameter that naturally evolves in the estimation equation is analyzed, and an adaptive method utilizing the instantaneous coefficient of variation is proposed to adjust this parameter. To further tailor performance, a weighted version of the proposed estimator is introduced to exploit varying statistics of input samples. Finally, the proposed method is evaluated and compared to well-accepted methods through simulations utilizing synthetic and real ultrasound data.


Subject(s)
Algorithms , Artifacts , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Ultrasonography, Prenatal/methods , Computer Simulation , Humans , Likelihood Functions , Models, Biological , Models, Statistical , Reproducibility of Results , Sensitivity and Specificity
15.
IEEE Trans Biomed Eng ; 54(4): 766-9, 2007 Apr.
Article in English | MEDLINE | ID: mdl-17405386

ABSTRACT

Most of the recent electrocardiogram (ECG) compression approaches developed with the wavelet transform are implemented using the discrete wavelet transform. Conversely, wavelet packets (WP) are not extensively used, although they are an adaptive decomposition for representing signals. In this paper, we present a thresholding-based method to encode ECG signals using WP. The design of the compressor has been carried out according to two main goals: (1) The scheme should be simple to allow real-time implementation; (2) quality, i.e., the reconstructed signal should be as similar as possible to the original signal. The proposed scheme is versatile as far as neither QRS detection nor a priori signal information is required. As such, it can thus be applied to any ECG. Results show that WP perform efficiently and can now be considered as an alternative in ECG compression applications.


Subject(s)
Algorithms , Artifacts , Data Compression/methods , Electrocardiography/methods , Signal Processing, Computer-Assisted , Feasibility Studies , Humans , Reproducibility of Results , Sensitivity and Specificity
16.
IEEE Trans Image Process ; 15(12): 3636-54, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17153940

ABSTRACT

The rank information of samples is widely utilized in nonlinear signal processing algorithms. Recently developed fuzzy transformation theory introduces the concept of fuzzy ranks, which incorporates sample spread (or sample diversity) information into the sample ranking framework. Thus, the fuzzy rank reflects a sample's rank, as well as its similarity to the other sample (namely, joint rank order and spread), and can be utilized to improve the performance of the conventional rank-order-based filters. In this paper, the well-known lower-upper-middle (LUM) filters are generalized utilizing the fuzzy ranks, yielding the class of fuzzy rank LUM (F-LUM) filters. Statistical and deterministic properties of the F-LUM filters are derived, showing that the F-LUM smoothers have similar impulsive noise removal capability to the LUM smoothers, while preserving the image details better. The F-LUM sharpeners are capable of enhancing strong edges while simultaneously preserving small variations. The performance of the F-LUM filters are evaluated for the problems of image impulsive noise removal, sharpening and edge-detection preprocessing. The experimental results show that the F-LUM smoothers can achieve a better tradeoff between noise removal and detail preservation than the LUM smoothers. The F-LUM sharpeners are capable of sharpening the image edges without amplifying the noise or distorting the fine details. The joint smoothing and sharpening operation of the general F-LUM filters also showed superiority in edge detection preprocessing application. In conclusion, the simplicity and versatility of the F-LUM filters and their advantages over the conventional LUM filters are desirable in many practical applications. This also shows that utilizing fuzzy ranks in filter generalization is a promising methodology.


Subject(s)
Algorithms , Artifacts , Fuzzy Logic , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Information Storage and Retrieval/methods , Signal Processing, Computer-Assisted , Numerical Analysis, Computer-Assisted , Reproducibility of Results , Sensitivity and Specificity
17.
IEEE Trans Image Process ; 15(11): 3294-310, 2006 Nov.
Article in English | MEDLINE | ID: mdl-17076391

ABSTRACT

Quadratic Volterra filters are effective in image sharpening applications. The linear combination of polynomial terms, however, yields poor performance in noisy environments. Weighted median (WM) filters, in contrast, are well known for their outlier suppression and detail preservation properties. The WM sample selection methodology is naturally extended to the quadratic sample case, yielding a filter structure referred to as quadratic weighted median (QWM) that exploits the higher order statistics of the observed samples while simultaneously being robust to outliers arising in the higher order statistics of environment noise. Through statistical analysis of higher order samples, it is shown that, although the parent Gaussian distribution is light tailed, the higher order terms exhibit heavy-tailed distributions. The optimal combination of terms contributing to a quadratic system, i.e., cross and square, is approached from a maximum likelihood perspective which yields the WM processing of these terms. The proposed QWM filter structure is analyzed through determination of the output variance and breakdown probability. The studies show that the QWM exhibits lower variance and breakdown probability indicating the robustness of the proposed structure. The performance of the QWM filter is tested on constant regions, edges and real images, and compared to its weighted-sum dual, the quadratic Volterra filter. The simulation results show that the proposed method simultaneously suppresses the noise and enhances image details. Compared with the quadratic Volterra sharpener, the QWM filter exhibits superior qualitative and quantitative performance in noisy image sharpening.


Subject(s)
Algorithms , Filtration/methods , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Information Storage and Retrieval/methods , Models, Statistical , Computer Simulation , Reproducibility of Results , Sensitivity and Specificity , Stochastic Processes
18.
IEEE Trans Image Process ; 15(7): 1900-15, 2006 Jul.
Article in English | MEDLINE | ID: mdl-16830911

ABSTRACT

Partition-based Weighted Sum (P-WS) filtering is an effective method for processing nonstationary signals, especially those with regularly occurring structures, such as images. P-WS filters were originally formulated as Hard-partition Weighted Sum (HP-WS) filters and were successfully applied to image denoising. This formulation relied on intuitive arguments to generate the filter class. Here we present a statistical analysis that justifies the use of weighted sum filters after observation space partitioning. Unfortunately, the HP-WS filters are nondifferentiable and an analytical solution for their global optimization is therefore difficult to obtain. A two-stage suboptimal training procedure has been reported in the literature, but prior to this research no evaluation on the optimality of this approach has been reported. Here, a Genetic Algorithm (GA) HP-WS optimization procedure is developed that, in simulations, shows that the simpler two-stage training procedure yields near optimal results. Also developed in this paper are Soft-partition Weighted Sum (SP-WS) filters. The SP-WS filters utilize soft, or fuzzy, partitions that yield a differentiable filtering operation, enabling the development of gradient-based optimization procedures. Image denoising simulation results are presented comparing HP-WS and SP-WS filters, their optimization procedures, and wavelet-based image denoising. These results show that P-WS filters, in general, outperform traditional and wavelet-based image filters, and SP-WS filters utilizing soft partitioning not only allow for simple optimization, but also yields improved performance.


Subject(s)
Algorithms , Artifacts , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Information Storage and Retrieval/methods , Models, Statistical , Signal Processing, Computer-Assisted , Computer Graphics , Computer Simulation , Filtration/methods , Numerical Analysis, Computer-Assisted , Stochastic Processes
19.
Appl Opt ; 45(12): 2697-706, 2006 Apr 20.
Article in English | MEDLINE | ID: mdl-16633419

ABSTRACT

Soft-partition-weighted-sum (Soft-PWS) filters are a class of spatially adaptive moving-window filters for signal and image restoration. Their performance is shown to be promising. However, optimization of the Soft-PWS filters has received only limited attention. Earlier work focused on a stochastic-gradient method that is computationally prohibitive in many applications. We describe a novel radial basis function interpretation of the Soft-PWS filters and present an efficient optimization procedure. We apply the filters to the problem of noise reduction. The experimental results show that the Soft-PWS filter outperforms the standard partition-weighted-sum filter and the Wiener filter.


Subject(s)
Algorithms , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Signal Processing, Computer-Assisted , Filtration/methods , Information Storage and Retrieval/methods , Reproducibility of Results , Sensitivity and Specificity
20.
IEEE Trans Image Process ; 15(4): 910-27, 2006 Apr.
Article in English | MEDLINE | ID: mdl-16579378

ABSTRACT

The spatial and rank (SR) orderings of samples play a critical role in most signal processing algorithms. The recently introduced fuzzy ordering theory generalizes traditional, or crisp, SR ordering concepts and defines the fuzzy (spatial) samples, fuzzy order statistics, fuzzy spatial indexes, and fuzzy ranks. Here, we introduce a more general concept, the fuzzy transformation (FZT), which refers to the mapping of the crisp samples, order statistics, and SR ordering indexes to their fuzzy counterparts. We establish the element invariant and order invariant properties of the FZT. These properties indicate that fuzzy spatial samples and fuzzy order statistics constitute the same set and, under commonly satisfied membership function conditions, the sample rank order is preserved by the FZT. The FZT also possesses clustering and symmetry properties, which are established through analysis of the distributions and expectations of fuzzy samples and fuzzy order statistics. These properties indicate that the FZT incorporates sample diversity into the ordering operation, which can be utilized in the generalization of conventional filters. Here, we establish the fuzzy weighted median (FWM), fuzzy lower-upper-middle (FLUM), and fuzzy identity filters as generalizations of their crisp counterparts. The advantage of the fuzzy generalizations is illustrated in the applications of DCT coded image deblocking, impulse removal, and noisy image sharpening.


Subject(s)
Algorithms , Fuzzy Logic , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Pattern Recognition, Automated/methods , Computer Graphics , Computer Simulation , Information Storage and Retrieval/methods , Models, Statistical , Numerical Analysis, Computer-Assisted , Signal Processing, Computer-Assisted
SELECTION OF CITATIONS
SEARCH DETAIL
...