Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 20
Filter
Add more filters










Publication year range
1.
J Med Imaging (Bellingham) ; 8(Suppl 1): 019801, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33409337

ABSTRACT

[This corrects the article DOI: 10.1117/1.JMI.8.S1.S16001.].

2.
J Med Imaging (Bellingham) ; 8(Suppl 1): S16001, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33313340

ABSTRACT

Purpose: The goal of this research is to develop innovative methods of acquiring simultaneous multidimensional molecular images of several different physiological random processes (PRPs) that might all be active in a particular disease such as COVID-19. Approach: Our study is part of an ongoing effort at the University of Arizona to derive biologically accurate yet mathematically tractable models of the objects of interest in molecular imaging and of the images they produce. In both cases, the models are fully stochastic, in the sense that they provide ways to estimate any estimable property of the object or image. The mathematical tool we use for images is the characteristic function, which can be calculated if the multivariate probability density function for the image data is known. For objects, which are functions of continuous variables rather than discrete pixels or voxels, the characteristic function becomes infinite dimensional, and we refer to it as the characteristic functional. Results: Several innovative mathematical results are derived, in particular for simultaneous imaging of multiple PRPs. Then the application of these methods to cancers that disrupt the mammalian target of rapamycin signaling pathway and to COVID-19 are discussed qualitatively. One reason for choosing these two problems is that they both involve lipid rafts. Conclusions: We found that it was necessary to employ a new algorithm for energy estimation to do simultaneous single-photon emission computerized tomography imaging of a large number of different tracers. With this caveat, however, we expect to be able to acquire and analyze an unprecedented amount of molecular imaging data for an individual COVID patient.

3.
J Imaging Sci Technol ; 64(6): 604081-6040811, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33967570

ABSTRACT

The performance of a convolutional neural network (CNN) on an image texture detection task as a function of linear image processing and the number of training images is investigated. Performance is quantified by the area under (AUC) the receiver operating characteristic (ROC) curve. The Ideal Observer (IO) maximizes AUC but depends on high-dimensional image likelihoods. In many cases, the CNN performance can approximate the IO performance. This work demonstrates counterexamples where a full-rank linear transform degrades the CNN performance below the IO in the limit of large quantities of training data and network layers. A subsequent linear transform changes the images' correlation structure, improves the AUC, and again demonstrates the CNN dependence on linear processing. Compression strictly decreases or maintains the IO detection performance while compression can increase the CNN performance especially for small quantities of training data. Results indicate an optimal compression ratio for the CNN based on task difficulty, compression method, and number of training images.

4.
Sci Rep ; 7(1): 15807, 2017 Nov 17.
Article in English | MEDLINE | ID: mdl-29150683

ABSTRACT

Null functions of an imaging system are functions in the object space that give exactly zero data. Hence, they represent the intrinsic limitations of the imaging system. Null functions exist in all digital imaging systems, because these systems map continuous objects to discrete data. However, the emergence of detectors that measure continuous data, e.g. particle-processing (PP) detectors, has the potential to eliminate null functions. PP detectors process signals produced by each particle and estimate particle attributes, which include two position coordinates and three components of momentum, as continuous variables. We consider Charged-Particle Emission Tomography (CPET), which relies on data collected by a PP detector to reconstruct the 3D distribution of a radioisotope that emits alpha or beta particles, and show empirically that the null functions are significantly reduced for alpha particles if ≥3 attributes are measured or for beta particles with five attributes measured.

5.
Med Phys ; 44(6): 2478-2489, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28370094

ABSTRACT

PURPOSE: Conventional charged-particle imaging techniques - such as autoradiography - provide only two-dimensional (2D) black ex vivo images of thin tissue slices. In order to get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct three-dimensional (3D) autoradiography technique, which we call charged-particle emission tomography (CPET). This 3D imaging technique enables imaging of thick tissue sections, thus increasing laboratory throughput and eliminating distortions due to registration. CPET also has the potential to enable in vivo charged-particle imaging with a window chamber or an endoscope. METHODS: Our approach to charged-particle emission tomography uses particle-processing detectors (PPDs) to estimate attributes of each detected particle. The attributes we estimate include location, direction of propagation, and/or the energy deposited in the detector. Estimated attributes are then fed into a reconstruction algorithm to reconstruct the 3D distribution of charged-particle-emitting radionuclides. Several setups to realize PPDs are designed. Reconstruction algorithms for CPET are developed. RESULTS: Reconstruction results from simulated data showed that a PPD enables CPET if the PPD measures more attributes than just the position from each detected particle. Experiments showed that a two-foil charged-particle detector is able to measure the position and direction of incident alpha particles. CONCLUSIONS: We proposed a new volumetric imaging technique for charged-particle-emitting radionuclides, which we have called charged-particle emission tomography (CPET). We also proposed a new class of charged-particle detectors, which we have called particle-processing detectors (PPDs). When a PPD is used to measure the direction and/or energy attributes along with the position attributes, CPET is feasible.


Subject(s)
Algorithms , Imaging, Three-Dimensional , Tomography, Emission-Computed
6.
J Med Imaging (Bellingham) ; 3(2): 023502, 2016 Apr.
Article in English | MEDLINE | ID: mdl-27175376

ABSTRACT

The therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control versus the probability of normal-tissue complications as the overall radiation dose level is varied, e.g., by varying the beam current in external-beam radiotherapy or the total injected activity in radionuclide therapy. This paper shows how TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy, AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. The mathematical analogy between response of observers to images and the response of tumors to distributions of a chemotherapy drug is exploited to obtain linear discriminant functions from which AUTOC can be calculated. Methods for using mathematical models of drug delivery and tumor response with imaging data to estimate patient-specific parameters that are needed for calculation of AUTOC are outlined. The implications of this viewpoint for clinical trials are discussed.

7.
Opt Eng ; 55(1)2016 Jan.
Article in English | MEDLINE | ID: mdl-32139948

ABSTRACT

The statistics of detector outputs produced by an imaging system are derived from basic radiometric concepts and definitions. We show that a fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process in spatial, angular, and wavelength variables. We begin the paper by recalling the concept of radiance in geometrical optics, radiology, physical optics, and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Building upon these concepts, we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors (capable of measuring radiance on a photon-by-photon basis). This allows us to rigorously show how the concept of radiance is related to the statistical properties of detector outputs and to the information content of a single detected photon. A Monte-Carlo technique, which is derived from the Boltzmann transport equation, is presented as a way to estimate probability density functions to be used in reconstruction from photon-processing data.

8.
Phys Med Biol ; 60(18): 7359-85, 2015 Sep 21.
Article in English | MEDLINE | ID: mdl-26350439

ABSTRACT

Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and implemented for graphics processing units (GPUs). Further, this approach leverages another important advantage of PP systems, namely the possibility to perform photon-by-photon real-time reconstruction. We demonstrate the application of the approach to perform reconstruction in a simulated 2D SPECT system. The results help to validate and demonstrate the utility of the proposed method and show that PP systems can help overcome the aliasing artifacts that are otherwise intrinsically present in PC systems.


Subject(s)
Algorithms , Image Processing, Computer-Assisted/methods , Nuclear Medicine , Phantoms, Imaging , Photons , Tomography, Emission-Computed, Single-Photon/methods , Computer Simulation , Humans , Signal Processing, Computer-Assisted
9.
Proc SPIE Int Soc Opt Eng ; 94122015 Feb 21.
Article in English | MEDLINE | ID: mdl-26166931

ABSTRACT

There are two basic sources of uncertainty in cancer chemotherapy: how much of the therapeutic agent reaches the cancer cells, and how effective it is in reducing or controlling the tumor when it gets there. There is also a concern about adverse effects of the therapy drug. Similarly in external-beam radiation therapy or radionuclide therapy, there are two sources of uncertainty: delivery and efficacy of the radiation absorbed dose, and again there is a concern about radiation damage to normal tissues. The therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control vs. the probability of normal-tissue complications as the overall radiation dose level is varied, e.g. by varying the beam current in external-beam radiotherapy or the total injected activity in radionuclide therapy. The TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. In this paper we discuss the potential of using mathematical models of drug delivery and tumor response with imaging data to estimate AUTOC for chemotherapy, again for a single patient. This approach provides a basis for truly personalized therapy and for rigorously assessing and optimizing the therapy regimen for the particular patient. A key role is played by Emission Computed Tomography (PET or SPECT) of radiolabeled chemotherapy drugs.

10.
Proc SPIE Int Soc Opt Eng ; 91932014 Aug 17.
Article in English | MEDLINE | ID: mdl-27478293

ABSTRACT

A fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process in spatial, angular and wavelength variables. The mean of this random process is the spectral radiance. The principle of conservation of radiance then allows a full characterization of the noise in the image (conditional on viewing a specified object). To elucidate these connections, we first review the definitions and basic properties of radiance as defined in terms of geometrical optics, radiology, physical optics and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Then we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors. The relation between the radiance and the statistical properties of the detector output is discussed and related to task-based measures of image quality and the information content of a single detected photon.

11.
Article in English | MEDLINE | ID: mdl-26347396

ABSTRACT

We introduce and discuss photon-processing detectors and we compare them with photon-counting detectors. By estimating a relatively small number of attributes for each collected photon, photon-processing detectors may help understand and solve a fundamental theoretical problem of any imaging system based on photon-counting detectors, namely null functions. We argue that photon-processing detectors can improve task performance by estimating position, energy, and time of arrival for each collected photon. We consider a continuous-to-continuous linear operator to relate the object being imaged to the collected data, and discuss how this operator can be analyzed to derive properties of the imaging system. Finally, we derive an expression for the characteristic functional of an imaging system that produces list-mode data.

12.
J Opt Soc Am A Opt Image Sci Vis ; 29(6): 1003-16, 2012 Jun 01.
Article in English | MEDLINE | ID: mdl-22673432

ABSTRACT

A theoretical framework for detection or discrimination tasks with list-mode data is developed. The object and imaging system are rigorously modeled via three random mechanisms: randomness of the object being imaged, randomness in the attribute vectors, and, finally, randomness in the attribute vector estimates due to noise in the detector outputs. By considering the list-mode data themselves, the theory developed in this paper yields a manageable expression for the likelihood of the list-mode data given the object being imaged. This, in turn, leads to an expression for the optimal Bayesian discriminant. Figures of merit for detection tasks via the ideal and optimal linear observers are derived. A concrete example discusses detection performance of the optimal linear observer for the case of a known signal buried in a random lumpy background.


Subject(s)
Models, Theoretical , Photons , Optical Phenomena , Poisson Distribution , Quality Control , Stochastic Processes
13.
Proc SPIE Int Soc Opt Eng ; 84472012 Jul 01.
Article in English | MEDLINE | ID: mdl-26347393

ABSTRACT

The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

14.
IEEE Trans Nucl Sci ; 57(3): 1077-1084, 2010 Jun 01.
Article in English | MEDLINE | ID: mdl-20824155

ABSTRACT

A fast search algorithm capable of operating in multi-dimensional spaces is introduced. As a sample application, we demonstrate its utility in the 2D and 3D maximum-likelihood position-estimation problem that arises in the processing of PMT signals to derive interaction locations in compact gamma cameras. We demonstrate that the algorithm can be parallelized in pipelines, and thereby efficiently implemented in specialized hardware, such as field-programmable gate arrays (FPGAs). A 2D implementation of the algorithm is achieved in Cell/BE processors, resulting in processing speeds above one million events per second, which is a 20× increase in speed over a conventional desktop machine. Graphics processing units (GPUs) are used for a 3D application of the algorithm, resulting in processing speeds of nearly 250,000 events per second which is a 250× increase in speed over a conventional desktop machine. These implementations indicate the viability of the algorithm for use in real-time imaging applications.

15.
IEEE Nucl Sci Symp Conf Rec (1997) ; 2010: 2643-2647, 2010 Oct.
Article in English | MEDLINE | ID: mdl-21841906

ABSTRACT

Current thick detectors used in medical imaging allow recording many attributes, such as the 3D location of interaction within the scintillation crystal and the amount of energy deposited. An efficient way of dealing with these data is by storing them in list-mode (LM). To reconstruct the data, maximum-likelihood expectation-maximization (MLEM) is efficiently applied to the list-mode data, resulting in the list-mode maximum-likelihood expectation-maximization (LMMLEM) reconstruction algorithm.In this work, we consider a PET system consisting of two thick detectors facing each other. PMT outputs are collected for each coincidence event and are used to perform 3D maximum-likelihood (ML) position estimation of location of interaction. The mathematical properties of the ML estimation allow accurate modeling of the detector blur and provide a theoretical framework for the subsequent estimation step, namely the LMMLEM reconstruction. Indeed, a rigorous statistical model for the detector output can be obtained from calibration data and used in the calculation of the conditional probability density functions for the interaction location estimates.Our implementation of the 3D ML position estimation takes advantage of graphics processing unit (GPU) hardware and permits accurate real-time estimates of position of interaction. The LMMLEM algorithm is then applied to the list of position estimates, and the 3D radiotracer distribution is reconstructed on a voxel grid.

16.
Opt Express ; 17(13): 10946-58, 2009 Jun 22.
Article in English | MEDLINE | ID: mdl-19550494

ABSTRACT

Detection of signals in noisy images is necessary in many applications, including astronomy and medical imaging. The optimal linear observer for performing a detection task, called the Hotelling observer in the medical literature, can be regarded as a generalization of the familiar prewhitening matched filter. Performance on the detection task is limited by randomness in the image data, which stems from randomness in the object, randomness in the imaging system, and randomness in the detector outputs due to photon and readout noise, and the Hotelling observer accounts for all of these effects in an optimal way. If multiple temporal frames of images are acquired, the resulting data set is a spatio-temporal random process, and the Hotelling observer becomes a spatio-temporal linear operator. This paper discusses the theory of the spatio-temporal Hotelling observer and estimation of the required spatio-temporal covariance matrices. It also presents a parallel implementation of the observer on a cluster of Sony PLAYSTATION 3 gaming consoles. As an example, we consider the use of the spatio-temporal Hotelling observer for exoplanet detection.


Subject(s)
Image Interpretation, Computer-Assisted , Optics and Photonics , Pattern Recognition, Automated/methods , Algorithms , Artificial Intelligence , Computer Graphics , Diagnostic Imaging/methods , Image Interpretation, Computer-Assisted/methods , Models, Statistical , Programming Languages , Signal Processing, Computer-Assisted , Software , Time Factors , User-Computer Interface , Video Games
17.
IEEE Nucl Sci Symp Conf Rec (1997) ; 2009: 4072, 2009 Oct 24.
Article in English | MEDLINE | ID: mdl-21278803

ABSTRACT

The scintillation detectors commonly used in SPECT and PET imaging and in Compton cameras require estimation of the position and energy of each gamma ray interaction. Ideally, this process would yield images with no spatial distortion and the best possible spatial resolution. In addition, especially for Compton cameras, the computation must yield the best possible estimate of the energy of each interacting gamma ray. These goals can be achieved by use of maximum-likelihood (ML) estimation of the event parameters, but in the past the search for an ML estimate has not been computationally feasible. Now, however, graphics processing units (GPUs) make it possible to produce optimal, real-time estimates of position and energy, even from scintillation cameras with a large number of photodetectors. In addition, the mathematical properties of ML estimates make them very attractive for use as list entries in list-mode ML image reconstruction. This two-step ML process-using ML estimation once to get the list data and again to reconstruct the object-allows accurate modeling of the detector blur and, potentially, considerable improvement in reconstructed spatial resolution.

18.
IEEE Nucl Sci Symp Conf Rec (1997) ; 2008: 5548-5551, 2008 Oct.
Article in English | MEDLINE | ID: mdl-26778913

ABSTRACT

In this paper, we consider a prototype of an adaptive SPECT system, and we use simulation to objectively assess the system's performance with respect to a conventional, non-adaptive SPECT system. Objective performance assessment is investigated for a clinically relevant task: the detection of tumor necrosis at a known location and in a random lumpy background. The iterative maximum-likelihood expectation-maximization (MLEM) algorithm is used to perform image reconstruction. We carried out human observer studies on the reconstructed images and compared the probability of correct detection when the data are generated with the adaptive system as opposed to the non-adaptive system. Task performance is also assessed by using a channelized Hotelling observer, and the area under the receiver operating characteristic curve is the figure of merit for the detection task. Our results show a large performance improvement of adaptive systems versus non-adaptive systems and motivate further research in adaptive medical imaging.

19.
J Opt Soc Am A Opt Image Sci Vis ; 24(12): B13-24, 2007 Dec.
Article in English | MEDLINE | ID: mdl-18059905

ABSTRACT

The ideal linear discriminant or Hotelling observer is widely used for detection tasks and image-quality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with long-exposure images obtained from ground-based or space-based telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the point-spread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves.


Subject(s)
Astronomy/methods , Expert Systems , Linear Models , Signal Detection, Psychological , Artifacts , Astronomy/instrumentation , Astronomy/statistics & numerical data , Equipment Design , Extraterrestrial Environment , Humans , Observer Variation , Optics and Photonics/instrumentation , Pattern Recognition, Automated , Photogrammetry/instrumentation , Photogrammetry/methods , Poisson Distribution , ROC Curve , Reproducibility of Results , Stochastic Processes , Task Performance and Analysis
20.
Proc SPIE Int Soc Opt Eng ; 6272: 62721W, 2006 Jan 01.
Article in English | MEDLINE | ID: mdl-20890393

ABSTRACT

In objective or task-based assessment of image quality, figures of merit are defined by the performance of some specific observer on some task of scientific interest. This methodology is well established in medical imaging but is just beginning to be applied in astronomy. In this paper we survey the theory needed to understand the performance of ideal or ideal-linear (Hotelling) observers on detection tasks with adaptive-optical data. The theory is illustrated by discussing its application to detection of exoplanets from a sequence of short-exposure images.

SELECTION OF CITATIONS
SEARCH DETAIL
...