Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
1.
Preprint em Inglês | medRxiv | ID: ppmedrxiv-22271399

RESUMO

BackgroundIn October 2020, the National Cancer Institute (NCI) Serological Sciences Network (SeroNet) was established to study the immune response to COVID-19, and "to develop, validate, improve, and implement serological testing and associated technologies." SeroNet is comprised of 25 participating research institutions partnering with the Frederick National Laboratory for Cancer Research (FNLCR) and the SeroNet Coordinating Center. Since its inception, SeroNet has supported collaborative development and sharing of COVID-19 serological assay procedures and has set forth plans for assay harmonization. MethodsTo facilitate collaboration and procedure sharing, a detailed survey was sent to collate comprehensive assay details and performance metrics on COVID-19 serological assays within SeroNet. In addition, FNLCR established a protocol to calibrate SeroNet serological assays to reference standards, such as the U.S. SARS-CoV-2 serology standard reference material and First WHO International Standard (IS) for anti-SARS-CoV-2 immunoglobulin (20/136), to facilitate harmonization of assay reporting units and cross-comparison of study data. ResultsSeroNet institutions reported development of a total of 27 ELISA methods, 13 multiplex assays, 9 neutralization assays, and use of 12 different commercial serological methods. FNLCR developed a standardized protocol for SeroNet institutions to calibrate these diverse serological assays to reference standards. ConclusionsSeroNet institutions have established a diverse array of COVID-19 serological assays to study the immune response to SARS-CoV-2 virus and vaccines. Calibration of SeroNet serological assays to harmonize results reporting will facilitate future pooled data analyses and study cross-comparisons.

2.
Preprint em Inglês | medRxiv | ID: ppmedrxiv-21261592

RESUMO

BackgroundSARS-CoV-2 seroprevalence studies have largely focused on adults but little is known about spread in children. We determined SARS-CoV-2 seroprevalence in children and adolescents from Arkansas over the first year of the COVID-19 pandemic. MethodsWe tested remnant serum samples from children from 1-18 years who visited Arkansas hospitals or clinics for non-COVID19-related reasons from April, 2020 through April, 2021 for SARS-CoV-2 antibodies. We used univariable and multivariable regression models to determine association between seropositivity and participant characteristics. ResultsAmong 2400 participants, seroprevalence rose from 7.9% in April/May 2020 (95% CI, 4.9-10.9%) to 25.8% in April 2021 (95% CI, 22.2-29.3%). Hispanic and black children had a significantly higher association with antibody positivity than white children in multiple sampling periods. ConclusionsBy spring 2021, most children in Arkansas had not been infected with SARS-CoV-2. With the emergence of SARS-CoV-2 variants, recognition of long-term effects of COVID-19, and the lack of an authorized pediatric SARS-CoV-2 vaccine, these results highlight the importance of including children in SARS-CoV-2 public health, clinical care, and research strategies. These findings are important for state and local officials as they consider measures to limit SARS-CoV-2 spread in schools and daycares for the 2021-2022 school year.

3.
Pediatr Blood Cancer ; 67(11): e28665, 2020 11.
Artigo em Inglês | MEDLINE | ID: mdl-32827342

RESUMO

Recent clinical trials have moved iodine-131 (I-131) metaiodobenzylguanidine (MIBG) therapy into frontline management of high-risk neuroblastoma. With this expansion, it is reasonable to anticipate the need for intensive care level resuscitations. Radiation exposure remains the greatest risk to health care professionals managing these patients. We combined shock simulation scenario data with actual radiation dosimetry data to create a care model allowing for aggressive, prolonged in situ resuscitation of a critically ill pediatric patient after I-131 MIBG administration. This model will maintain a critical care provider's radiation level below 10% of the annual occupational dose limit (5 mSv, 500 mrem) per patient managed.


Assuntos
3-Iodobenzilguanidina/efeitos adversos , Estado Terminal/terapia , Radioisótopos do Iodo/efeitos adversos , Modelos Estatísticos , Neuroblastoma/radioterapia , Assistência Centrada no Paciente/normas , Exposição à Radiação/normas , 3-Iodobenzilguanidina/administração & dosagem , Criança , Cuidados Críticos/normas , Estado Terminal/epidemiologia , Feminino , Humanos , Infusões Intravenosas , Radioisótopos do Iodo/administração & dosagem , Michigan/epidemiologia , Prognóstico , Compostos Radiofarmacêuticos/administração & dosagem , Compostos Radiofarmacêuticos/efeitos adversos , Dosagem Radioterapêutica
4.
PLoS One ; 9(11): e111625, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-25390888

RESUMO

Today, while many researchers focus on the improvement of the regularization term in IR algorithms, they pay less concern to the improvement of the fidelity term. In this paper, we hypothesize that improving the fidelity term will further improve IR image quality in low-dose scanning, which typically causes more noise. The purpose of this paper is to systematically test and examine the role of high-fidelity system models using raw data in the performance of iterative image reconstruction approach minimizing energy functional. We first isolated the fidelity term and analyzed the importance of using focal spot area modeling, flying focal spot location modeling, and active detector area modeling as opposed to just flying focal spot motion. We then compared images using different permutations of all three factors. Next, we tested the ability of the fidelity terms to retain signals upon application of the regularization term with all three factors. We then compared the differences between images generated by the proposed method and Filtered-Back-Projection. Lastly, we compared images of low-dose in vivo data using Filtered-Back-Projection, Iterative Reconstruction in Image Space, and the proposed method using raw data. The initial comparison of difference maps of images constructed showed that the focal spot area model and the active detector area model also have significant impacts on the quality of images produced. Upon application of the regularization term, images generated using all three factors were able to substantially decrease model mismatch error, artifacts, and noise. When the images generated by the proposed method were tested, conspicuity greatly increased, noise standard deviation decreased by 90% in homogeneous regions, and resolution also greatly improved. In conclusion, the improvement of the fidelity term to model clinical scanners is essential to generating higher quality images in low-dose imaging.


Assuntos
Interpretação de Imagem Radiográfica Assistida por Computador/métodos , Tomografia Computadorizada por Raios X/métodos , Algoritmos , Artefatos , Tomografia Computadorizada de Feixe Cônico/métodos , Humanos , Processamento de Imagem Assistida por Computador , Modelos Teóricos , Movimento (Física) , Imagens de Fantasmas , Doses de Radiação , Reprodutibilidade dos Testes , Processamento de Sinais Assistido por Computador
5.
IEEE Trans Image Process ; 17(5): 645-56, 2008 May.
Artigo em Inglês | MEDLINE | ID: mdl-18390371

RESUMO

In this paper, we present a complete and practical algorithm for the approximation of level-set-based curve evolution suitable for real-time implementation. In particular, we propose a two-cycle algorithm to approximate level-set-based curve evolution without the need of solving partial differential equations (PDEs). Our algorithm is applicable to a broad class of evolution speeds that can be viewed as composed of a data-dependent term and a curve smoothness regularization term. We achieve curve evolution corresponding to such evolution speeds by separating the evolution process into two different cycles: one cycle for the data-dependent term and a second cycle for the smoothness regularization. The smoothing term is derived from a Gaussian filtering process. In both cycles, the evolution is realized through a simple element switching mechanism between two linked lists, that implicitly represents the curve using an integer valued level-set function. By careful construction, all the key evolution steps require only integer operations. A consequence is that we obtain significant computation speedups compared to exact PDE-based approaches while obtaining excellent agreement with these methods for problems of practical engineering interest. In particular, the resulting algorithm is fast enough for use in real-time video processing applications, which we demonstrate through several image segmentation and video tracking experiments.


Assuntos
Algoritmos , Inteligência Artificial , Interpretação de Imagem Assistida por Computador/métodos , Reconhecimento Automatizado de Padrão/métodos , Gravação em Vídeo/métodos , Sistemas Computacionais , Aumento da Imagem/métodos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
6.
IEEE Trans Image Process ; 17(5): 757-66, 2008 May.
Artigo em Inglês | MEDLINE | ID: mdl-18390380

RESUMO

In this paper, we develop a new unified approach for laser radar range anomaly suppression, range profiling, and segmentation. This approach combines an object-based hybrid scene model for representing the range distribution of the field and a statistical mixture model for the range data measurement noise. The image segmentation problem is formulated as a minimization problem which jointly estimates the target boundary together with the target region range variation and background range variation directly from the noisy and anomaly-filled range data. This formulation allows direct incorporation of prior information concerning the target boundary, target ranges, and background ranges into an optimal reconstruction process. Curve evolution techniques and a generalized expectation-maximization algorithm are jointly employed as an efficient solver for minimizing the objective energy, resulting in a coupled pair of object and intensity optimization tasks. The method directly and optimally extracts the target boundary, avoiding a suboptimal two-step process involving image smoothing followed by boundary extraction. Experiments are presented demonstrating that the proposed approach is robust to anomalous pixels (missing data) and capable of producing accurate estimation of the target boundary and range values from noisy data.


Assuntos
Algoritmos , Inteligência Artificial , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Lasers , Reconhecimento Automatizado de Padrão/métodos , Radar , Artefatos , Simulação por Computador , Imageamento Tridimensional/métodos , Armazenamento e Recuperação da Informação/métodos , Funções Verossimilhança , Modelos Estatísticos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
7.
J Opt Soc Am A Opt Image Sci Vis ; 24(12): 3762-71, 2007 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-18059929

RESUMO

Spectral self-interference microscopy (SSM) relies on the balanced collection of light traveling two different paths from the sample to the detector, one direct and the other indirect from a reflecting substrate. The resulting spectral interference effects allow nanometer-scale axial localization of isolated emitters. To produce spectral fringes the difference between the two optical paths must be significant. Consequently, to ensure that both contributions are in focus, a low-numerical-aperture objective lens must be used, giving poor lateral resolution. Here this limitation is overcome using a 4Pi apparatus to produce the requisite two paths to the detector. The resulting instrument generalizes both SSM and 4Pi microscopy and allows a quantification of SSM resolution (rather than localization precision). Specifically, SSM is shown to be subject to the same resolution constraints as 4Pi microscopy.


Assuntos
Aumento da Imagem/instrumentação , Microscopia de Interferência/instrumentação , Desenho de Equipamento , Análise de Falha de Equipamento , Imageamento Tridimensional/instrumentação , Lasers Semicondutores , Lentes , Valores de Referência , Refratometria/instrumentação , Refratometria/métodos , Sensibilidade e Especificidade
8.
J Opt Soc Am A Opt Image Sci Vis ; 24(11): 3587-99, 2007 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-17975585

RESUMO

A theoretical and numerical analysis of spectral self-interference microscopy (SSM) is presented with the goal of expanding the realm of SSM applications. In particular, this work is intended to enable SSM imaging in low-signal applications such as single-molecule studies. A comprehensive electromagnetic model for SSM is presented, allowing arbitrary forms of the excitation field, detection optics, and tensor sample response. An evanescently excited SSM system, analogous to total internal reflection microscopy, is proposed and investigated through Monte Carlo simulations. Nanometer-scale axial localization for single-emitter objects is demonstrated, even in low-signal environments. The capabilities of SSM in imaging more general objects are also considered--specifically, imaging arbitrary fluorophore distributions and two-emitter objects. A data-processing method is presented that makes SSM robust to noise and uncertainties in the detected spectral envelope.


Assuntos
Interpretação de Imagem Assistida por Computador , Microscopia de Fluorescência/métodos , Microscopia de Interferência/métodos , Modelos Teóricos , Corantes Fluorescentes/química
9.
IEEE Trans Image Process ; 15(3): 582-91, 2006 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-16519345

RESUMO

The problem of determining the location and orientation of straight lines in images is of great importance in the fields of computer vision and image processing. Traditionally the Hough transform, (a special case of the Radon transform) has been widely used to solve this problem for binary images. In this paper, we pose the problem of detecting straight lines in gray-scale images as an inverse problem. Our formulation is based on use of the inverse Radon operator, which relates the parameters determining the location and orientation of the lines in the image to the noisy input image. The advantage of this formulation is that we can then approach the problem of line detection within a regularization framework and enhance the performance of the Hough-based line detector through the incorporation of prior information in the form of regularization. We discuss the type of regularizers that are useful for this problem and derive efficient computational schemes to solve the resulting optimization problems enabling their use in large applications. Finally, we show how our new approach can be alternatively viewed as one of finding an optimal representation of the noisy image in terms of elements chosen from a dictionary of lines. This interpretation relates the problem of Hough-based line finding to the body of work on adaptive signal representation.


Assuntos
Algoritmos , Inteligência Artificial , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Reconhecimento Automatizado de Padrão/métodos , Armazenamento e Recuperação da Informação/métodos , Análise Numérica Assistida por Computador , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
10.
Inf Process Med Imaging ; 19: 345-56, 2005.
Artigo em Inglês | MEDLINE | ID: mdl-17354708

RESUMO

In this paper we develop a multi-object prior shape model for use in curve evolution-based image segmentation. Our prior shape model is constructed from a family of shape distributions (cumulative distribution functions) of features related to the shape. Shape distribution-based object representations possess several desired properties, such as robustness, invariance, and good discriminative and generalizing properties. Further, our prior can capture information about the interaction between multiple objects. We incorporate this prior in a curve evolution formulation for shape estimation. We apply this methodology to problems in medical image segmentation.


Assuntos
Inteligência Artificial , Encéfalo/anatomia & histologia , Aumento da Imagem/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Imageamento por Ressonância Magnética/métodos , Reconhecimento Automatizado de Padrão/métodos , Algoritmos , Humanos , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
11.
Opt Express ; 12(17): 4150-6, 2004 Aug 23.
Artigo em Inglês | MEDLINE | ID: mdl-19483958

RESUMO

The use of pupil-plane filters in microscopes has been proposed as a method of producing superresolution. Here it is shown that pupil-plane filters cannot increase the support of the transfer function for a large class of optical systems, implying that resolution cannot be improved solely by adding pupil-plane filters to an instrument. However, pupil filters can improve signal-to-noise performance and modify transfer-function zero crossing positions, as demonstrated through a confocal fluorescence example.

12.
IEEE Trans Image Process ; 12(1): 44-57, 2003.
Artigo em Inglês | MEDLINE | ID: mdl-18237878

RESUMO

In this paper, we develop a new approach to tomographic reconstruction problems based on geometric curve evolution techniques. We use a small set of texture coefficients to represent the object and background inhomogeneities and a contour to represent the boundary of multiple connected or unconnected objects. Instead of reconstructing pixel values on a fixed rectangular grid, we then find a reconstruction by jointly estimating these unknown contours and texture coefficients of the object and background. By designing a new "tomographic flow", the resulting problem is recast into a curve evolution problem and an efficient algorithm based on level set techniques is developed. The performance of the curve evolution method is demonstrated using examples with noisy limited-view Radon transformed data and noisy ground-penetrating radar data. The reconstruction results and computational cost are compared with those of conventional, pixel-based regularization methods. The results indicate that the curve evolution methods achieve improved shape reconstruction and have potential computation and memory advantages over conventional regularized inversion methods.

13.
IEEE Trans Med Imaging ; 21(11): 1402-12, 2002 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-12575877

RESUMO

Characterizing the response of the brain to a stimulus based on functional magnetic resonance imaging data is a major challenge due to the fact that the response time delay of the brain may be different from one stimulus phase to the next and from pixel to pixel. To enhance detectability, this work introduces the use of a curve evolution approach that provides separate estimates of the response time shifts at each phase of the stimulus on a pixel-by-pixel basis. The approach relies on a parsimonious but simple model that is nonlinear in the time shifts of the response relative to the stimulus and linear in the gains. To effectively use the response time shift estimates in a subspace detection framework, we implement a robust hypothesis test based on a Laplacian noise model. The algorithm provides a pixel-by-pixel functional characterization of the brain's response. The results based on experimental data show that response time shift estimates, when properly implemented, enhance detectability without sacrificing robustness.


Assuntos
Mapeamento Encefálico/métodos , Potenciais Evocados/fisiologia , Aumento da Imagem/métodos , Imageamento por Ressonância Magnética/métodos , Modelos Neurológicos , Tempo de Reação/fisiologia , Adulto , Algoritmos , Encéfalo/anatomia & histologia , Encéfalo/fisiologia , Potenciais Evocados Visuais , Humanos , Masculino , Neurônios/fisiologia , Controle de Qualidade , Reprodutibilidade dos Testes , Sensibilidade e Especificidade , Processos Estocásticos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...