Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros










Base de dados
Assunto principal
Intervalo de ano de publicação
1.
Phys Eng Sci Med ; 2024 Jun 17.
Artigo em Inglês | MEDLINE | ID: mdl-38884673

RESUMO

To propose a style transfer model for multi-contrast magnetic resonance imaging (MRI) images with a cycle-consistent generative adversarial network (CycleGAN) and evaluate the image quality and prognosis prediction performance for glioblastoma (GBM) patients from the extracted radiomics features. Style transfer models of T1 weighted MRI image (T1w) to T2 weighted MRI image (T2w) and T2w to T1w with CycleGAN were constructed using the BraTS dataset. The style transfer model was validated with the Cancer Genome Atlas Glioblastoma Multiforme (TCGA-GBM) dataset. Moreover, imaging features were extracted from real and synthesized images. These features were transformed to rad-scores by the least absolute shrinkage and selection operator (LASSO)-Cox regression. The prognosis performance was estimated by the Kaplan-Meier method. For the accuracy of the image quality of the real and synthesized MRI images, the MI, RMSE, PSNR, and SSIM were 0.991 ± 2.10 × 10 - 4 , 2.79 ± 0.16, 40.16 ± 0.38, and 0.995 ± 2.11 × 10 - 4 , for T2w, and .992 ± 2.63 × 10 - 4 , 2.49 ± 6.89 × 10 - 2 , 40.51 ± 0.22, and 0.993 ± 3.40 × 10 - 4 for T1w, respectively. The survival time had a significant difference between good and poor prognosis groups for both real and synthesized T2w (p < 0.05). However, the survival time had no significant difference between good and poor prognosis groups for both real and synthesized T1w. On the other hand, there was no significant difference between the real and synthesized T2w in both good and poor prognoses. The results of T1w were similar in the point that there was no significant difference between the real and synthesized T1w. It was found that the synthesized image could be used for prognosis prediction. The proposed prognostic model using CycleGAN could reduce the cost and time of image scanning, leading to a promotion to build the patient's outcome prediction with multi-contrast images.

2.
Phys Eng Sci Med ; 46(1): 313-323, 2023 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-36715853

RESUMO

This study aims to synthesize fluid-attenuated inversion recovery (FLAIR) and diffusion-weighted images (DWI) with a deep conditional adversarial network from T1- and T2-weighted magnetic resonance imaging (MRI) images. A total of 1980 images of 102 patients were split into two datasets: 1470 (68 patients) in a training set and 510 (34 patients) in a test set. The prediction framework was based on a convolutional neural network with a generator and discriminator. T1-weighted, T2-weighted, and composite images were used as inputs. The digital imaging and communications in medicine (DICOM) images were converted to 8-bit red-green-blue images. The red and blue channels of the composite images were assigned to 8-bit grayscale pixel values in T1-weighted images, and the green channel was assigned to those in T2-weighted images. The prediction FLAIR and DWI images were of the same objects as the inputs. For the results, the prediction model with composite MRI input images in the DWI image showed the smallest relative mean absolute error (rMAE) and largest mutual information (MI), and that in the FLAIR image showed the largest relative mean-square error (rMSE), relative root-mean-square error (rRMSE), and peak signal-to-noise ratio (PSNR). For the FLAIR image, the prediction model with the T2-weighted MRI input images generated more accurate synthesis results than that with the T1-weighted inputs. The proposed image synthesis framework can improve the versatility and quality of multi-contrast MRI without extra scans. The composite input MRI image contributes to synthesizing the multi-contrast MRI image efficiently.


Assuntos
Aprendizado Profundo , Humanos , Imageamento por Ressonância Magnética/métodos , Redes Neurais de Computação , Razão Sinal-Ruído
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...