Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Biomed Phys Eng Express ; 8(3)2022 03 15.
Article in English | MEDLINE | ID: mdl-35253656

ABSTRACT

Objective. To quantify the benefit of adaptive radiotherapy over non-adaptive radiotherapy it is useful to extract and compare dosimetric features of patient treatments in both scenarios. This requires Image-Guided Radiotherapy (IGRT) matching of baseline planning to adaptive fraction imaging, followed by extraction of relevant dose metrics. This can be impractical to retrospectively perform manually for multiple patients.Approach. Here we present an algorithm for automatic IGRT matching of baseline planning with fraction imaging and performing automated dosimetric feature extraction from adaptive and non-adaptive treatment plans, thereby allowing comparison of the two scenarios. This workflow can be done in an entirely automated way via scripting solutions given structure and dose Digital Imaging and Communications in Medicine (DICOM) files from baseline and adaptive fractions. We validate this algorithm against the results of manual IGRT matching. We also demonstrate automated dosimetric feature extraction. Lastly, we combine these two scripting solutions to extract daily adaptive and non-adaptive radiotherapy dosimetric features from an initial cohort of patients treated on an MRI guided linear accelerator (MR-LINAC).Results.Our results demonstrate that automated feature extraction and IGRT matching was successful and comparable to results performed by a manual operator. We have therefore demonstrated a method for easy analysis of patients treated on an adaptive radiotherapy platform.Significance.We believe that this scripting solution can be used for quantifying the benefits of adaptive therapy and for comparing adaptive therapy against various non-adaptive IGRT scenarios (e.g. 6 degree of freedom couch rotation).


Subject(s)
Radiotherapy, Image-Guided , Radiotherapy, Intensity-Modulated , Humans , Radiotherapy Dosage , Radiotherapy Planning, Computer-Assisted/methods , Radiotherapy, Image-Guided/methods , Radiotherapy, Intensity-Modulated/methods , Retrospective Studies
2.
Comput Methods Programs Biomed ; 158: 113-122, 2018 May.
Article in English | MEDLINE | ID: mdl-29544777

ABSTRACT

BACKGROUND AND OBJECTIVES: Medical image analysis and computer-assisted intervention problems are increasingly being addressed with deep-learning-based solutions. Established deep-learning platforms are flexible but do not provide specific functionality for medical image analysis and adapting them for this domain of application requires substantial implementation effort. Consequently, there has been substantial duplication of effort and incompatible infrastructure developed across many research groups. This work presents the open-source NiftyNet platform for deep learning in medical imaging. The ambition of NiftyNet is to accelerate and simplify the development of these solutions, and to provide a common mechanism for disseminating research outputs for the community to use, adapt and build upon. METHODS: The NiftyNet infrastructure provides a modular deep-learning pipeline for a range of medical imaging applications including segmentation, regression, image generation and representation learning applications. Components of the NiftyNet pipeline including data loading, data augmentation, network architectures, loss functions and evaluation metrics are tailored to, and take advantage of, the idiosyncracies of medical image analysis and computer-assisted intervention. NiftyNet is built on the TensorFlow framework and supports features such as TensorBoard visualization of 2D and 3D images and computational graphs by default. RESULTS: We present three illustrative medical image analysis applications built using NiftyNet infrastructure: (1) segmentation of multiple abdominal organs from computed tomography; (2) image regression to predict computed tomography attenuation maps from brain magnetic resonance images; and (3) generation of simulated ultrasound images for specified anatomical poses. CONCLUSIONS: The NiftyNet infrastructure enables researchers to rapidly develop and distribute deep learning solutions for segmentation, regression, image generation and representation learning applications, or extend the platform to new applications.


Subject(s)
Diagnostic Imaging/methods , Machine Learning , Abdomen/diagnostic imaging , Brain/diagnostic imaging , Computer Simulation , Databases, Factual , Diagnostic Imaging/instrumentation , Humans , Image Processing, Computer-Assisted/instrumentation , Image Processing, Computer-Assisted/methods , Magnetic Resonance Imaging , Neural Networks, Computer , Ultrasonography
SELECTION OF CITATIONS
SEARCH DETAIL
...