Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
J Appl Clin Med Phys ; 22(12): 168-176, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34783427

ABSTRACT

PURPOSE: The dual-energy CT (DECT) LiverVNC application class in the Siemens Syngo.via software has been used to perform non-iodine material decompositions. However, the LiverVNC application is designed with an optional size-specific calibration based on iodine measurements. This work investigates the effects of this iodine-based size-specific calibration on non-iodine material decomposition and benchmarks alternative methods for size-specific calibrations. METHODS: Calcium quantification was performed with split-filter and sequential-scanning DECT techniques on the Siemens SOMATOM Definition Edge CT scanner. Images were acquired of the Gammex MECT abdomen and head phantom containing calcium inserts with concentrations ranging from 50-300 mgCa/ml. Several workflows were explored investigating the effects of size-specific dual-energy ratios (DERs) and the beam hardening correction (BHC) function in the LiverVNC application. Effects of image noise were also investigated by varying CTDIvol and using iterative reconstruction (ADMIRE). RESULTS: With the default BHC activated, Syngo.via underestimated the calcium concentrations in the abdomen for sequential-scanning acquisitions, leaving residual calcium in the virtual non-contrast images and underestimating calcium in the enhancement images for all DERs. Activation of the BHC with split-filter images resulted in a calcium over- or underestimation depending on the DER. With the BHC inactivated, the use of a single DER led to an under- or overestimate of calcium concentration depending on phantom size and DECT modality. Optimal results were found with BHC inactivated using size-specific DERs. CTDIvol levels and ADMIRE had no significant effect on results. CONCLUSION: When performing non-iodine material decomposition in the LiverVNC application class, it is important to understand the implications of the BHC function and to account for patient size appropriately. The BHC in the LiverVNC application is specific to iodine and leads to inaccurate quantification of other materials. The inaccuracies can be overcome by deactivating the BHC function and using size-specific DERs, which provided the most accurate calcium quantification.


Subject(s)
Iodine , Calibration , Humans , Phantoms, Imaging , Tomography Scanners, X-Ray Computed , Tomography, X-Ray Computed
2.
J Appl Clin Med Phys ; 21(8): 249-255, 2020 Aug.
Article in English | MEDLINE | ID: mdl-32410336

ABSTRACT

PURPOSE: Accurate liver tumor delineation is crucial for radiation therapy, but liver tumor volumes are difficult to visualize with conventional single-energy CT. This work investigates the use of split-filter dual-energy CT (DECT) for liver tumor visibility by quantifying contrast and contrast-to-noise ratio (CNR). METHODS: Split-filter DECT contrast-enhanced scans of 20 liver tumors including cholangiocarcinomas, hepatocellular carcinomas, and liver metastases were acquired. Analysis was performed on the arterial and venous phases of mixed 120 kVp-equivalent images and VMIs at 57 keV and 40 keV gross target volume (GTV) contrast and CNR were calculated. RESULTS: For the arterial phase, liver GTV contrast was 12.1 ± 10.0 HU and 43.1 ± 32.3 HU (P < 0.001) for the mixed images and 40 keV VMIs. Image noise increased on average by 116% for the 40 keV VMIs compared to the mixed images. The average CNR did not change significantly (1.6 ± 1.5, 1.7 ± 1.4, 2.4 ± 1.7 for the mixed, 57 keV and 40 keV VMIs (P > 0.141)). For individual cases, however, CNR increases of up to 607% were measured for the 40 keV VMIs compared to the mixed image. Venous phase 40 keV VMIs demonstrated an average increase of 35.4 HU in GTV contrast and 121% increase in image noise. Average CNR values were also not statistically different, but for individual cases CNR increases of up to 554% were measured for the 40 keV VMIs compared to the mixed image. CONCLUSIONS: Liver tumor contrast was significantly improved using split-filter DECT 40 keV VMIs compared to mixed images. On average, there was no statistical difference in CNR between the mixed images and VMIs, but for individual cases, CNR was greatly increased for the 57 keV and 40 keV VMIs. Therefore, although not universally successful for our patient cohort, split-filter DECT VMIs may provide substantial gains in tumor visibility of certain liver cases for radiation therapy treatment planning.


Subject(s)
Carcinoma, Hepatocellular , Liver Neoplasms , Radiography, Dual-Energy Scanned Projection , Carcinoma, Hepatocellular/diagnostic imaging , Carcinoma, Hepatocellular/radiotherapy , Contrast Media , Humans , Liver Neoplasms/diagnostic imaging , Liver Neoplasms/radiotherapy , Radiographic Image Interpretation, Computer-Assisted , Signal-To-Noise Ratio , Tomography, X-Ray Computed
3.
Med Phys ; 45(12): 5564-5576, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30273996

ABSTRACT

PURPOSE: This work seeks to investigate new methods to determine the absorbed dose to water from kilovoltage x rays. Current methods are based on measurements in air and rely on correction factors in order to account for differences between the photon spectrum in air and at depth in phantom, between the photon spectra of the calibration beam and the beam of interest, or in the radiation absorption properties of air and water. This work aims to determine the absorbed dose to water in the NIST-matched x-ray beams at the University of Wisconsin Accredited Dosimetry Calibration Laboratory (UWADCL). This will facilitate the use of detectors in terms of dose to water, which will allow for a simpler determination of dose to water in clinical kilovoltage x-ray beams. MATERIALS AND METHODS: A model of the moderately filtered x-ray beams at the UWADCL was created using the BEAMnrc user code of the EGSnrc Monte Carlo code system. This model was validated against measurements and the dose to water per unit air kerma was calculated in a custom built water tank. Using this value and the highly precise measurement of the air kerma made by the UWADCL, the dose to water was determined in the water tank for the x-ray beams of interest. The dose to water was also determined using the formalism defined in the report of AAPM Task Group 61 and using a method that makes use of a 60 Co absorbed dose-to-water calibration coefficient and a beam quality correction factor to account for differences in beam quality between the 60 Co calibration and kilovoltage x-ray beam of interest. The dose to water values as determined by these different methods was then compared. RESULTS: The BEAMnrc models used in this work produced simulations of transverse and depth dose profiles that agreed with measurements with a 2%/2 mm criteria gamma test. The dose to water as determined from the different methods used here agreed within 3.5% at the surface of the water tank and agreed within 1.8% at a depth of 2 cm in phantom. The dose-to-water values all agreed within the associated uncertainties of the methods used in this work. Both the Monte Carlo-based method and the 60 Co-based method had a lower uncertainty than the TG-61 methodology for all of the x-ray beams used in this work. CONCLUSION: Two new dose determination methods were used to determine the dose to water in the NIST-matched x-ray beams at the UWADCL and they showed good agreement with previously established techniques. Due to the improved Monte Carlo calculation techniques used in this work, both of the methods have lower uncertainties compared to TG-61. The methods presented in this work compare favorably with calorimetry-based standards established at other institutions.


Subject(s)
Cobalt Radioisotopes , Monte Carlo Method , Radiometry/methods , Uncertainty , X-Rays
SELECTION OF CITATIONS
SEARCH DETAIL
...