Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 93
Filter
1.
Radiat Res ; 201(5): 487-498, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38471523

ABSTRACT

In gene expression (GE) studies, housekeeping genes (HKGs) are required for normalization purposes. In large-scale inter-laboratory comparison studies, significant differences in dose estimates are reported and divergent HKGs are employed by the teams. Among them, the 18S rRNA HKG is known for its robustness. However, the high abundance of 18S rRNA copy numbers requires dilution, which is time-consuming and a possible source of errors. This study was conducted to identify the most promising HKGs showing the least radiation-induced GE variance after radiation exposure. In the screening stage of this study, 35 HKGs were analyzed. This included selected HKGs (ITFG1, MRPS5, and DPM1) used in large-scale biodosimetry studies which were not covered on an additionally employed pre-designed 96-well platform comprising another 32 HKGs used for different exposures. Altogether 41 samples were examined, including 27 ex vivo X-ray irradiated blood samples (0, 0.5, 4 Gy), six X-irradiated samples (0, 0.5, 5 Gy) from two cell lines (U118, A549), as well as eight non-irradiated tissue samples to encompass multiple biological entities. In the independent validation stage, the most suitable candidate genes were examined from another 257 blood samples, taking advantage of already stored material originating from three studies. These comprise 100 blood samples from ex vivo X-ray irradiated (0-4 Gy) healthy donors, 68 blood samples from 5.8 Gy irradiated (cobalt-60) Rhesus macaques (RM) (LD29/60) collected 0-60 days postirradiation, and 89 blood samples from chemotherapy-(CTx) treated breast tumor patients. CTx and radiation-induced GE changes in previous studies appeared comparable. RNA was isolated, converted into cDNA, and GE was quantified employing TaqMan assays and quantitative RT-PCR. We calculated the standard deviation (SD) and the interquartile range (IQR) as measures of GE variance using raw cycle threshold (Ct) values and ranked the HKGs accordingly. Dose, time, age, and sex-dependent GE changes were examined employing the parametrical t-test and non-parametrical Kruskal Wallis test, as well as linear regression analysis. Generally, similar ranking results evolved using either SD or IQR GE measures of variance, indicating a tight distribution of GE values. PUM1 and PGK1 showed the lowest variance among the first ten most suitable genes in the screening phase. MRPL19 revealed low variance among the first ten most suitable genes in the screening phase only for blood and cells, but certain comparisons indicated a weak association of MRPL19 with dose (P = 0.02-0.09). In the validation phase, these results could be confirmed. Here, IQR Ct values from, e.g., X-irradiated blood samples were 0.6 raw Ct values for PUM1 and PGK1, which is considered to represent GE differences as expected due to methodological variance. Overall, when compared, the GE variance of both genes was either comparable or lower compared to 18S rRNA. Compared with the IQR GE values of PUM1 and PGKI, twofold-fivefold increased values were calculated for the biodosimetry HKG HPRT1, and comparable values were calculated for biodosimetry HKGs ITFG1, MRPS5, and DPM1. Significant dose-dependent associations were found for ITFG1 and MRPS5 (P = 0.001-0.07) and widely absent or weak (P = 0.02-0.07) for HPRT1 and DPM1. In summary, PUM1 and PGK1 appeared most promising for radiation exposure studies among the 35 HKGs examined, considering GE variance and adverse associations of GE with dose.


Subject(s)
Genes, Essential , Phosphoglycerate Kinase , RNA-Binding Proteins , Radiation Exposure , Adult , Animals , Female , Humans , Male , Middle Aged , Dose-Response Relationship, Radiation , Genes, Essential/radiation effects , Radiation Exposure/adverse effects , Radiometry , RNA, Ribosomal, 18S/genetics , RNA, Ribosomal, 18S/radiation effects , RNA-Binding Proteins/genetics , RNA-Binding Proteins/radiation effects , Macaca mulatta , Phosphoglycerate Kinase/genetics , Phosphoglycerate Kinase/radiation effects
2.
Radiat Res ; 201(5): 504-513, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38471521

ABSTRACT

Increased radiological and nuclear threats require preparedness. Our earlier work identified a set of four genes (DDB2, FDXR, POU2AF1 and WNT3), which predicts severity of the hematological acute radiation syndrome (H-ARS) within the first three days postirradiation In this study of 41 Rhesus macaques (Macaca mulatta, 27 males, 14 females) irradiated with 5.8-7.2 Gy (LD29-50/60), including some treated with gamma-tocotrienol (GT3, a radiation countermeasure) we independently validated these genes as predictors in both sexes and examined them after three days. At the Armed Forces Radiobiology Research Institute/Uniformed Services University of the Health Sciences, peripheral whole blood (1 ml) of Rhesus macaques was collected into PAXgene® Blood RNA tubes pre-irradiation after 1, 2, 3, 35 and 60 days postirradiation, stored at -80°C for internal experimental analyses. Leftover tubes from these already ongoing studies were kindly provided to Bundeswehr Institute of Radiobiology. RNA was isolated (QIAsymphony), converted into cDNA, and for further gene expression (GE) studies quantitative RT-PCR was performed. Differential gene expression (DGE) was measured relative to the pre-irradiation Rhesus macaques samples. Within the first three days postirradiation, we found similar results to human data: 1. FDXR and DDB2 were up-regulated, FDXR up to 3.5-fold, and DDB2 up to 13.5-fold in the median; 2. POU2AF1 appeared down regulated around tenfold in nearly all Rhesus macaques; 3. Contrary to human data, DDB2 was more up-regulated than FDXR, and the difference of the fold change (FC) ranged between 2.4 and 10, while the median fold changes of WNT3, except days 1 and 35, were close to 1. Nevertheless, 46% of the Rhesus macaques showed down-regulated WNT3 on day one postirradiation, which decreased to 12.2% on day 3 postirradiation. Considering the extended phase, there was a trend towards decreased fold changes at day 35, with median-fold changes ranging from 0.7 for DDB2 to 0.1 for POU2AF1, and on day 60 postirradiation, DGE in surviving animals was close to pre-exposure values for all four genes. In conclusion, the diagnostic significance for radiation-induced H-ARS severity prediction of FDXR, DDB2, and POU2AF1 was confirmed in this Rhesus macaques model. However, DDB2 showed higher GE values than FDXR. As shown in previous studies, the diagnostic significance of WNT3 could not be reproduced in Rhesus macaques; this could be due to the choice of animal model and methodological challenges.


Subject(s)
Acute Radiation Syndrome , Macaca mulatta , Animals , Male , Female , Acute Radiation Syndrome/blood , Acute Radiation Syndrome/genetics
3.
Radiat Res ; 201(5): 499-503, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38471522

ABSTRACT

Despite the large variety of high-voltage semiconductor components for medium and high voltage switching and pulse-forming applications as well as for high-power high-frequency generation, the use of vacuum electron tubes still prevails to a considerable degree. Due to the common design incorporating a high energy electron beam which finally is dumped into an anode or a resonator cavity, these tubes are also considered as sources of X rays produced as bremsstrahlung and characteristic radiation, which are referred to as parasitic X rays. Here three types of vacuum-electron tubes, diode, tetrode, and thyratron, with glass housings are investigated. They are predominantly operated in the high voltage range below 30 kV and are not subject to licensing laws. The measurements of the dose rate and X-ray-spectra were performed in the laboratory without complex electrical circuitry usually used in making practical measurements for occupational radiation protection. For the diode tube, where a parasitic X-ray emission is observed only in the reverse operation as a blocking diode, a broad distribution of dose rates of electrically equivalent specimens was observed. This is attributed to field emission from the electrodes. For the tetrode and the thyratron tubes, field emission from the electrodes is identified as the dominant mechanism for the generation of parasitic X rays. Thus, technical radiation protection must focus on shielding of the glass tube rather than optimization of the electrical circuitry.


Subject(s)
Electrons , Glass , Occupational Exposure , X-Rays , Vacuum , Glass/chemistry , Occupational Exposure/analysis , Radiation Dosage , Radiation Protection/instrumentation
4.
Radiat Res ; 201(5): 523-534, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38499035

ABSTRACT

As the great majority of gene expression (GE) biodosimetry studies have been performed using blood as the preferred source of tissue, searching for simple and less-invasive sampling methods is important when considering biodosimetry approaches. Knowing that whole saliva contains an ultrafiltrate of blood and white blood cells, it is expected that the findings in blood can also be found in saliva. This human in vivo study aims to examine radiation-induced GE changes in saliva for biodosimetry purposes and to predict radiation-induced disease, which is yet poorly characterized. Furthermore, we examined whether transcriptional biomarkers in blood can also be found equivalently in saliva. Saliva and blood samples were collected in parallel from radiotherapy (RT) treated patients who suffered from head and neck cancer (n = 8) undergoing fractioned partial-body irradiations (1.8 Gy/fraction and 50-70 Gy total dose). Samples were taken 12-24 h before first irradiation and ideally 24 and 48 h, as well as 5 weeks after radiotherapy onset. Due to the low quality and quantity of isolated RNA samples from one patient, they had to be excluded from further analysis, leaving a total of 24 saliva and 24 blood samples from 7 patients eligible for analysis. Using qRT-PCR, 18S rRNA and 16S rRNA (the ratio being a surrogate for the relative human RNA/bacterial burden), four housekeeping genes and nine mRNAs previously identified as radiation responsive in blood-based studies were detected. Significant GE associations with absorbed dose were found for five genes and after the 2nd radiotherapy fraction, shown by, e.g., the increase of CDKN1A (2.0 fold, P = 0.017) and FDXR (1.9 fold increased, P = 0.002). After the 25th radiotherapy fraction, however, all four genes (FDXR, DDB2, POU2AF1, WNT3) predicting ARS (acute radiation syndrome) severity, as well as further genes (including CCNG1 [median-fold change (FC) = 0.3, P = 0.013], and GADD45A (median-FC = 0.3, P = 0.031)) appeared significantly downregulated (FC = 0.3, P = 0.01-0.03). A significant association of CCNG1, POU2AF1, HPRT1, and WNT3 (P = 0.006-0.04) with acute or late radiotoxicity could be shown before the onset of these clinical outcomes. In an established set of four genes predicting acute health effects in blood, the response in saliva samples was similar to the expected up- (FDXR, DDB2) or downregulation (POU2AF1, WNT3) in blood for up to 71% of the measurements. Comparing GE responses (PHPT1, CCNG1, CDKN1A, GADD45A, SESN1) in saliva and blood samples, there was a significant linear association between saliva and blood response of CDKN1A (R2 = 0.60, P = 0.0004). However, the GE pattern of other genes differed between saliva and blood. In summary, the current human in vivo study, (I) reveals significant radiation-induced GE associations of five transcriptional biomarkers in salivary samples, (II) suggests genes predicting diverse clinical outcomes such as acute and late radiotoxicity as well as ARS severity, and (III) supports the view that blood-based GE response can be reflected in saliva samples, indicating that saliva is a "mirror of the body" for certain but not all genes and, thus, studies for each gene of interest in blood are required for saliva.


Subject(s)
Saliva , Humans , Saliva/radiation effects , Saliva/metabolism , Male , Middle Aged , Female , Aged , Radiometry , Head and Neck Neoplasms/radiotherapy , Adult , Dose-Response Relationship, Radiation
5.
Radiat Res ; 201(5): 514-522, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38514385

ABSTRACT

In times of war, radiological/nuclear emergency scenarios have become a reemphasized threat. However, there are challenges in transferring whole-blood samples to laboratories for specialized diagnostics using RNA. This project aims to miniaturize the process of unwieldy conventional RNA extraction with its stationed technical equipment using a microfluidic-based slide (MBS) for point-of-care diagnostics. The MBS is thought to be a preliminary step toward the development of a so-called lab-on-a-chip microfluidic device. A MBS would enable early and fast field care combined with gene expression (GE) analysis for the prediction of hematologic acute radiation syndrome (HARS) severity or identification of RNA microbes. Whole blood samples from ten healthy donors were irradiated with 0, 0.5 and 4 Gy, simulating different ARS severity degrees. RNA quality and quantity of a preliminary MBS was compared with a conventional column-based (CB) RNA extraction method. GE of four HARS severity-predicting radiation-induced genes (FDXR, DDB2, POU2AF1 and WNT3) was examined employing qRT-PCR. Compared to the CB method, twice as much total RNA from whole blood could be extracted using the MBS (6.6 ± 3.2 µg vs. 12.0 ± 5.8 µg) in half of the extraction time, and all MBS RNA extracts appeared DNA-free in contrast to the CB method (30% were contaminated with DNA). Using MBS, RNA quality [RNA integrity number equivalent (RINe)] values decreased about threefold (3.3 ± 0.8 vs. 9.0 ± 0.4), indicating severe RNA degradation, while expected high-quality RINe ≥ 8 were found using column-based method. However, normalized cycle threshold (Ct) values, as well as radiation-induced GE fold-changes appeared comparable for all genes utilizing both methods, indicating that no RNA degradation took place. In summary, the preliminary MBS showed promising features such as: 1. halving the RNA extraction time without the burden of heavy technical equipment (e.g., a centrifuge); 2. absence of DNA contamination in contrast to CB RNA extraction; 3. reduction in blood required, because of twice the biological output of RNA; and 4. equal GE performance compared to CB, thus, increasing its appeal for later semi-automatic parallel field applications.


Subject(s)
Point-of-Care Systems , RNA , Humans , RNA/isolation & purification , RNA/blood , RNA/genetics , Lab-On-A-Chip Devices , Acute Radiation Syndrome/blood , Acute Radiation Syndrome/etiology , Acute Radiation Syndrome/diagnosis , Acute Radiation Syndrome/genetics
6.
Radiat Res ; 201(5): 384-395, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38282135

ABSTRACT

Radiosensitivity differs in humans and possibly in closely related nonhuman primates. The reasons for variation in radiosensitivity are not well known. In an earlier study, we examined gene expression (GE) pre-radiation in peripheral blood among male (n = 62) and female (n = 60) rhesus macaques (n = 122), which did or did not survive (up to 60 days) after whole-body exposure of 7.0 Gy (LD66/60). Eight genes (CHD5, CHI3L1, DYSF, EPX, IGF2BP1, LCN2, MBOAT4, SLC22A4) revealed significant associations with survival. Access to a second rhesus macaque cohort (males = 40, females = 23, total n = 63) irradiated with 5.8-7.2 Gy (LD29-50/60) and some treated with gamma-tocotrienol (GT3, a radiation countermeasure) allowed us to validate these gene expression changes independently. Total RNA was isolated from whole blood samples and examined by quantitative RT-PCR on a 96-well format. cycle threshold (Ct)-values normalized to 18S rRNA were analyzed for their association with survival. Regardless of the species-specific TaqMan assay, similar results were obtained. Two genes (CHD5 and CHI3L1) out of eight revealed a significant association with survival in the second cohort, while only CHD5 (involved in DNA damage response and proliferation control) showed mean gene expression changes in the same direction for both cohorts. No expected association of CHD5 GE with dose, treatment, or sex could be established. Instead, we observed significant associations for those comparisons comprising pre-exposure samples with CHD5 Ct values ≤ 11 (total n = 17). CHD5 Ct values ≤ 11 in these comparisons were mainly associated with increased frequencies (61-100%) of non-survivors, a trend which depending on the sample numbers, reached significance (P = 0.03) in males and, accordingly, in females. This was also reflected by a logistic regression model including all available samples from both cohorts comprising CHD5 measurements (n = 104, odds ratio 1.38, 95% CI 1.07-1.79, P = 0.01). However, this association was driven by males (odds ratio 1.62, 95% CI 1.10-2.38, P = 0.01) and CHD5 Ct values ≤ 11 since removing low CHD5 Ct values from this model, converted to insignificance (P = 0.19). A second male subcohort comprising high CHD5 Ct values ≥ 14.4 in both cohorts (n = 5) appeared associated with survival. Removing these high CHD5 Ct values converted the model borderline significant (P = 0.051). Based on the probability function of the receiver operating characteristics (ROC) curves, 8 (12.3%) and 5 (7.7%) from 65 pre-exposure RNA measurements in males, death and survival could be predicted with a negative and positive predictive value ranging between 85-100%. An associated odds ratio reflected a 62% elevated risk for dying or surviving per unit change (Ct-value) in gene expression, considering the before-mentioned CHD5 thresholds in RNA copy numbers. In conclusion, we identified two subsets of male animals characterized by increased (Ct values ≤ 11) and decreased (Ct values ≥ 14.4) CHD5 GE copy numbers before radiation exposure, which independently of the cohort, radiation exposure or treatment appeared to predict the death or survival in males.


Subject(s)
Macaca mulatta , Radiation Tolerance , Animals , Male , Female , Radiation Tolerance/genetics , Cohort Studies , Gene Expression Regulation/radiation effects , Dose-Response Relationship, Radiation , Whole-Body Irradiation
7.
Radiat Res ; 199(6): 535-555, 2023 06 01.
Article in English | MEDLINE | ID: mdl-37310880

ABSTRACT

Tools for radiation exposure reconstruction are required to support the medical management of radiation victims in radiological or nuclear incidents. Different biological and physical dosimetry assays can be used for various exposure scenarios to estimate the dose of ionizing radiation a person has absorbed. Regular validation of the techniques through inter-laboratory comparisons (ILC) is essential to guarantee high quality results. In the current RENEB inter-laboratory comparison, the performance quality of established cytogenetic assays [dicentric chromosome assay (DCA), cytokinesis-block micronucleus assay (CBMN), stable chromosomal translocation assay (FISH) and premature chromosome condensation assay (PCC)] was tested in comparison to molecular biological assays [gamma-H2AX foci (gH2AX), gene expression (GE)] and physical dosimetry-based assays [electron paramagnetic resonance (EPR), optically or thermally stimulated luminescence (LUM)]. Three blinded coded samples (e.g., blood, enamel or mobiles) were exposed to 0, 1.2 or 3.5 Gy X-ray reference doses (240 kVp, 1 Gy/min). These doses roughly correspond to clinically relevant groups of unexposed to low exposed (0-1 Gy), moderately exposed (1-2 Gy, no severe acute health effects expected) and highly exposed individuals (>2 Gy, requiring early intensive medical care). In the frame of the current RENEB inter-laboratory comparison, samples were sent to 86 specialized teams in 46 organizations from 27 nations for dose estimation and identification of three clinically relevant groups. The time for sending early crude reports and more precise reports was documented for each laboratory and assay where possible. The quality of dose estimates was analyzed with three different levels of granularity, 1. by calculating the frequency of correctly reported clinically relevant dose categories, 2. by determining the number of dose estimates within the uncertainty intervals recommended for triage dosimetry (±0.5 Gy or ±1.0 Gy for doses <2.5 Gy or >2.5 Gy), and 3. by calculating the absolute difference (AD) of estimated doses relative to the reference doses. In total, 554 dose estimates were submitted within the 6-week period given before the exercise was closed. For samples processed with the highest priority, earliest dose estimates/categories were reported within 5-10 h of receipt for GE, gH2AX, LUM, EPR, 2-3 days for DCA, CBMN and within 6-7 days for the FISH assay. For the unirradiated control sample, the categorization in the correct clinically relevant group (0-1 Gy) as well as the allocation to the triage uncertainty interval was, with the exception of a few outliers, successfully performed for all assays. For the 3.5 Gy sample the percentage of correct classifications to the clinically relevant group (≥2 Gy) was between 89-100% for all assays, with the exception of gH2AX. For the 1.2 Gy sample, an exact allocation to the clinically relevant group was more difficult and 0-50% or 0-48% of the estimates were wrongly classified into the lowest or highest dose categories, respectively. For the irradiated samples, the correct allocation to the triage uncertainty intervals varied considerably between assays for the 1.2 Gy (29-76%) and 3.5 Gy (17-100%) samples. While a systematic shift towards higher doses was observed for the cytogenetic-based assays, extreme outliers exceeding the reference doses 2-6 fold were observed for EPR, FISH and GE assays. These outliers were related to a particular material examined (tooth enamel for EPR assay, reported as kerma in enamel, but when converted into the proper quantity, i.e. to kerma in air, expected dose estimates could be recalculated in most cases), the level of experience of the teams (FISH) and methodological uncertainties (GE). This was the first RENEB ILC where everything, from blood sampling to irradiation and shipment of the samples, was organized and realized at the same institution, for several biological and physical retrospective dosimetry assays. Almost all assays appeared comparably applicable for the identification of unexposed and highly exposed individuals and the allocation of medical relevant groups, with the latter requiring medical support for the acute radiation scenario simulated in this exercise. However, extreme outliers or a systematic shift of dose estimates have been observed for some assays. Possible reasons will be discussed in the assay specific papers of this special issue. In summary, this ILC clearly demonstrates the need to conduct regular exercises to identify research needs, but also to identify technical problems and to optimize the design of future ILCs.


Subject(s)
Biological Assay , Blood Specimen Collection , Retrospective Studies , Cytokinesis , Electron Spin Resonance Spectroscopy
8.
Arch Toxicol ; 97(6): 1577-1598, 2023 06.
Article in English | MEDLINE | ID: mdl-37022444

ABSTRACT

Uranium and thorium are heavy metals, and all of their isotopes are radioactive, so it is impossible to study chemical effects entirely independent of the radiation effects. In the present study, we tried to compare the chemo- and radiotoxicity of both metals, taking into account deterministic radiation damages reflected by acute radiation sickness and stochastic radiation damages leading to long-term health impairments (e.g., tumor induction). We made at first a literature search on acute median lethal doses that may be expected to be caused by chemical effects, as even acute radiation sickness as a manifestation of acute radiotoxicity occurs with latency. By simulations based on the biokinetic models of the International Commission on Radiological Protection and using the Integrated Modules for Bioassay Analysis software, we determined the amounts of uranium at different enrichment grades and thorium-232 leading to a short-term red bone marrow equivalent dose of 3.5 Sv considered to cause 50% lethality in humans. Different intake pathways for incorporation were considered, and values were compared to the mean lethal doses by chemotoxicity. To assess stochastic radiotoxicity, we calculated the uranium and thorium amounts leading to a committed effective dose of 200 mSv that is often considered critical. Mean lethal values for uranium and thorium are in the same order of magnitude so that the data do not give evidence for substantial differences in acute chemical toxicity. When comparing radiotoxicity, the reference units (activity in Bq or weight in g) must always be taken into account. The mean lethal equivalent dose to the red bone marrow of 3.5 Sv is reached by lower activities of thorium compared to uranium in soluble compounds. However, for uranium as well as thorium-232, acute radiation sickness is expected only after incorporation of amounts exceeding the mean lethal doses by chemotoxicity. Thus, acute radiation sickness is not a relevant clinical issue for either metal. Concerning stochastic radiation damages, thorium-232 is more radiotoxic than uranium if incorporating the same activities. Using weight units for comparison show that for soluble compounds, thorium-232 is more radiotoxic than low-enriched uranium in the case of ingestion but even more toxic than high-enriched uranium after inhalation or intravenous administration. For insoluble compounds, the situation differs as the stochastic radiotoxicity of thorium-232 ranges between depleted and natural uranium. For acute effects, the chemotoxicity of uranium, even at high enrichment grades, as well as thorium-232 exceeds deterministic radiotoxicity. Simulations show that thorium-232 is more radiotoxic than uranium expressed in activity units. If the comparison is based on weight units, the rankings depend on the uranium enrichment grades and the route of intake.


Subject(s)
Radiation Injuries , Uranium , Humans , Thorium/toxicity , Thorium/analysis , Uranium/toxicity , Uranium/analysis , Dose-Response Relationship, Radiation
9.
Radiat Res ; 199(6): 583-590, 2023 06 01.
Article in English | MEDLINE | ID: mdl-37057978

ABSTRACT

Translocation analysis using fluorescence in situ hybridization (FISH) is the method of choice for dose assessment in case of chronic or past exposures to ionizing radiation. Although it is a widespread technique, unlike dicentrics, the number of FISH-based inter-laboratory comparisons is small. For this reason, although the current Running the European Network of Biological and Physical retrospective Dosimetry (RENEB) inter-laboratory comparison 2021 was designed as a fast response to a real emergency scenario, it was considered a good opportunity to perform an inter-laboratory comparison using the FISH technique to gain further experience. The Bundeswehr Institute of Radiobiology provided peripheral blood samples from one healthy human volunteer. Three test samples were irradiated with blinded doses of 0, 1.2, and 3.5 Gy, respectively. Samples were then sent to the seven participating laboratories. The FISH technique was applied according to the standard procedure of each laboratory. Both, the frequency of translocations and the estimated dose for each sample were sent to the coordinator using a special scoring sheet for FISH. All participants sent their results in due time. However, although it was initially requested to send the results based on the full analysis, evaluating 500 equivalent cells, most laboratories only sent the results based on triage, with a smaller number of analyzed cells. In the triage analysis, there was great heterogeneity in the number of equivalent cells scored. On the contrary, for the full analysis, this number was more homogeneous. For all three samples, one laboratory showed outlier yields compared to the other laboratories. Excluding these results, in the triage analysis, the frequency of translocations in sample no. 1 ranged from 0 to 0.013 translocations per cell, and for samples no. 2 and no. 3 the genomic mean frequency were 0.27 ± 0.03 and 1.47 ± 0.14, with a coefficient of variation of 0.29 and 0.23 respectively. Considering only results obtained in the triage analysis for sample no. 1, all laboratories, except one, classified this sample as the non-irradiated one. For sample no. 2, excluding the outlier value, the mean reported dose was 1.74 ± 0.16 Gy indicating a mean deviation of about 0.5 Gy to the delivered dose of 1.2 Gy. For sample no. 3 the mean dose estimated was 4.21 ± 0.21 Gy indicating a mean deviation of about 0.7 Gy to the delivered dose of 3.5 Gy. In the frame of RENEB, this is the second FISH-based inter-laboratory comparison. The whole exercise was planned as a response to an emergency, therefore, a triage analysis was requested for all the biomarkers except for FISH. Although a full analysis was initially requested for FISH, most of the laboratories reported only a triage-based result. The main reason is that it was not clearly stated what was required before starting the exercise. Results show that most of the laboratories successfully discriminated unexposed and irradiated samples from each other without any overlap. A good agreement in the observed frequencies of translocations was observed but there was a tendency to overestimate the delivered doses. Efforts to improve the harmonization of this technique and subsequent exercises to elucidate the reason for this trend should be promoted.


Subject(s)
Radiometry , Translocation, Genetic , Humans , In Situ Hybridization, Fluorescence/methods , Retrospective Studies , Radiometry/methods , Biological Assay/methods , Chromosome Aberrations
10.
Radiat Res ; 199(6): 598-615, 2023 06 01.
Article in English | MEDLINE | ID: mdl-37057982

ABSTRACT

Early and high-throughput individual dose estimates are essential following large-scale radiation exposure events. In the context of the Running the European Network for Biodosimetry and Physical Dosimetry (RENEB) 2021 exercise, gene expression assays were conducted and their corresponding performance for dose-assessment is presented in this publication. Three blinded, coded whole blood samples from healthy donors were exposed to 0, 1.2 and 3.5 Gy X-ray doses (240 kVp, 1 Gy/min) using the X-ray source Yxlon. These exposures correspond to clinically relevant groups of unexposed, low dose (no severe acute health effects expected) and high dose exposed individuals (requiring early intensive medical health care). Samples were sent to eight teams for dose estimation and identification of clinically relevant groups. For quantitative reverse transcription polymerase chain reaction (qRT-PCR) and microarray analyses, samples were lysed, stored at 20°C and shipped on wet ice. RNA isolations and assays were run in each laboratory according to locally established protocols. The time-to-result for both rough early and more precise later reports has been documented where possible. Accuracy of dose estimates was calculated as the difference between estimated and reference doses for all doses (summed absolute difference, SAD) and by determining the number of correctly reported dose estimates that were defined as ±0.5 Gy for reference doses <2.5 Gy and ±1.0 Gy for reference doses >3 Gy, as recommended for triage dosimetry. We also examined the allocation of dose estimates to clinically/diagnostically relevant exposure groups. Altogether, 105 dose estimates were reported by the eight teams, and the earliest report times on dose categories and estimates were 5 h and 9 h, respectively. The coefficient of variation for 85% of all 436 qRT-PCR measurements did not exceed 10%. One team reported dose estimates that systematically deviated several-fold from reported dose estimates, and these outliers were excluded from further analysis. Teams employing a combination of several genes generated about two-times lower median SADs (0.8 Gy) compared to dose estimates based on single genes only (1.7 Gy). When considering the uncertainty intervals for triage dosimetry, dose estimates of all teams together were correctly reported in 100% of the 0 Gy, 50% of the 1.2 Gy and 50% of the 3.5 Gy exposed samples. The order of dose estimates (from lowest to highest) corresponding to three dose categories (unexposed, low dose and highest exposure) were correctly reported by all teams and all chosen genes or gene combinations. Furthermore, if teams reported no exposure or an exposure >3.5 Gy, it was always correctly allocated to the unexposed and the highly exposed group, while low exposed (1.2 Gy) samples sometimes could not be discriminated from highly (3.5 Gy) exposed samples. All teams used FDXR and 78.1% of correct dose estimates used FDXR as one of the predictors. Still, the accuracy of reported dose estimates based on FDXR differed considerably among teams with one team's SAD (0.5 Gy) being comparable to the dose accuracy employing a combination of genes. Using the workflow of this reference team, we performed additional experiments after the exercise on residual RNA and cDNA sent by six teams to the reference team. All samples were processed similarly with the intention to improve the accuracy of dose estimates when employing the same workflow. Re-evaluated dose estimates improved for half of the samples and worsened for the others. In conclusion, this inter-laboratory comparison exercise enabled (1) identification of technical problems and corrections in preparations for future events, (2) confirmed the early and high-throughput capabilities of gene expression, (3) emphasized different biodosimetry approaches using either only FDXR or a gene combination, (4) indicated some improvements in dose estimation with FDXR when employing a similar methodology, which requires further research for the final conclusion and (5) underlined the applicability of gene expression for identification of unexposed and highly exposed samples, supporting medical management in radiological or nuclear scenarios.


Subject(s)
Radiation Exposure , Radiometry , Humans , Dose-Response Relationship, Radiation , Radiometry/methods , Radiation Exposure/adverse effects , Radiation Exposure/analysis , Biological Assay/methods , Gene Expression
11.
Radiat Res ; 199(6): 571-582, 2023 06 01.
Article in English | MEDLINE | ID: mdl-37057983

ABSTRACT

The goal of the RENEB inter-laboratory comparison 2021 exercise was to simulate a large-scale radiation accident involving a network of biodosimetry labs. Labs were required to perform their analyses using different biodosimetric assays in triage mode scoring and to rapidly report estimated radiation doses to the organizing institution. This article reports the results obtained with the cytokinesis-block micronucleus assay. Three test samples were exposed to blinded doses of 0, 1.2 and 3.5 Gy X-ray doses (240 kVp, 13 mA, ∼75 keV, 1 Gy/min). These doses belong to 3 triage categories of clinical relevance: a low dose category, for no exposure or exposures inferior to 1 Gy, requiring no direct treatment of subjects; a medium dose category, with doses ranging from 1 to 2 Gy, and a high dose category, after exposure to doses higher than 2 Gy, with the two latter requiring increasing medical attention. After irradiation the test samples (no. 1, no. 2 and no. 3) were sent by the organizing laboratory to 14 centers participating in the micronucleus assay exercise. Laboratories were asked to setup micronucleus cultures and to perform the micronucleus assay in triage mode, scoring 500 binucleated cells manually, or 1,000 binucleated cells in automated/semi-automated mode. One laboratory received no blood samples, but scored pictures from another lab. Based on their calibration curves, laboratories had to provide estimates of the administered doses. The accuracy of the reported dose estimates was further analyzed by the micronucleus assay lead. The micronucleus assay allowed classification of samples in the corresponding clinical triage categories (low, medium, high dose category) in 88% of cases (manual scoring, 88%; semi-automated scoring, 100%; automated scoring, 73%). Agreement between scoring laboratories, assessed by calculating the Fleiss' kappa, was excellent (100%) for semi-automated scoring, good (83%) for manual scoring and poor (53%) for fully automated scoring. Correct classification into triage scoring dose intervals (reference dose ±0.5 Gy for doses ≤2.5 Gy, or reference dose ±1 Gy for doses >2.5 Gy), recommended for triage biodosimetry, was obtained in 79% of cases (manual scoring, 73%; semi-automated scoring, 100%; automated scoring, 67%). The percentage of dose estimates whose 95% confidence intervals included the reference dose was 58% (manual scoring, 48%; semiautomated scoring, 72%; automated scoring, 60%). For the irradiated samples no. 2 and no. 3, a systematic shift towards higher dose estimations was observed. This was also noticed with the other cytogenetic assays in this intercomparison exercise. Accuracy of the rapid triage modality could be maintained when the number of manually scored cells was scaled down to 200 binucleated cells. In conclusion, the micronucleus assay, preferably performed in a semi-automated or manual scoring mode, is a reliable technique to perform rapid biodosimetry analysis in large-scale radiation emergencies.


Subject(s)
Cytokinesis , Radioactive Hazard Release , Humans , Dose-Response Relationship, Radiation , Cytokinesis/radiation effects , Micronucleus Tests/methods , Biological Assay/methods , Radiometry/methods
12.
Radiat Res ; 199(6): 616-627, 2023 06 01.
Article in English | MEDLINE | ID: mdl-37084254

ABSTRACT

Inter-laboratory exercises are important tools within the European network for biological dosimetry and physical retrospective dosimetry (RENEB) to validate and improve the performance of member laboratories and to ensure an operational network with high quality standards for dose estimations in case of a large-scale radiological or nuclear event. In addition to the RENEB inter-laboratory comparison 2021, several inter-laboratory comparisons have been performed in the frame of RENEB for a number of assays in recent years. This publication gives an overview of RENEB inter-laboratory comparisons for biological dosimetry assays in the past and a final summary of the challenges and lessons learnt from the RENEB inter-laboratory comparison 2021. In addition, the dose estimates of all RENEB inter-laboratory comparisons since 2013 that have been conducted for the dicentric chromosome assay, the most established and applied assay, are compared and discussed.


Subject(s)
Radiation Exposure , Radiation Monitoring , Radiation Exposure/analysis , Retrospective Studies , Biological Assay , Laboratories
13.
Radiat Res ; 199(1): 17-24, 2023 01 01.
Article in English | MEDLINE | ID: mdl-36445953

ABSTRACT

Radiation-induced gene expression (GE) changes can be used for early and high-throughput biodosimetry within the first three days postirradiation. However, is the method applicable in situations such as the Alexander Litvinenko case or the Goiania accident, where diagnosis occurred in a prefinal health stage? We aimed to characterize gene expression changes in a prefinal health stage of lethally irradiated male and female rhesus macaques. Peripheral blood was drawn pre-exposure and at the prefinal stage of male and female animals, which did not survive whole-body exposure with 700 cGy (LD66/60). RNA samples originated from a blinded randomized Good Laboratory Practice study comprising altogether 142 irradiated rhesus macaques of whom 60 animals and blood samples (15 samples for both time points and sexes) were used for this analysis. We evaluated GE on 34 genes widely used in biodosimetry and prediction of the hematological acute radiation syndrome severity (H-ARS) employing quantitative real-time polymerase chain reaction (qRT-PCR). These genes were run in duplicate and triplicate and altogether 96 measurements per time point and sex could be performed. In addition, 18S ribosomal RNA (rRNA) was measured to depict the ribosome/transcriptome status as well as for normalization purposes and 16S rRNA was evaluated as a surrogate for bacteremia. Mean differential gene expression (DGE) was calculated for each gene and sex including all replicate measurements and using pre-exposure samples as the reference. From 34 genes, altogether 27 genes appeared expressed. Pre-exposure samples revealed no signs of bacteremia and 18S rRNA GE was in the normal range in all 30 samples. Regarding prefinal samples, 46.7% and 40% of animals appeared infected in females and males, respectively, and for almost all males this was associated with out of normal range 18S rRNA values. The total number of detectable GE measurements was sixfold (females) and 15-fold (males) reduced in prefinal relative to pre-exposure samples and about tenfold lower in 80% of prefinal compared to pre-exposure samples (P < 0.0001). An overall 11-fold (median) downregulation in prefinal compared to pre-exposure samples was identified for most of the 27 genes and even FDXR appeared 4-14-fold downregulated in contrast to a pronounced up-regulation according to cited work. This pattern of overall downregulation of almost all genes and the rapid reduction of detectable genes at a prefinal stage was found in uninfected animals with normal range 18S rRNA as well. In conclusion, in a prefinal stage after lethal radiation exposure, the ribosome/transcriptome status remains present (based on normal range 18S rRNA values) in 60-67% of animals, but the whole transcriptome activity in general appears silenced and cannot be used for biodosimetry purposes, but probably as an indicator for an emerging prefinal health stage.


Subject(s)
Bacteremia , Transcriptome , Animals , Male , Female , Macaca mulatta , RNA, Ribosomal, 18S , RNA, Ribosomal, 16S , Gene Expression Profiling
14.
Arch Toxicol ; 96(11): 2947-2965, 2022 11.
Article in English | MEDLINE | ID: mdl-35922584

ABSTRACT

In the case of nuclear incidents, radioiodine may be released. After incorporation, it accumulates in the thyroid and enhances the risk of thyroidal dysfunctions and cancer occurrence by internal irradiation. Pregnant women and children are particularly vulnerable. Therefore, thyroidal protection by administering a large dose of stable (non-radioactive) iodine, blocking radioiodide uptake into the gland, is essential in these subpopulations. However, a quantitative estimation of the protection conferred to the maternal and fetal thyroids in the different stages of pregnancy is difficult. We departed from an established biokinetic model for radioiodine in pregnancy using first-order kinetics. As the uptake of iodide into the thyroid and several other tissues is mediated by a saturable active transport, we integrated an uptake mechanism described by a Michaelis-Menten kinetic. This permits simulating the competition between stable and radioactive iodide at the membrane carrier site, one of the protective mechanisms. The Wollf-Chaikoff effect, as the other protective mechanism, was simulated by adding a total net uptake block for iodide into the thyroid, becoming active when the gland is saturated with iodine. The model's validity was confirmed by comparing predicted values with results from other models and sparse empirical data. According to our model, in the case of radioiodine exposure without thyroid blocking, the thyroid equivalent dose in the maternal gland increases about 45% within the first weeks of pregnancy to remain in the same range until term. Beginning in the 12th pregnancy week, the equivalent dose in the fetal thyroid disproportionately increases over time and amounts to three times the dose of the maternal gland at term. The maternal and fetal glands' protection increases concomitantly with the amount of stable iodine administered to the mother simultaneously with acute radioiodine exposure. The dose-effect curves reflecting the combined thyroidal protection by the competition at the membrane carrier site and the Wolff-Chaikoff effect in the mother are characterized by a mean effective dose (ED50) of roughly 1.5 mg all over pregnancy. In the case of the fetal thyroid, the mean effective doses for thyroid blocking, taking into account only the competition at the carrier site are numerically lower than in the mother. Taking into account additionally the Wolff-Chaikoff effect, the dose-effect curves for thyroidal protection in the fetus show a shift to the left over time, with a mean effective dose of 12.9 mg in the 12th week of pregnancy decreasing to 0.5 mg at term. In any case, according to our model, the usually recommended dose of 100 mg stable iodine given at the time of acute radioiodine exposure confers a very high level of thyroidal protection to the maternal and fetal glands over pregnancy. For ethical reasons, the possibilities of experimental studies on thyroid blocking in pregnant women are extremely limited. Furthermore, results from animal studies are associated with the uncertainties related to the translation of the data to humans. Thus model-based simulations may be a valuable tool for better insight into the efficacy of thyroidal protection and improve preparedness planning for uncommon nuclear or radiological emergencies.


Subject(s)
Iodine , Thyroid Gland , Animals , Child , Female , Fetus , Humans , Iodides/metabolism , Iodine/pharmacology , Iodine Radioisotopes , Mothers , Pregnancy , Thyroid Gland/metabolism
15.
Sci Rep ; 12(1): 2312, 2022 02 10.
Article in English | MEDLINE | ID: mdl-35145126

ABSTRACT

Isolation of RNA from whole saliva, a non-invasive and easily accessible biofluid that is an attractive alternative to blood for high-throughput biodosimetry of radiological/nuclear victims might be of clinical significance for prediction and diagnosis of disease. In a previous analysis of 12 human samples we identified two challenges to measuring gene expression from total RNA: (1) the fraction of human RNA in whole saliva was low and (2) the bacterial contamination was overwhelming. To overcome these challenges, we performed selective cDNA synthesis for human RNA species only by employing poly(A)+-tail primers followed by qRT-PCR. In the current study, this approach was independently validated on 91 samples from 61 healthy donors. Additionally, we used the ratio of human to bacterial RNA to adjust the input RNA to include equal amounts of human RNA across all samples before cDNA synthesis, which then ensured comparable analysis using the same base human input material. Furthermore, we examined relative levels of ten known housekeeping genes, and assessed inter- and intra-individual differences in 61 salivary RNA isolates, while considering effects of demographical factors (e.g. sex, age), epidemiological factors comprising social habits (e.g. alcohol, cigarette consumption), oral hygiene (e.g. flossing, mouthwash), previous radiological diagnostic procedures (e.g. number of CT-scans) and saliva collection time (circadian periodic). Total human RNA amounts appeared significantly associated with age only (P ≤ 0.02). None of the chosen housekeeping genes showed significant circadian periodicity and either did not associate or were weakly associated with the 24 confounders examined, with one exception, 60% of genes were altered by mouthwash. ATP6, ACTB and B2M represented genes with the highest mean baseline expression (Ct-values ≤ 30) and were detected in all samples. Combining these housekeeping genes for normalization purposes did not decrease inter-individual variance, but increased the robustness. In summary, our work addresses critical confounders and provides important information for the successful examination of gene expression in human whole saliva.


Subject(s)
Gene Expression , Genes, Essential , RNA/isolation & purification , Saliva/metabolism , Adult , DNA Contamination , DNA, Complementary , Female , Gene Expression Profiling/methods , Humans , Male , Middle Aged , RNA, Bacterial , Real-Time Polymerase Chain Reaction , Young Adult
16.
J Radiol Prot ; 42(1)2022 Jan 25.
Article in English | MEDLINE | ID: mdl-34492641

ABSTRACT

Radiation-induced biological changes occurring within hours and days after irradiation can be potentially used for either exposure reconstruction (retrospective dosimetry) or the prediction of consecutively occurring acute or chronic health effects. The advantage of molecular protein or gene expression (GE) (mRNA) marker lies in their capability for early (1-3 days after irradiation), high-throughput and point-of-care diagnosis, required for the prediction of the acute radiation syndrome (ARS) in radiological or nuclear scenarios. These molecular marker in most cases respond differently regarding exposure characteristics such as e.g. radiation quality, dose, dose rate and most importantly over time. Changes over time are in particular challenging and demand certain strategies to deal with. With this review, we provide an overview and will focus on already identified and used mRNA GE and protein markers of the peripheral blood related to the ARS. These molecules are examined in light of 'ideal' characteristics of a biomarkers (e.g. easy accessible, early response, signal persistency) and the validation degree. Finally, we present strategies on the use of these markers considering challenges as their variation over time and future developments regarding e.g. origin of samples, point of care and high-throughput diagnosis.


Subject(s)
Acute Radiation Syndrome , Radiometry , Acute Radiation Syndrome/diagnosis , Biomarkers , Humans , Retrospective Studies
17.
Eur J Nucl Med Mol Imaging ; 49(5): 1447-1455, 2022 04.
Article in English | MEDLINE | ID: mdl-34773472

ABSTRACT

AIM: The aim of this study was to provide a systematic approach to characterize DNA damage induction and repair in isolated peripheral blood mononuclear cells (PBMCs) after internal ex vivo irradiation with [131I]NaI. In this approach, we tried to mimic ex vivo the irradiation of patient blood in the first hours after radioiodine therapy. MATERIAL AND METHODS: Blood of 33 patients of two centres was collected immediately before radioiodine therapy of differentiated thyroid cancer (DTC) and split into two samples. One sample served as non-irradiated control. The second sample was exposed to ionizing radiation by adding 1 ml of [131I]NaI solution to 7 ml of blood, followed by incubation at 37 °C for 1 h. PBMCs of both samples were isolated, split in three parts each and (i) fixed in 70% ethanol and stored at - 20 °C directly (0 h) after irradiation, (ii) after 4 h and (iii) 24 h after irradiation and culture in RPMI medium. After immunofluorescence staining microscopically visible co-localizing γ-H2AX + 53BP1 foci were scored in 100 cells per sample as biomarkers for radiation-induced double-strand breaks (DSBs). RESULTS: Thirty-two of 33 blood samples could be analysed. The mean absorbed dose to the blood in all irradiated samples was 50.1 ± 2.3 mGy. For all time points (0 h, 4 h, 24 h), the average number of γ-H2AX + 53BP1 foci per cell was significantly different when compared to baseline and the other time points. The average number of radiation-induced foci (RIF) per cell after irradiation was 0.72 ± 0.16 at t = 0 h, 0.26 ± 0.09 at t = 4 h and 0.04 ± 0.09 at t = 24 h. A monoexponential fit of the mean values of the three time points provided a decay rate of 0.25 ± 0.05 h-1, which is in good agreement with data obtained from external irradiation with γ- or X-rays. CONCLUSION: This study provides novel data about the ex vivo DSB repair in internally irradiated PBMCs of patients before radionuclide therapy. Our findings show, in a large patient sample, that efficient repair occurs after internal irradiation with 50 mGy absorbed dose, and that the induction and repair rate after 131I exposure is comparable to that of external irradiation with γ- or X-rays.


Subject(s)
Histones , Iodine Radioisotopes , DNA Damage , DNA Repair , Dose-Response Relationship, Radiation , Histones/metabolism , Humans , Iodine Radioisotopes/therapeutic use , Leukocytes, Mononuclear/metabolism
18.
Arch Toxicol ; 95(7): 2335-2350, 2021 07.
Article in English | MEDLINE | ID: mdl-34003340

ABSTRACT

Radioactive iodine released in nuclear accidents may accumulate in the thyroid and by irradiation enhances the risk of cancer. Radioiodine uptake into the gland can be inhibited by large doses of stable iodine or perchlorate. Nutritional iodine daily intake may impact thyroid physiology, so that radiological doses absorbed by the thyroid as well as thyroid blocking efficacy may differ in Japanese with a very rich iodine diet compared to Caucasians. Based on established biokinetic-dosimetric models for the thyroid, we derived the parameters for Caucasians and Japanese to quantitatively compare the effects of radioiodine exposure and the protective efficacy of thyroid blocking by stable iodine at the officially recommended dosages (100 mg in Germany, 76 mg in Japan) or perchlorate. The maximum transport capacity for iodine uptake into the thyroid is lower in Japanese compared to Caucasians. For the same radioiodine exposure pattern, the radiological equivalent thyroid dose is substantially lower in Japanese in the absence of thyroid blocking treatments. In the case of acute radioiodine exposure, stable iodine is less potent in Japanese (ED50 = 41.6 mg) than in Caucasians (ED50 = 2.7 mg) and confers less thyroid protection at the recommended dosages because of a delayed responsiveness to iodine saturation of the gland (Wolff-Chaikoff effect). Perchlorate (ED50 = 10 mg in Caucasians) at a dose of 1000 mg has roughly the same thyroid blocking effect as 100 mg iodine in Caucasians, whereas it confers a much better protection than 76 mg iodine in Japanese. For prolonged exposures, a single dose of iodine offer substantially lower protection than after acute radioiodine exposure in both groups. Repetitive daily iodine administrations improve efficacy without reaching levels after acute radioiodine exposure and achieve only slightly better protection in Japanese than in Caucasians. However, in the case of continuous radioiodine exposure, daily doses of 1000 mg perchlorate achieve a high protective efficacy in Caucasians as well as Japanese (> 0.98). In Caucasians, iodine (100 mg) and perchlorate (1000 mg) at the recommended dosages seem alternatives in case of acute radioiodine exposure, whereas perchlorate has a higher protective efficacy in the case of longer lasting radioiodine exposures. In Japanese, considering protective efficacy, preference should be given to perchlorate in acute as well as prolonged radioiodine exposure scenarios.


Subject(s)
Iodine , Thyroid Neoplasms , Humans , Iodine Radioisotopes/adverse effects , Japan , Perchlorates/toxicity , Thyroid Neoplasms/prevention & control
19.
Sci Rep ; 11(1): 9756, 2021 05 07.
Article in English | MEDLINE | ID: mdl-33963206

ABSTRACT

Large-scale radiation emergency scenarios involving protracted low dose rate radiation exposure (e.g. a hidden radioactive source in a train) necessitate the development of high throughput methods for providing rapid individual dose estimates. During the RENEB (Running the European Network of Biodosimetry) 2019 exercise, four EDTA-blood samples were exposed to an Iridium-192 source (1.36 TBq, Tech-Ops 880 Sentinal) at varying distances and geometries. This resulted in protracted doses ranging between 0.2 and 2.4 Gy using dose rates of 1.5-40 mGy/min and exposure times of 1 or 2.5 h. Blood samples were exposed in thermo bottles that maintained temperatures between 39 and 27.7 °C. After exposure, EDTA-blood samples were transferred into PAXGene tubes to preserve RNA. RNA was isolated in one laboratory and aliquots of four blinded RNA were sent to another five teams for dose estimation based on gene expression changes. Using an X-ray machine, samples for two calibration curves (first: constant dose rate of 8.3 mGy/min and 0.5-8 h varying exposure times; second: varying dose rates of 0.5-8.3 mGy/min and 4 h exposure time) were generated for distribution. Assays were run in each laboratory according to locally established protocols using either a microarray platform (one team) or quantitative real-time PCR (qRT-PCR, five teams). The qRT-PCR measurements were highly reproducible with coefficient of variation below 15% in ≥ 75% of measurements resulting in reported dose estimates ranging between 0 and 0.5 Gy in all samples and in all laboratories. Up to twofold reductions in RNA copy numbers per degree Celsius relative to 37 °C were observed. However, when irradiating independent samples equivalent to the blinded samples but increasing the combined exposure and incubation time to 4 h at 37 °C, expected gene expression changes corresponding to the absorbed doses were observed. Clearly, time and an optimal temperature of 37 °C must be allowed for the biological response to manifest as gene expression changes prior to running the gene expression assay. In conclusion, dose reconstructions based on gene expression measurements are highly reproducible across different techniques, protocols and laboratories. Even a radiation dose of 0.25 Gy protracted over 4 h (1 mGy/min) can be identified. These results demonstrate the importance of the incubation conditions and time span between radiation exposure and measurements of gene expression changes when using this method in a field exercise or real emergency situation.


Subject(s)
Blood Cells/metabolism , Gamma Rays/adverse effects , Gene Expression Regulation/radiation effects , Laboratories , Radiation Dosage , Radiation Exposure , X-Rays/adverse effects , Dose-Response Relationship, Radiation , Humans , Reproducibility of Results
SELECTION OF CITATIONS
SEARCH DETAIL
...