Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
Add more filters










Publication year range
3.
Health Phys ; 60 Suppl 1: 45-100, 1991.
Article in English | MEDLINE | ID: mdl-2004918

ABSTRACT

This report was prepared by a working group established by the Oak Ridge Associated Universities (ORAU) for the purpose of assessing the current capabilities of bioassay methods that can be used to determine the occurrence and magnitude of a previous internal deposition of one or more radionuclides. The first five sections discuss general features of the use of in-vitro bioassay samples to achieve this purpose. The remainder of the report is focused on the possible use of urine bioassay procedures to detect and quantify internal depositions of radionuclides that may have occurred in United States occupation troops in Hiroshima or Nagasaki, Japan, prior to 1 July 1946, or to personnel who participated in atmospheric nuclear weapons tests conducted between 1945 and 1962. Theoretical calculations were made to estimate the quantities of various radionuclides produced in a 20-kiloton (kt) nuclear detonation that might still be present in measurable quantities in people today if they were exposed 25 to 40 y ago. Two radionuclides that emerged as good choices for this type of bioassay analysis were 90Sr, which emits beta particles, and 239,240Pu, which emits alpha particles. The current status and future prospects of chemical procedures for analyzing in-vitro urine bioassay samples for these two radionuclides were examined to determine the minimum amounts that could be detected with current methods and how much one might expect the sensitivity of detection to improve in the near future. Most routine 239,240Pu bioassay analyses involve detection by alpha spectrometry. The current minimum detectable amount (MDA) is about 0.74 mBq L-1 (20 fCi L-1), but this could be lowered to 74 muBq L-1 (2 fCi L-1). An MDA of 0.74 mBq L-1 (20 fCi L-1) is adequate for routine bioassay analyses but is too high to detect most uptakes of 239,240Pu that may have occurred 25 to 40 y ago. Methods under development that are or can be much more sensitive and have lower MDAs than alpha spectrometry for 239Pu are fission track analysis and mass spectrometry. Currently, the fission track analysis method has an MDA of about 19 muBq L-1), and this may eventually be lowered to 1.9 muBq L-1 (0.005 fCi L-1). The current MDA for 239Pu by mass spectrometry is about 7.4 mBq L-1 (200 fCi L-1), but the potential exists that it could be lowered to a value of about 0.37 muBq L-1 (0.01 fCi L-1).(ABSTRACT TRUNCATED AT 400 WORDS)


Subject(s)
Radioisotopes/analysis , Radiometry/methods , Feces/chemistry , Humans , Nuclear Warfare , Occupational Exposure , Radiation Dosage , Radioactive Fallout , Radioisotopes/urine , Tissue Distribution
4.
Health Phys ; 60 Suppl 1: 103-9, 1991.
Article in English | MEDLINE | ID: mdl-2004917

ABSTRACT

Hematologic changes following whole-body exposure to gamma or x-ray radiation have been used to estimate dose. The usefulness of this biological indicator is limited because of the recovery of these cells with time, thus making it unsuitable for estimation of dose years after exposure. The same is true for spermatogenic indicators; recovery and restoration of sperm numbers and fertility makes this biological indicator impractical for assessing radiation dose decades after radiation exposure. As noted in the text of the report, immunological concepts are in a state of rapid development, and it is possible that improved methods for applying immunologic procedures as biological indicators of radiation may be developed in the future. However, at the time, immunological indicators are not useful, even in an early time period, for quantitating radiation dose after total-body irradiation. A semiquantitative effect is observable in the early phase after total-body irradiation over a period of days to weeks, but there is little data available to indicate whether any of the immunological parameters can be indicative of a dose when the test is applied several years after radiation exposure. More detailed information regarding immunological indicators for estimating irradiation dose has been summarized elsewhere (Wasserman 1986). There is good agreement that ionizing radiation causes biochemical changes in the body; however, attempts to apply these changes to provide a reliable biological dosimetry system have not been particularly successful. The status of this research has been summarized by Gerber (1986). One of the difficulties has been the problem of establishing clear dose-effect relationships in humans. The lack of specificity in the response for radiation is another problem. Additional problems are due to the strict time dependency of biochemical changes and the limited duration of the changes during the postexposure period. Information on biochemical indicators is based on animal experiments; human experience is limited to a relatively few accidental human exposures and investigations involving patients undergoing radiation therapy. It appears that none of the biochemical indicators studied are currently useful for radiation dosimetry. Even if further developed, it is questionable whether or not biochemical indicators could be of use in estimating radiation dose received years and decades prior to the assay.


Subject(s)
Radiation Dosage , Radiometry , Blood/radiation effects , Body Burden , Gamma Rays , Humans , Lymphocytes/immunology , Lymphocytes/radiation effects , Male , Spermatogenesis/radiation effects , X-Rays
5.
Health Phys ; 60 Suppl 1: 7-42, 1991.
Article in English | MEDLINE | ID: mdl-1900815

ABSTRACT

This report discusses the principles, techniques, and application of whole-body counting with respect to previous radiation exposure. Whole-body counting facilities are located nationwide and have a wide range of capabilities. A listing of these facilities is provided in Appendix A. However, only a few facilities are truly state-of-the-art and have the sophisticated capabilities required to attempt detection of low-level activity in vivo. Measurements made many years after exposure can be extremely difficult to interpret. The precision and accuracy of resulting dose estimates are functions of such factors as the assumptions made concerning intake, time since intake, radionuclide metabolism, and level of intake. The indiscriminate application of metabolic models to current body contents or minimum detectable amounts of radionuclides with relatively short effective half-lives (such as 137Cs) can lead to absurd results when used as a basis for calculating intakes 25 and 40 y ago. Skull counting for 90Sr-90Y and 239,240Pu can set upper limits on possible uptakes and radiation doses, but in the case of 239,240Pu, the limits are rather high. In both cases, the accuracy of the limits depends on the metabolic models used in the calculations. These models (ICRP 1979) were developed to set safety standards for the intakes of radionuclides by workers and are not intended to be used to back-calculate uptakes and radiation doses from measurements made long after the uptake. There are, therefore, large uncertainties in any conclusions derived from these calculations. The experience gained over the years with whole- and partial-body counting has consistently shown that they are of little use in determining body contents of radionuclides resulting from exposure to weapons debris decades earlier. The development of new detectors such as an array of lithium-drifted silicon devices offers some hope of lowering the minimum detectable amount (MDA) for Pu and Am, but such detectors are still several years from routine application and do not represent current state-of-the-art. Furthermore, it is doubtful that such improvements will be sufficient to meet the need of assessing radiation exposures that occurred decades earlier.


Subject(s)
Whole-Body Counting , Accidents , Humans , Nuclear Reactors , Nuclear Warfare , Occupational Exposure , Radiation Dosage , Radioactive Fallout , Whole-Body Counting/instrumentation , Whole-Body Counting/methods
6.
Health Phys ; 59(5): 511-4, 1990 Nov.
Article in English | MEDLINE | ID: mdl-2211110

ABSTRACT

The Department of Health and Human Services (DHHS) was directed by Congress to assess the risk of thyroid cancer from 131I associated with fallout from the atmospheric testing of nuclear weapons at the Nevada Test Site. The National Cancer Institute (NCI) was requested by DHHS to address Public Law 97-414, Section 7 (a), which directs DHHS to "(1) conduct scientific research and prepare analyses necessary to develop valid and credible assessments of the risks of thyroid cancer that are associated with thyroid doses of Iodine 131; (2)...develop...methods to estimate the thyroid doses of Iodine 131 that are received by individuals from nuclear bomb fallout; (and) (3)...develop...assessments of the exposure to Iodine 131 that the American people received from the Nevada atmospheric nuclear bomb tests." In addition, the University of Utah, under contract with the NCI, is carrying out a study to determine if the incidence of thyroid disease and leukemia among identified populations in Utah may be related to exposure from fallout originating at the Nevada Test Site.


Subject(s)
Environmental Exposure , Nuclear Warfare , Radioactive Fallout , United States Dept. of Health and Human Services , Humans , Leukemia, Radiation-Induced/epidemiology , Nevada/epidemiology , Thyroid Diseases/epidemiology , Thyroid Diseases/etiology , United States , Utah/epidemiology
7.
Health Phys ; 59(5): 627-36, 1990 Nov.
Article in English | MEDLINE | ID: mdl-2211120

ABSTRACT

Determining the consumption of milk contaminated with 131I, resulting from atmospheric nuclear weapons tests conducted at the Nevada Test Site, by the United States population during the 1950s constitutes one part of the methodology used by the National Cancer Institute to assess radiation exposures to Americans. In order to make these estimates for locations throughout the United States, it is necessary to determine the pasture intake by cows and the distribution of the milk produced for human consumption at times when the weapons were tested. Since the milk industry has undergone many changes during the past 35 y, historical records and information must be used. The methodology developed to estimate the intake of contaminated pasture by dairy cows, milk production, and milk distribution on a county basis for the continental U.S. during the 1950s is described in detail. The relevant data on milk consumption by humans are also discussed.


Subject(s)
Environmental Exposure , Food Contamination, Radioactive/analysis , Iodine Radioisotopes/analysis , Milk , Nuclear Warfare , Radioactive Fallout , Animals , Dairying , Humans , Milk/statistics & numerical data , Nevada , United States
8.
Health Phys ; 59(5): 659-68, 1990 Nov.
Article in English | MEDLINE | ID: mdl-2145245

ABSTRACT

A methodology is being developed to estimate the exposure of Americans to 131I originating from atmospheric nuclear weapons tests carried out at the Nevada Test Site (NTS) during the 1950s and early 1960s. Since very few direct environmental measurements of 131I were made at that time, the assessment must rely on estimates of 131I deposition based on meterological modeling and on measurements of total beta activity from the radioactive fallout deposited on gummed-film collectors that were located across the country. The most important source of human exposure from fallout 131I was due to the ingestion of cows' milk. The overall methodology used to assess the 131I concentration in milk and the 131I intake by people on a county basis for the most significant atmospheric tests is presented and discussed. Certain aspects of the methodology are discussed in a more detailed manner in companion papers also presented in this issue. This work is carried out within the framework of a task group established by the National Cancer Institute.


Subject(s)
Computer Simulation , Environmental Exposure , Food Contamination, Radioactive , Iodine Radioisotopes , Milk , Nuclear Warfare , Radioactive Fallout , Animal Feed , Animals , Humans , Meta-Analysis as Topic , Meteorological Concepts , Nevada , United States
9.
Mutat Res ; 196(2): 103-59, 1988 Sep.
Article in English | MEDLINE | ID: mdl-3047567

ABSTRACT

The estimation of the magnitude of a dose of ionizing radiation to which an individual has been exposed (or of the plausibility of an alleged exposure) from chromosomal aberration frequencies determined in peripheral blood lymphocyte cultures is a well-established methodology, having first been employed over 25 years ago. The cytogenetics working group has reviewed the accumulated data and the possible applicability of the technique to the determination of radiation doses to which American veterans might have been exposed as participants in nuclear weapons tests in the continental U.S.A. or the Pacific Atolls during the late 1940s and 1950s or as members of the Occupation Forces entering Hiroshima or Nagasaki shortly after the nuclear detonations there. The working group believes that with prompt peripheral blood sampling, external doses to individuals of the order of about 10 rad (or less if the exposure was to high-LET radiation) can accurately be detected and measured. It also believes that exposures of populations to doses of the order of maximum permissible occupational exposures can also be detected (but only in populations; not in an individual). Large exposures of populations can also be detected even several decades after their exposure, but only in the case of populations, and of large doses (of the order of 100 to several hundred rad). The working group does not believe that cytogenetic measurements can detect internal doses from fallout radionuclides in individuals unless these are very large. The working group has approached the problem of detection of small doses (less than or equal to 10 or so rad) sampled decades after the exposure of individuals by using a Bayesian statistical approach. Only a preliminary evaluation of this approach was possible, but it is clear that it could provide a formal statement of the likelihood that any given observation of a particular number of chromosomal aberrations in a sample of any particular number of lymphocytes actually indicates an exposure to any given dose of radiation. It is also clear that aberration frequencies (and consequently doses) would have to be quite high before much confidence could be given to either exposure or dose estimation by this method, given the approximately 3 decades of elapsed time between the exposures and any future blood sampling.(ABSTRACT TRUNCATED AT 400 WORDS)


Subject(s)
Abnormalities, Radiation-Induced/diagnosis , Chromosome Aberrations , Environmental Exposure , Abnormalities, Radiation-Induced/genetics , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...