Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
2.
Psychopharmacology (Berl) ; 239(12): 3793-3804, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36308562

ABSTRACT

RATIONALE: Kratom derives from Mitragyna speciosa (Korth.), a tropical tree in the genus Mitragyna (Rubiaceae) that also includes the coffee tree. Kratom leaf powders, tea-like decoctions, and commercial extracts are taken orally, primarily for health and well-being by millions of people globally. Others take kratom to eliminate opioid use for analgesia and manage opioid withdrawal and use disorder. There is debate over the possible respiratory depressant overdose risk of the primary active alkaloid, mitragynine, a partial µ-opioid receptor agonist, that does not signal through ß-arrestin, the primary opioid respiratory depressant pathway. OBJECTIVES: Compare the respiratory effects of oral mitragynine to oral oxycodone in rats with the study design previously published by US Food and Drug Administration (FDA) scientists for evaluating the respiratory effects of opioids (Xu et al., Toxicol Rep 7:188-197, 2020). METHODS: Blood gases, observable signs, and mitragynine pharmacokinetics were assessed for 12 h after 20, 40, 80, 240, and 400 mg/kg oral mitragynine isolate and 6.75, 60, and 150 mg/kg oral oxycodone hydrochloride. FINDINGS: Oxycodone administration produced significant dose-related respiratory depressant effects and pronounced sedation with one death each at 60 and 150 mg/kg. Mitragynine did not yield significant dose-related respiratory depressant or life-threatening effects. Sedative-like effects, milder than produced by oxycodone, were evident at the highest mitragynine dose. Maximum oxycodone and mitragynine plasma concentrations were dose related. CONCLUSIONS: Consistent with mitragynine's pharmacology that includes partial µ-opioid receptor agonism with little recruitment of the respiratory depressant activating ß-arrestin pathway, mitragynine produced no evidence of respiratory depression at doses many times higher than known to be taken by humans.


Subject(s)
Mitragyna , Plant Extracts , Secologanin Tryptamine Alkaloids , Animals , Rats , Analgesics, Opioid/pharmacology , Mitragyna/chemistry , Oxycodone/pharmacology , Plant Extracts/pharmacology , Receptors, Opioid , Secologanin Tryptamine Alkaloids/pharmacology
3.
Exp Biol Med (Maywood) ; 247(1): 1-75, 2022 01.
Article in English | MEDLINE | ID: mdl-34783606

ABSTRACT

There is an evolution and increasing need for the utilization of emerging cellular, molecular and in silico technologies and novel approaches for safety assessment of food, drugs, and personal care products. Convergence of these emerging technologies is also enabling rapid advances and approaches that may impact regulatory decisions and approvals. Although the development of emerging technologies may allow rapid advances in regulatory decision making, there is concern that these new technologies have not been thoroughly evaluated to determine if they are ready for regulatory application, singularly or in combinations. The magnitude of these combined technical advances may outpace the ability to assess fit for purpose and to allow routine application of these new methods for regulatory purposes. There is a need to develop strategies to evaluate the new technologies to determine which ones are ready for regulatory use. The opportunity to apply these potentially faster, more accurate, and cost-effective approaches remains an important goal to facilitate their incorporation into regulatory use. However, without a clear strategy to evaluate emerging technologies rapidly and appropriately, the value of these efforts may go unrecognized or may take longer. It is important for the regulatory science field to keep up with the research in these technically advanced areas and to understand the science behind these new approaches. The regulatory field must understand the critical quality attributes of these novel approaches and learn from each other's experience so that workforces can be trained to prepare for emerging global regulatory challenges. Moreover, it is essential that the regulatory community must work with the technology developers to harness collective capabilities towards developing a strategy for evaluation of these new and novel assessment tools.


Subject(s)
Biomedical Research , Computer Simulation , Humans
4.
Regul Toxicol Pharmacol ; 122: 104897, 2021 Jun.
Article in English | MEDLINE | ID: mdl-33639256

ABSTRACT

Benzoic acid (BA) was administered in the diet to male and female Sprague Dawley Crl:CD(SD) rats in an OECD Test Guideline 443 Extended One-Generation Reproductive Toxicity (EOGRT) study to test for effects that may occur as a result of pre- and postnatal exposure. The study included cohorts of F1 offspring to evaluate potential effects of benzoic acid on reproduction, the developing immune system, and the developing neurological system with the inclusion of learning and memory assessments. Benzoic acid was incorporated in the diet at concentrations of 0, 7,500, 11,500, and 15,000 mg/kg diet (ppm). These concentrations were selected based on the results of preliminary studies, and, based on average food consumption, were intended to supply BA doses of approximately 0, 500, 750, and 1000 mg/kg bw/day. To avoid exceeding these target dose levels, the dietary concentrations were adjusted (based on historical control body weight and food consumption data) to maintain the target mg/kg bw/day dose levels during those life periods when food intake per unit of body weight was increased to support milk production by females (gestation and lactation) and rapid pup growth (post-weaning). In the parental (F0) generation, survival, clinical observations, organ weights, pathology, hematology, serum chemistry, urinalysis, and bile acids were unaffected by BA administration. Reproductive parameters were also unaffected by BA administration. In the F1 generation, survival, growth and developmental landmarks, organ weights, pathology, immunotoxicity assessment, and neurotoxicity and neurobehavioral parameters such as auditory startle response, locomotor activity, learning and memory assessments were unaffected by BA administration, as were clinical pathology (hematology, serum chemistry, urinalysis, bile acids and thyroid hormones) and reproductive performance. Similarly, no adverse effects or systemic toxicity were observed in the F2 generation. Overall, the highest dietary concentration (15,000 ppm), providing a dosage of approximately 1000 mg/kg bw/day, was the NOAEL for benzoic acid in this EOGRT study.


Subject(s)
Benzoic Acid/pharmacology , Food Preservatives/pharmacology , Genitalia/drug effects , Animals , Body Weight , Dose-Response Relationship, Drug , Female , Male , No-Observed-Adverse-Effect Level , Organ Size/drug effects , Rats , Rats, Sprague-Dawley
5.
Risk Anal ; 40(S1): 2218-2230, 2020 11.
Article in English | MEDLINE | ID: mdl-33135225

ABSTRACT

Before the founding of the Society for Risk Analysis (SRA) in 1980, food safety in the United States had long been a concern, but there was a lack of systematic methods to assess food-related risks. In 1906, the U.S. Congress passed, and President Roosevelt signed, the Pure Food and Drug Act and the Meat Inspection Act to regulate food safety at the federal level. This Act followed the publication of multiple reports of food contamination, culminating in Upton Sinclair's novel The Jungle, which highlighted food and worker abuses in the meatpacking industry. Later in the 20th century, important developments in agricultural and food technology greatly increased food production. But chemical exposures from agricultural and other practices resulted in major amendments to federal food laws, including the Delaney Clause, aimed specifically at cancer-causing chemicals. Later in the 20th century, when quantitative risk assessment methods were given greater scientific status in a seminal National Research Council report, food safety risk assessment became more systematized. Additionally, in these last 40 years, food safety research has resulted in increased understanding of a range of health effects from foodborne chemicals, and technological developments have improved U.S. food safety from farm to fork by offering new ways to manage risks. We discuss the history of food safety and the role risk analysis has played in its evolution, starting from over a century ago, but focusing on the last 40 years. While we focus on chemical risk assessment in the U.S., we also discuss microbial risk assessment and international food safety.


Subject(s)
Food Safety , Risk Assessment/history , Carcinogens/analysis , Food Contamination/analysis , History, 20th Century , United States , United States Food and Drug Administration
6.
Am J Clin Nutr ; 112(5): 1390-1403, 2020 11 11.
Article in English | MEDLINE | ID: mdl-33022704

ABSTRACT

Folate, an essential nutrient found naturally in foods in a reduced form, is present in dietary supplements and fortified foods in an oxidized synthetic form (folic acid). There is widespread agreement that maintaining adequate folate status is critical to prevent diseases due to folate inadequacy (e.g., anemia, birth defects, and cancer). However, there are concerns of potential adverse effects of excess folic acid intake and/or elevated folate status, with the original concern focused on exacerbation of clinical effects of vitamin B-12 deficiency and its role in neurocognitive health. More recently, animal and observational studies have suggested potential adverse effects on cancer risk, birth outcomes, and other diseases. Observations indicating adverse effects from excess folic acid intake, elevated folate status, and unmetabolized folic acid (UMFA) remain inconclusive; the data do not provide the evidence needed to affect public health recommendations. Moreover, strong biological and mechanistic premises connecting elevated folic acid intake, UMFA, and/or high folate status to adverse health outcomes are lacking. However, the body of evidence on potential adverse health outcomes indicates the need for comprehensive research to clarify these issues and bridge knowledge gaps. Three key research questions encompass the additional research needed to establish whether high folic acid or total folate intake contributes to disease risk. 1) Does UMFA affect biological pathways leading to adverse health effects? 2) Does elevated folate status resulting from any form of folate intake affect vitamin B-12 function and its roles in sustaining health? 3) Does elevated folate intake, regardless of form, affect biological pathways leading to adverse health effects other than those linked to vitamin B-12 function? This article summarizes the proceedings of an August 2019 NIH expert workshop focused on addressing these research areas.


Subject(s)
Folic Acid/administration & dosage , Adolescent , Adult , Child , Child, Preschool , Dietary Supplements , Dose-Response Relationship, Drug , Humans , Middle Aged , United States
7.
Dose Response ; 17(1): 1559325818824934, 2019.
Article in English | MEDLINE | ID: mdl-30783394

ABSTRACT

Federal regulatory agencies had, by the 1970s, been charged with enforcing a host of new laws requiring that they establish controls on human exposures to chemicals necessary to protect health. The agencies relied upon a methodology introduced in the 1950s to identify safe levels of exposure to chemicals known to display toxicity. During the 2 decades prior to the 1970s, federal authorities had come to treat carcinogens as distinct from other toxic agents, and to regard them as unsafe at any level of exposure, and no systematic methods had been developed to deal with the rapidly increasing numbers of carcinogens. Beginning in the mid-1970s, some scientists and policy makers in regulatory agencies, including the present author, began to propose adopting emerging quantitative methods to evaluate the risks of carcinogens and introduced new notions of safety based on explicit consideration of risk. Quantitative risk assessment rose to prominence in the decade reviewed in this article (1974-1984) and began to replace the unsystematic approaches that provided no view of how well health would be protected under various regulatory controls. This article offers the author's recollections of that important decade.

8.
Regul Toxicol Pharmacol ; 92: 472-490, 2018 Feb.
Article in English | MEDLINE | ID: mdl-29158043

ABSTRACT

Shortly after the International Agency for Research on Cancer (IARC) determined that formaldehyde causes leukemia, the United States Environmental Protection Agency (EPA) released its Draft IRIS Toxicological Review of Formaldehyde ("Draft IRIS Assessment"), also concluding that formaldehyde causes leukemia. Peer review of the Draft IRIS Assessment by a National Academy of Science committee noted that "causal determinations are not supported by the narrative provided in the draft" (NRC 2011). They offered recommendations for improving the Draft IRIS assessment and identified several important research gaps. Over the six years since the NRC peer review, significant new science has been published. We identify and summarize key recommendations made by NRC and map them to this new science, including extended analysis of epidemiological studies, updates of earlier occupational cohort studies, toxicological experiments using a sensitive mouse strain, mechanistic studies examining the role of exogenous versus endogenous formaldehyde in bone marrow, and several critical reviews. With few exceptions, new findings are consistently negative, and integration of all available evidence challenges the earlier conclusions that formaldehyde causes leukemia. Given formaldehyde's commercial importance, environmental ubiquity and endogenous production, accurate hazard classification and risk evaluation of whether exposure to formaldehyde from occupational, residential and consumer products causes leukemia are critical.


Subject(s)
Formaldehyde/toxicity , Leukemia/chemically induced , Leukemia/etiology , Animals , Bone Marrow/drug effects , Humans , Occupational Exposure/adverse effects , Risk Assessment , United States , United States Environmental Protection Agency
9.
Regul Toxicol Pharmacol ; 89: 165-185, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28756014

ABSTRACT

This report evaluates the scientific literature on caffeine with respect to potential cardiovascular outcomes, specifically relative risks of total cardiovascular disease (CVD), coronary heart disease (CHD) and acute myocardial infarction (AMI), effects on arrhythmia, heart failure, sudden cardiac arrest, stroke, blood pressure, hypertension, and other biomarkers of effect, including heart rate, cerebral blood flow, cardiac output, plasma homocysteine levels, serum cholesterol levels, electrocardiogram (EKG) parameters, heart rate variability, endothelial/platelet function and plasma/urine catecholamine levels. Caffeine intake has been associated with a range of reversible and transient physiological effects broadly and cardiovascular effects specifically. This report attempts to understand where the delineations exist in caffeine intake and corresponding cardiovascular effects among various subpopulations. The available literature suggests that cardiovascular effects experienced by caffeine consumers at levels up to 600 mg/day are in most cases mild, transient, and reversible, with no lasting adverse effect. The point at which caffeine intake may cause harm to the cardiovascular system is not readily identifiable in part because data on the effects of daily intakes greater than 600 mg is limited. However, the evidence considered within this review suggests that typical moderate caffeine intake is not associated with increased risks of total cardiovascular disease; arrhythmia; heart failure; blood pressure changes among regular coffee drinkers; or hypertension in baseline populations.


Subject(s)
Caffeine/pharmacology , Cardiovascular Diseases/chemically induced , Cardiovascular System/drug effects , Central Nervous System Stimulants/pharmacology , Hemodynamics/physiology , Caffeine/adverse effects , Central Nervous System Stimulants/adverse effects , Coffee , Hemodynamics/drug effects , Humans
10.
Am J Clin Nutr ; 105(1): 249S-285S, 2017 01.
Article in English | MEDLINE | ID: mdl-27927637

ABSTRACT

Dietary Reference Intakes (DRIs) are used in Canada and the United States in planning and assessing diets of apparently healthy individuals and population groups. The approaches used to establish DRIs on the basis of classical nutrient deficiencies and/or toxicities have worked well. However, it has proved to be more challenging to base DRI values on chronic disease endpoints; deviations from the traditional framework were often required, and in some cases, DRI values were not established for intakes that affected chronic disease outcomes despite evidence that supported a relation. The increasing proportions of elderly citizens, the growing prevalence of chronic diseases, and the persistently high prevalence of overweight and obesity, which predispose to chronic disease, highlight the importance of understanding the impact of nutrition on chronic disease prevention and control. A multidisciplinary working group sponsored by the Canadian and US government DRI steering committees met from November 2014 to April 2016 to identify options for addressing key scientific challenges encountered in the use of chronic disease endpoints to establish reference values. The working group focused on 3 key questions: 1) What are the important evidentiary challenges for selecting and using chronic disease endpoints in future DRI reviews, 2) what intake-response models can future DRI committees consider when using chronic disease endpoints, and 3) what are the arguments for and against continuing to include chronic disease endpoints in future DRI reviews? This report outlines the range of options identified by the working group for answering these key questions, as well as the strengths and weaknesses of each option.


Subject(s)
Chronic Disease , Diet , Nutrition Assessment , Nutritional Requirements , Nutritional Status , Recommended Dietary Allowances , Aged , Canada , Chronic Disease/prevention & control , Humans , Obesity/complications , Reference Values , United States
11.
Regul Toxicol Pharmacol ; 74: 81-92, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26702789

ABSTRACT

This report evaluates the scientific literature on caffeine with respect to potential central nervous system (CNS) effects, specifically effects on sleep, anxiety, and aggression/risk-taking. Caffeine has been the subject of more scientific safety studies than any other food ingredient. It is important, therefore, to evaluate new studies in the context of this large existing body of knowledge. The safety of caffeine can best be described in a narrative form, and is not usefully expressed in terms of a "bright line" numerical value like an "acceptable daily intake" (ADI). Caffeine intake has been associated with a range of reversible physiological effects, in a few studies at levels of less than 100 mg in sensitive individuals. It is also clear that many people can tolerate much greater levels - perhaps up to 600-800 mg/day or more - without experiencing such effects. The reasons for this type of variability in response are described in this report. Based on all the available evidence, there is no reason to believe that experiencing such effects from caffeine intake has any significant or lasting effect on health. The point at which caffeine intake may cause harm to the CNS is not readily identifiable, in part because data on the effects of daily intakes greater than 600 mg is limited. Effects of caffeine on risk-taking and aggressive behavior in young people have received considerable publicity, yet are the most difficult to study because of ethical concerns and limitations in the ability to design appropriate studies. At present, the weight of available evidence does not support these concerns, yet this should not preclude ongoing careful monitoring of the scientific literature.


Subject(s)
Caffeine/adverse effects , Central Nervous System Stimulants/adverse effects , Central Nervous System/drug effects , Neurotoxicity Syndromes/etiology , Aggression/drug effects , Animals , Anxiety/chemically induced , Anxiety/physiopathology , Anxiety/psychology , Caffeine/pharmacokinetics , Central Nervous System/physiopathology , Central Nervous System Stimulants/pharmacokinetics , Dose-Response Relationship, Drug , Humans , Neurotoxicity Syndromes/diagnosis , Neurotoxicity Syndromes/physiopathology , Neurotoxicity Syndromes/psychology , Recommended Dietary Allowances , Risk Assessment , Risk-Taking , Sleep/drug effects , Sleep Wake Disorders/chemically induced , Sleep Wake Disorders/physiopathology , Sleep Wake Disorders/psychology , Substance Withdrawal Syndrome/etiology , Substance Withdrawal Syndrome/physiopathology , Substance Withdrawal Syndrome/psychology
12.
Crit Rev Toxicol ; 43(8): 661-70, 2013 Sep.
Article in English | MEDLINE | ID: mdl-23902349

ABSTRACT

A recent study (Zhang et al., 2010) has provided results attributed to aneuploidy in circulating stem cells that has been characterized as providing potential support for proposed mechanisms for formaldehyde to impact bone marrow. A critical review of the study, as well as a reanalysis of the underlying data, was performed and the results of this reanalysis suggested factors other than formaldehyde exposure may have contributed to the effects reported. In addition, although the authors stated in their paper that "all scorable metaphase spreads on each slide were analyzed, and a minimum of 150 cells per subject was scored," this protocol was not followed. In fact, the protocol to evaluate the presence of monosomy 7 or trisomy 8 was followed for three or less samples in exposed workers and six or less samples in non-exposed workers. In addition, the assays used (CFU-GM) do not actually measure the proposed events in primitive cells involved in the development of acute myeloid leukemia. Evaluation of these data indicates that the aneuploidy measured could not have arisen in vivo, but rather arose during in vitro culture. The results of our critical review and reanalysis of the data, in combination with recent toxicological and mechanistic studies, do not support a mechanism for a causal association between formaldehyde exposure and myeloid or lymphoid malignancies.


Subject(s)
Formaldehyde/toxicity , Leukemia, Myeloid, Acute/pathology , Occupational Exposure/analysis , Animals , Carcinogens/toxicity , Chromosome Deletion , Chromosomes, Human, Pair 7/drug effects , Chromosomes, Human, Pair 7/genetics , Chromosomes, Human, Pair 8/drug effects , Chromosomes, Human, Pair 8/genetics , DNA Damage/drug effects , Disease Models, Animal , Humans , Leukemia, Myeloid, Acute/etiology , Leukemia, Myeloid, Acute/genetics , Stem Cells/drug effects , Stem Cells/pathology , Trisomy/genetics
15.
Toxicol Sci ; 131(1): 1-8, 2013 Jan.
Article in English | MEDLINE | ID: mdl-22874419

ABSTRACT

In 2009, the National Research Council (NRC) released the latest in a series of advisory reports on human health risk assessment, titled Science and Decisions: Advancing Risk Assessment. This wide-ranging report made a number of recommendations related to risk assessment practice at the U.S. Environmental Protection Agency that could both influence and be influenced by evolving toxicological practice. In particular, Science and Decisions emphasized the scientific and operational necessity of a new approach for dose-response modeling; addressed the recurring challenge of defaults in risk assessment and the question of when research results can be used in place of defaults; and reinforced the value of cumulative risk assessment, which would require enhanced understanding of the joint influence of chemical and nonchemical stressors on health outcomes. The objective of this article is to summarize key messages from Science and Decisions, both as a stand-alone report and in comparison with another recent NRC report, Toxicity Testing in the 21st Century: A Vision and a Strategy. Although these reports have many conclusions in common and reinforce similar themes, there are important differences that merit careful consideration, such as the move away from apical endpoints in Toxicity Testing and the emphasis on benefit-cost analyses and related decision tools in Science and Decisions that would be strengthened by quantification of apical endpoints. Moving risk assessment forward will require toxicologists to wrestle with the implications of Science and Decisions from a toxicological perspective.


Subject(s)
Decision Making , Risk Assessment/methods , Toxicity Tests/methods , Toxicology/methods , Dose-Response Relationship, Drug , Models, Theoretical , Risk Assessment/statistics & numerical data , Risk Assessment/trends , Toxicity Tests/statistics & numerical data , Toxicity Tests/trends , Toxicology/statistics & numerical data , Toxicology/trends , United States , United States Environmental Protection Agency
16.
Open Epidemiol J ; 4: 3-29, 2011.
Article in English | MEDLINE | ID: mdl-31341519

ABSTRACT

The field of environmental public health is at an important crossroad. Our current biomonitoring efforts document widespread exposure to a host of chemicals for which toxicity information is lacking. At the same time, advances in the fields of genomics, proteomics, metabolomics, genetics and epigenetics are yielding volumes of data at a rapid pace. Our ability to detect chemicals in biological and environmental media has far outpaced our ability to interpret their health relevance, and as a result, the environmental risk paradigm, in its current state, is antiquated and ill-equipped to make the best use of these new data. In light of new scientific developments and the pressing need to characterize the public health burdens of chemicals, it is imperative to reinvigorate the use of environmental epidemiology in chemical risk assessment. Two case studies of chemical assessments from the Environmental Protection Agency Integrated Risk Information System database are presented to illustrate opportunities where epidemiologic data could have been used in place of experimental animal data in dose-response assessment, or where different approaches, techniques, or studies could have been employed to better utilize existing epidemiologic evidence. Based on the case studies and what can be learned from recent scientific advances and improved approaches to utilizing human data for dose-response estimation, recommendations are provided for the disciplines of epidemiology and risk assessment for enhancing the role of epidemiologic data in hazard identification and dose-response assessment.

17.
Risk Anal ; 30(7): 1028-36, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20497395

ABSTRACT

At the request of the U.S. Environmental Protection Agency (EPA), the National Research Council (NRC) recently completed a major report, Science and Decisions: Advancing Risk Assessment, that is intended to strengthen the scientific basis, credibility, and effectiveness of risk assessment practices and subsequent risk management decisions. The report describes the challenges faced by risk assessment and the need to consider improvements in both the technical analyses of risk assessments (i.e., the development and use of scientific information to improve risk characterization) and the utility of risk assessments (i.e., making assessments more relevant and useful for risk management decisions). The report tackles a number of topics relating to improvements in the process, including the design and framing of risk assessments, uncertainty and variability characterization, selection and use of defaults, unification of cancer and noncancer dose-response assessment, cumulative risk assessment, and the need to increase EPA's capacity to address these improvements. This article describes and summarizes the NRC report, with an eye toward its implications for risk assessment practices at EPA.


Subject(s)
Risk Assessment/methods , Dose-Response Relationship, Drug , Humans , No-Observed-Adverse-Effect Level , Risk Assessment/statistics & numerical data , Risk Assessment/trends , Risk Management , United States , United States Environmental Protection Agency
18.
Crit Rev Toxicol ; 40(5): 422-84, 2010 May.
Article in English | MEDLINE | ID: mdl-20377306

ABSTRACT

Triclosan (2,4,4'-trichloro-2'-hydroxy-diphenyl ether) is an antibacterial compound that has been used in consumer products for about 40 years. The tolerability and safety of triclosan has been evaluated in human volunteers with little indication of toxicity or sensitization. Although information in humans from chronic usage of personal care products is not available, triclosan has been extensively studied in laboratory animals. When evaluated in chronic oncogenicity studies in mice, rats, and hamsters, treatment-related tumors were found only in the liver of male and female mice. Application of the Human Relevance Framework suggested that these tumors arose by way of peroxisome proliferator-activated receptor alpha (PPARalpha) activation, a mode of action not considered to be relevant to humans. Consequently, a Benchmark Dose (BMDL(10)) of 47 mg/kg/day was developed based on kidney toxicity in the hamster. Estimates of the amount of intake from in the use of representative personal care products for men, women, and children were derived in two ways: (1) using known or assumed triclosan levels in various consumer products and assumed usage patterns (product-based estimates); and (2) using upper bound measured urinary triclosan levels from human volunteers (biomonitoring-based estimates) using data from the Centers for Disease Control and Prevention. For the product-based estimates, the margin of safety (MOS) for the combined exposure estimates of intake from the use of all triclosan-containing products considered were approximately 1000, 730, and 630 for men, women, and children, respectively. The MOS calculated from the biomonitoring-based estimated intakes were 5200, 6700, and 11,750 for men, women, and children, respectively. Based on these results, exposure to triclosan in consumer products is not expected to cause adverse health effects in children or adults who use these products as intended.


Subject(s)
Benchmarking , Cosmetics/toxicity , Triclosan/therapeutic use , Adult , Animals , Child , Clinical Trials as Topic , Consumer Product Safety , Cricetinae , Environmental Health , Female , Humans , Male , Mice , PPAR alpha , Rats , Soaps
19.
Crit Rev Food Sci Nutr ; 49(8): 708-17, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19690996

ABSTRACT

The methodology used to establish tolerable upper intake levels (UL) for nutrients borrows heavily from risk assessment methods used by toxicologists. Empirical data are used to identify intake levels associated with adverse effects, and Uncertainty Factors (UF) are applied to establish ULs, which in turn inform public health decisions and standards. Use of UFs reflects lack of knowledge regarding the biological events that underlie response to the intake of a given nutrient, and also regarding the sources of variability in that response. In this paper, the Key Events Dose-Response Framework (KEDRF) is used to systematically consider the major biological steps that lead from the intake of the preformed vitamin A to excess systemic levels, and subsequently to increased risk of adverse effects. Each step is examined with regard to factors that influence whether there is progression toward the adverse effect of concern. The role of homeostatic mechanisms is discussed, along with the types of research needed to improve understanding of dose-response for vitamin A. This initial analysis illustrates the potential of the KEDRF as a useful analytical tool for integrating current knowledge regarding dose-response, generating questions that will focus future research efforts, and clarifying how improved knowledge and data could be used to reduce reliance on UFs.


Subject(s)
Vitamin A Deficiency/metabolism , Vitamin A/administration & dosage , Vitamin A/adverse effects , Algorithms , Drug Overdose , Homeostasis , Humans , Intestinal Mucosa/metabolism , Liver/metabolism , Vitamin A/metabolism
20.
Int J Toxicol ; 26(1): 3-12, 2007.
Article in English | MEDLINE | ID: mdl-17365141

ABSTRACT

Quantitative approaches to evaluating the risks of chemical toxicity entered the lives of toxicologists in the mid-1970s, and the continuing interaction of toxicology and risk assessment has been of benefit to both disciplines. I will summarize the origins of the interaction, the reasons for it, and the difficult course it has followed. In doing so, I will set the stage for a discussion of how the type of thinking that informs risk-based decision-making provides important benefits to the continuing development of the science of toxicology. There will continue to be societal pressure for the development of reliable knowledge about the public health importance of the enormous variety of chemical exposures we all incur, from conception to death. Risk assessment is the framework used to organize and convey that knowledge. Toxicology is the principle discipline used to give scientific substance to that framework. Social acceptance of every manifestation of the modern chemical age requires high assurance that the public health is not threatened, and that assurance depends upon continued improvements in these two mutually dependent disciplines.


Subject(s)
Awards and Prizes , Toxicology/history , History, 20th Century , History, 21st Century , Risk Assessment , Societies, Scientific
SELECTION OF CITATIONS
SEARCH DETAIL
...