Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 13 de 13
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Crit Rev Toxicol ; 54(4): 252-289, 2024 04.
Artigo em Inglês | MEDLINE | ID: mdl-38753561

RESUMO

INTRODUCTION: Causal epidemiology for regulatory risk analysis seeks to evaluate how removing or reducing exposures would change disease occurrence rates. We define interventional probability of causation (IPoC) as the change in probability of a disease (or other harm) occurring over a lifetime or other specified time interval that would be caused by a specified change in exposure, as predicted by a fully specified causal model. We define the closely related concept of causal assigned share (CAS) as the predicted fraction of disease risk that would be removed or prevented by a specified reduction in exposure, holding other variables fixed. Traditional approaches used to evaluate the preventable risk implications of epidemiological associations, including population attributable fraction (PAF) and the Bradford Hill considerations, cannot reveal whether removing a risk factor would reduce disease incidence. We argue that modern formal causal models coupled with causal artificial intelligence (CAI) and realistically partial and imperfect knowledge of underlying disease mechanisms, show great promise for determining and quantifying IPoC and CAS for exposures and diseases of practical interest. METHODS: We briefly review key CAI concepts and terms and then apply them to define IPoC and CAS. We present steps to quantify IPoC using a fully specified causal Bayesian network (BN) model. Useful bounds for quantitative IPoC and CAS calculations are derived for a two-stage clonal expansion (TSCE) model for carcinogenesis and illustrated by applying them to benzene and formaldehyde based on available epidemiological and partial mechanistic evidence. RESULTS: Causal BN models for benzene and risk of acute myeloid leukemia (AML) incorporating mechanistic, toxicological and epidemiological findings show that prolonged high-intensity exposure to benzene can increase risk of AML (IPoC of up to 7e-5, CAS of up to 54%). By contrast, no causal pathway leading from formaldehyde exposure to increased risk of AML was identified, consistent with much previous mechanistic, toxicological and epidemiological evidence; therefore, the IPoC and CAS for formaldehyde-induced AML are likely to be zero. CONCLUSION: We conclude that the IPoC approach can differentiate between likely and unlikely causal factors and can provide useful upper bounds for IPoC and CAS for some exposures and diseases of practical importance. For causal factors, IPoC can help to estimate the quantitative impacts on health risks of reducing exposures, even in situations where mechanistic evidence is realistically incomplete and individual-level exposure-response parameters are uncertain. This illustrates the strength that can be gained for causal inference by using causal models to generate testable hypotheses and then obtaining toxicological data to test the hypotheses implied by the models-and, where necessary, refine the models. This virtuous cycle provides additional insight into causal determinations that may not be available from weight-of-evidence considerations alone.


Assuntos
Benzeno , Formaldeído , Leucemia Mieloide Aguda , Humanos , Benzeno/toxicidade , Leucemia Mieloide Aguda/epidemiologia , Leucemia Mieloide Aguda/induzido quimicamente , Formaldeído/toxicidade , Causalidade , Probabilidade , Medição de Risco , Exposição Ambiental , Fatores de Risco
2.
Crit Rev Toxicol ; 51(2): 95-116, 2021 02.
Artigo em Inglês | MEDLINE | ID: mdl-33853483

RESUMO

Are dose-response relationships for benzene and health effects such as myelodysplastic syndrome (MDS) and acute myeloid leukemia (AML) supra-linear, with disproportionately high risks at low concentrations, e.g. below 1 ppm? To investigate this hypothesis, we apply recent mode of action (MoA) and mechanistic information and modern data science techniques to quantify air benzene-urinary metabolite relationships in a previously studied data set for Tianjin, China factory workers. We find that physiologically based pharmacokinetics (PBPK) models and data for Tianjin workers show approximately linear production of benzene metabolites for air benzene (AB) concentrations below about 15 ppm, with modest sublinearity at low concentrations (e.g. below 5 ppm). Analysis of the Tianjin worker data using partial dependence plots reveals that production of metabolites increases disproportionately with increases in air benzene (AB) concentrations above 10 ppm, exhibiting steep sublinearity (J shape) before becoming saturated. As a consequence, estimated cumulative exposure is not an adequate basis for predicting risk. Risk assessments must consider the variability of exposure concentrations around estimated exposure concentrations to avoid over-estimating risks at low concentrations. The same average concentration for a specified duration is disproportionately risky if it has higher variance. Conversely, if chronic inflammation via activation of inflammasomes is a critical event for induction of MDS and other health effects, then sufficiently low concentrations of benzene are predicted not to cause increased risks of inflammasome-mediated diseases, no matter how long the duration of exposure. Thus, we find no evidence that the dose-response relationship is supra-linear at low doses; instead sublinear or zero excess risk at low concentrations is more consistent with the data. A combination of physiologically based pharmacokinetic (PBPK) modeling, Bayesian network (BN) analysis and inference, and partial dependence plots appears a promising and practical approach for applying current data science methods to advance benzene risk assessment.


Assuntos
Benzeno/toxicidade , Exposição Ambiental/estatística & dados numéricos , Poluentes Ambientais/toxicidade , Teorema de Bayes , China , Relação Dose-Resposta a Droga , Humanos , Leucemia Mieloide Aguda , Síndromes Mielodisplásicas , Medição de Risco
3.
N Engl J Med ; 381(21): 2074, 2019 11 21.
Artigo em Inglês | MEDLINE | ID: mdl-31747736
4.
Chem Biol Interact ; 278: 242-255, 2017 Dec 25.
Artigo em Inglês | MEDLINE | ID: mdl-28882553

RESUMO

Two apparently contradictory findings in the literature on low-dose human metabolism of benzene are as follows. First, metabolism is approximately linear at low concentrations, e.g., below 10 ppm. This is consistent with decades of quantitative modeling of benzene pharmacokinetics and dose-dependent metabolism. Second, measured benzene exposure and metabolite concentrations for occupationally exposed benzene workers in Tianjin, China show that dose-specific metabolism (DSM) ratios of metabolite concentrations per ppm of benzene in air decrease steadily with benzene concentration, with the steepest decreases below 3 ppm. This has been interpreted as indicating that metabolism at low concentrations of benzene is highly nonlinear. We reexamine the data using non-parametric methods. Our main conclusion is that both findings are correct; they are not contradictory. Low-concentration metabolism can be linear, with metabolite concentrations proportional to benzene concentrations in air, and yet DSM ratios can still decrease with benzene concentrations. This is because a ratio of random variables can be negatively correlated with its own denominator even if the mean of the numerator is proportional to the denominator. Interpreting DSM ratios that decrease with air benzene concentrations as evidence of nonlinear metabolism is therefore unwarranted when plots of metabolite concentrations against benzene ppm in air show approximately straight-line relationships between them, as in the Tianjin data. Thus, an apparent contradiction that has fueled heated discussions in the recent literature can be resolved by recognizing that highly nonlinear, decreasing DSM ratios are consistent with linear metabolism.


Assuntos
Benzeno/metabolismo , Exposição Ocupacional/análise , Acetilcisteína/análogos & derivados , Acetilcisteína/análise , Adulto , Poluição do Ar em Ambientes Fechados/análise , Teorema de Bayes , Benzeno/análise , Catecóis/urina , Creatinina/urina , Monitoramento Ambiental , Feminino , Humanos , Hidroquinonas/urina , Modelos Lineares , Masculino , Pessoa de Meia-Idade , Fenol/metabolismo , Fenol/urina , Estatísticas não Paramétricas , Tolueno/análise , Adulto Jovem
5.
Regul Toxicol Pharmacol ; 90: 185-196, 2017 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-28866267

RESUMO

IARC has begun using ToxCast/Tox21 data in efforts to represent key characteristics of carcinogens to organize and weigh mechanistic evidence in cancer hazard determinations and this implicit inference approach also is being considered by USEPA. To determine how well ToxCast/Tox21 data can explicitly predict cancer hazard, this approach was evaluated with statistical analyses and machine learning prediction algorithms. Substances USEPA previously classified as having cancer hazard potential were designated as positives and substances not posing a carcinogenic hazard were designated as negatives. Then ToxCast/Tox21 data were analyzed both with and without adjusting for the cytotoxicity burst effect commonly observed in such assays. Using the same assignments as IARC of ToxCast/Tox21 assays to the seven key characteristics of carcinogens, the ability to predict cancer hazard for each key characteristic, alone or in combination, was found to be no better than chance. Hence, we have little scientific confidence in IARC's inference models derived from current ToxCast/Tox21 assays for key characteristics to predict cancer. This finding supports the need for a more rigorous mode-of-action pathway-based framework to organize, evaluate, and integrate mechanistic evidence with animal toxicity, epidemiological investigations, and knowledge of exposure and dosimetry to evaluate potential carcinogenic hazards and risks to humans.


Assuntos
Carcinógenos/toxicidade , Interpretação Estatística de Dados , Ensaios de Triagem em Larga Escala , Modelos Estatísticos , Neoplasias/classificação , Algoritmos , Animais , Testes de Carcinogenicidade , Humanos , Aprendizado de Máquina , Neoplasias/induzido quimicamente , Medição de Risco/métodos , Estados Unidos , United States Environmental Protection Agency
7.
Regul Toxicol Pharmacol ; 66(3): 336-46, 2013 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-23707535

RESUMO

Recent studies have indicated that reducing particulate pollution would substantially reduce average daily mortality rates, prolonging lives, especially among the elderly (age ≥ 75). These benefits are projected by statistical models of significant positive associations between levels of fine particulate matter (PM2.5) levels and daily mortality rates. We examine the empirical correspondence between changes in average PM2.5 levels and temperatures from 1999 to 2000, and corresponding changes in average daily mortality rates, in each of 100 U.S. cities in the National Mortality and Morbidity Air Pollution Study (NMMAPS) data base, which has extensive PM2.5, temperature, and mortality data for those 2 years. Increases in average daily temperatures appear to significantly reduce average daily mortality rates, as expected from previous research. Unexpectedly, reductions in PM2.5 do not appear to cause any reductions in mortality rates. PM2.5 and mortality rates are both elevated on cold winter days, creating a significant positive statistical relation between their levels, but we find no evidence that reductions in PM2.5 concentrations cause reductions in mortality rates. For all concerned, it is crucial to use causal relations, rather than statistical associations, to project the changes in human health risks due to interventions such as reductions in particulate air pollution.


Assuntos
Poluentes Atmosféricos/análise , Monitoramento Ambiental/métodos , Exposição por Inalação/efeitos adversos , Mortalidade/tendências , Material Particulado/análise , Temperatura , Idoso , Poluentes Atmosféricos/toxicidade , Causas de Morte , Cidades , Interpretação Estatística de Dados , Bases de Dados Factuais , Monitoramento Ambiental/estatística & dados numéricos , Humanos , Exposição por Inalação/análise , Tamanho da Partícula , Material Particulado/toxicidade , Estações do Ano , Estados Unidos
9.
Environ Int ; 34(4): 459-75, 2008 May.
Artigo em Inglês | MEDLINE | ID: mdl-18201762

RESUMO

Using precautionary principles when facing incomplete facts and causal conjectures raises the possibility of a Faustian bargain. This paper applies systems dynamics based on previously unavailable data to show how well intended precautionary policies for promoting food safety may backfire unless they are informed by quantitative cause-and-effect models of how animal antibiotics affect animal and human health. We focus on European Union and United States formulations of regulatory precaution and then analyze zoonotic infections in terms of the consequences of relying on political will to justify precautionary bans. We do not attempt a political analysis of these issues; rather, we conduct a regulatory analysis of precautionary legal requirements and use Quantitative Risk Assessment (QRA) to assess a set of policy outcomes. Thirty-seven years ago, the Joint Committee on the Use of Antibiotics in Animal Husbandry and Veterinary Medicine (the Swann Report) warned that uncontrolled use of similar antibiotics in humans and food animals could promote the emergence of resistant strains of foodborne bacteria that could endanger human health. Since then, many countries have either banned or restricted antibiotics as feed additives for promoting animal growth. Others, including the United States, have relied on prudent use guidelines and programs that reduce total microbial loads, rather than focusing exclusively on antibiotic-resistant bacteria. In retrospect, the regulatory strategy of banning or restricting animal antibiotic uses has had limited success: it has been followed in many cases by deteriorating animal health and increases in human illnesses and resistance rates. Conversely, a combination of continued prudent use of antibiotics to prevent and control animal infections, together with HACCP and other improvements, has been followed by large improvements in the microbial safety of chickens and other food animals in the United States, leaving both animals and people better off now than they were decades ago. A quantitative risk assessment model of microbiological risks (Campylobacter because of data availability) suggests that these outcomes may be more than coincidental: prudent use of animal antibiotics may actually improve human health, while bans on animal antibiotics, intended to be precautionary, inadvertently may harm human health.


Assuntos
Animais Domésticos/microbiologia , Antibioticoprofilaxia/tendências , Infecções Bacterianas/epidemiologia , Infecções Bacterianas/prevenção & controle , Zoonoses/epidemiologia , Zoonoses/microbiologia , Animais , Antibacterianos/farmacologia , Infecções Bacterianas/transmissão , Farmacorresistência Bacteriana , Europa (Continente)/epidemiologia , União Europeia , Microbiologia de Alimentos , Política de Saúde , Humanos , Modelos Teóricos , Medição de Risco , Estados Unidos/epidemiologia
10.
Prev Vet Med ; 79(2-4): 186-203, 2007 May 16.
Artigo em Inglês | MEDLINE | ID: mdl-17270298

RESUMO

To achieve further reductions in foodborne illness levels in humans, effective pre-harvest interventions are needed. The health status of food animals that are destined to enter the human food supply chain may be an important, although often overlooked, factor in predicting the risk of human foodborne infections. The health status of food animals can potentially influence foodborne pathogen levels in three ways. First, diseased animals may shed higher levels of foodborne pathogens. Second, animals that require further handling in the processing plant to remove affected parts may lead to increased microbial contamination and cross-contamination. Finally, certain animal illnesses may lead to a higher probability of mistakes in the processing plant, such as gastrointestinal ruptures, which would lead to increased microbial contamination and cross-contamination. Consequently, interventions that reduce the incidence of food animal illnesses might also help reduce bacterial contamination on meat, thereby reducing human illness. Some of these interventions, however, might also present a risk to human health. For example, the use of antibiotics in food animals can reduce rates of animal illness but can also select for antibiotic-resistant bacteria which can threaten human treatment options. In this study, we present a mathematical model to evaluate human health risks from foodborne pathogens associated with changes in animal illness. The model is designed so that potential human health risks and benefits from interventions such as the continued use of antibiotics in animal agriculture can be evaluated simultaneously. We applied the model to a hypothetical example of Campylobacter from chicken. In general, the model suggests that very minor perturbations in microbial loads on meat products could have relatively large impacts on human health, and consequently, small improvements in food animal health might result in significant reductions in human illness.


Assuntos
Doenças dos Animais/transmissão , Qualidade de Produtos para o Consumidor , Contaminação de Alimentos/prevenção & controle , Doenças Transmitidas por Alimentos/prevenção & controle , Nível de Saúde , Zoonoses , Bem-Estar do Animal , Animais , Microbiologia de Alimentos , Doenças Transmitidas por Alimentos/epidemiologia , Doenças Transmitidas por Alimentos/etiologia , Humanos , Matemática , Medição de Risco , Fatores de Risco
11.
Risk Anal ; 24(1): 271-88, 2004 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-15028017

RESUMO

The streptogramin antimicrobial combination Quinupristin-Dalfopristin (QD) has been used in the United States since late 1999 to treat patients with vancomycin-resistant Enterococcus faecium (VREF) infections. Another streptogramin, virginiamycin (VM), is used as a growth promoter and therapeutic agent in farm animals in the United States and other countries. Many chickens test positive for QD-resistant E. faecium, raising concern that VM use in chickens might compromise QD effectiveness against VREF infections by promoting development of QD-resistant strains that can be transferred to human patients. Despite the potential importance of this threat to human health, quantifying the risk via traditional farm-to-fork modeling has proved extremely difficult. Enough key data (mainly on microbial loads at each stage) are lacking so that such modeling amounts to little more than choosing a set of assumptions to determine the answer. Yet, regulators cannot keep waiting for more data. Patients prescribed QD are typically severely ill, immunocompromised people for whom other treatment options have not readily been available. Thus, there is a pressing need for sound risk assessment methods to inform risk management decisions for VM/QD using currently available data. This article takes a new approach to the QD-VM risk modeling challenge. Recognizing that the usual farm-to-fork ("forward chaining") approach commonly used in antimicrobial risk assessment for food animals is unlikely to produce reliable results soon enough to be useful, we instead draw on ideas from traditional fault tree analysis ("backward chaining") to reverse the farm-to-fork process and start with readily available human data on VREF case loads and QD resistance rates. Combining these data with recent genogroup frequency data for humans, chickens, and other sources (Willems et al., 2000, 2001) allows us to quantify potential human health risks from VM in chickens in both the United States and Australia, two countries where regulatory action for VM is being considered. We present a risk simulation model, thoroughly grounded in data, that incorporates recent nosocomial transmission and genetic typing data. The model is used to estimate human QD treatment failures over the next five years with and without continued VM use in chickens. The quantitative estimates and probability distributions were implemented in a Monte Carlo simulation model for a five-year horizon beginning in the first quarter of 2002. In Australia, a Q1-2002 ban of virginiamycin would likely reduce average attributable treatment failures by 0.35 x 10(-3) cases, expected mortalities by 5.8 x 10(-5) deaths, and life years lost by 1.3 x 10(-3) for the entire population over five years. In the United States, where the number of cases of VRE is much higher, a 1Q-2002 ban on VM is predicted to reduce average attributable treatment failures by 1.8 cases in the entire population over five years; expected mortalities by 0.29 cases; and life years lost by 6.3 over a five-year period. The model shows that the theoretical statistical human health benefits of a VM ban range from zero to less than one statistical life saved in both Australia and the United States over the next five years and are rapidly decreasing. Sensitivity analyses indicate that this conclusion is robust to key data gaps and uncertainties, e.g., about the extent of resistance transfer from chickens to people.


Assuntos
Galinhas/microbiologia , Microbiologia de Alimentos , Virginiamicina/efeitos adversos , Criação de Animais Domésticos , Animais , Austrália/epidemiologia , Farmacorresistência Bacteriana , Enterococcus faecium/efeitos dos fármacos , Enterococcus faecium/isolamento & purificação , Contaminação de Alimentos/análise , Infecções por Bactérias Gram-Positivas/epidemiologia , Infecções por Bactérias Gram-Positivas/etiologia , Infecções por Bactérias Gram-Positivas/prevenção & controle , Humanos , Carne/análise , Carne/microbiologia , Modelos Biológicos , Medição de Risco , Gestão de Riscos , Estados Unidos/epidemiologia
12.
Hum Exp Toxicol ; 23(12): 579-600, 2004 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-15688986

RESUMO

Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial (and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.


Assuntos
Tomada de Decisões , Exposição Ambiental/legislação & jurisprudência , Saúde Pública , Saúde Ambiental/legislação & jurisprudência , Europa (Continente) , Humanos , Responsabilidade Legal , Saúde Pública/legislação & jurisprudência , Opinião Pública , Política Pública , Gestão de Riscos , Estados Unidos
13.
Environ Int ; 29(1): 1-19, 2003 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-12605931

RESUMO

What measures of uncertainty and what causal analysis can improve the management of potentially severe, irreversible or dreaded environmental outcomes? Environmental choices show that policies intended to be precautionary (such as adding MTBE to petrol) can cause unanticipated harm (by mobilizing benzene, a known leukemogen, in the ground water). Many environmental law principles set the boundaries of what should be done but do not provide an operational construct to answer this question. Those principles, ranging from the precautionary principle to protecting human health from a significant risk of material health impairment, do not explain how to make environmental management choices when incomplete, inconsistent and complex scientific evidence characterizes potentially adverse environmental outcomes. Rather, they pass the task to lower jurisdictions such as agencies or authorities. To achieve the goals of the principle, those who draft it must deal with scientific casual conjectures, partial knowledge and variable data. In this paper we specifically deal with the qualitative and quantitative aspects of the European Union's (EU) explanation of consistency and on the examination of scientific developments relevant to variability and uncertain data and causation. Managing hazards under the precautionary principle requires inductive, empirical methods of assessment. However, acting on a scientific conjecture can also be socially unfair, costly, and detrimental when applied to complex environmental choices. We describe a constructive framework rationally to meet the command of the precautionary principle using alternative measures of uncertainty and recent statistical methods of causal analysis. These measures and methods can bridge the gap between conjectured future irreversible or severe harm and scant scientific evidence, thus leading to more confident and resilient social choices. We review two sets of measures and computational systems to deal with uncertainty and link them to causation through inductive empirical methods such as Bayesian Networks. We conclude that primary legislation concerned with large uncertainties and potential severe or dreaded environmental outcomes can produce accurate and efficient choices. To do so, primary legislation should specifically indicate what measures can represent uncertainty and how to deal with uncertain causation thus providing guidance to an agency's rulemaking or to an authority's writing secondary legislation. A corollary conclusion with legal, scientific and probabilistic implications concerns how to update past information when the state of information increases because a failure to update can result in regretting past choices. Elected legislators have the democratic mandate to formulate precautionary principles and are accountable. To preserve that mandate, imbedding formal methods to represent uncertainty in the statutory language of the precautionary principle enhances subsequent judicial review of legislative actions. The framework that we propose also reduces the Balkanized views and interpretations of probabilities, possibilities, likelihood and uncertainty that exists in environmental decision-making.


Assuntos
Meio Ambiente , Modelos Estatísticos , Formulação de Políticas , Teorema de Bayes , Previsões , Humanos , Saúde Pública , Medição de Risco
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...