Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
Front Toxicol ; 5: 1234498, 2023.
Article in English | MEDLINE | ID: mdl-38026843

ABSTRACT

In silico toxicology protocols are meant to support computationally-based assessments using principles that ensure that results can be generated, recorded, communicated, archived, and then evaluated in a uniform, consistent, and reproducible manner. We investigated the availability of in silico models to predict the carcinogenic potential of pregabalin using the ten key characteristics of carcinogens as a framework for organizing mechanistic studies. Pregabalin is a single-species carcinogen producing only one type of tumor, hemangiosarcomas in mice via a nongenotoxic mechanism. The overall goal of this exercise is to test the ability of in silico models to predict nongenotoxic carcinogenicity with pregabalin as a case study. The established mode of action (MOA) of pregabalin is triggered by tissue hypoxia, leading to oxidative stress (KC5), chronic inflammation (KC6), and increased cell proliferation (KC10) of endothelial cells. Of these KCs, in silico models are available only for selected endpoints in KC5, limiting the usefulness of computational tools in prediction of pregabalin carcinogenicity. KC1 (electrophilicity), KC2 (genotoxicity), and KC8 (receptor-mediated effects), for which predictive in silico models exist, do not play a role in this mode of action. Confidence in the overall assessments is considered to be medium to high for KCs 1, 2, 5, 6, 7 (immune system effects), 8, and 10 (cell proliferation), largely due to the high-quality experimental data. In order to move away from dependence on animal data, development of reliable in silico models for prediction of oxidative stress, chronic inflammation, immunosuppression, and cell proliferation will be critical for the ability to predict nongenotoxic compound carcinogenicity.

2.
Lab Chip ; 20(7): 1177-1190, 2020 04 07.
Article in English | MEDLINE | ID: mdl-32129356

ABSTRACT

Drug-induced gastrointestinal toxicities (DI-GITs) are among the most common adverse events in clinical trials. High prevalence of DI-GIT has persisted among new drugs due in part to the lack of robust experimental tools to allow early detection or to guide optimization of safer molecules. Developing in vitro assays for the leading GI toxicities (nausea, vomiting, diarrhoea, constipation, and abdominal pain) will likely involve recapitulating complex physiological properties that require contributions from diverse cell/tissue types including epithelial, immune, microbiome, nerve, and muscle. While this stipulation may be beyond traditional 2D monocultures of intestinal cell lines, emerging 3D GI microtissues capture interactions between diverse cell and tissue types. These interactions give rise to microphysiologies fundamental to gut biology. For GI microtissues, organoid technology was the breakthrough that introduced intestinal stem cells with the capability of differentiating into each of the epithelial cell types and that self-organize into a multi-cellular tissue proxy with villus- and crypt-like domains. Recently, GI microtissues generated using miniaturized devices with microfluidic flow and cyclic peristaltic strain were shown to induce Caco2 cells to spontaneously differentiate into each of the principle intestinal epithelial cell types. Second generation models comprised of epithelial organoids or microtissues co-cultured with non-epithelial cell types can successfully reproduce cross-'tissue' functional interactions broadening the potential of these models to accurately study drug-induced toxicities. A new paradigm in which in vitro assays become an early part of GI safety assessment could be realized if microphysiological systems (MPS) are developed in alignment with drug-discovery needs. Herein, approaches for assessing GI toxicity of pharmaceuticals are reviewed and gaps are compared with capabilities of emerging GI microtissues (e.g., organoids, organ-on-a-chip, transwell systems) in order to provide perspective on the assay features needed for MPS models to be adopted for DI-GIT assessment.


Subject(s)
Microfluidics , Organoids , Caco-2 Cells , Humans , Intestinal Mucosa , Intestines
3.
Front Physiol ; 10: 1389, 2019.
Article in English | MEDLINE | ID: mdl-31780954

ABSTRACT

Frigid temperatures of the Southern Ocean are known to be an evolutionary driver in Antarctic fish. For example, many fish have reduced red blood cell (RBC) concentration to minimize vascular resistance. Via the oxygen-carrying protein hemoglobin, RBCs contain the vast majority of the body's iron, which is known to be a limiting nutrient in marine ecosystems. Since lower RBC levels also lead to reduced iron requirements, we hypothesize that low iron availability was an additional evolutionary driver of Antarctic fish speciation. Antarctic Icefish of the family Channichthyidae are known to have an extreme alteration of iron metabolism due to loss of RBCs and two iron-binding proteins, hemoglobin and myoglobin. Loss of hemoglobin is considered a maladaptive trait allowed by relaxation of predator selection since extreme adaptations are required to compensate for the loss of oxygen-carrying capacity. However, iron dependency minimization may have driven hemoglobin loss instead of a random evolutionary event. Given the variety of functions that hemoglobin serves in the endothelium, we suspected the protein corresponding to the 3' truncated Hbα fragment (Hbα-3'f) that was not genetically excluded by icefish may still be expressed as a protein. Using whole mount confocal microscopy, we show that Hbα-3'f is expressed in the vascular endothelium of icefish retina, suggesting this Hbα fragment may still serve an important role in the endothelium. These observations support a novel hypothesis that iron minimization could have influenced icefish speciation with the loss of the iron-binding portion of Hbα in Hbα-3'f, as well as hemoglobin ß and myoglobin.

4.
Toxicol Appl Pharmacol ; 334: 100-109, 2017 11 01.
Article in English | MEDLINE | ID: mdl-28893587

ABSTRACT

The contribution of animal testing in drug development has been widely debated and challenged. An industry-wide nonclinical to clinical translational database was created to determine how safety assessments in animal models translate to First-In-Human clinical risk. The blinded database was composed of 182 molecules and contained animal toxicology data coupled with clinical observations from phase I human studies. Animal and clinical data were categorized by organ system and correlations determined. The 2×2 contingency table (true positive, false positive, true negative, false negative) was used for statistical analysis. Sensitivity was 48% with a 43% positive predictive value (PPV). The nonhuman primate had the strongest performance in predicting adverse effects, especially for gastrointestinal and nervous system categories. When the same target organ was identified in both the rodent and nonrodent, the PPV increased. Specificity was 84% with an 86% negative predictive value (NPV). The beagle dog had the strongest performance in predicting an absence of clinical adverse effects. If no target organ toxicity was observed in either test species, the NPV increased. While nonclinical studies can demonstrate great value in the PPV for certain species and organ categories, the NPV was the stronger predictive performance measure across test species and target organs indicating that an absence of toxicity in animal studies strongly predicts a similar outcome in the clinic. These results support the current regulatory paradigm of animal testing in supporting safe entry to clinical trials and provide context for emerging alternate models.


Subject(s)
Databases, Factual , Translational Research, Biomedical , Animals , Drug Evaluation, Preclinical , Drug Industry , Drug-Related Side Effects and Adverse Reactions , Humans , Models, Animal , Risk Assessment
5.
Exp Biol Med (Maywood) ; 242(16): 1579-1585, 2017 10.
Article in English | MEDLINE | ID: mdl-28622731

ABSTRACT

Tissue chips are poised to deliver a paradigm shift in drug discovery. By emulating human physiology, these chips have the potential to increase the predictive power of preclinical modeling, which in turn will move the pharmaceutical industry closer to its aspiration of clinically relevant and ultimately animal-free drug discovery. Despite the tremendous science and innovation invested in these tissue chips, significant challenges remain to be addressed to enable their routine adoption into the industrial laboratory. This article describes the main steps that need to be taken and highlights key considerations in order to transform tissue chip technology from the hands of the innovators into those of the industrial scientists. Written by scientists from 13 pharmaceutical companies and partners at the National Institutes of Health, this article uniquely captures a consensus view on the progression strategy to facilitate and accelerate the adoption of this valuable technology. It concludes that success will be delivered by a partnership approach as well as a deep understanding of the context within which these chips will actually be used. Impact statement The rapid pace of scientific innovation in the tissue chip (TC) field requires a cohesive partnership between innovators and end users. Near term uptake of these human-relevant platforms will fill gaps in current capabilities for assessing important properties of disposition, efficacy and safety liabilities. Similarly, these platforms could support mechanistic studies which aim to resolve challenges later in development (e.g. assessing the human relevance of a liability identified in animal studies). Building confidence that novel capabilities of TCs can address real world challenges while they themselves are being developed will accelerate their application in the discovery and development of innovative medicines. This article outlines a strategic roadmap to unite innovators and end users thus making implementation smooth and rapid. With the collective contributions from multiple international pharmaceutical companies and partners at National Institutes of Health, this article should serve as an invaluable resource to the multi-disciplinary field of TC development.


Subject(s)
Drug Evaluation, Preclinical/methods , Microchip Analytical Procedures/methods , Microfluidics/methods , Drug Industry , Humans , Lab-On-A-Chip Devices
6.
Toxicol Pathol ; 45(3): 372-380, 2017 04.
Article in English | MEDLINE | ID: mdl-28351296

ABSTRACT

An Innovation and Quality (IQ) Consortium focus group conducted a cross-company survey to evaluate current practices and perceptions around the use of animal models of disease (AMDs) in nonclinical safety assessment of molecules in clinical development. The IQ Consortium group is an organization of pharmaceutical and biotechnology companies with the mission of advancing science and technology. The survey queried the utilization of AMDs during drug discovery in which drug candidates are evaluated in efficacy models and limited short-duration non-Good Laboratory Practices (GLP) toxicology testing and during drug development in which drug candidates are evaluated in GLP toxicology studies. The survey determined that the majority of companies used AMDs during drug discovery primarily as a means for proactively assessing potential nonclinical safety issues prior to the conduct of toxicology studies, followed closely by the use of AMDs to better understand toxicities associated with exaggerated pharmacology in traditional toxicology models or to derisk issues when the target is only expressed in the disease state. In contrast, the survey results indicated that the use of AMDs in development is infrequent, being used primarily to investigate nonclinical safety issues associated with targets expressed only in disease states and/or in response to requests from global regulatory authorities.


Subject(s)
Disease Models, Animal , Drug Evaluation, Preclinical/methods , Drug Industry , Animals , Decision Making, Organizational , Drug Evaluation, Preclinical/statistics & numerical data , Drug Industry/legislation & jurisprudence , Drug Industry/organization & administration , Drug Industry/standards , Government Regulation , Surveys and Questionnaires
7.
Toxicol Sci ; 155(1): 22-31, 2017 01.
Article in English | MEDLINE | ID: mdl-27780885

ABSTRACT

Future Tox III, a Society of Toxicology Contemporary Concepts in Toxicology workshop, was held in November 2015. Building upon Future Tox I and II, Future Tox III was focused on developing the high throughput risk assessment paradigm and taking the science of in vitro data and in silico models forward to explore the question-what progress is being made to address challenges in implementing the emerging big-data toolbox for risk assessment and regulatory decision-making. This article reports on the outcome of the workshop including 2 examples of where advancements in predictive toxicology approaches are being applied within Federal agencies, where opportunities remain within the exposome and AOP domains, and how collectively the toxicology community across multiple sectors can continue to bridge the translation from historical approaches to Tox21 implementation relative to risk assessment and regulatory decision-making.


Subject(s)
Toxicology , Animals , Humans , In Vitro Techniques , Toxicity Tests
9.
Toxicol Sci ; 143(2): 256-67, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25628403

ABSTRACT

FutureTox II, a Society of Toxicology Contemporary Concepts in Toxicology workshop, was held in January, 2014. The meeting goals were to review and discuss the state of the science in toxicology in the context of implementing the NRC 21st century vision of predicting in vivo responses from in vitro and in silico data, and to define the goals for the future. Presentations and discussions were held on priority concerns such as predicting and modeling of metabolism, cell growth and differentiation, effects on sensitive subpopulations, and integrating data into risk assessment. Emerging trends in technologies such as stem cell-derived human cells, 3D organotypic culture models, mathematical modeling of cellular processes and morphogenesis, adverse outcome pathway development, and high-content imaging of in vivo systems were discussed. Although advances in moving towards an in vitro/in silico based risk assessment paradigm were apparent, knowledge gaps in these areas and limitations of technologies were identified. Specific recommendations were made for future directions and research needs in the areas of hepatotoxicity, cancer prediction, developmental toxicity, and regulatory toxicology.


Subject(s)
Computer Simulation , In Vitro Techniques , Toxicology/methods , Toxicology/trends , Congresses as Topic , Predictive Value of Tests , Societies, Scientific , United States
10.
Toxicol Sci ; 126(2): 291-7, 2012 Apr.
Article in English | MEDLINE | ID: mdl-22262567

ABSTRACT

The practice of toxicology is changing rapidly, as demonstrated by the response to the 2007 NRC report on "Toxicity Testing in the 21(st) Century." New assays are being developed to replace animal testing; yet the use of data from these assays in decision making is not clear. A Health and Environmental Sciences Institute committee held a May 2011 workshop to discuss approaches to identifying adverse effects in the context of the NRC report. Scientists from industry, government, academia, and NGOs discussed two case studies and explored how information from new, high data content assays developed for screening can be used to differentiate adverse effects from adaptive responses. The terms "adverse effect" and "adaptive response" were defined, as well as two new terms, the relevant pathways of toxicological concern (RPTCs) and relevant responses for regulation (RRRs). RPTCs are biochemical pathways associated with adverse events and need to be elucidated before they are used in regulatory decision making. RRRs are endpoints that are the basis for risk assessment and may or may not be at the level of pathways. Workshop participants discussed the criteria for determining whether, at the RPTC level, an effect is potentially adverse or potentially indicative of adaptability, and how the use of prototypical, data-rich compounds could lead to a greater understanding of RPTCs and their use as RRRs. Also discussed was the use of RPTCs in a weight-of-evidence approach to risk assessment. Inclusion of data at this level could decrease uncertainty in risk assessments but will require the use of detailed dosimetry and consideration of exposure context and the time and dose continuum to yield scientifically based decisions. The results of this project point to the need for an extensive effort to characterize RPTCs and their use in risk assessment to make the vision of the 2007 NRC report a reality.


Subject(s)
Toxicology , History, 21st Century , Risk Assessment , Toxicity Tests
SELECTION OF CITATIONS
SEARCH DETAIL
...