Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
Article in English | MEDLINE | ID: mdl-38546802

ABSTRACT

BACKGROUND: Robust solutions to global, national, and regional burdens of communicable and non-communicable diseases, particularly related to diet, demand interdisciplinary or transdisciplinary collaborations to effectively inform risk analysis and policy decisions. OBJECTIVE: U.S. outbreak data for 2005-2020 from all transmission sources were analyzed for trends in the burden of infectious disease and foodborne outbreaks. METHODS: Outbreak data from 58 Microsoft Access® data tables were structured using systematic queries and pivot tables for analysis by transmission source, pathogen, and date. Trends were examined using graphical representations, smoothing splines, Spearman's rho rank correlations, and non-parametric testing for trend. Hazard Identification was conducted based on the number and severity of illnesses. RESULTS: The evidence does not support increasing trends in the burden of infectious foodborne disease, though strongly increasing trends were observed for other transmission sources. Morbidity and mortality were dominated by person-to-person transmission; foodborne and other transmission sources accounted for small portions of the disease burden. Foods representing the greatest hazards associated with the four major foodborne bacterial diseases were identified. Fatal foodborne disease was dominated by fruits, vegetables, peanut butter, and pasteurized dairy. CONCLUSION: The available evidence conflicts with assumptions of zero risk for pasteurized milk and increasing trends in the burden of illness for raw milk. For future evidence-based risk management, transdisciplinary risk analysis methodologies are essential to balance both communicable and non-communicable diseases and both food safety and food security, considering scientific, sustainable, economic, cultural, social, and political factors to support health and wellness for humans and ecosystems.

2.
Risk Anal ; 37(5): 943-957, 2017 05.
Article in English | MEDLINE | ID: mdl-28121020

ABSTRACT

Survival models are developed to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple-dose data set to predict the probability of death through specifying functions of dose response and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) is an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed use different underlying dose-response functions and use the assumption that, in a multiple-dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this article. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit data sets. More accurate survival models depend upon future development of dose-response data sets specifically designed to assess potential multiple-dose effects on response and time-to-response. The process used in this article to develop the best-fitting survival model for exposure of rabbits to multiple aerosol doses of B. anthracis spores should have broad applicability to other host-pathogen systems and dosing schedules because the empirical modeling approach is based upon pathogen-specific empirically-derived parameters.


Subject(s)
Aerosols/analysis , Air Pollutants/analysis , Bacillus anthracis , Risk Assessment/methods , Algorithms , Animals , Anthrax , Disease Models, Animal , Environmental Monitoring/methods , Inhalation Exposure , Models, Statistical , Rabbits , Spores, Bacterial
3.
Biosecur Bioterror ; 6(2): 147-60, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18582166

ABSTRACT

The notion that inhalation of a single Bacillus anthracis spore is fatal has become entrenched nearly to the point of urban legend, in part because of incomplete articulation of the scientific basis for microbial risk assessment, particularly dose-response assessment. Risk analysis (ie, risk assessment, risk communication, risk management) necessitates transparency: distinguishing scientific facts, hypotheses, judgments, biases in interpretations, and potential misinformation. The difficulty in achieving transparency for biothreat risk is magnified by misinformation and poor characterization of both dose-response relationships and the driving mechanisms that cause susceptibility or resistance to disease progression. Regrettably, this entrenchment unnecessarily restricts preparedness planning to a single response scenario: decontaminate until no spores are detectable in air, water, or on surfaces-essentially forcing a zero-tolerance policy inconsistent with the biology of anthrax. We present evidence about inhalation anthrax dose-response relationships, including reports from multiple studies documenting exposures insufficient to cause inhalation anthrax in laboratory animals and humans. The emphasis of the article is clarification about what is known from objective scientific evidence for doses of anthrax spores associated with survival and mortality. From this knowledge base, we discuss the need for future applications of more formal risk analysis processes to guide development of alternative non-zero criteria or standards based on science to inform preparedness planning and other risk management activities.


Subject(s)
Anthrax/microbiology , Anthrax/mortality , Bacillus anthracis/pathogenicity , Inhalation Exposure , Spores, Bacterial/pathogenicity , Animals , Anthrax/epidemiology , Anthrax/prevention & control , Global Health , Humans , Risk Assessment
4.
Vet Microbiol ; 131(3-4): 215-28, 2008 Oct 15.
Article in English | MEDLINE | ID: mdl-18479846

ABSTRACT

As laying hens age, egg production and quality decreases. Egg producers can impose an induced molt on older hens that results in increased egg productivity and decreased hen mortality compared with non-molted hens of the same age. This review discusses the effect of induced molting by feed removal on immune parameters, Salmonella enterica serovar Enteritidis (SE) invasion and subsequent production of SE-contaminated eggs. Experimental oral infections with SE show molted hens are more susceptible to SE infection and produce more SE-contaminated eggs in the first few weeks post-molt compared with pre-molt egg production. In addition, it appears that molted hens are more likely to disseminate SE into their environment. Molted hens are more susceptible to SE infection by contact exposure to experimentally infected hens; thus, transmission of SE among molted hens could be more rapid than non-molted birds. Histological examination of the gastrointestinal tracts of molted SE-infected hens revealed more frequent and severe intestinal mucosal lesions compared with non-molted SE-infected hens. These data suggest that induced molting by feed deprivation alters the normal asymptomatic host-pathogen relationship. Published data suggest the highest proportion of SE-positive eggs is produced within 1-5 weeks post-molt and decreases sharply by 6-10 weeks and dissipates to the background level for non-molted hens by 11-20 weeks. Appropriate treatment measures of eggs produced in the fist 5 weeks post-molting may decrease the risk of foodborne infections to humans.


Subject(s)
Chickens/microbiology , Chickens/physiology , Food Deprivation , Molting , Ovum/microbiology , Salmonella enteritidis/isolation & purification , Animals , Female
5.
Foodborne Pathog Dis ; 5(1): 59-68, 2008 Feb.
Article in English | MEDLINE | ID: mdl-18260816

ABSTRACT

As part of the process for developing risk-based performance standards for egg product processing, the United States Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS) undertook a quantitative microbial risk assessment for Salmonella spp. in pasteurized egg products. The assessment was designed to assist risk managers in evaluating egg handling and pasteurization performance standards for reducing the likelihood of Salmonella in pasteurized egg products and the subsequent risk to human health. The following seven pasteurized liquid egg product formulations were included in the risk assessment model, with the value in parentheses indicating the estimated annual number of human illnesses from Salmonella from each: egg white (2636), whole egg (1763), egg yolk (708), whole egg with 10% salt (407), whole egg with 10% sugar (0), egg yolk with 10% salt (11), and egg yolk with 10% sugar (0). Increased levels of pasteurization were predicted to be highly effective mitigations for reducing the number of illnesses. For example, if all egg white products were pasteurized for a 6-log(10) reduction of Salmonella, the estimated annual number of illnesses from these products would be reduced from 2636 to 270. The risk assessment identified several data gaps and research needs, including a quantitative study of cross-contamination during egg product processing and characterization of egg storage times and temperatures (i) on farms and in homes, (ii) for eggs produced off-line, and (iii) for egg products at retail. Pasteurized egg products are a relatively safe food; however, findings from this study suggest increased pasteurization can make them safer.


Subject(s)
Eggs/microbiology , Food Contamination/analysis , Food Handling/methods , Risk Assessment , Salmonella/growth & development , Animals , Chickens , Colony Count, Microbial , Consumer Product Safety , Egg White/microbiology , Egg Yolk/microbiology , Food Preservation/methods , Hot Temperature , Humans , Time Factors , United States , United States Department of Agriculture
6.
Foodborne Pathog Dis ; 3(4): 403-12, 2006.
Article in English | MEDLINE | ID: mdl-17199522

ABSTRACT

In 1998, the United States Department of Agriculture's Food Safety and Inspection Service (FSIS) and the Food and Drug Administration completed a risk assessment that indicated multiple interventions along the farm-to-table chain were needed to reduce the risk of human illness from Salmonella Enteritidis in shell eggs. Based on newly available data and improved modeling techniques, FSIS completed an updated risk assessment to examine the effect of pasteurization and refrigeration on reducing human illnesses from S. Enteritidis in shell eggs. The risk assessment model was written in Visual Basic for Applications (Microsoft, Redmond, WA) and run using Monte Carlo methods. The model estimated that if all shell eggs produced in the United States were pasteurized for a 3-log10 reduction of S. Enteritidis, the annual number of illnesses from S. Enteritidis in eggs would decrease from approximately 130,000 to 40,000. Pasteurization for a 5-log10 reduction of S. Enteritidis was estimated to reduce the annual number of illnesses to 19,000. The model also estimated that if all eggs produced in the United States were stored and held at 7.2 degrees C within 12 hours of lay, the annual number of illnesses from S. Enteritidis in eggs would decrease from 130,000 to 28,000. As a result, rapid cooling and pasteurization of shell eggs were predicted to be highly effective mitigations for reducing illnesses from consumption of S. Enteritidis in shell eggs.


Subject(s)
Consumer Product Safety , Eggs/microbiology , Food Contamination/analysis , Risk Assessment , Salmonella Food Poisoning/epidemiology , Salmonella enteritidis/isolation & purification , Animals , Chickens , Eggs/standards , Food Inspection , Humans , Monte Carlo Method , Salmonella Food Poisoning/etiology , United States/epidemiology
7.
J Toxicol Environ Health A ; 67(8-10): 667-85, 2004.
Article in English | MEDLINE | ID: mdl-15192861

ABSTRACT

In order to estimate the risk or probability of adverse events in risk assessment, it is necessary to identify the important variables that contribute to the risk and provide descriptions of distributions of these variables for well-defined populations. One component of modeling dose response that can create uncertainty is the inherent genetic variability among pathogenic bacteria. For many microbial risk assessments, the "default" assumption used for dose response does not account for strain or serotype variability in pathogenicity and virulence, other than perhaps, recognizing the existence of avirulent strains. However, an examination of data sets from human clinical trials in which Salmonella spp. and Campylobacter jejuni strains were administered reveals significant strain differences. This article discusses the evidence for strain variability and concludes that more biologically based alternatives are necessary to replace the default assumptions commonly used in microbial risk assessment, specifically regarding strain variability.


Subject(s)
Campylobacter Infections/microbiology , Campylobacter jejuni/classification , Food Microbiology , Risk Assessment , Salmonella Food Poisoning/microbiology , Salmonella/classification , Campylobacter jejuni/pathogenicity , Humans , Salmonella/pathogenicity
9.
Risk Anal ; 23(1): 215-28, 2003 Feb.
Article in English | MEDLINE | ID: mdl-12635734

ABSTRACT

A novel extension of traditional growth models for exposure assessment of food-borne microbial pathogens was developed to address the complex interactions of competing microbial populations in foods. Scenarios were designed for baseline refrigeration and mild abuse of servings of chicken broiler and ground beef Our approach employed high-quality data for microbiology of foods at production, refrigerated storage temperatures, and growth kinetics of microbial populations in culture media. Simple parallel models were developed for exponential growth of multiple pathogens and the abundant and ubiquitous nonpathogenic indigenous microbiota. Monte Carlo simulations were run for unconstrained growth and growth with the density-dependent constraint based on the "Jameson effect," inhibition of pathogen growth when the indigenous microbiota reached 10(9) counts per serving. The modes for unconstrained growth of the indigenous microbiota were 10(8), 10(10), and 10(11) counts per serving for chicken broilers, and 10(7), 10(9) and 10(11) counts per serving for ground beef at respective sites for backroom, meat case, and home refrigeration. Contamination rates and likelihoods of reaching temperatures supporting growth of the pathogens in the baseline refrigeration scenario were rare events. The unconstrained exponential growth models appeared to overestimate L. monocytogenes growth maxima for the baseline refrigeration scenario by 1500-7233% (10(6)-10(7) counts/serving) when the inhibitory effects of the indigenous microbiota are ignored. The extreme tails of the distributions for the constrained models appeared to overestimate growth maxima 110% (10(4)-10(5) counts/serving) for Salmonella spp. and 108% (6 x 10(3) counts/serving) for E. coli O157:H7 relative to the extremes of the unconstrained models. The approach of incorporating parallel models for pathogens and the indigenous microbiota into exposure assessment modeling motivates the design of validation studies to test the modeling assumptions, consistent with the analytical-deliberative process of risk analysis.


Subject(s)
Food Microbiology , Meat/microbiology , Poultry/microbiology , Animals , Colony Count, Microbial , Escherichia coli O157/growth & development , Humans , Listeria monocytogenes/growth & development , Models, Biological , Refrigeration , Risk Assessment , Safety , Salmonella/growth & development
SELECTION OF CITATIONS
SEARCH DETAIL
...