Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
1.
J Food Prot ; 78(2): 240-7, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25710137

ABSTRACT

A field trial in Salinas Valley, California, was conducted during July 2011 to quantify the microbial load that transfers from wildlife feces onto nearby lettuce during foliar irrigation. Romaine lettuce was grown using standard commercial practices and irrigated using an impact sprinkler design. Five grams of rabbit feces was spiked with 1.29 × 10(8) CFU of Escherichia coli O157:H7 and placed - 3, - 2, and - 1 days and immediately before a 2-h irrigation event. Immediately after irrigation, 168 heads of lettuce ranging from ca. 23 to 69 cm (from 9 to 27 in.) from the fecal deposits were collected, and the concentration of E. coli O157:H7 was determined. Thirty-eight percent of the collected lettuce heads had detectable E. coli O157:H7, ranging from 1 MPN to 2.30 × 10(5) MPN per head and a mean concentration of 7.37 × 10(3) MPN per head. Based on this weighted arithmetic mean concentration of 7.37 × 10(3) MPN of bacteria per positive head, only 0.00573% of the original 5 g of scat with its mean load of 1.29 × 10(8) CFU was transferred to the positive heads of lettuce. Bacterial contamination was limited to the outer leaves of lettuce. In addition, factors associated with the transfer of E. coli O157:H7 from scat to lettuce were distance between the scat and lettuce, age of scat before irrigation, and mean distance between scat and the irrigation sprinkler heads. This study quantified the transfer coefficient between scat and adjacent heads of lettuce as a function of irrigation. The data can be used to populate a quantitative produce risk assessment model for E. coli O157:H7 in romaine lettuce to inform risk management and food safety policies.


Subject(s)
Escherichia coli O157/isolation & purification , Feces/microbiology , Food Contamination/analysis , Lactuca/microbiology , Animals , Animals, Wild , California , Colony Count, Microbial , Consumer Product Safety , Food Microbiology , Food Safety , Plant Leaves/microbiology
2.
Int J Food Microbiol ; 133(1-2): 38-47, 2009 Jul 31.
Article in English | MEDLINE | ID: mdl-19450890

ABSTRACT

It is widely recognized that the human noroviruses (HuNoV) are responsible for a large proportion of the world's foodborne disease burden. These viruses are transmitted by human fecal contamination and frequently make their way into foods because of poor personal hygiene of infected food handlers. This paper describes a probabilistic exposure assessment which models the dynamics of the transmission of HuNoV in the retail food preparation environment. Key inputs included degree of fecal shedding, hand hygiene behaviors, efficacy of virus removal and/or inactivation, and transferability of virus between surfaces. The model has a temporal dimension allowing contamination to be estimated as a function of time over the simulation period. Sensitivity and what-if scenario analyses were applied to identify the most important model inputs and evaluate potential mitigation strategies. The key inputs affecting estimates of the number of infectious viruses present in contaminated food servings, given the current model structure and assumptions, were as follows: mass of feces on hands (m(FH)), concentration of virus in feces (nv(CF)), number of bathroom visits, degree of gloving compliance (p(WG)), hand-washing efficiency (HW(eff)), and hand-washing compliance (p(HW)). The model suggests that gloving and hand-washing compliance are most effective in controlling contamination of food products when practiced simultaneously. Moreover, the bathroom environment was identified as a major reservoir of HuNoV, even in the absence of an ill individual on site. This mathematical approach to modeling the transmission of gastrointestinal viruses should facilitate comparison of potential mitigations aimed at reducing the transmission of foodborne viruses.


Subject(s)
Caliciviridae Infections/transmission , Food Contamination/prevention & control , Food Handling/standards , Foodborne Diseases/prevention & control , Gastroenteritis/prevention & control , Norovirus , Caliciviridae Infections/prevention & control , Disease Reservoirs/virology , Feces/virology , Food Contamination/analysis , Foodborne Diseases/virology , Gastroenteritis/virology , Hand Disinfection/standards , Humans , Hygiene/standards , Models, Theoretical
3.
J Food Prot ; 70(2): 363-72, 2007 Feb.
Article in English | MEDLINE | ID: mdl-17340870

ABSTRACT

This study describes an analytical framework that permits quantitative consideration of variability and uncertainty in microbial hazard characterization. Second-order modeling that used two-dimensional Monte Carlo simulation and stratification into homogeneous population subgroups was applied to integrate uncertainty and variability. Specifically, the bootstrap method was used to simulate sampling error due to the limited sample size in microbial dose-response modeling. A data set from human feeding trials with Campylobacter jejuni was fitted to the log-logistic dose-response model, and results from the analysis of FoodNet surveillance data provided further information on variability and uncertainty in Campylobacter susceptibility due to the effect of age. Results of our analyses indicate that uncertainty associated with dose-response modeling has a dominating influence on the analytical outcome. In contrast, inclusion of the age factor has a limited impact. While the advocacy of more closely modeling variability in hazard characterization is warranted, the characterization of key sources of uncertainties and their consistent propagation throughout a microbial risk assessment actually appear of greater importance.


Subject(s)
Campylobacter jejuni/pathogenicity , Food Contamination/analysis , Food Microbiology , Models, Biological , Risk Assessment , Adolescent , Adult , Animals , Child , Child, Preschool , Colony Count, Microbial , Consumer Product Safety , Female , Food Contamination/statistics & numerical data , Humans , Infant , Logistic Models , Male , Middle Aged , Monte Carlo Method , Predictive Value of Tests
4.
Risk Anal ; 26(3): 753-68, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16834632

ABSTRACT

We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.


Subject(s)
Food Microbiology , Risk Assessment/methods , Salmonella Food Poisoning/etiology , Salmonella Food Poisoning/prevention & control , Salmonella Infections/etiology , Salmonella enteritidis/pathogenicity , Bacteriophage Typing , Consumer Product Safety , Cooking , Eggs , Food , Food Handling , Models, Statistical , Risk , Risk Factors
5.
J Food Prot ; 69(3): 609-18, 2006 Mar.
Article in English | MEDLINE | ID: mdl-16541693

ABSTRACT

Microbial food safety process risk models are simplifications of the real world that help risk managers in their efforts to mitigate food safety risks. An important tool in these risk assessment endeavors is sensitivity analysis, a systematic method used to quantify the effect of changes in input variables on model outputs. In this study, a novel sensitivity analysis method called classification and regression trees was applied to safety risk assessment with the use of portions of the Slaughter Module and Preparation Module of the E. coli O157:H7 microbial food safety process risk as an example. Specifically, the classification and regression trees sensitivity analysis method was evaluated on the basis of its ability to address typical characteristics of microbial food safety process risk models such as nonlinearities, interaction, thresholds, and categorical inputs. Moreover, this method was evaluated with respect to identification of high exposure scenarios and corresponding key inputs and critical limits. The results from the classification and regression trees analysis applied to the Slaughter Module confirmed that the process of chilling carcasses is a critical control point. The method identified a cutoff value of a 2.2-log increase in the number of organisms during chilling as a critical value above which high levels of contamination would be expected. When classification and regression trees analysis was applied to the cooking effects part of the Preparation Module, cooking temperature was found to be the most sensitive input, with precooking treatment (i.e., raw product storage conditions) ranked second in importance. This case study demonstrates the capabilities of classification and regression trees analysis as an alternative to other statistically based sensitivity analysis methods, and one that can readily address specific characteristics that are common in microbial food safety process risk models.


Subject(s)
Consumer Product Safety , Escherichia coli O157/growth & development , Food Handling/methods , Risk Assessment/methods , Food Microbiology , Food-Processing Industry/standards , Humans , Models, Biological , Regression Analysis , Risk Assessment/standards , Sensitivity and Specificity , Temperature , Time Factors
6.
J Expo Sci Environ Epidemiol ; 16(6): 491-506, 2006 Nov.
Article in English | MEDLINE | ID: mdl-16519411

ABSTRACT

Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.


Subject(s)
Environmental Exposure/analysis , Environmental Pollutants/analysis , Models, Theoretical , Pesticides/analysis , Stochastic Processes , Analysis of Variance , Fourier Analysis , Humans , Risk Assessment , Sampling Studies , Sensitivity and Specificity , Statistics, Nonparametric , United States , United States Environmental Protection Agency/standards
7.
Risk Anal ; 26(1): 89-103, 2006 Feb.
Article in English | MEDLINE | ID: mdl-16492183

ABSTRACT

The foodborne disease risk associated with the pathogen Listeria monocytogenes has been the subject of recent efforts in quantitative microbial risk assessment. Building upon one of these efforts undertaken jointly by the U.S. Food and Drug Administration and the U.S. Department of Agriculture (USDA), the purpose of this work was to expand on the consumer phase of the risk assessment to focus on handling practices in the home. One-dimensional Monte Carlo simulation was used to model variability in growth and cross-contamination of L. monocytogenes during food storage and preparation of deli meats. Simulations approximated that 0.3% of the servings were contaminated with >10(4) CFU/g of L. monocytogenes at the time of consumption. The estimated mean risk associated with the consumption of deli meats for the intermediate-age population was approximately 7 deaths per 10(11) servings. Food handling in homes increased the estimated mean mortality by 10(6)-fold. Of all the home food-handling practices modeled, inadequate storage, particularly refrigeration temperatures, provided the greatest contribution to increased risk. The impact of cross-contamination in the home was considerably less. Adherence to USDA Food Safety and Inspection Service recommendations for consumer handling of ready-to-eat foods substantially reduces the risk of listeriosis.


Subject(s)
Food Microbiology , Listeria monocytogenes/isolation & purification , Listeria monocytogenes/pathogenicity , Meat/microbiology , Animals , Food Handling , Foodborne Diseases/etiology , Foodborne Diseases/mortality , Foodborne Diseases/prevention & control , Humans , Listeria monocytogenes/growth & development , Listeriosis/etiology , Listeriosis/mortality , Listeriosis/prevention & control , Models, Biological , Refrigeration , Risk Assessment , United States/epidemiology
8.
Risk Anal ; 25(6): 1511-29, 2005 Dec.
Article in English | MEDLINE | ID: mdl-16506979

ABSTRACT

This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.


Subject(s)
Food Microbiology , Models, Statistical , Risk Assessment/statistics & numerical data , Analysis of Variance , Animals , Cattle , Escherichia coli O157/isolation & purification , Escherichia coli O157/pathogenicity , Meat/microbiology , Risk Management , Safety , Sensitivity and Specificity
SELECTION OF CITATIONS
SEARCH DETAIL
...