Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 35
Filter
1.
Int J Food Microbiol ; 378: 109801, 2022 Oct 02.
Article in English | MEDLINE | ID: mdl-35749912

ABSTRACT

The United States Department of Agriculture's Food Safety and Inspection Service implemented Salmonella performance standards for establishments producing chicken parts in 2016. The standards were chosen based on the assumption that a 30 % reduction in the occurrence of Salmonella-contaminated chicken parts samples (i.e., legs, breasts or wings) would result following implementation of the performance standard program. The derivation of the performance standards was based on data collected prior to the implementation of the standards and in the intervening years, so overall changes in the Salmonella contamination of this product can be assessed. This study presents a historical review of changes in Salmonella contamination on chicken parts as these changes relate to the performance standard. The analysis demonstrates that the reduction in Salmonella contaminated chicken parts samples was more than 75 %, so the FSIS risk assessment significantly underestimated the actual reduction in Salmonella contamination. An analysis of chicken parts samples collected at retail demonstrates reductions of a similar magnitude. Changes in the characteristics of Salmonella contamination that are potentially relevant to the occurrence or severity of human illness, such as seasonal changes in contamination, the composition of serotypes and changes in antimicrobial resistance, are also assessed. Small but significant seasonal increases in contamination were observed, with the peaks occurring in late winter rather than the more traditional late summer peak. Rapid changes in both the five most common serotypes and antimicrobial resistance patterns were also observed.


Subject(s)
Anti-Infective Agents , Chickens , Animals , Anti-Infective Agents/analysis , Food Contamination/analysis , Food Contamination/prevention & control , Food Microbiology , Humans , Meat/analysis , Salmonella , United States
2.
Epidemiol Infect ; 150: e126, 2022 06 15.
Article in English | MEDLINE | ID: mdl-35703081

ABSTRACT

Using data from 20 years of Salmonella foodborne outbreaks, this study investigates significant trends in the proportion of outbreaks associated with 12 broad commodity groups. Outbreak counts are demonstrated to have a stronger trend signal than outbreak illness counts. The number of outbreaks with an identified food vehicle increased significantly between 1998 and 2000. This was followed by a 10-year period when the number of outbreaks decreased. The number of outbreaks increased significantly between 2010 and 2014 and then remained unchanged for the remainder of the study period. During the period of 1998 through 2017, the proportion of outbreaks for three commodities groups, consisting of eggs, pork and seeded vegetables, changed significantly. No significant changes were observed in the remaining nine commodity groups. Simple approximations are derived to highlight the effect of dependencies between outbreak proportions and a consumption analysis for meat and poultry is used to enhance the limited interpretability of the changes in these proportions. Given commodity-specific approaches to verifying food safety and promoting pathogen reduction, regulatory agencies benefit from analyses that elucidate illness trends attributable to the products under their jurisdiction. Results from this trend analysis can be used to inform the development and assessment of new pathogen reduction programmes in the United States.


Subject(s)
Salmonella Food Poisoning , Disease Outbreaks , Eggs , Food Safety , Humans , Salmonella , Salmonella Food Poisoning/epidemiology , United States/epidemiology
3.
Int J Food Microbiol ; 369: 109616, 2022 May 16.
Article in English | MEDLINE | ID: mdl-35306255

ABSTRACT

In order for the United States Department of Agriculture's (USDA) Food Safety and Inspection Service (FSIS) to make an equivalence determination for a foreign meat, poultry or egg products inspection procedure that differs from FSIS inspection procedures (an Individual Sanitary Measure or ISM), a country must demonstrate objectively that its food safety inspection system provides the same level of public health protection as the FSIS inspection system. To evaluate microbiological testing data that such countries may submit to this end, we present a possible risk metric to inform FSIS's assessment of whether products produced under an alternative inspection system in another country pose no greater consumer risk of foodborne illness than products produced under FSIS inspection. This metric requires evaluation of prevalence estimates of pathogen occurrence in products for the foreign country and the U.S. and determining what constitutes an unacceptable deviance of another country's prevalence from the U.S. prevalence, i.e., the margin of equivalence. We define the margin of equivalence as a multiple of the standard error of the U.S. prevalence estimate. Minimizing the margin of equivalence ensures the maximum public health protection for U.S. consumers, but an optimum choice must also avoid undue burden for quantitative data from alternative inspection systems in the foreign country. Across a wide range of U.S. prevalence levels and sample sizes, we determine margin of equivalence values that provide high confidence in conclusions as to whether or not the country's product poses no greater risk of foodborne illness from microbiological pathogens. These margins of equivalence can be used to inform FSIS's equivalence determination for an ISM request from a foreign country. Illustrative examples are used to support this definition of margin of equivalence. This approach is consistent with the World Trade Organization's concept of risk equivalence and is transparent and practical to apply in situations when FSIS makes an equivalence determination for an ISM requested by a foreign country.


Subject(s)
Food Inspection , Foodborne Diseases , Commerce , Food Contamination/analysis , Food Inspection/methods , Food Microbiology , Foodborne Diseases/epidemiology , Humans , Internationality , Meat/microbiology , United States
4.
Int J Food Microbiol ; 342: 109075, 2021 Mar 16.
Article in English | MEDLINE | ID: mdl-33550153

ABSTRACT

In many countries campylobacteriosis ranks as one of the most frequently reported foodborne illnesses and poultry is the commodity that is most often associated with these illnesses. Nevertheless, efforts to reduce the occurrence of pathogen contamination on poultry are often more focused on Salmonella. While some control measures are pathogen specific, such as pre-harvest vaccination for Salmonella, improvements in sanitary dressing and interventions applied during the slaughter process can be effective against all forms of microbial contamination. To investigate the potential effectiveness of these non-specific pathogen reduction strategies in the United States, it is helpful to assess if, and by how much, Campylobacter contamination of chicken meat has changed across time. This study assesses change considering data collected in both slaughter and retail establishments and comparing observed trends in contamination with trends in human surveillance data. The results support the assertion that substantial reductions in Campylobacter contamination of chicken meat in the late 1990s and early 2000s contributed to a reduction in the human case rate of campylobacteriosis. Further reductions in chicken meat contamination between 2013 and 2018 are more difficult to associate with trends in human illnesses, with one contributing factor being the inclusion of culture independent diagnostic test results in the official case counts during that time. Other contributing factors are discussed.


Subject(s)
Campylobacter Infections/epidemiology , Campylobacter/isolation & purification , Food Contamination/statistics & numerical data , Foodborne Diseases/epidemiology , Poultry Products/microbiology , Animals , Campylobacter Infections/prevention & control , Chickens , Food Contamination/prevention & control , Food Microbiology , Foodborne Diseases/microbiology , Foodborne Diseases/prevention & control , Humans , United States/epidemiology
5.
J Food Prot ; 83(10): 1707-1717, 2020 Oct 01.
Article in English | MEDLINE | ID: mdl-32421826

ABSTRACT

ABSTRACT: In 1996, the Food Safety and Inspection Service (FSIS) published its pathogen reduction and hazard analysis and critical control point (PR-HACCP) rule. The intention of this program was to reduce microbial contamination on meat, poultry, and egg products. The program was implemented in stages between January 1998 and January 2000, with sampling for Escherichia coli O157:H7 and/or Salmonella in large production establishments beginning in 1998. As the PR-HACCP program begins its third decade, it is reasonable to question whether there have been reductions in the frequency of pathogen-contaminated meat and poultry products reaching consumers. This study summarizes the results for over 650,000 samples collected by FSIS between 2000 and 2018 in slaughter and processing establishments across the United States and compares these results to the roughly 100,000 retail samples collected by the U.S. Food and Drug Administration between 2002 and 2017. The data demonstrate that there has been an overall reduction in the occurrence of Salmonella on meat and poultry products, but the direction and magnitude of change has not been consistent over time or across commodities. Although the available data do not support the identification of causal factors for the observed changes, a historical review of the timing of various factors and policy decisions generates potential hypotheses for the observed changes.


Subject(s)
Meat Products , Poultry , Animals , Consumer Product Safety , Food Contamination/analysis , Food Inspection , Food Microbiology , Hazard Analysis and Critical Control Points , Meat , Salmonella , United States
6.
J Food Prot ; 81(11): 1851-1863, 2018 11.
Article in English | MEDLINE | ID: mdl-30325223

ABSTRACT

Buffered peptone water is the rinsate commonly used for chicken rinse sampling. A new formulation of buffered peptone water was developed to address concerns about the transfer of antimicrobials, used during poultry slaughter and processing, into the rinsate. This new formulation contains additives to neutralize the antimicrobials, and this neutralizing buffered peptone water replaced the original formulation for all chicken carcass and chicken part sampling programs run by the Food Safety and Inspection Service beginning in July 2016. Our goal was to determine whether the change in rinsate resulted in significant differences in the observed proportion of positive chicken rinse samples for both Salmonella and Campylobacter. This assessment compared sampling results for the 12-month periods before and after implementation. The proportion of carcass samples that tested positive for Salmonella increased from approximately 0.02 to almost 0.06. Concurrently, the proportion of chicken part samples that tested for Campylobacter decreased from 0.15 to 0.04. There were no significant differences associated with neutralizing buffered peptone water for the other two product-pathogen pairs. Further analysis of the effect of the new rinsate on corporations that operate multiple establishments demonstrated that changes in the percent positive rates differed across the corporations, with some corporations being unaffected, while others saw all of the establishments operated by the corporation move from passing to failing the performance standard and vice versa. The results validated earlier concerns that antimicrobial contamination of rinse samples was causing false-negative Salmonella testing results for chicken carcasses. The results also indicate that additional development work may still be required before the rinsate is sufficiently robust for its use in Campylobacter testing.


Subject(s)
Campylobacter , Chickens , Food Handling/methods , Food Microbiology , Salmonella/isolation & purification , Animals , Campylobacter/isolation & purification , Food Contamination , Meat , Peptones , Prevalence , Water , Water Microbiology
7.
Int J Food Microbiol ; 282: 24-27, 2018 Oct 03.
Article in English | MEDLINE | ID: mdl-29885974

ABSTRACT

Advances in microbiological testing methods have led to faster and less expensive assays. Given these advances, it is logical to employ these assays for use in the sampling plan of an existing microbiological criterion. A change in the performance characteristics of the assay can affect the intended effect of the microbiological criterion. This study describes a method for updating a 2-class attributes sampling plan to account for the different test sensitivity and specificity of a new assay and provides an example based on the replacement of a culture-based assay with a real-time polymerase chain reaction assay.


Subject(s)
Campylobacter/isolation & purification , Chickens/microbiology , Meat Products/microbiology , Microbiological Techniques/methods , Animals , Campylobacter/genetics , Laboratories , Microbiological Techniques/economics , Real-Time Polymerase Chain Reaction , Sensitivity and Specificity
8.
Int J Food Microbiol ; 245: 29-37, 2017 Mar 20.
Article in English | MEDLINE | ID: mdl-28119218

ABSTRACT

The presence or absence of contaminants in food samples changes as a commodity moves along the farm-to-table continuum. Interest lies in the degree to which the prevalence (i.e., infected animals or contaminated sample units) at one location in the continuum, as measured by the proportion of test-positive samples, is correlated with the prevalence at a location later in the continuum. If prevalence of a contaminant at one location in the continuum is strongly correlated with the prevalence of the contaminant later in the continuum, then the effect of changes in contamination on overall food safety can be better understood. Pearson's correlation coefficient is one of the simplest metrics of association between two measurements of prevalence but it is biased when data consisting of presence/absence testing results are used to directly estimate the correlation. This study demonstrates the potential magnitude of this bias and explores the utility of three methods for unbiased estimation of the degree of correlation in prevalence. An example, based on testing broiler chicken carcasses for Salmonella at re-hang and post-chill, is used to demonstrate the methods.


Subject(s)
Food Contamination/analysis , Food Microbiology , Salmonella Infections/epidemiology , Salmonella/isolation & purification , Animals , Chickens , Farms , Food Safety , Meat , Models, Statistical , Prevalence , Reproducibility of Results
9.
Emerg Infect Dis ; 22(7): 1193-200, 2016 07.
Article in English | MEDLINE | ID: mdl-27314510

ABSTRACT

Outbreak data have been used to estimate the proportion of illnesses attributable to different foods. Applying outbreak-based attribution estimates to nonoutbreak foodborne illnesses requires an assumption of similar exposure pathways for outbreak and sporadic illnesses. This assumption cannot be tested, but other comparisons can assess its veracity. Our study compares demographic, clinical, temporal, and geographic characteristics of outbreak and sporadic illnesses from Campylobacter, Escherichia coli O157, Listeria, and Salmonella bacteria ascertained by the Foodborne Diseases Active Surveillance Network (FoodNet). Differences among FoodNet sites in outbreak and sporadic illnesses might reflect differences in surveillance practices. For Campylobacter, Listeria, and Escherichia coli O157, outbreak and sporadic illnesses are similar for severity, sex, and age. For Salmonella, outbreak and sporadic illnesses are similar for severity and sex. Nevertheless, the percentage of outbreak illnesses in the youngest age category was lower. Therefore, we do not reject the assumption that outbreak and sporadic illnesses are similar.


Subject(s)
Disease Outbreaks , Epidemiological Monitoring , Food Microbiology , Foodborne Diseases/epidemiology , Population Surveillance/methods , Campylobacter , Campylobacter Infections/epidemiology , Campylobacter Infections/microbiology , Escherichia coli Infections/epidemiology , Escherichia coli Infections/microbiology , Escherichia coli O157 , Humans , Retrospective Studies , Salmonella , Salmonella Infections/epidemiology , Salmonella Infections/microbiology , United States/epidemiology
10.
J Food Prot ; 78(8): 1451-60, 2015 Aug.
Article in English | MEDLINE | ID: mdl-26219357

ABSTRACT

Process models that include the myriad pathways that pathogen-contaminated food may traverse before consumption and the dose-response function to relate exposure to likelihood of illness may represent a "gold standard" for quantitative microbial risk assessment. Nevertheless, simplifications that rely on measuring the change in contamination occurrence of a raw food at the end of production may provide reasonable approximations of the effects measured by a process model. In this study, we parameterized three process models representing different product-pathogen pairs (i.e., chicken-Salmonella, chicken-Campylobacter, and beef-E. coli O157:H7) to compare with predictions based on qualitative testing of the raw product before consideration of mixing, partitioning, growth, attenuation, or dose-response processes. The results reveal that reductions in prevalence generated from qualitative testing of raw finished product usually underestimate the reduction in likelihood of illness for a population of consumers. Qualitative microbial testing results depend on the test's limit of detection. The negative bias is greater for limits of detection that are closer to the center of the contamination distribution and becomes less as the limit of detection is moved further into the right tail of the distribution. Nevertheless, a positive bias can result when the limit of detection refers to very high contamination levels. Changes in these high levels translate to larger consumed doses for which the slope of the dose-response function is smaller compared with the larger slope associated with smaller doses. Consequently, in these cases, a proportional reduction in prevalence of contamination results in a less than proportional reduction in probability of illness. The magnitudes of the biases are generally less for nonscalar (versus scalar) adjustments to the distribution.


Subject(s)
Food Microbiology/methods , Food Safety , Foodborne Diseases/prevention & control , Animals , Campylobacter/growth & development , Campylobacter/isolation & purification , Cattle , Chickens/microbiology , Escherichia coli Infections/epidemiology , Escherichia coli O157/growth & development , Escherichia coli O157/isolation & purification , Food Contamination/analysis , Food Handling/methods , Humans , Meat/microbiology , Models, Statistical , Risk Assessment , Salmonella/isolation & purification
11.
Int J Food Microbiol ; 208: 114-21, 2015 Sep 02.
Article in English | MEDLINE | ID: mdl-26065728

ABSTRACT

The proportion of Campylobacter contaminated food and water samples collected by different surveillance systems often exhibit seasonal patterns. In addition, the incidence of foodborne campylobacteriosis also tends to exhibit strong seasonal patterns. Of the various product classes, the occurrence of Campylobacter contamination can be high on raw poultry products, and chicken is often thought to be one of the leading food vehicles for campylobacteriosis. Two different federal agencies in the United States collected samples of raw chicken products and tested them for the presence of Campylobacter. During the same time period, a consortium of federal and state agencies operated a nationwide surveillance system to monitor cases of campylobacteriosis in the United States. This study uses a common modeling approach to estimate trends and seasonal patterns in both the proportion of raw chicken product samples that test positive for Campylobacter and cases of campylobacteriosis. The results generally support the hypothesis of a weak seasonal increase in the proportion of Campylobacter positive chicken samples in the summer months, though the number of Campylobacter on test-positive samples is slightly lower during this time period. In contrast, campylobacteriosis cases exhibit a strong seasonal pattern that generally precedes increases in contaminated raw chicken. These results suggest that while contaminated chicken products may be responsible for a substantial number of campylobacteriosis cases, they are most likely not the primary driver of the seasonal pattern in human illness.


Subject(s)
Campylobacter Infections/epidemiology , Campylobacter/physiology , Food Microbiology , Meat/microbiology , Animals , Campylobacter Infections/microbiology , Chickens , Environmental Microbiology , Humans , Incidence , Poultry Products/microbiology , Seasons , Time Factors , United States/epidemiology
12.
Environ Sci Technol ; 48(22): 13316-22, 2014 Nov 18.
Article in English | MEDLINE | ID: mdl-25333423

ABSTRACT

The fitting of statistical distributions to chemical and microbial contamination data is a common application in risk assessment. These distributions are used to make inferences regarding even the most pedestrian of statistics, such as the population mean. The reason for the heavy reliance on a fitted distribution is the presence of left-, right-, and interval-censored observations in the data sets, with censored observations being the result of nondetects in an assay, the use of screening tests, and other practical limitations. Considerable effort has been expended to develop statistical distributions and fitting techniques for a wide variety of applications. Of the various fitting methods, Markov Chain Monte Carlo methods are common. An underlying assumption for many of the proposed Markov Chain Monte Carlo methods is that the data represent independent and identically distributed (iid) observations from an assumed distribution. This condition is satisfied when samples are collected using a simple random sampling design. Unfortunately, samples of food commodities are generally not collected in accordance with a strict probability design. Nevertheless, pseudosystematic sampling efforts (e.g., collection of a sample hourly or weekly) from a single location in the farm-to-table continuum are reasonable approximations of a simple random sample. The assumption that the data represent an iid sample from a single distribution is more difficult to defend if samples are collected at multiple locations in the farm-to-table continuum or risk-based sampling methods are employed to preferentially select samples that are more likely to be contaminated. This paper develops a weighted bootstrap estimation framework that is appropriate for fitting a distribution to microbiological samples that are collected with unequal probabilities of selection. An example based on microbial data, derived by the Most Probable Number technique, demonstrates the method and highlights the magnitude of biases in an estimator that ignores the effects of an unequal probability sample design.


Subject(s)
Environmental Monitoring/methods , Environmental Pollutants/analysis , Environmental Pollution/analysis , Markov Chains , Monte Carlo Method , Probability , Computer Simulation
13.
Int J Food Microbiol ; 175: 1-5, 2014 Apr 03.
Article in English | MEDLINE | ID: mdl-24491921

ABSTRACT

Indicator organisms, such as generic Escherichia coli (GEC) and coliforms, can be used to measure changes in microbial contamination during the production of food products. Large and consistent reductions in the concentration of these organisms demonstrates an effective and well-controlled production process. Nevertheless, it is unclear to what degree concentrations of indicator organisms are related to pathogenic organisms such as Campylobacter and Salmonella on a sample-by-sample basis. If a strong correlation exists between the concentrations of different organisms, then the monitoring of indicator organisms would be a cost-effective surrogate for the measurement of pathogenic organisms. Calculating the correlation between the concentrations of an indicator and pathogenic organism is complicated because microbial testing datasets typically contain a large proportion of censored observations (i.e., samples where the true concentration is not observable, with nondetects and samples that are only screen-test positive being examples). This study proposes a maximum likelihood estimator that can be used to estimate the correlation between the concentrations of indicator and pathogenic organisms. An example based on broiler chicken rinse samples demonstrates modest, but significant positive correlations between the concentration of the indicator organism GEC when compared to the concentration of both Campylobacter and Salmonella. A weak positive correlation was also observed between concentrations of Campylobacter and Salmonella, but it was not statistically significant.


Subject(s)
Food Microbiology/methods , Models, Theoretical , Animals , Campylobacter/physiology , Chickens/microbiology , Colony Count, Microbial , Escherichia coli/physiology , Food Handling/standards , Food Microbiology/standards , Salmonella/physiology , Statistics as Topic
14.
Int J Food Microbiol ; 165(2): 89-96, 2013 Jul 15.
Article in English | MEDLINE | ID: mdl-23727652

ABSTRACT

Levels of pathogenic organisms in food and water have steadily declined in many parts of the world. A consequence of this reduction is that the proportion of samples that test positive for the most contaminated product-pathogen pairings has fallen to less than 0.1. While this is unequivocally beneficial to public health, datasets with very few enumerated samples present an analytical challenge because a large proportion of the observations are censored values. One application of particular interest to risk assessors is the fitting of a statistical distribution function to datasets collected at some point in the farm-to-table continuum. The fitted distribution forms an important component of an exposure assessment. A number of studies have compared different fitting methods and proposed lower limits on the proportion of samples where the organisms of interest are identified and enumerated, with the recommended lower limit of enumerated samples being 0.2. This recommendation may not be applicable to food safety risk assessments for a number of reasons, which include the development of new Bayesian fitting methods, the use of highly sensitive screening tests, and the generally larger sample sizes found in surveys of food commodities. This study evaluates the performance of a Markov chain Monte Carlo fitting method when used in conjunction with a screening test and enumeration of positive samples by the Most Probable Number technique. The results suggest that levels of contamination for common product-pathogen pairs, such as Salmonella on poultry carcasses, can be reliably estimated with the proposed fitting method and samples sizes in excess of 500 observations. The results do, however, demonstrate that simple guidelines for this application, such as the proportion of positive samples, cannot be provided.


Subject(s)
Food Microbiology/methods , Food Safety/methods , Markov Chains , Monte Carlo Method , Risk Assessment/methods , Meat/microbiology , Probability , Risk Assessment/standards , Sample Size
15.
Int J Food Microbiol ; 162(3): 266-75, 2013 Apr 01.
Article in English | MEDLINE | ID: mdl-23454818

ABSTRACT

This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumer's home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment PO's of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence in 125 g (i.e., -2.1 log10 cfu/g). This example, and others, demonstrates that a PO for L. monocytogenes would be far below any current monitoring capabilities. Furthermore, this work highlights the demands placed on risk managers and risk assessors when applying uncertain risk models to the current risk metric framework.


Subject(s)
Food Contamination/prevention & control , Food Microbiology/organization & administration , Listeria monocytogenes/growth & development , Meat/microbiology , Models, Statistical , Risk Management , Food Handling/standards , Humans , Listeria monocytogenes/isolation & purification , Maximum Allowable Concentration , Meat Products/microbiology , Monte Carlo Method , Risk Assessment , Uncertainty
16.
Int J Food Microbiol ; 157(2): 251-8, 2012 Jul 02.
Article in English | MEDLINE | ID: mdl-22658686

ABSTRACT

Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two data sets that represent Salmonella and Campylobacter concentrations on chicken carcasses. The results demonstrate a bias in the maximum likelihood estimator that increases with reductions in average concentration. The Bayesian method provided unbiased estimates of the concentration distribution parameters for all data sets. We provide computer code for the Bayesian fitting method.


Subject(s)
Food Contamination , Food Safety , Meat/microbiology , Bayes Theorem , Campylobacter/isolation & purification , Data Interpretation, Statistical , Likelihood Functions , Probability , Risk Assessment/methods , Salmonella/isolation & purification
17.
J Food Prot ; 75(4): 775-8, 2012 Apr.
Article in English | MEDLINE | ID: mdl-22488071

ABSTRACT

Estimates of the burden of bacterial foodborne illness are used in applications ranging from determining economic losses due to a particular pathogenic organism to improving our understanding of the effects of antimicrobial resistance or changes in pathogen serotype. Estimates of the total number of illnesses can be derived by multiplying the number of observed illnesses, as reported by a specific active surveillance system, by an underdiagnosis factor that describes the relationship between observed and unobserved cases. The underdiagnosis factor can be a fixed value, but recent research efforts have focused on characterizing the inherent uncertainty in the surveillance system with a computer simulation. Although the inclusion of uncertainty is beneficial, re-creating the simulation results for every application can be burdensome. An alternative approach is to describe the underdiagnosis factor and its uncertainty with a parametric distribution. The use of such a distribution simplifies analyses by providing a closed-form definition of the underdiagnosis factor and allows this factor to be easily incorporated into Bayesian models. In this article, we propose and estimate parametric distributions for the underdiagnosis multipliers developed for the FoodNet surveillance systems in the United States. Distributions are provided for the five foodborne pathogens deemed most relevant to meat and poultry.


Subject(s)
Cost of Illness , Foodborne Diseases/diagnosis , Foodborne Diseases/epidemiology , Computer Simulation , Diagnosis, Differential , Disease Outbreaks , Drug Resistance, Bacterial , Foodborne Diseases/economics , Foodborne Diseases/pathology , Humans , Public Health , Sentinel Surveillance
18.
Foodborne Pathog Dis ; 9(1): 59-67, 2012 Jan.
Article in English | MEDLINE | ID: mdl-22091640

ABSTRACT

A common approach to reducing microbial contamination has been the implementation of a Hazard Analysis and Critical Control Point (HACCP) program to prevent or reduce contamination during production. One example is the Pathogen Reduction HACCP program implemented by the U.S. Department of Agriculture's Food Safety and Inspection Service (FSIS). This program consisted of a staged implementation between 1996 and 2000 to reduce microbial contamination on meat and poultry products. Of the commodities regulated by FSIS, one of the largest observed reductions was for Salmonella contamination on broiler chicken carcasses. Nevertheless, how this reduction might have influenced the total number of salmonellosis cases in the United States has not been assessed. This study incorporates information from public health surveillance and surveys of the poultry slaughter industry into a model that estimates the number of broiler-related salmonellosis cases through time. The model estimates that-following the 56% reduction in the proportion of contaminated broiler carcasses observed between 1995 and 2000-approximately 190,000 fewer annual salmonellosis cases (attributed to broilers) occurred in 2000 compared with 1995. The uncertainty bounds for this estimate range from approximately 37,000 to 500,000 illnesses. Estimated illnesses prevented, due to the more modest reduction in contamination of 13% between 2000 and 2007, were not statistically significant. An analysis relating the necessary magnitude of change in contamination required for detection via human surveillance also is provided.


Subject(s)
Food Contamination/prevention & control , Food-Processing Industry/standards , Foodborne Diseases/prevention & control , Poultry Diseases/microbiology , Salmonella Food Poisoning/microbiology , Salmonella/isolation & purification , Animals , Chickens , Consumer Product Safety , Data Collection , Food Inspection , Food Microbiology , Humans , Models, Statistical , Poultry Products/microbiology , Public Health Surveillance , United States
19.
Risk Anal ; 31(3): 345-50, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21039706

ABSTRACT

Microbial food safety risk assessment models can often at times be simplified by eliminating the need to integrate a complex dose-response relationship across a distribution of exposure doses. This is possible if exposure pathways lead to pathogens at exposure that consistently have a small probability of causing illness. In this situation, the probability of illness will follow an approximately linear function of dose. Consequently, the predicted probability of illness per serving across all exposures is linear with respect to the expected value of dose. The majority of dose-response functions are approximately linear when the dose is low. Nevertheless, what constitutes "low" is dependent on the parameters of the dose-response function for a particular pathogen. In this study, a method is proposed to determine an upper bound of the exposure distribution for which the use of a linear dose-response function is acceptable. If this upper bound is substantially larger than the expected value of exposure doses, then a linear approximation for probability of illness is reasonable. If conditions are appropriate for using the linear dose-response approximation, for example, the expected value for exposure doses is two to three logs(10) smaller than the upper bound of the linear portion of the dose-response function, then predicting the risk-reducing effectiveness of a proposed policy is trivial. Simple examples illustrate how this approximation can be used to inform policy decisions and improve an analyst's understanding of risk.


Subject(s)
Colony Count, Microbial , Food Microbiology , Food Safety , Humans , Risk Assessment
20.
Risk Anal ; 31(4): 548-65, 2011 Apr.
Article in English | MEDLINE | ID: mdl-21105883

ABSTRACT

Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm-to-table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food-safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.


Subject(s)
Food Microbiology , Models, Theoretical , Risk Assessment , Bayes Theorem , Foodborne Diseases/epidemiology , Humans , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...