Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
J Food Prot ; 87(4): 100258, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38428462

ABSTRACT

The objective of this study was to compare preharvest monitoring strategies by evaluating three different sampling methods in the lairage area to determine pathogen recovery for each sampling method and incoming pathogen prevalence from the cattle to inform in-plant decision making. Samples were gathered over a 5-month period, from February to June 2022, at a harvesting and processing facility located in Eastern Nebraska. Sampling methods included (i) fecal pats, (ii) boot swabs, and (iii) MicroTally swab. A total of 329 samples were collected over the study period (fecal pats: n = 105, boot swabs: n = 104, and MicroTally swabs: n = 120). Specific media combinations, an incubation temperature of 42°C, and incubation timepoints (18-24 h) were utilized for each matrix and the prevalence of Salmonella, Escherichia coli O157:H7, and six non-O157 Shiga-toxin producing E. coli (STEC) was evaluated using the BAX system Real-Time PCR assay. Overall, results from the study concluded that boot swabs were an effective sampling method for pathogen detection in the cattle lairage area. Boot swabs (97.1%) were statistically more likely to detect for Salmonella (p < 0.05) when compared to fecal pats (67.6%) and MicroTally swab (77.5%) methods. For E. coli O157:H7 and STEC - O26, O121, O45, and O103 prevalence, boot swabs were significantly better at detecting for these pathogens (p < 0.05) than MicroTally swabs (OR = 3.16 - 11.95) and a comparable sampling method to fecal pats (OR = 0.93 - 2.01, p > 0.05). Lastly, all three sampling methods detected a very low prevalence for E. coli O111 and O145; therefore, no further analysis was conducted. The boot swab sampling method was strongly favored because they require little training to implement, are inexpensive, and they do not require much sampling labor; therefore, would be a simple and effective sampling method to implement within the industry to evaluate pathogen prevalence preharvest.


Subject(s)
Escherichia coli Infections , Escherichia coli O157 , Escherichia coli Proteins , Shiga-Toxigenic Escherichia coli , Cattle , Animals , Escherichia coli Infections/veterinary , Feces , Salmonella , Food Microbiology
2.
J Food Prot ; 86(10): 100139, 2023 10.
Article in English | MEDLINE | ID: mdl-37567500

ABSTRACT

In recent years, there has been an increased interest in beef cattle shedding of foodborne pathogens due to the potential to contaminate surrounding food crops; however, the number of studies published on this topic has declined as the majority of research has emphasized on postharvest mitigation efforts. A field study was conducted to determine the prevalence of pathogens and indicator bacteria in beef cattle fed two different direct-fed microbials (DFMs). Fecal samples from a total of 3,708 crossbred yearling cattle randomly assigned to 16 pens and two treatment groups at a commercial cattle feedlot were taken. During the study period, diets were supplemented with two different DFMs i.) Lactobacillus acidophilus (NP51) and Propionibacterium freudenreichii (NP24) (9 log10CFU/head/day), and ii.) Lactobacillus salivarius (L28) (6 log10CFU/head/day). Fecal samples from pen floors were collected on days 0, 21, 42, 63, 103, and analyzed for the presence of Salmonella and E. coli O157:H7 and concentration of E. coli O157:H7, Enterobacteriaceae, and C. perfringens. Fecal samples collected from cattle fed L28 had significantly lower concentration of C. perfringens (p < 0.05) and had a similar prevalence with no significant differences in E. coli O157:H7 as those fed NP51/NP24 through the study until day 103. On day 103, the prevalence in cattle fed L28 was 40% with a concentration of 0.95 log10MPN/g while those fed NP51/NP24 were 65% with a concentration of 1.2 log10MPN/g. Cattle supplemented with NP51/NP24 achieved a significant log reduction of EB by 2.4 log10CFU/g over the course of the 103-day supplementation period compared to L28. Salmonella prevalence was also measured, but not detected in any samples at significant amounts to draw conclusions. It is evident that E. coli O157:H7 and other foodborne pathogens are still prevalent in cattle operations and that preharvest mitigation strategies should be considered to reduce the risk to beef products.


Subject(s)
Cattle Diseases , Escherichia coli Infections , Escherichia coli O157 , Cattle , Animals , Prevalence , Colony Count, Microbial , Antibiosis , Random Allocation , Feces/microbiology , Escherichia coli Infections/epidemiology , Salmonella , Animal Feed/microbiology , Cattle Diseases/microbiology
3.
Compr Rev Food Sci Food Saf ; 22(5): 3506-3530, 2023 09.
Article in English | MEDLINE | ID: mdl-37421315

ABSTRACT

Recently, multiple reports from regulatory agencies have linked leafy green outbreaks to nearby or adjacent cattle operations. While they have made logical explanations for this phenomenon, the reports and data should be summarized to determine if the association was based on empirical data, epidemiological association, or speculation. Therefore, this scoping review aims to gather data on the mechanisms of transmission for pathogens from livestock to produce, identify if direct evidence linking the two entities exists, and identify any knowledge gaps in the scientific literature and public health reports. Eight databases were searched systematically and 27 eligible primary research products, which focus on produce safety concerning proximity to livestock, provided empirical or epidemiological association and described mechanisms of transmission, qualitatively or quantitatively were retained. Fifteen public health reports were also covered. Results from the scientific articles provided evidence that proximity to livestock might be a risk factor; however, most lack quantitative data on the relative contribution of different pathways for contamination. Public health reports mainly indicate livestock presence as a possible source and encourage further research. Although the collected information regarding the proximity of cattle is a concern, data gaps indicate that more studies should be conducted to determine the relative contribution of different mechanisms of contamination and generate quantitative data to inform food safety risk analyses, regarding leafy greens produced nearby livestock areas.


Subject(s)
Escherichia coli O157 , Food Microbiology , Cattle , Animals , Public Health , Disease Outbreaks , Plant Leaves
4.
Foods ; 12(23)2023 Dec 01.
Article in English | MEDLINE | ID: mdl-38231835

ABSTRACT

Finalyse, a T4 bacteriophage, is a pre-harvest intervention that utilizes a combination of bacteriophages to reduce incoming Escherichia coli O157:H7 prevalence by destroying the bacteria on the hides of harvest-ready cattle entering commercial abattoirs. The objective of this study was to evaluate the efficacy of Finalyse, as a pre-harvest intervention, on the reduction in pathogens, specifically E. coli O157:H7, on the cattle hides and lairage environment to overall reduce incoming pathogen loads. Over 5 sampling events, a total of 300 composite hide samples were taken using 25 mL pre-hydrated Buffered Peptone Water (BPW) swabs, collected before and after the hide wash intervention, throughout the beginning, middle, and end of the production day (n = 10 swabs/sampling point/timepoint). A total of 171 boot swab samples were also simultaneously taken at the end of the production day by walking from the front to the back of the pen in a pre-determined 'Z' pattern to monitor the pen floor environment from 3 different locations in the lairage area. The prevalence of pathogens was analyzed using the BAX® System Real-Time PCR Assay. There were no significant reductions observed for Salmonella and/or any Shiga toxin-producing E. coli (STEC) on the hides after the bacteriophage application (p > 0.05). Escherichia coli O157:H7 and O111 hide prevalence was very low throughout the study; therefore, no further analysis was conducted. However, boot swab monitoring showed a significant reduction in E. coli O157:H7, O26, and O45 in the pen floor environment (p < 0.05). While using Finalyse as a pre-harvest intervention in the lairage areas of commercial beef processing facilities, this bacteriophage failed to reduce E. coli O157:H7 on the hides of beef cattle, as prevalence was low; however, some STECs were reduced in the lairage environment, where the bacteriophage was applied. Overall, an absolute conclusion was not formed on the effectiveness of Finalyse and its ability to reduce E. coli O157:H7 on the hides of beef cattle, as prevalence on the hides was low.

5.
Foods ; 11(23)2022 Nov 28.
Article in English | MEDLINE | ID: mdl-36496642

ABSTRACT

The purpose of the study was to evaluate the prevalence and concentration of foodborne pathogens in the feces and peripheral lymph nodes (PLNs) of beef cattle when supplemented with direct-fed microbials (DFMs) in feedlots. Fecal samples were collected from the pen floors over a 5-month period at three different feedlots in a similar geographical location in Nebraska, where each feed yard represented a treatment group: (i.) control: no supplement, (ii.) Bovamine Defend: supplemented with NP51 and NP24 at a target dose of 9 log10CFU/g/head/day, and (iii.) Probicon: supplemented with L28 at a target dose of 6 log10CFU/g/head/day. Each fecal sample was tested for the prevalence of E. coli O157:H7 and Salmonella, and concentration of E. coli O157:H7, Enterobacteriaceae and Clostridium perfringens. Cattle were harvested and PLNs were collected on the harvest floor. Real-time Salmonella PCR assays were performed for each PLN sample to determine Salmonella presence. The cattle supplemented with both DFMs had reduced foodborne pathogens in fecal samples, but feces collected from the pens housing the cattle supplemented with Probicon consistently had significantly less E. coli O157:H7 and Salmonella prevalence as well as a lower C. perfringens concentration. While DFMs do not eliminate foodborne pathogens in fecal shedding and PLNs, the use of DFMs as a pre-harvest intervention allows for an effective way to target multiple pathogens reducing the public health risks and environmental dissemination from cattle.

6.
Foods ; 11(8)2022 Apr 14.
Article in English | MEDLINE | ID: mdl-35454719

ABSTRACT

The objective was to conduct a bio-mapping of microbial indicators to determine statistical process control (SPC) parameters at a beef processing plant to establish microbiological baselines and process control parameters to support food safety management decisions. EZ-ReachTM swabs were used to collect 100 cm2 area samples at seven different locations throughout the beef processing line at four different regions on the carcass. Each of the eight sampling days evaluated included three samples collected per sampling location/carcass region for a total of 84 samples per day. Enumeration of total aerobic bacteria, Enterobacteriaceae, and Escherichia coli was performed on each sample. Microbial SPC parameters were estimated for each sampling point. Statistical differences between sampling points for all carcass locations (p < 0.001) followed an overall trend with higher values at pre- and post-evisceration with a continuous decrease until final interventions with a slight increase in counts during the chilling process and a final increase after fabrication. Variability at sampling points is the result of the nature of the process and highlights open opportunities for improvement of the food safety system. Microbial baselines and SPC parameters will help support decision making for continuous process improvement, validation of intervention schemes, and corrective action implementation for food safety management.

7.
Int J Food Microbiol ; 370: 109635, 2022 Jun 02.
Article in English | MEDLINE | ID: mdl-35339915

ABSTRACT

Primary and secondary models were developed for quantitatively characterizing the survival of Listeria monocytogenes in soy-sauce based acidified Asian style products that do not undergo a thermal treatment. The objective of this study was to quantify the effect of food matrix properties on L. monocytogenes' survival in soy sauce-based products. This quantification enables a product-specific estimation of 5-log reduction time to ensure a safe processing and management operation, to ultimately facilitate a science-based, safety-oriented product development process. A central composite design with four independent variables (pH, soy sauce, added NaCl and soluble solids) with five levels was used to plan the challenge studies on different formulations. To model microbial survival over time, different non-linear primary models were fit to the data obtained from challenge studies. The best-fit model was selected based on a series of statistical goodness-of-fit measures. Kinetic parameters estimated from the best-fit primary models were fit to response surface equations using second order polynomial regression. The best-fit primary model representative of the product formulations was a modified Weibull model. The natural logarithm of the scale parameter (δ, in h) was used as the response variable for the secondary model. This resulted in acceptable fitting compared to the observed values with R2 values of 0.95 and RMSE of 0.7 h. External validity of model predictions was conducted by comparing them to 5-log reduction times observed in independent challenge tests using different product formulations. Results indicated an acceptable validation with R2 = 0.81 and RMSE = 35 h. The present study provides quantitative tools specific for cold-fill-hold soy sauce-based products to enhance microbial safety management plans and product development.


Subject(s)
Listeria monocytogenes , Soy Foods , Colony Count, Microbial , Food Microbiology , Kinetics , Models, Biological
8.
Compr Rev Food Sci Food Saf ; 21(1): 227-271, 2022 01.
Article in English | MEDLINE | ID: mdl-34730272

ABSTRACT

Systematic review and meta-analysis were conducted to quantify the effects of processing stages and interventions on the prevalence and concentration of Campylobacter on broiler carcasses. To comprehensively capture relevant evidence, six databases were searched using the keywords "Campylobacter" and "broiler chicken." The literature search yielded 10,450 unique citations, and after applying predetermined inclusion and exclusion criteria, 72 and 53 relevant citations were included in meta-analyses for processing stages and interventions, respectively. As the two primary outcomes, log reduction and prevalence changes were estimated for each stage or intervention using a random-effects meta-analysis approach whenever possible. The outcome-level quality assessment was conducted following the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) approach. The analysis revealed that scalding and chilling majorly reduces the prevalence and concentration of Campylobacter. Immersion chilling reduces the concentration regardless of chemical additives, but its effect on prevalence is not conclusive. The effects of carcass washing applications remain uncertain due to the inconsistency and imprecision of both outcomes. Defeathering and evisceration were identified as stages that can increase both prevalence and concentration. Both chemical and physical processing interventions provide limited efficacy in concentration and prevalence reduction. Major limitations of the review were inconsistency and imprecision at the outcome level and reporting issues and data gaps at the study level. The results are expected to inform quantitative microbial risk assessment model development and support evidence-based decision-making.


Subject(s)
Campylobacter , Animals , Chickens , Food Handling/methods , Food Microbiology
9.
Water Res ; 171: 115466, 2020 Mar 15.
Article in English | MEDLINE | ID: mdl-31927094

ABSTRACT

Intense pressure on water resources has led to efforts to reuse reclaimed processing wastewater for cleaning purposes in food processing plants. The milk industry produces considerable amounts of wastewater, which can be used for cleaning of equipment after appropriate treatment. However, due to naturally occurring microbiological contamination in raw milk, the wastewater is often contaminated, and therefore the reuse of reclaimed wastewater is perceived as risky. This study aims to quantify the risks of Listeria monocytogenes infection and associated disease burden when wastewater reclaimed from milk processing operations is used in cleaning-in-place (CIP) systems for pasteurized fluid milk production following a quantitative microbial risk assessment (QMRA) approach. Furthermore, this study aims to inform risk-based tolerable limits for levels of contamination in CIP water based on a public health target of 10-6 DALY per person annually. The suggested model investigates the passage of L. monocytogenes throughout the fluid milk chain, from receipt of raw milk at the plant to the point of consumption and covering storage in receiving and storage tanks, pasteurization, and storage at retail and at home. Risk and disease burden estimates are simulated for general (younger than 65 years), elderly (65 years and older) and pregnant population subgroups. Additional scenarios covering the effect of using clean water, using water with different levels of contamination and using reclaimed wastewater modeled as recovered from cheese whey after membrane filtration (reclaimed water scenario) are considered to estimate a risk-based limit of contamination and simulate a real-life example. The tolerable limit of contamination in CIP water was estimated as -2 log10 CFU/mL to ensure the protection of the most vulnerable subgroup, pregnant women, while higher limits were estimated for the elderly and general subgroups. Under the reclaimed water scenario, the annual number of listeriosis cases was estimated as 3.36, 5.67, and 0.15 for the general, elderly and pregnant population subgroups, respectively, while in the clean water scenario, the estimates were 3.33, 5.56 and 0.15, respectively. In both scenarios, the DALY estimates were lower than the tolerable limit. The results indicate that reclaimed water can be an alternative to potable water for CIP applications.


Subject(s)
Listeria monocytogenes , Animals , Female , Food Contamination , Food Handling , Humans , Milk , Pregnancy , Risk Assessment , Wastewater
SELECTION OF CITATIONS
SEARCH DETAIL
...