Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
J Food Prot ; 71(9): 1761-7, 2008 Sep.
Article in English | MEDLINE | ID: mdl-18810859

ABSTRACT

To quantify associations at slaughter between Escherichia coli O157 carcass contamination, fecal-positive animals, and high-shedding animals within truckloads of finished cattle, we sampled up to 32 cattle from each of 50 truckloads arriving at a commercial abattoir in the Midwest United States during a 5-week summer period. Carcass swab samples collected pre-evisceration and fecal samples collected postevisceration were matched within animals and analyzed for the presence of E. coli O157, using enrichment, immunomagnetic separation, and plating on selective media (IMS). In addition, a direct plating procedure was performed on feces to identify high-shedding animals. E. coli O157 was isolated from 39 (2.6%) of 1,503 carcass samples in 15 (30%) truckloads, and 127 (8.5%) of 1,495 fecal samples in 37 (74%) truckloads. Fifty-five (3.7%) high-shedding animals were detected from 26 (52%) truckloads. Truckload high-shedder (Spearman rank-order correlation coefficient [r(s)] = 0.68), IMS-positive (r(s) = 0.48), and combined fecal (r(s) = 0.61) prevalence were significantly correlated with carcass prevalence. The probability of isolating E. coli O157 from a carcass was not significantly associated with the high-shedder or fecal IMS status of the animal from which the carcass was derived. However, the probability of carcass contamination was significantly associated with all truckload-level measures of fecal E. coli O157, particularly whether or not a high shedder was present within the truckload (odds ratio = 16.2; 95% confidence interval, 6.3-43.6). Our results suggest that high shedders within a truckload at slaughter could be a target for mitigation strategies to reduce the probability of preevisceration carcass contamination.


Subject(s)
Abattoirs , Cattle/microbiology , Escherichia coli O157/isolation & purification , Feces/microbiology , Transportation , Animals , Colony Count, Microbial/methods , Food Contamination/analysis , Food Contamination/prevention & control , Food Microbiology , Humans , Meat/microbiology , Midwestern United States , Prevalence , Risk Assessment , Statistics, Nonparametric
2.
J Food Prot ; 69(4): 794-800, 2006 Apr.
Article in English | MEDLINE | ID: mdl-16629021

ABSTRACT

Italian-style salami batter (formulated with pork shoulder) was inoculated with ca. 7.0 log CFU/g of either Salmonella or Listeria monocytogenes. Salami links (55-mm cellulose casings) were fermented at 30 degrees C for 24, 40, or 72 h and then dried to target moisture/protein ratios (MPRs) of 1.9:1 or 1.4:1. Links were sampled after fermentation (24, 40, and 72 h) and after combined fermentation-drying treatments (MPRs of 1.9:1 and 1.4:1 for all fermentation periods), and microbiological and proximate analyses were performed at each sampling. Pathogen populations were enumerated by direct plating on selective agar and by an injured-cell recovery method. When enumerated by the injured-cell recovery method, Salmonella populations were reduced by 1.2 to 2.1 log CFU/g after fermentation alone (24 to 72 h) and by 2.4 to 3.4 log CFU/g when fermentation was followed by drying. Drying to an MPR of 1.4:1 was no more effective than drying to an MPR of 1.9:1 (P > 0.05). When enumerated directly on selective media, Salmonella populations were reduced from 1.6 to 2.4 log CFU/g and from 3.6 to 4.5 log CFU/g for fermentation alone and fermentation followed by drying, respectively. L. monocytogenes populations were reduced by <1.0 log CFU/g following all fermentation and combined fermentation-drying treatments, regardless of the enumeration method. These results suggest that the Italian-style salami manufacturing process evaluated does not adequately reduce high pathogen loads. Processors may thus need to consider supplemental measures, such as raw material specifications and a final heating step, to enhance the lethality of the overall manufacturing process.


Subject(s)
Food Handling , Food-Processing Industry/standards , Listeria monocytogenes/growth & development , Meat Products/microbiology , Salmonella/growth & development , Animals , Colony Count, Microbial , Consumer Product Safety , Fermentation , Food Contamination/prevention & control , Food Handling/methods , Food Handling/standards , Food Microbiology , Food-Processing Industry/methods , Humans , Temperature , Time Factors
3.
J Food Prot ; 68(11): 2349-55, 2005 Nov.
Article in English | MEDLINE | ID: mdl-16300072

ABSTRACT

Sliced (cut) and exterior (intact) surfaces of restructured cooked roast beef were inoculated with Listeria monocytogenes, treated with cetylpyridinium chloride (CPC; immersion in 500 ml of 1% solution for 1 min), individually vacuum packaged, and stored for 42 days at 0 or 4 degrees C. Noninoculated samples were similarly treated, packaged, and stored to determine effects on quality (color and firmness) and on naturally occurring bacterial populations, including aerobic plate counts and lactic acid bacteria. Immediately after CPC treatment, regardless of inoculation level, L. monocytogenes populations were reduced (P = 0.05) by about 2 log CFU/cm2 on sliced surfaces and by about 4 log CFU/cm2 on exterior surfaces. Throughout 42 days of refrigerated storage (at both 0 and 4 degrees C), L. monocytogenes populations on CPC-treated samples remained lower (P = 0.05) than those of nontreated samples for both surface types. After 42 days of storage at both 0 and 4 degrees C, aerobic plate count and lactic acid bacteria populations of treated samples were 1 to 1.5 log CFU/cm2 lower (P = 0.05) than those of nontreated samples for both surface types. CPC treatment resulted in negligible effects (P > 0.05) on the color (L*, a*, and b* values) of exterior and sliced roast beef surfaces during storage. For both sliced and exterior surfaces, CPC-treated samples were generally less firm than nontreated samples. CPC treatment effectively reduced L. monocytogenes populations on roast beef surfaces and resulted in relatively minor impacts on color and texture attributes. CPC treatment, especially when applied to products prior to slicing, may serve as an effective antimicrobial intervention for ready-to-eat meat products.


Subject(s)
Anti-Infective Agents, Local/pharmacology , Cetylpyridinium/pharmacology , Food Preservation/methods , Listeria monocytogenes/drug effects , Meat Products/analysis , Animals , Cattle , Colony Count, Microbial , Food Handling/methods , Food Packaging/methods , Listeria monocytogenes/growth & development , Meat Products/standards , Pigmentation , Temperature , Time Factors , Vacuum
4.
Foodborne Pathog Dis ; 2(3): 233-41, 2005.
Article in English | MEDLINE | ID: mdl-16156704

ABSTRACT

Ready-to-eat Polish sausages were inoculated with Listeria monocytogenes at either low (3 log(10) CFU/g) or high (7 log(10) CFU/g) levels, treated with a 1% cetylpyridinium chloride (CPC) spray (20 psi, 25 degrees C, 30-sec exposure), vacuum packaged, and stored for 42 days at 0 degrees C or 4 degrees C. Non-inoculated samples were similarly treated, packaged, and stored to determine effects on color, firmness, and naturally occurring bacterial populations such as aerobic plate counts (APC). At the low inoculation level, L. monocytogenes populations were reduced by 1 log(10) CFU/g immediately after CPC treatment, and populations on treated samples remained approximately 2 log(10) CFU/g lower than non-treated samples throughout the 42-day storage period. At the high inoculation level, L. monocytogenes populations were reduced by 3 log(10) CFU/g immediately after treatment and, after 42 days of storage, populations on treated samples were 4 log(10) CFU/g lower than non-treated samples. Regardless of storage temperature, APC populations of CPC-treated samples were 1-2 log(10) CFU/g lower than non-treated samples throughout storage. An APC of 6 log(10) CFU/g was observed by day 7 of storage for non-treated samples, although not until day 21 of storage for CPC-treated samples. For samples stored at 4 degrees C, no significant differences (p > 0.05) were observed for L*, a*, or b* color values of treated versus non-treated samples. At 0 degrees C, the effects of CPC treatment on a* values were statistically significant (p < or = 0.05), although minor. Non-treated samples were somewhat firmer than CPC-treated samples, primarily at the 0 degrees C storage temperature, although the observed differences were of a magnitude unlikely to impact perceived product quality. CPC treatment appears to be a viable post-processing decontamination technology for eliminating and/or inhibiting L. monocytogenes on RTE meats during refrigerated storage without detrimentally impacting color and texture.


Subject(s)
Anti-Infective Agents, Local/pharmacology , Cetylpyridinium/pharmacology , Consumer Product Safety , Food Handling/methods , Listeria monocytogenes/drug effects , Meat Products/microbiology , Animals , Colony Count, Microbial , Disinfection/methods , Disinfection/standards , Food Packaging/methods , Food Preservation/methods , Humans , Listeria monocytogenes/growth & development , Meat Products/standards , Pigmentation/drug effects , Swine , Temperature , Time Factors , Vacuum
5.
J Food Prot ; 68(9): 1823-30, 2005 Sep.
Article in English | MEDLINE | ID: mdl-16161680

ABSTRACT

Frankfurters inoculated with Listeria monocytogenes were treated with 1% cetylpyridinium chloride (CPC) or with 1% CPC followed by a water rinse at various combinations of spray temperatures (25, 40, and 55 degrees C), spray pressures (20, 25, and 35 psi), and times of exposure (30, 40, and 60 s). No significant differences (P > 0.05) were observed in the reductions achieved by 1% CPC + water wash and those achieved with 1% CPC treatment alone. L. monocytogenes populations were reduced by ca. 1.7 log CFU/g immediately following treatment, with no differences (P > 0.05) observed for different spray temperatures, pressures, or exposure times. The effectiveness of 1% CPC spray treatment (at 25 degrees C, 20 psi, and 30 s of exposure) against L. monocytogenes on vacuum-packaged frankfurters stored at 0 and 4 degrees C for 42 days was then evaluated. Application of a 1% CPC surface spray to frankfurters immediately prior to packaging reduced L. monocytogenes concentrations by 1.4 to 1.7 log CFU/g and further restricted growth of the pathogen during 42 days of refrigerated storage, thereby meeting U.S. Department of Agriculture alternatives 1 and 2 criteria for Listeria control. CPC treatment reduced aerobic plate counts, lactic acid bacteria, yeasts and molds, total coliforms, and Escherichia coli populations on noninoculated frankfurters to below detectable limits. The 1% CPC treatment did not affect the color (L*, a*, and b* values) of frankfurters stored for 42 days at 0 or 4 degrees C (P > 0.05). The effect of 1% CPC treatment on the firmness of frankfurters was also negligible.


Subject(s)
Anti-Infective Agents, Local/pharmacology , Cetylpyridinium/pharmacology , Food Preservation/methods , Listeria monocytogenes/drug effects , Meat Products/microbiology , Meat Products/standards , Colony Count, Microbial , Food Microbiology , Food Packaging , Pressure , Quality Control , Temperature , Time Factors
6.
J Food Prot ; 61(5): 571-7, 1998 May.
Article in English | MEDLINE | ID: mdl-9709229

ABSTRACT

A steam pasteurization process (patent pending) has been shown to effectively reduce pathogenic bacterial populations on beef tissue and to significantly reduce naturally occurring bacterial populations on commercially slaughtered beef carcasses. The objective of this study was to determine the effectiveness of the steam pasteurization treatment for reducing bacterial populations at several anatomical locations on commerically slaughtered carcasses. Before and after pasteurization treatment (82.2 degrees C, 6.5-s exposure time), a sterile sponge was used to sample 300 cm2 at one of five locations (inside round, loin, midline, brisket, or neck). Eighty carcasses (40 before treatment and 40 after treatment) were sampled per anatomical location over 2 processing days. Before treatment, aerobic plate counts (APCs) were found to be highest (P < or = 0.01) at the midline (4.5 log10 CFU/100 cm2), intermediate at the inside round, brisket, and neck (ca. 3.8 log10 CFU/100 cm2), and lowest at the loin (3.4 log10 CFU/100 cm2). After treatment, APCs at all locations were reduced significantly (P < or = 0.01). The inside round, loin, and brisket had the lowest (P < or = 0.01) APCs (ca. 2.6 log10 CFU/100 cm2), whereas the midline and neck had APCs of 3.1 and 3.3 log10 CFU/100 cm2, respectively. The lower reduction in APCs at the neck area indicated that the treatment may not be as effective there, possibly because of the design of the pasteurization equipment. Generic Escherichia coli populations were low at all locations before treatment, with populations on 32% of all carcasses sampled being less than the detection limit of the study (5.0 CFU/100 cm2). After treatment, E. coli populations were significantly lower (P < or = 0.01) than populations before treatment and 85% of all carcasses sampled had E. coli populations below the detection limit. The maximum E. coli population detected after treatment was 25 CFU/100 cm2. For enteric bacterial populations, no differences were observed in the effectiveness of the treatment among the five carcass locations.


Subject(s)
Abattoirs/standards , Meat Products/microbiology , Steam , Sterilization , Animals , Cattle , Colony Count, Microbial , Enterobacteriaceae/isolation & purification , Escherichia coli/isolation & purification , Food Handling , Salmonella/isolation & purification , Temperature , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...