Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 38
Filtrar
1.
J Food Prot ; 87(7): 100304, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38777091

RESUMO

Salmonella prevalence declined in U.S. raw poultry products since adopting prevalence-based Salmonella performance standards, but human illnesses did not reduce proportionally. We used Quantitative Microbial Risk Assessment (QMRA) to evaluate public health risks of raw chicken parts contaminated with different levels of all Salmonella and specific high- and low-virulence serotypes. Lognormal Salmonella level distributions were fitted to 2012 USDA-FSIS Baseline parts survey and 2023 USDA-FSIS HACCP verification sampling data. Three different Dose-Response (DR) approaches included (i) a single DR for all serotypes, (ii) DR that reduces Salmonella Kentucky ST152 virulence, and (iii) multiple serotype-specific DR models. All scenarios found risk concentrated in the few products with high Salmonella levels. Using a single DR model with Baseline data (µ = -3.19, σ = 1.29 Log CFU/g), 68% and 37% of illnesses were attributed to the 0.7% and 0.06% of products with >1 and >10 CFU/g Salmonella, respectively. Using distributions from 2023 HACCP data (µ = -5.53, σ = 2.45), 99.8% and 99.0% of illnesses were attributed to the 1.3% and 0.4% of products with >1 and >10 CFU/g Salmonella, respectively. Scenarios with serotype-specific DR models showed more concentrated risk at higher levels. Baseline data showed 92% and 67% and HACCP data showed >99.99% and 99.96% of illnesses attributed to products with >1 and >10 CFU/g Salmonella, respectively. Regarding serotypes using Baseline or HACCP input data, 0.002% and 0.1% of illnesses were attributed to the 0.2% and 0.4% of products with >1 CFU/g of Kentucky ST152, respectively, while 69% and 83% of illnesses were attributed to the 0.3% and 0.6% of products with >1 CFU/g of Enteritidis, Infantis, or Typhimurium, respectively. Therefore, public health risk in chicken parts is concentrated in finished products with high levels and specifically high levels of high-virulence serotypes. Low-virulence serotypes like Kentucky contribute few human cases.


Assuntos
Galinhas , Microbiologia de Alimentos , Salmonella , Sorogrupo , Animais , Medição de Risco , Humanos , Virulência , Contaminação de Alimentos/análise , Intoxicação Alimentar por Salmonella/epidemiologia , Infecções por Salmonella/epidemiologia
2.
J Dairy Sci ; 107(5): 2733-2747, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-37949407

RESUMO

Share tables (ST) are tables or stations in school cafeterias where students can return unopened foods and beverages, providing an opportunity to access these items at no cost. Currently, research suggests that milk is among the most wasted items in breakfast and lunch programs in the United States. Share tables present a simple solution for reducing milk waste, but research is needed to understand the microbial spoilage potential of milk in ST. To this end, uninoculated milk cartons and milk cartons inoculated with 2 to 3 log10(cfu/mL) Pseudomonas poae, a fast-growing psychrotroph, was exposed to ambient temperature during winter (mean temperature = 20.3°C) and summer (23.1°C) for 125 min, repeated over 5 d (the length of a school week). Microbial counts in the inoculated milk cartons increased linearly, exceeding the spoilage threshold of 6.0 log10(cfu/mL) after d 3 and after d 4 in the winter and summer season trials, respectively. In the winter trial, the microbial counts for uninoculated milk cartons never exceeded the lower limit of detection, 2.31 log10(cfu/mL), and in the summer trials, microbial counts never reached the spoilage threshold, indicating that initial contamination is a driving factor of microbial milk spoilage. Regardless of sharing status or seasonality, the greatest changes in counts for inoculated milk cartons occurred during overnight refrigeration, ranging from 0.56 to 1.4 log10(cfu/mL), while during the share table ranged from no observable change up to 0.29 log10(cfu/mL), emphasizing that school nutrition personnel should focus efforts on tightly controlling refrigeration temperatures and returning milk to refrigeration as soon as possible. A previously developed model for school cafeteria share tables was adapted to understand the typical residence time of milk in a simulated cafeteria with an ambient temperature share table for the summer and winter seasons over 1,000 wk. Milk was predicted to have a very short mean residence time (85 min) regardless of sharing status or season, with 99.8% of milk consumed, discarded, or donated within the first 2 d. As a result, only 3 out of 451,410 and 6 out of 451,410 simulated milks spoiled in the winter and summer seasons, respectively. The data generated here can be used to inform science-based decision-making for including milk in share tables, or applied to any system where one might have to accept short-term unrefrigerated storage of milk to meet a waste reduction or food security goal.

3.
Foods ; 12(21)2023 Nov 05.
Artigo em Inglês | MEDLINE | ID: mdl-37959153

RESUMO

The many possible treatments and continuously changing consumer trends present a challenge when selecting antimicrobial interventions during pork processing. Thirty-five potential antimicrobials were screened at commercial working concentrations by individually adding them to miniaturized (69 cm3) disks of pork loin ends, followed by inoculation with Salmonella Typhimurium ATCC 19585. Two organic acids and nine essential oils significantly inhibited Salmonella counts on pork (p < 0.05). However, six compounds that represent different levels of significance (p < 0.05-p < 0.0001) were selected as independent variables to build a Response Surface Methodology model based on a Doehlert matrix (Doehlert Matrix-RSM): lactic acid 1.25%, formic acid 0.25%, cumin 0.25%, clove 0.25%, peppermint 0.5%, and spearmint 0.5%. The goal of the Doehlert Matrix-RSM was to study single and paired effects of these antimicrobials on the change in Salmonella over 24 h. The Doehlert Matrix-RSM model predicted that lactic acid, formic acid, cumin, peppermint, and spearmint significantly reduced Salmonella when added alone, while no significant interactions between these antimicrobials were found. A laboratory-scale validation was carried out on pork loin end slices, which confirmed the results predicted by the model. While this screening did not identify novel synergistic combinations, our approach to screening a variety of chemical compounds by implementing a miniaturized pork loin disk model allowed us to identify the most promising antimicrobial candidates to then formally design experiments to study potential interactions with other antimicrobials.

4.
J Food Prot ; 86(11): 100177, 2023 11.
Artigo em Inglês | MEDLINE | ID: mdl-37805043

RESUMO

Aggregative boot cover sampling may be a more representative, practical, and powerful method for preharvest produce soil testing than grab sampling because boot covers aggregate soil from larger areas. Our study tests if boot cover sampling results reflect quality and safety indicator organisms and community diversity of grab sampling. We collected soil samples from commercial romaine lettuce fields spanning 5060 m2 using boot covers (n = 28, m = 1.1 ± 0.4 g; wearing boot covers and walking along the path), composite grabs (n = 28, m = 231 ± 24 g; consisting of 60 grabs of 3-5 g each), and high-resolution grabs (n = 72, m = 56 ± 4 g; taking one sample per stratum). Means and standard deviations of log-transformed aerobic plate counts (APCs) were 7.0 ± 0.3, 7.1 ± 0.2, and 7.3 ± 0.2 log(CFU/g) for boot covers, composite grabs, and high-resolution grabs, respectively. APCs did not show biologically meaningful differences between sample types. Boot covers recovered on average 0.6 log(CFU/g) more total coliforms than both grabs (p < 0.001) where means and standard deviations of log-transformed counts were 3.2 ± 1.0, 2.6 ± 0.6, and 2.6 ± 1.0 log(CFU/g) for boot covers, composite grabs, and high-resolution grabs, respectively. There were no generic E. coli detected in any sample by enumeration methods with LODs of 1.3-2.1 log(CFU/g) for boot covers and 0.5 log(CFU/g) for both grabs. By 16S rRNA sequencing, community species diversity (alpha diversity) was not significantly different within collection methods. While communities differed (p < 0.001) between soil sampling methods (beta diversity), variance in microbial communities was not significantly different. Of the 28 phyla and 297 genera detected, 25 phyla (89%) and 258 genera (87%) were found by all methods. Overall, aggregative boot cover sampling is similar to both grab methods for recovering quality and safety indicator organisms and representative microbiomes. This justifies future work testing aggregative soil sampling for foodborne pathogen detection.


Assuntos
Escherichia coli , Microbiologia de Alimentos , Contagem de Colônia Microbiana , Solo , RNA Ribossômico 16S
5.
J Food Prot ; 86(11): 100161, 2023 11.
Artigo em Inglês | MEDLINE | ID: mdl-37742835

RESUMO

As of August 2023, the two U.S. Food and Drug Administration (FDA) official detection methods for C. cayetanensis are outlined in the FDA Bacteriological Analytical Manual (BAM) Chapters 19b (produce testing) and 19c (agricultural water testing). These newly developed detection methods have been shown to not always detect contamination when present at low levels. Yet, industry and regulators may choose to use these methods as part of their monitoring and verification activities while detection methods continue to be improved. This study uses simulation to better understand the performance of these methods for various produce and water sampling plans. To do so, we used published FDA test validation data to fit a logistic regression model that predicts the methods' detection rate given the number of oocysts present in a 10-L agricultural water or 25 g produce sample. By doing so, we were able to determine contamination thresholds at which different numbers of samples (n = 1, 2, 4, 8, 16, and 32) would be adequate for detecting contamination. Furthermore, to evaluate sampling plans in use cases, a simulation was developed to represent C. cayetanensis contamination in agricultural water and on cilantro throughout a 45-day growth cycle. The model included uncertainty around the contamination sources, including scenarios of unintentionally contaminated irrigation water or in-field contamination. The results demonstrate that in cases where irrigation water was the contamination source, frequent water testing proved to be more powerful than produce testing. In scenarios where contamination occurred in-field, conducting frequent produce testing or testing produce toward the end of the season more reliably detected contamination. This study models the power of C. cayetanensis detection methods to understand the sampling plan performance and how these methods can be better used to monitor this emerging food safety hazard.


Assuntos
Cyclospora , Animais , Água , Agricultura , Inocuidade dos Alimentos , Oocistos
6.
J Food Prot ; 86(10): 100142, 2023 10.
Artigo em Inglês | MEDLINE | ID: mdl-37562513

RESUMO

Fusarium species infect maize crops leading to the production of fumonisin by their toxigenic members. Elimination of microbes is critical in mitigating further postharvest spoilage and toxin accumulation. The current study investigates the efficacy of a previously described multispectral sorting technique to analyze the reduction of fumonisin and toxigenic Fusarium species found contaminating maize kernels in Kenya. Maize samples (n = 99) were collected from six mycotoxin hotspot counties in Kenya (Embu, Meru, Tharaka Nithi, Machakos, Makueni, and Kitui County) and analyzed for aflatoxin and fumonisin using commercial ELISA kits. Aflatoxin levels in majority (91%) of the samples were below the 10 ng/g threshold set by the Kenya Bureau of Standards and therefore not studied further. The 23/99 samples that had >2,000 ng/g of fumonisin were selected for sorting. The sorter was calibrated using kernels sourced from Ghana to reject visibly high-risk kernels for fumonisin contamination using reflectance at nine distinct wavelengths (470-1,550 nm). Accepted and rejected streams were tested for fumonisin using ELISA, and the presence of toxigenic Fusarium using qPCR. After sorting, there was a significant (p < 0.001) reduction of fumonisin, by an average of 1.8 log ng/g (98%) and ranging between 0.14 and 2.7 log ng/g reduction (28-99.8%) with a median mass rejection rate of 1.9% (ranged 0% to 48%). The fumonisin rejection rate ranged between 0 and 99.8% with a median of 77%. There was also a significant reduction (p = 0.005) in the proportion of DNA represented by toxigenic Fusarium, from a mean of 30-1.4%. This study demonstrates the use of multispectral sorting as a potential postharvest intervention tool for the reduction of Fusarium species and preformed fumonisin. The spectral sorting approach of this study suggests that classification algorithms based on high-risk visual features associated with mycotoxin can be applied across different sources of maize to reduce fumonisin.


Assuntos
Aflatoxinas , Fumonisinas , Fusarium , Micotoxinas , Fumonisinas/análise , Zea mays , Contaminação de Alimentos/análise , Quênia , Micotoxinas/análise , Aflatoxinas/análise
7.
Microorganisms ; 11(7)2023 Jun 22.
Artigo em Inglês | MEDLINE | ID: mdl-37512805

RESUMO

Tomato is the main vegetable cultivated under soilless culture systems (SCSs); production of organic tomato under SCSs has increased due to consumer demands for healthier and environmentally friendly vegetables. However, organic tomato production under SCSs has been associated with low crop performance and fruit quality defects. These agricultural deficiencies could be linked to alterations in tomato plant microbiota; nonetheless, this issue has not been sufficiently addressed. Thus, the main goal of the present study was to characterize the rhizosphere and phyllosphere of tomato plants cultivated under conventional and organic SCSs. To accomplish this goal, tomato plants grown in commercial greenhouses under conventional or organic SCSs were tested at 8, 26, and 44 weeks after seedling transplantation. Substrate (n = 24), root (n = 24), and fruit (n = 24) composite samples were subjected to DNA extraction and high-throughput 16S rRNA gene sequencing. The present study revealed that the tomato core microbiota was predominantly constituted by Proteobacteria, Actinobacteria, and Firmicutes. Remarkably, six bacterial families, Bacillaceae, Microbacteriaceae, Nocardioidaceae, Pseudomonadaceae, Rhodobacteraceae, and Sphingomonadaceae, were shared among all substrate, rhizosphere, and fruit samples. Importantly, it was shown that plants under organic SCSs undergo a dysbiosis characterized by significant changes in the relative abundance of Bradyrhizobiaceae, Caulobacteraceae, Chitinophagaceae, Enterobacteriaceae, Erythrobacteraceae, Flavobacteriaceae, Nocardioidaceae, Rhodobacteraceae, and Streptomycetaceae. These results suggest that microbial alterations in substrates, roots, and fruits could be potential factors in contributing to the crop performance and fruit quality deficiencies observed in organic SCSs.

8.
J Food Prot ; 86(8): 100115, 2023 08.
Artigo em Inglês | MEDLINE | ID: mdl-37295498

RESUMO

Cronobacteris a hazard in Powdered Infant Formula (PIF) products that is hard to detect due to localized and low-level contamination. We adapted a previously published sampling simulation to PIF sampling and benchmarked industry-relevant sampling plans across different numbers of grabs, total sample mass, and sampling patterns. We evaluated performance to detect published Cronobacter contamination profiles for a recalled PIF batch [42% prevalence, -1.8 ± 0.7 log(CFU/g)] and a reference, nonrecalled, PIF batch [1% prevalence, -2.4 ± 0.8 log(CFU/g)]. Simulating a range of numbers of grabs [n = 1-22,000 (representing testing every finished package)] with 300 g total composite mass showed that taking 30 or more grabs detected contamination reliably (<1% median probability to accept the recalled batch). Benchmarking representative sampling plans ([n = 30, mass grab = 10g], [n = 30, m = 25g], [n = 60, m = 25g], [n = 180, m = 25g]) showed that all plans would reject the recalled batch (<1% median probability to accept) but would rarely reject the reference batch (>50% median probability of acceptance, all plans). Overall, (i) systematic or stratified random sampling patterns are equal to or more powerful than random sampling of the same sample size and total sampled mass, and, (ii) taking more samples, even if smaller, can increase the power to detect contamination.


Assuntos
Cronobacter sakazakii , Cronobacter , Humanos , Lactente , Contaminação de Alimentos/análise , Fórmulas Infantis , Pós , Contaminação de Medicamentos , Microbiologia de Alimentos
9.
Appl Environ Microbiol ; 89(5): e0034723, 2023 05 31.
Artigo em Inglês | MEDLINE | ID: mdl-37098895

RESUMO

Commercial leafy green supply chains often are required to have test and reject (sampling) plans for specific microbial adulterants at primary production or finished product packing for market access. To better understand the impact of this type of sampling, this study simulated the effect of sampling (from preharvest to consumer) and processing interventions (such as produce wash with antimicrobial chemistry) on the microbial adulterant load reaching the system endpoint (customer). This study simulated seven leafy green systems, an optimal system (all interventions), a suboptimal system (no interventions), and five systems where single interventions were removed to represent single process failures, resulting in 147 total scenarios. The all-interventions scenario resulted in a 3.4 log reduction (95% confidence interval [CI], 3.3 to 3.6) of the total adulterant cells that reached the system endpoint (endpoint TACs). The most effective single interventions were washing, prewashing, and preharvest holding, 1.3 (95% CI, 1.2 to 1.5), 1.3 (95% CI, 1.2 to 1.4), and 0.80 (95% CI, 0.73 to 0.90) log reduction to endpoint TACs, respectively. The factor sensitivity analysis suggests that sampling plans that happen before effective processing interventions (preharvest, harvest, and receiving) were most effective at reducing endpoint TACs, ranging between 0.05 and 0.66 log additional reduction compared to systems with no sampling. In contrast, sampling postprocessing (finished product) did not provide meaningful additional reductions to the endpoint TACs (0 to 0.04 log reduction). The model suggests that sampling used to detect contamination was most effective earlier in the system before effective interventions. Effective interventions reduce undetected contamination levels and prevalence, reducing a sampling plan's ability to detect contamination. IMPORTANCE This study addresses the industry and academic need to understand the effect of test-and-reject sampling within a farm-to-customer food safety system. The model developed looks at product sampling beyond the preharvest stage by assessing sampling at multiple stages. This study shows that individual interventions and combined interventions substantially reduce the total adulterant cells reaching the system endpoint. When effective interventions occur during processing, sampling at earlier stages (preharvest, harvest, receiving) has more power to detect incoming contamination than postprocessing sampling, as prevalence and contamination levels are lower. This study reiterates that effective food safety interventions are crucial for food safety. When product sampling is used to test and reject a lot as a preventive control, it may detect critically high incoming contamination. However, if contamination levels and prevalence are low, typical sampling plans will fail to detect contamination.


Assuntos
Contaminação de Alimentos , Microbiologia de Alimentos , Contaminação de Alimentos/análise , Fazendas , Inocuidade dos Alimentos/métodos , Manipulação de Alimentos/métodos , Contagem de Colônia Microbiana
10.
Appl Environ Microbiol ; 88(23): e0101522, 2022 12 13.
Artigo em Inglês | MEDLINE | ID: mdl-36377948

RESUMO

Commercial leafy greens customers often require a negative preharvest pathogen test, typically by compositing 60 produce sample grabs of 150 to 375 g total mass from lots of various acreages. This study developed a preharvest sampling Monte Carlo simulation, validated it against literature and experimental trials, and used it to suggest improvements to sampling plans. The simulation was validated by outputting six simulated ranges of positive samples that contained the experimental number of positive samples (range, 2 to 139 positives) recovered from six field trials with point source, systematic, and sporadic contamination. We then evaluated the relative performance between simple random, stratified random, or systematic sampling in a 1-acre field to detect point sources of contamination present at 0.3% to 1.7% prevalence. Randomized sampling was optimal because of lower variability in probability of acceptance. Optimized sampling was applied to detect an industry-relevant point source [3 log(CFU/g) over 0.3% of the field] and widespread contamination [-1 to -4 log(CFU/g) over the whole field] by taking 60 to 1,200 sample grabs of 3 g. More samples increased the power of detecting point source contamination, as the median probability of acceptance decreased from 85% with 60 samples to 5% with 1,200 samples. Sampling plans with larger total composite sample mass increased power to detect low-level, widespread contamination, as the median probability of acceptance with -3 log(CFU/g) contamination decreased from 85% with a 150-g total mass to 30% with a 1,200-g total mass. Therefore, preharvest sampling power increases by taking more, smaller samples with randomization, up to the constraints of total grabs and mass feasible or required for a food safety objective. IMPORTANCE This study addresses a need for improved preharvest sampling plans for pathogen detection in leafy green fields by developing and validating a preharvest sampling simulation model, avoiding the expensive task of physical sampling in many fields. Validated preharvest sampling simulations were used to develop guidance for preharvest sampling protocols. Sampling simulations predicted that sampling plans with randomization are less variable in their power to detect low-prevalence point source contamination in a 1-acre field. Collecting larger total sample masses improved the power of sampling plans in detecting widespread contamination in 1-acre fields. Hence, the power of typical sampling plans that collect 150 to 375 g per composite sample can be improved by taking more, randomized smaller samples for larger total sample mass. The improved sampling plans are subject to feasibility constraints or to meet a particular food safety objective.


Assuntos
Contaminação de Alimentos , Inocuidade dos Alimentos , Contaminação de Alimentos/análise , Folhas de Planta , Simulação por Computador , Microbiologia de Alimentos , Contagem de Colônia Microbiana
11.
Toxicon X ; 16: 100141, 2022 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-36406140

RESUMO

Maize is a staple food in Kenya. However, maize is prone to fungal infestation, which may result in production of harmful aflatoxins and fumonisins. Electron beam (eBeam) food processing is a proven post-harvest technology, but published literature is rare on the ability of eBeam to reduce mycotoxins in naturally contaminated maize samples. This study evaluated the efficacy of eBeam doses in reducing viable fungal populations and the destruction of aflatoxins and fumonisins in naturally highly contaminated maize samples from eastern Kenya. Ninety-seven maize samples were analyzed for total aflatoxins and fumonisins using commercial ELISA kits. Then, 24 samples with >100 ng/g of total aflatoxins and >1000 ng/g of total fumonisins were chosen for eBeam toxin degradation studies. Prior to eBeam exposure studies, the samples were made into a slurry using sterile de-ionized water. These slurry samples were exposed to target doses of 5 kGy, 10 kGy, and 20 kGy, with 0 kGy (untreated) samples as controls. Samples were analyzed for total fungal load using culture methods, the quantity of total aflatoxins and fumonisins using ELISA, and the presence of Aspergillus and Fusarium spp. nucleic acids using qPCR for just control samples. There was a significant positive correlation in the control samples between total Aspergillus and aflatoxin levels (r = 0.54; p = 0.007) and total Fusarium and fumonisin levels (r = 0.68; p < 0.001). Exposure to eBeam doses 5 kGy and greater reduced fungal loads to below limits of detection by plating (<1.9 log(CFU/g)). There was also a significant (p = 0.03) average reduction of 0.3 log (ng/g) in aflatoxin at 20 kGy (range from -0.9 to 1.4 log (ng/g)). There was no significant reduction in fumonisin even at 20 kGy. eBeam doses below 20 kGy did not reduce mycotoxins. These results confirm the sensitivity of fungi to eBeam doses in a naturally contaminated maize slurry and that 20 kGy is effective at degrading some pre-formed aflatoxin in such maize preparations.

13.
Microorganisms ; 9(7)2021 Jun 22.
Artigo em Inglês | MEDLINE | ID: mdl-34206706

RESUMO

A prophage is a phage-related sequence that is integrated into a bacterial chromosome. Prophages play an important role in bacterial evolution, survival, and persistence. To understand the impact of Listeria prophages on their host genome organizations, this work sequenced two L. monocytogenes strains (134LM and 036LM), previously identified as lysogens by mitomycin C induction. Draft genomes were generated with assembly sizes of 2,953,877 bp and 3,000,399 bp. One intact prophage (39,532 bp) was inserted into the comK gene of the 134LM genome. Two intact prophages (48,684 bp and 39,488 bp) were inserted in tRNA-Lys and elongation-factor genes of the 036LM genome. The findings confirmed the presence of three corresponding induced phages previously obtained by mitomycin C induction. Comparative genomic analysis of three prophages obtained in the newly sequenced lysogens with 61 prophages found in L. monocytogenes genomes, available in public databases, identified six major clusters using whole genome-based phylogenetic analysis. The results of the comparative genomic analysis of the prophage sequences provides knowledge about the diversity of Listeria prophages and their distribution among Listeria genomes in diverse environments, including different sources or geographical regions. In addition, the prophage sequences and their insertion sites contribute to the genomic diversity of L. monocytogenes genomes. These data of prophage sequences, prophage insertion sites, and prophage sequence comparisons, together with ANIb confirmation, could be useful for L. monocytogenes classification by prophages. One potential development could be refinement of prophage typing tools for monitoring or surveillance of L. monocytogenes contamination and transmission.

14.
J Food Prot ; 84(10): 1664-1672, 2021 10 01.
Artigo em Inglês | MEDLINE | ID: mdl-34047784

RESUMO

ABSTRACT: Environmental and health advocates are increasingly promoting food donations to reduce landfilled food waste and feed hungry people. Share tables are locations where students can put unwanted school food or beverage items, allowing their uneaten food items to be "shared" with other students and providing food donation opportunities for the 4.9 billion lunches served annually in the U.S. National School Lunch Program. The purpose of this qualitative study was to identify differences in health inspector interpretations of the Food Code as it relates to share table operations and risk mitigation techniques preferred by inspectors for preventing foodborne illness from recovered food. A snowball sampling technique was used to identify Illinois health inspectors (n = 13) engaged in share table inspections. Telephone interviews were audio recorded and transcribed verbatim. The transcripts were coded using a hybrid process of deductive and inductive content analysis. Participants considered contamination, rather than temperature abuse, to be the primary risk factor for foodborne illness. Those participants with permissive Food Code interpretations considered contamination risk in the context of the overall school environment. Participants had the lowest degree of consensus on whether to allow whole apple recovery via a share table. Participants also lacked consensus on reuse of unclaimed share table items in future meal programs (reservice). This lack of consensus indicates that further research is needed to develop data-driven strategies to assess and manage the microbial risks associated with share tables and ultimately to facilitate increased food recovery.


Assuntos
Serviços de Alimentação , Eliminação de Resíduos , Humanos , Almoço , Instituições Acadêmicas , Estudantes
15.
Risk Anal ; 41(11): 2065-2080, 2021 11.
Artigo em Inglês | MEDLINE | ID: mdl-33733507

RESUMO

Probe sampling plans for aflatoxin in corn attempt to reliably estimate concentrations in bulk corn given complications like skewed contamination distribution and hotspots. To evaluate and improve sampling plans, three sampling strategies (simple random sampling, stratified random sampling, systematic sampling with U.S. GIPSA sampling schemes), three numbers of probes (5, 10, 100, the last a proxy for autosampling), four clustering levels (1, 10, 100, 1,000 kernels/cluster source), and six aflatoxin concentrations (5, 10, 20, 40, 80, 100 ppb) were assessed by Monte-Carlo simulation. Aflatoxin distribution was approximated by PERT and Gamma distributions of experimental aflatoxin data for uncontaminated and naturally contaminated single kernels. The model was validated against published data repeatedly sampling 18 grain lots contaminated with 5.8-680 ppb aflatoxin. All empirical acceptance probabilities fell within the range of simulated acceptance probabilities. Sensitivity analysis with partial rank correlation coefficients found acceptance probability more sensitive to aflatoxin concentration (-0.87) and clustering level (0.28) than number of probes (-0.09) and sampling strategy (0.04). Comparison of operating characteristic curves indicate all sampling strategies have similar average performance at the 20 ppb threshold (0.8-3.5% absolute marginal change), but systematic sampling has larger variability at clustering levels above 100. Taking extra probes improves detection (1.8% increase in absolute marginal change) when aflatoxin is spatially clustered at 1,000 kernels/cluster, but not when contaminated grains are homogenously distributed. Therefore, taking many small samples, for example, autosampling, may increase sampling plan reliability. The simulation is provided as an R Shiny web app for stakeholder use evaluating grain sampling plans.


Assuntos
Aflatoxinas/análise , Contaminação de Alimentos/análise , Zea mays/química , Análise por Conglomerados , Reprodutibilidade dos Testes
16.
J Food Prot ; 84(5): 802-810, 2021 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-33302287

RESUMO

ABSTRACT: Ready-to-eat meat products, such as deli ham, can support the growth of Listeria monocytogenes (LM), which can cause severe illness in immunocompromised individuals. The objectives of this study were to validate a miniature ham model (MHM) against the ham slice method and to screen antimicrobial combinations to control LM on ham by using response surface methology (RSM) as a time- and cost-effective high-throughput screening tool. The effect of nisin (Ni), potassium lactate and sodium diacetate, lauric arginate (LAG), lytic bacteriophage (P100), and ε-polylysine (EPL) added alone, or in combination, were determined on the MHM over 12 days of storage. Results showed the MHM accurately mimics the ham slice method because no statistical differences were found (P = 0.526) in the change of LM cell counts in MHM and slice counts after 12 days of storage at 4°C for treated and untreated hams. The MHM was then used to screen antimicrobial combinations by using an on-face design and three center points in a central composite design. The RSM was tested by using a cocktail of five LM strains isolated from foodborne disease outbreaks. Three levels of the previously mentioned antimicrobials were used in combination for a total of 28 runs performed in triplicate. The change of LM cell counts were determined after 12 days of storage at 4°C. All tested antimicrobials were effective on reducing LM cell counts on ham when added alone. A significant antagonistic interaction (P = 0.002) was identified by the RSM between LAG and P100, where this antimicrobial combination caused a 2.2-log CFU/g change of LM cell counts after 12 days of storage. Two interactions, between Ni and EPL (P = 0.058), and Ni and P100 (P = 0.068), showed possible synergistic effects against LM on the MHM. Other interactions were clearly nonsignificant, suggesting additive effects. In future work, the developed MHM in combination with RSM can be used as a high-throughput method to analyze novel antimicrobial treatments against LM.


Assuntos
Anti-Infecciosos , Listeria monocytogenes , Produtos da Carne , Contagem de Colônia Microbiana , Análise Custo-Benefício , Microbiologia de Alimentos , Conservação de Alimentos , Humanos
17.
Foods ; 9(11)2020 Nov 20.
Artigo em Inglês | MEDLINE | ID: mdl-33233500

RESUMO

Listeria monocytogenes is a food-borne pathogen often associated with ready-to-eat (RTE) food products. Many antimicrobial compounds have been evaluated in RTE meats. However, the search for optimum antimicrobial treatments is ongoing. The present study developed a rapid, non-destructive preliminary screening tool for large-scale evaluation of antimicrobials utilizing a bioluminescent L. monocytogenes with a model meat system. Miniature hams were produced, surface treated with antimicrobials nisin (at 0-100 ppm) and potassium lactate sodium diacetate (at 0-3.5%) and inoculated with bioluminescent L. monocytogenes. A strong correlation (r = 0.91) was found between log scale relative light units (log RLU, ranging from 0.00 to 3.35) read directly from the ham surface and endpoint enumeration on selective agar (log colony forming units (CFU)/g, ranging from 4.7 to 8.3) when the hams were inoculated with 6 log CFU/g, treated with antimicrobials, and L. monocytogenes were allowed to grow over a 12 d refrigerated shelf life at 4 °C. Then, a threshold of 1 log RLU emitted from a ham surface was determined to separate antimicrobial treatments that allowed more than 2 log CFU/g growth of L. monocytogenes (from 6 log CFU/g inoculation to 8 log CFU/g after 12 d). The proposed threshold was utilized in a luminescent screening of antimicrobials with days-to-detect growth monitoring of luminescent L. monocytogenes. Significantly different (p < 0.05) plate counts were found in antimicrobial treated hams that had reached a 1 log RLU increase (8.1-8.5 log(CFU/g)) and the hams that did not reach the proposed light threshold (5.3-7.5 log(CFU/g)). This confirms the potential use of the proposed light threshold as a qualitative tool to screen antimicrobials with less than or greater than a 2 log CFU/g increase. This screening tool can be used to prioritize novel antimicrobials targeting L. monocytogenes, alone or in combination, for future validation.

18.
PLoS One ; 15(8): e0236668, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32756571

RESUMO

While complex sample pooling strategies have been developed for large-scale experiments with robotic liquid handling, many medium-scale experiments like mycotoxin screening by Enzyme-Linked Immunosorbent Assay (ELISA) are still conducted manually in 48- and 96-well plates. At this scale, the opportunity to save on reagent costs is offset by the increased costs of labor, materials, and risk-of-error caused by increasingly complex pooling strategies. This paper compares one-dimensional (1D), two-dimensional (2D), and Shifted Transversal Design (STD) pooling to study whether pooling affects assay accuracy and experimental cost and to provide guidance for when a human experimentalist might benefit from pooling. We approximated mycotoxin contamination in single corn kernels by fitting statistical distributions to experimental data (432 kernels for aflatoxin and 528 kernels for fumonisin) and used experimentally-validated Monte-Carlo simulation (10,000 iterations) to evaluate assay sensitivity, specificity, reagent cost, and pipetting cost. Based on the validated simulation results, assay sensitivity remains 100% for all four pooling strategies while specificity decreases as prevalence level rises. Reagent cost could be reduced by 70% and 80% in 48- and 96-well plates, with 1D and STD pooling being most reagent-saving respectively. Such a reagent-saving effect is only valid when prevalence level is < 21% for 48-well plates and < 13%-21% for 96-well plates. Pipetting cost will rise by 1.3-3.3 fold for 48-well plates and 1.2-4.3 fold for 96-well plates, with 1D pooling by row requiring the least pipetting. Thus, it is advisable to employ pooling when the expected prevalence level is below 21% and when the likely savings of up to 80% on reagent cost outweighs the increased materials and labor costs of up to 4 fold increases in pipetting.


Assuntos
Ensaio de Imunoadsorção Enzimática/métodos , Programas de Rastreamento/métodos , Micotoxinas/isolamento & purificação , Humanos , Método de Monte Carlo , Micotoxinas/química , Micotoxinas/genética , Zea mays/genética , Zea mays/microbiologia
19.
Foods ; 9(3)2020 Mar 05.
Artigo em Inglês | MEDLINE | ID: mdl-32150943

RESUMO

Current detection methods for contamination of aflatoxin and fumonisin used in the corn industry are based on bulk level. However, literature demonstrates that contamination of these mycotoxins is highly skewed and bulk samples do not always represent accurately the overall contamination in a batch of corn. Single kernel analysis can provide an insightful level of analysis of the contamination of aflatoxin and fumonisin, as well as suggest a possible remediation to the skewness present in bulk detection. Current literature describes analytical methods capable of detecting aflatoxin and fumonisin at a single kernel level, such as liquid chromatography, fluorescence imaging, and reflectance imaging. These methods could provide tools to classify mycotoxin contaminated kernels and study potential co-occurrence of aflatoxin and fumonisin. Analysis at a single kernel level could provide a solution to the skewness present in mycotoxin contamination detection and offer improved remediation methods through sorting that could impact food security and management of food waste.

20.
BMC Microbiol ; 19(1): 257, 2019 11 19.
Artigo em Inglês | MEDLINE | ID: mdl-31744459

RESUMO

BACKGROUND: The foodborne pathogen Listeria monocytogenes causes the potentially lethal disease listeriosis. Within food-associated environments, L. monocytogenes can persist for long periods and increase the risk of contamination by continued presence in processing facilities or other food-associated environments. Most research on phenotyping of persistent L. monocytogenes' has explored biofilm formation and sanitizer resistance, with less data examining persistent L. monocytogenes' phenotypic responses to extrinsic factors, such as variations in osmotic pressure, pH, and energy source availability. It was hypothesized that isolates of persistent strains are able to grow, and grow faster, under a broader range of intrinsic and extrinsic factors compared to closely related isolates of sporadic strains. RESULTS: To test this hypothesis, 95 isolates (representing 74 isolates of 20 persistent strains and 21 isolates of sporadic strains) from a series of previous studies in retail delis, were grown at 37 °C, in (i) stress conditions: salt (0, 5, and 10% NaCl), pH (5.2, 7.2, and 9.2), and sanitizer (benzalkonium chloride, 0, 2, and 5 µg/mL) and (ii) energy sources: 25 mM glucose, cellobiose, glycogen, fructose, lactose, and sucrose; the original goal was to follow up with low temperature experiments for treatments where significant differences were observed. Growth rate and the ability to grow of 95 isolates were determined using high-throughput, OD600, growth curves. All stress conditions reduced growth rates in isolates compared to control (p < 0.05). In addition, growth varied by the tested energy sources. In chemically defined, minimal media there was a trend toward more isolates showing growth in all replicates using cellobiose (p = 0.052) compared to the control (glucose) and fewer isolates able to grow in glycogen (p = 0.02), lactose (p = 2.2 × 10- 16), and sucrose (p = 2.2 × 10- 16). Still, at least one isolate was able to consistently grow in every replicate for each energy source. CONCLUSIONS: The central hypothesis was rejected, as there was not a significant difference in growth rate or ability to grow for retail deli isolates of persistent strains compared to sporadic strains for any treatments at 37 °C. Therefore, these data suggest that persistence is likely not determined by a phenotype unique to persistent strains grown at 37 °C and exposed to extrinsic stresses or variation in energy sources.


Assuntos
Biofilmes/crescimento & desenvolvimento , Metabolismo dos Carboidratos , Metabolismo Energético , Listeria monocytogenes/crescimento & desenvolvimento , Celobiose/metabolismo , Contaminação de Alimentos , Microbiologia de Alimentos , Glucose/metabolismo , Glicogênio/metabolismo , Resposta ao Choque Térmico , Concentração de Íons de Hidrogênio , Listeria monocytogenes/metabolismo , Osmose , Plâncton
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...