Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 54
Filtrar
1.
EFSA J ; 22(4): e8719, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38650612

RESUMO

Surveillance data published since 2010, although limited, showed that there is no evidence of zoonotic parasite infection in market quality Atlantic salmon, marine rainbow trout, gilthead seabream, turbot, meagre, Atlantic halibut, common carp and European catfish. No studies were found for greater amberjack, brown trout, African catfish, European eel and pikeperch. Anisakis pegreffii, A. simplex (s. s.) and Cryptocotyle lingua were found in European seabass, Atlantic bluefin tuna and/or cod, and Pseudamphistomum truncatum and Paracoenogonimus ovatus in tench, produced in open offshore cages or flow-through ponds or tanks. It is almost certain that fish produced in closed recirculating aquaculture systems (RAS) or flow-through facilities with filtered water intake and exclusively fed heat-treated feed are free of zoonotic parasites. Since the last EFSA opinion, the UV-press and artificial digestion methods have been developed into ISO standards to detect parasites in fish, while new UV-scanning, optical, molecular and OMICs technologies and methodologies have been developed for the detection, visualisation, isolation and/or identification of zoonotic parasites in fish. Freezing and heating continue to be the most efficient methods to kill parasites in fishery products. High-pressure processing may be suitable for some specific products. Pulsed electric field is a promising technology although further development is needed. Ultrasound treatments were not effective. Traditional dry salting of anchovies successfully inactivated Anisakis. Studies on other traditional processes - air-drying and double salting (brine salting plus dry salting) - suggest that anisakids are successfully inactivated, but more data covering these and other parasites in more fish species and products is required to determine if these processes are always effective. Marinade combinations with anchovies have not effectively inactivated anisakids. Natural products, essential oils and plant extracts, may kill parasites but safety and organoleptic data are lacking. Advanced processing techniques for intelligent gutting and trimming are being developed to remove parasites from fish.

2.
EFSA J ; 22(1): e8521, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-38250499

RESUMO

Listeria monocytogenes (in the meat, fish and seafood, dairy and fruit and vegetable sectors), Salmonella enterica (in the feed, meat, egg and low moisture food sectors) and Cronobacter sakazakii (in the low moisture food sector) were identified as the bacterial food safety hazards most relevant to public health that are associated with persistence in the food and feed processing environment (FFPE). There is a wide range of subtypes of these hazards involved in persistence in the FFPE. While some specific subtypes are more commonly reported as persistent, it is currently not possible to identify universal markers (i.e. genetic determinants) for this trait. Common risk factors for persistence in the FFPE are inadequate zoning and hygiene barriers; lack of hygienic design of equipment and machines; and inadequate cleaning and disinfection. A well-designed environmental sampling and testing programme is the most effective strategy to identify contamination sources and detect potentially persistent hazards. The establishment of hygienic barriers and measures within the food safety management system, during implementation of hazard analysis and critical control points, is key to prevent and/or control bacterial persistence in the FFPE. Once persistence is suspected in a plant, a 'seek-and-destroy' approach is frequently recommended, including intensified monitoring, the introduction of control measures and the continuation of the intensified monitoring. Successful actions triggered by persistence of L. monocytogenes are described, as well as interventions with direct bactericidal activity. These interventions could be efficient if properly validated, correctly applied and verified under industrial conditions. Perspectives are provided for performing a risk assessment for relevant combinations of hazard and food sector to assess the relative public health risk that can be associated with persistence, based on bottom-up and top-down approaches. Knowledge gaps related to bacterial food safety hazards associated with persistence in the FFPE and priorities for future research are provided.

3.
EFSA J ; 21(11): e08332, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-37928944

RESUMO

The contamination of water used in post-harvest handling and processing operations of fresh and frozen fruit, vegetables and herbs (ffFVHs) is a global concern. The most relevant microbial hazards associated with this water are: Listeria monocytogenes, Salmonella spp., human pathogenic Escherichia coli and enteric viruses, which have been linked to multiple outbreaks associated with ffFVHs in the European Union (EU). Contamination (i.e. the accumulation of microbiological hazards) of the process water during post-harvest handling and processing operations is affected by several factors including: the type and contamination of the FVHs being processed, duration of the operation and transfer of microorganisms from the product to the water and vice versa, etc. For food business operators (FBOp), it is important to maintain the microbiological quality of the process water to assure the safety of ffFVHs. Good manufacturing practices (GMP) and good hygienic practices (GHP) related to a water management plan and the implementation of a water management system are critical to maintain the microbiological quality of the process water. Identified hygienic practices include technical maintenance of infrastructure, training of staff and cooling of post-harvest process water. Intervention strategies (e.g. use of water disinfection treatments and water replenishment) have been suggested to maintain the microbiological quality of process water. Chlorine-based disinfectants and peroxyacetic acid have been reported as common water disinfection treatments. However, given current practices in the EU, evidence of their efficacy under industrial conditions is only available for chlorine-based disinfectants. The use of water disinfection treatments must be undertaken following an appropriate water management strategy including validation, operational monitoring and verification. During operational monitoring, real-time information on process parameters related to the process and product, as well as the water and water disinfection treatment(s) are necessary. More specific guidance for FBOp on the validation, operational monitoring and verification is needed.

4.
EFSA J ; 21(10): e08312, 2023 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-37908452

RESUMO

EFSA Strategy 2027 outlines the need for fit-for-purpose protocols for EFSA generic scientific assessments to aid in delivering trustworthy scientific advice. This EFSA Scientific Committee guidance document helps address this need by providing a harmonised and flexible framework for developing protocols for EFSA generic assessments. The guidance replaces the 'Draft framework for protocol development for EFSA's scientific assessments' published in 2020. The two main steps in protocol development are described. The first is problem formulation, which illustrates the objectives of the assessment. Here a new approach to translating the mandated Terms of Reference into scientifically answerable assessment questions and sub-questions is proposed: the 'APRIO' paradigm (Agent, Pathway, Receptor, Intervention and Output). Owing to its cross-cutting nature, this paradigm is considered adaptable and broadly applicable within and across the various EFSA domains and, if applied using the definitions given in this guidance, is expected to help harmonise the problem formulation process and outputs and foster consistency in protocol development. APRIO may also overcome the difficulty of implementing some existing frameworks across the multiple EFSA disciplines, e.g. the PICO/PECO approach (Population, Intervention/Exposure, Comparator, Outcome). Therefore, although not mandatory, APRIO is recommended. The second step in protocol development is the specification of the evidence needs and the methods that will be applied for answering the assessment questions and sub-questions, including uncertainty analysis. Five possible approaches to answering individual (sub-)questions are outlined: using evidence from scientific literature and study reports; using data from databases other than bibliographic; using expert judgement informally collected or elicited via semi-formal or formal expert knowledge elicitation processes; using mathematical/statistical models; and - not covered in this guidance - generating empirical evidence ex novo. The guidance is complemented by a standalone 'template' for EFSA protocols that guides the users step by step through the process of planning an EFSA scientific assessment.

5.
Int J Food Microbiol ; 403: 110302, 2023 Oct 16.
Artigo em Inglês | MEDLINE | ID: mdl-37392608

RESUMO

EFSA's Panel on Biological Hazards (BIOHAZ Panel) deals with questions on biological hazards relating to food safety and food-borne diseases. This covers food-borne zoonoses, transmissible spongiform encephalopathies, antimicrobial resistance, food microbiology, food hygiene, animal-by products, and associated waste management issues. The scientific assessments are diverse and frequently the development of new methodological approaches is required to deal with a mandate. Among the many risk factors, product characteristics (pH, water activity etc.), time and temperature of processing and storage along the food supply chain are highly relevant for assessing the biological risks. Therefore, predictive microbiology becomes an essential element of the assessments. Uncertainty analysis is incorporated in all BIOHAZ scientific assessments, to meet the general requirement for transparency. Assessments should clearly and unambiguously state what sources of uncertainty have been identified and their impact on the conclusions of the assessment. Four recent BIOHAZ Scientific Opinions are presented to illustrate the use of predictive modelling and quantitative microbial risk assessment principles in regulatory science. The Scientific Opinion on the guidance on date marking and related food information, gives a general overview on the use of predictive microbiology for shelf-life assessment. The Scientific Opinion on the efficacy and safety of high-pressure processing of food provides an example of inactivation modelling and compliance with performance criteria. The Scientific Opinion on the use of the so-called 'superchilling' technique for the transport of fresh fishery products illustrates the combination of heat transfer and microbial growth modelling. Finally, the Scientific Opinion on the delayed post-mortem inspection in ungulates, shows how variability and uncertainty, were quantitatively embedded in assessing the probability of Salmonella detection on carcasses, via stochastic modelling and expert knowledge elicitation.


Assuntos
Microbiologia de Alimentos , Doenças Transmitidas por Alimentos , Animais , Zoonoses , Inocuidade dos Alimentos , Medição de Risco/métodos
6.
EFSA J ; 21(7): e08093, 2023 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-37416785

RESUMO

An assessment was conducted on the level of inactivation of relevant pathogens that could be present in processed animal protein of porcine origin intended to feed poultry and aquaculture animals when methods 2 to 5 and method 7, as detailed in Regulation (EU) No 142/2011, are applied. Five approved scenarios were selected for method 7. Salmonella Senftenberg, Enterococcus faecalis, spores of Clostridium perfringens and parvoviruses were shortlisted as target indicators. Inactivation parameters for these indicators were extracted from extensive literature search and a recent EFSA scientific opinion. An adapted Bigelow model was fitted to retrieved data to estimate the probability that methods 2 to 5, in coincidental and consecutive modes, and the five scenarios of method 7 are able to achieve a 5 log10 and a 3 log10 reduction of bacterial indicators and parvoviruses, respectively. Spores of C. perfringens were the indicator with the lowest probability of achieving the target reduction by methods 2 to 5, in coincidental and consecutive mode, and by the five considered scenarios of method 7. An expert knowledge elicitation was conducted to estimate the certainty of achieving a 5 log10 reduction of spores of C. perfringens considering the results of the model and additional evidence. A 5 log10 reduction of C. perfringens spores was judged: 99-100% certain for methods 2 and 3 in coincidental mode; 98-100% certain for method 7 scenario 3; 80-99% certain for method 5 in coincidental mode; 66-100% certain for method 4 in coincidental mode and for method 7 scenarios 4 and 5; 25-75% certain for method 7 scenario 2; and 0-5% certain for method 7 scenario 1. Higher certainty is expected for methods 2 to 5 in consecutive mode compared to coincidental mode.

7.
EFSA J ; 21(1): e07745, 2023 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-36698487

RESUMO

The impact of dry-ageing of beef and wet-ageing of beef, pork and lamb on microbiological hazards and spoilage bacteria was examined and current practices are described. As 'standard fresh' and wet-aged meat use similar processes these were differentiated based on duration. In addition to a description of the different stages, data were collated on key parameters (time, temperature, pH and aw) using a literature survey and questionnaires. The microbiological hazards that may be present in all aged meats included Shiga toxin-producing Escherichia coli (STEC), Salmonella spp., Staphylococcus aureus, Listeria monocytogenes, enterotoxigenic Yersinia spp., Campylobacter spp. and Clostridium spp. Moulds, such as Aspergillus spp. and Penicillium spp., may produce mycotoxins when conditions are favourable but may be prevented by ensuring a meat surface temperature of -0.5 to 3.0°C, with a relative humidity (RH) of 75-85% and an airflow of 0.2-0.5 m/s for up to 35 days. The main meat spoilage bacteria include Pseudomonas spp., Lactobacillus spp. Enterococcus spp., Weissella spp., Brochothrix spp., Leuconostoc spp., Lactobacillus spp., Shewanella spp. and Clostridium spp. Under current practices, the ageing of meat may have an impact on the load of microbiological hazards and spoilage bacteria as compared to standard fresh meat preparation. Ageing under defined and controlled conditions can achieve the same or lower loads of microbiological hazards and spoilage bacteria than the variable log10 increases predicted during standard fresh meat preparation. An approach was used to establish the conditions of time and temperature that would achieve similar or lower levels of L. monocytogenes and Yersinia enterocolitica (pork only) and lactic acid bacteria (representing spoilage bacteria) as compared to standard fresh meat. Finally, additional control activities were identified that would further assure the microbial safety of dry-aged beef, based on recommended best practice and the outputs of the equivalence assessment.

8.
EFSA J ; 20(5): e07265, 2022 May.
Artigo em Inglês | MEDLINE | ID: mdl-35592024

RESUMO

Studies evaluating the safety and efficacy of lactic acid to reduce microbiological surface contamination from carcases of wild game (i.e. kangaroos and wild pigs) and small stock (i.e. goats and sheep) before chilling at the slaughterhouse were assessed. Wild pig and kangaroo hide-on carcases may have been chilled before they arrive at the slaughterhouse and are treated after removal of the hides. Lactic acid solutions (2-5%) are applied to the carcases at temperatures of up to 55°C by spraying or misting. The treatment lasts 6-7 s per carcass side. The Panel concluded that: [1] the treatment is of no safety concern, provided that the lactic acid complies with the European Union specifications for food additives; [2] based on the available evidence, it was not possible to conclude on the efficacy of spraying or misting lactic acid on kangaroo, wild pig, goats and sheep carcases; [3] treatment of the above-mentioned carcases with lactic acid may induce reduced susceptibility to the same substance, but this can be minimised; there is currently no evidence that prior exposure of food-borne pathogens to lactic acid leads to the occurrence of resistance levels that compromise antimicrobial therapy; and [4] the release of lactic acid is not of concern for the environment, assuming that wastewaters released by the slaughterhouses are treated on-site, if necessary, to counter the potentially low pH caused by lactic acid, in compliance with local rules.

9.
EFSA J ; 20(3): e07128, 2022 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-35281651

RESUMO

High-pressure processing (HPP) is a non-thermal treatment in which, for microbial inactivation, foods are subjected to isostatic pressures (P) of 400-600 MPa with common holding times (t) from 1.5 to 6 min. The main factors that influence the efficacy (log10 reduction of vegetative microorganisms) of HPP when applied to foodstuffs are intrinsic (e.g. water activity and pH), extrinsic (P and t) and microorganism-related (type, taxonomic unit, strain and physiological state). It was concluded that HPP of food will not present any additional microbial or chemical food safety concerns when compared to other routinely applied treatments (e.g. pasteurisation). Pathogen reductions in milk/colostrum caused by the current HPP conditions applied by the industry are lower than those achieved by the legal requirements for thermal pasteurisation. However, HPP minimum requirements (P/t combinations) could be identified to achieve specific log10 reductions of relevant hazards based on performance criteria (PC) proposed by international standard agencies (5-8 log10 reductions). The most stringent HPP conditions used industrially (600 MPa, 6 min) would achieve the above-mentioned PC, except for Staphylococcus aureus. Alkaline phosphatase (ALP), the endogenous milk enzyme that is widely used to verify adequate thermal pasteurisation of cows' milk, is relatively pressure resistant and its use would be limited to that of an overprocessing indicator. Current data are not robust enough to support the proposal of an appropriate indicator to verify the efficacy of HPP under the current HPP conditions applied by the industry. Minimum HPP requirements to reduce Listeria monocytogenes levels by specific log10 reductions could be identified when HPP is applied to ready-to-eat (RTE) cooked meat products, but not for other types of RTE foods. These identified minimum requirements would result in the inactivation of other relevant pathogens (Salmonella and Escherichia coli) in these RTE foods to a similar or higher extent.

10.
EFSA J ; 19(4): e06576, 2021 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-33968255

RESUMO

Pasteurisation of raw milk, colostrum, dairy or colostrum-based products must be achieved using at least 72°C for 15 s, at least 63°C for 30 min or any equivalent combination, such that the alkaline phosphatase (ALP) test immediately after such treatment gives a negative result. For cows' milk, a negative result is when the measured activity is ≤ 350 milliunits of enzyme activity per litre (mU/L) using the ISO standard 11816-1. The use and limitations of an ALP test and possible alternative methods for verifying pasteurisation of those products from other animal species (in particular sheep and goats) were evaluated. The current limitations of ALP testing of bovine products also apply. ALP activity in raw ovine milk appears to be about three times higher and in caprine milk about five times lower than in bovine milk and is highly variable between breeds. It is influenced by season, lactation stage and fat content. Assuming a similar pathogen inactivation rate to cows' milk and based on the available data, there is 95-99% probability (extremely likely) that pasteurised goat milk and pasteurised sheep milk would have an ALP activity below a limit of 300 and 500 mU/L, respectively. The main alternative methods currently used are temperature monitoring using data loggers (which cannot detect other process failures such as cracked or leaking plates) and the enumeration of Enterobacteriaceae (which is not suitable for pasteurisation verification but is relevant for hygiene monitoring). The inactivation of certain enzymes other than ALP may be more suitable for the verification of pasteurisation but requires further study. Secondary products of heat treatment are not suitable as pasteurisation markers due to the high temperatures needed for their production. More research is needed to facilitate a definitive conclusion on the applicability of changes in native whey proteins as pasteurisation markers.

11.
EFSA J ; 19(4): e06510, 2021 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-33897858

RESUMO

A risk-based approach was used to develop guidance to be followed by food business operators (FBOs) when deciding on food information relating to storage conditions and/or time limits for consumption after opening a food package and thawing of frozen foods. After opening the package, contamination may occur, introducing new pathogens into the food and the intrinsic (e.g. pH and aw), extrinsic (e.g. temperature and gas atmosphere) and implicit (e.g. interactions with competing background microbiota) factors may change, affecting microbiological food safety. Setting a time limit for consumption after opening the package (secondary shelf-life) is complex in view of the many influencing factors and information gaps. A decision tree (DT) was developed to assist FBOs in deciding whether the time limit for consumption after opening, due to safety reasons, is potentially shorter than the initial 'best before' or 'use by' date of the product in its unopened package. For products where opening the package leads to a change of the type of pathogenic microorganisms present in the food and/or factors increasing their growth compared to the unopened product, a shorter time limit for consumption after opening would be appropriate. Freezing prevents the growth of pathogens, however, most pathogenic microorganisms may survive frozen storage, recover during thawing and then grow and/or produce toxins in the food, if conditions are favourable. Moreover, additional contamination may occur from hands, contact surfaces or contamination from other foods and utensils. Good practices for thawing should, from a food safety point of view, minimise growth of and contamination by pathogens between the food being thawed and other foods and/or contact surfaces, especially when removing the food from the package during thawing. Best practices for thawing foods are presented to support FBOs.

12.
EFSA J ; 19(1): e06378, 2021 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-33552296

RESUMO

Superchilling entails lowering the fish temperature to between the initial freezing point of the fish and about 1-2°C lower. The temperature of superchilled fresh fishery products (SFFP) in boxes without ice was compared to that of products subject to the currently authorised practice in boxes with ice (CFFP) under the same conditions of on-land storage and/or transport. A heat transfer model was developed and made available as a tool to identify under which initial configurations of SFFP the fish temperature, at any time of storage/transport, is lower or equal to CFFP. A minimum degree of superchilling, corresponding to an ice fraction in the fish matrix of SFFP equal or higher than the proportion of ice added per mass of fish in CFFP, will ensure with 99-100% certainty (almost certain) that the fish temperature of SFFP and the consequent increase of relevant hazards will be lower or equal to that of CFFP. In practice, the degree of superchilling can be estimated using the fish temperature after superchilling and its initial freezing point, which are subject to uncertainties. The tool can be used as part of 'safety-by-design' approach, with the reliability of its outcome being dependent on the accuracy of the input data. An evaluation of methods capable of detecting whether a previously frozen fish is commercially presented as 'superchilled' was carried out based on, amongst others, their applicability for different fish species, ability to differentiate fresh fish from fish frozen at different temperatures, use as a stand-alone method, ease of use and classification performance. The methods that were considered 'fit for purpose' are Hydroxyacyl-coenzyme A dehydrogenase (HADH) test, α-glucosidase test, histology, ultraviolet-visible-near-infrared (UV-VIS/NIR) spectroscopy and hyperspectral imaging. These methods would benefit from standardisation, including the establishment of threshold values or classification algorithms to provide a practical routine test.

13.
EFSA J ; 18(12): e06306, 2020 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-33304412

RESUMO

A risk-based approach was developed to be followed by food business operators (FBO) when deciding on the type of date marking (i.e. 'best before' date or 'use by' date), setting of shelf-life (i.e. time) and the related information on the label to ensure food safety. The decision on the type of date marking needs to be taken on a product-by-product basis, considering the relevant hazards, product characteristics, processing and storage conditions. The hazard identification is food product-specific and should consider pathogenic microorganisms capable of growing in prepacked temperature-controlled foods under reasonably foreseeable conditions. The intrinsic (e.g. pH and aw), extrinsic (e.g. temperature and gas atmosphere) and implicit (e.g. interactions with competing background microbiota) factors of the food determine which pathogenic and spoilage microorganisms can grow in the food during storage until consumption. A decision tree was developed to assist FBOs in deciding the type of date marking for a certain food product. When setting the shelf-life, the FBO needs to consider reasonably foreseeable conditions of distribution, storage and use of the food. Key steps of a case-by-case procedure to determine and validate the shelf-life period are: (i) identification of the relevant pathogenic/spoilage microorganism and its initial level, (ii) characterisation of the factors of the food affecting the growth behaviour and (iii) assessment of the growth behaviour of the pathogenic/spoilage microorganism in the food product during storage until consumption. Due to the variability between food products and consumer habits, it was not appropriate to present indicative time limits for food donated or marketed past the 'best before' date. Recommendations were provided relating to training activities and support, using 'reasonably foreseeable conditions', collecting time-temperature data during distribution, retail and domestic storage of foods and developing Appropriate Levels of Protection and/or Food Safety Objectives for food-pathogen combinations.

14.
EFSA J ; 18(4): e06090, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32874298

RESUMO

The 2011 EFSA opinion on Campylobacter was updated using more recent scientific data. The relative risk reduction in EU human campylobacteriosis attributable to broiler meat was estimated for on-farm control options using Population Attributable Fractions (PAF) for interventions that reduce Campylobacter flock prevalence, updating the modelling approach for interventions that reduce caecal concentrations and reviewing scientific literature. According to the PAF analyses calculated for six control options, the mean relative risk reductions that could be achieved by adoption of each of these six control options individually are estimated to be substantial but the width of the confidence intervals of all control options indicates a high degree of uncertainty in the specific risk reduction potentials. The updated model resulted in lower estimates of impact than the model used in the previous opinion. A 3-log10 reduction in broiler caecal concentrations was estimated to reduce the relative EU risk of human campylobacteriosis attributable to broiler meat by 58% compared to an estimate larger than 90% in the previous opinion. Expert Knowledge Elicitation was used to rank control options, for weighting and integrating different evidence streams and assess uncertainties. Medians of the relative risk reductions of selected control options had largely overlapping probability intervals, so the rank order was uncertain: vaccination 27% (90% probability interval (PI) 4-74%); feed and water additives 24% (90% PI 4-60%); discontinued thinning 18% (90% PI 5-65%); employing few and well-trained staff 16% (90% PI 5-45%); avoiding drinkers that allow standing water 15% (90% PI 4-53%); addition of disinfectants to drinking water 14% (90% PI 3-36%); hygienic anterooms 12% (90% PI 3-50%); designated tools per broiler house 7% (90% PI 1-18%). It is not possible to quantify the effects of combined control activities because the evidence-derived estimates are inter-dependent and there is a high level of uncertainty associated with each.

15.
EFSA J ; 18(4): e06092, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32874300

RESUMO

A multi-country outbreak of Listeria monocytogenes ST6 linked to blanched frozen vegetables (bfV) took place in the EU (2015-2018). Evidence of food-borne outbreaks shows that L. monocytogenes is the most relevant pathogen associated with bfV. The probability of illness per serving of uncooked bfV, for the elderly (65-74 years old) population, is up to 3,600 times greater than cooked bfV and very likely lower than any of the evaluated ready-to-eat food categories. The main factors affecting contamination and growth of L. monocytogenes in bfV during processing are the hygiene of the raw materials and process water; the hygienic conditions of the food processing environment (FPE); and the time/Temperature (t/T) combinations used for storage and processing (e.g. blanching, cooling). Relevant factors after processing are the intrinsic characteristics of the bfV, the t/T combinations used for thawing and storage and subsequent cooking conditions, unless eaten uncooked. Analysis of the possible control options suggests that application of a complete HACCP plan is either not possible or would not further enhance food safety. Instead, specific prerequisite programmes (PRP) and operational PRP activities should be applied such as cleaning and disinfection of the FPE, water control, t/T control and product information and consumer awareness. The occurrence of low levels of L. monocytogenes at the end of the production process (e.g. < 10 CFU/g) would be compatible with the limit of 100 CFU/g at the moment of consumption if any labelling recommendations are strictly followed (i.e. 24 h at 5°C). Under reasonably foreseeable conditions of use (i.e. 48 h at 12°C), L. monocytogenes levels need to be considerably lower (not detected in 25 g). Routine monitoring programmes for L. monocytogenes should be designed following a risk-based approach and regularly revised based on trend analysis, being FPE monitoring a key activity in the frozen vegetable industry.

16.
EFSA J ; 18(4): e06091, 2020 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-32874299

RESUMO

On-land transport/storage of fresh fishery products (FFP) for up to 3 days in 'tubs' of three-layered poly-ethylene filled with freshwater and ice was compared to the currently authorised practice (fish boxes of high-density poly-ethylene filled with ice). The impact on the survival and growth of biological hazards in fish and the histamine production in fish species associated with a high amount of histidine was assessed. In different modelling scenarios, the FFP are stored on-board in freshwater or seawater/ice (in tubs) and once on-land they are 'handled' (i.e. sorted or gutted and/or filleted) and transferred to either tubs or boxes. The temperature of the FFP was assumed to be the most influential factor affecting relevant hazards. Under reasonably foreseeable 'abusive' scenarios and using a conservative modelling approach, the growth of the relevant hazards (i.e. Listeria monocytogenes, Aeromonas spp. and non-proteolytic Clostridium botulinum), is expected to be < 0.2 log10 units higher in tubs than in boxes after 3 days when the initial temperature of the fish is 0°C ('keeping' process). Starting at 7°C ('cooling-keeping' process), the expected difference in the growth potential is higher (< 1 log10 for A. hydrophila and < 0.5 log10 for the other two hazards) due to the poorer cooling capacity of water and ice (tub) compared with ice (box). The survival of relevant hazards is not or is negligibly impacted. Histamine formation due to growth of Morganella psychrotolerans under the 'keeping' or 'cooling-keeping' process can be up to 0.4 ppm and 1.5 ppm higher, respectively, in tubs as compared to boxes after 3 days, without reaching the legal limit of 100 ppm. The water uptake associated with the storage of the FFP in tubs (which may be up to 6%) does not make a relevant contribution to the differences in microbial growth potential compared to boxes.

17.
EFSA J ; 17(2): e05596, 2019 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-32626222

RESUMO

An increase in confirmed human salmonellosis cases in the EU after 2014 triggered investigation of contributory factors and control options in poultry production. Reconsideration of the five current target serovars for breeding hens showed that there is justification for retaining Salmonella Enteritidis, Salmonella Typhimurium (including monophasic variants) and Salmonella Infantis, while Salmonella Virchow and Salmonella Hadar could be replaced by Salmonella Kentucky and either Salmonella Heidelberg, Salmonella Thompson or a variable serovar in national prevalence targets. However, a target that incorporates all serovars is expected to be more effective as the most relevant serovars in breeding flocks vary between Member State (MS) and over time. Achievement of a 1% target for the current target serovars in laying hen flocks is estimated to be reduced by 254,400 CrI95[98,540; 602,700] compared to the situation in 2016. This translates to a reduction of 53.4% CrI95[39.1; 65.7] considering the layer-associated human salmonellosis true cases and 6.2% considering the overall human salmonellosis true cases in the 23 MSs included in attribution modelling. A review of risk factors for Salmonella in laying hens revealed that overall evidence points to a lower occurrence in non-cage compared to cage systems. A conclusion on the effect of outdoor access or impact of the shift from conventional to enriched cages could not be reached. A similar review for broiler chickens concluded that the evidence that outdoor access affects the occurrence of Salmonella is inconclusive. There is conclusive evidence that an increased stocking density, larger farms and stress result in increased occurrence, persistence and spread of Salmonella in laying hen flocks. Based on scientific evidence, an impact of Salmonella control programmes, apart from general hygiene procedures, on the prevalence of Campylobacter in broiler flocks at the holding and on broiler meat at the end of the slaughter process is not expected.

18.
EFSA J ; 17(Suppl 1): e170714, 2019 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-32626451

RESUMO

This paper focusses on biological hazards at the global level and considers the challenges to risk assessment (RA) from a One Health perspective. Two topics - vector-borne diseases (VBD) and antimicrobial resistance (AMR) - are used to illustrate the challenges ahead and to explore the opportunities that new methodologies such as next-generation sequencing can offer. Globalisation brings complexity and introduces drivers for infectious diseases. Cooperation and the application of an integrated RA approach - one that takes into consideration food farming and production systems including social and environmental factors - are recommended. Also needed are methodologies to identify emerging risks at a global level and propose prevention strategies. AMR is one of the biggest threats to human health in the infectious disease environment. Whereas new genomic typing techniques such as whole genome sequencing (WGS) provide further insights into the mechanisms of spread of resistance, the role of the environment is not fully elucidated, nor is the role of plants as potential vehicles for spread of resistance. Historical trends and recent experience indicate that (re)-emergence and/or further spread of VBD within the EU is a matter of when rather than if. Standardised and validated vector monitoring programs are required to be implemented at an international level for continuous surveillance and assessment of potential threats. There are benefits to using WGS - such as a quicker and better response to outbreaks and additional evidence for source attribution. However, significant challenges need to be addressed, including method standardisation and validation to fully realise these benefits; barriers to data sharing; and establishing epidemiological capacity for cluster triage and response.

19.
EFSA J ; 16(12): e05482, 2018 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-32625776

RESUMO

Studies evaluating the safety and efficacy of lactic and acetic acids to reduce microbiological surface contamination on pork carcasses pre-chill and pork meat cuts post-chill were assessed. Lactic acid treatments consisted of 2-5% solutions at temperatures of up to 80°C applied to carcasses by spraying or up to 55°C applied on cuts by spraying or dipping. Acetic acid treatments consisted of 2-4% solutions at temperatures of up to 40°C applied on carcasses by spraying or on cuts by spraying or dipping. The maximum treatment duration was 30 s. The Panel concluded that: [1] the treatments are of no safety concern, provided that the substances comply with the European Union specifications for food additives; [2] spraying of pork carcasses pre-chill with lactic acid was efficacious compared to untreated control, but based on the available data, the Panel could not conclude whether lactic acid was more efficacious than water treatment when spraying of pork carcasses pre-chill or pork meat cuts post-chill. The Panel concluded that dipping of pork meat cuts post-chill in lactic acid was more efficacious than water treatment. However, it could not conclude on the efficacy of acetic acid treatment of pork carcasses pre-chill and/or pork meat cuts post-chill; [3] the potential selection and emergence of bacteria with reduced susceptibility to biocides and/or resistance to therapeutic antimicrobials linked to the use of the substances is unlikely as long as Good Hygienic Practices are implemented; and [4] the release of both organic acids is not of concern for the environment, assuming that wastewaters released by the slaughterhouses are treated, if necessary, to counter the potentially low pH caused by lactic or acetic acid, in compliance with local rules.

20.
EFSA J ; 16(1): e05134, 2018 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-32760461

RESUMO

Food safety criteria for Listeria monocytogenes in ready-to-eat (RTE) foods have been applied from 2006 onwards (Commission Regulation (EC) 2073/2005). Still, human invasive listeriosis was reported to increase over the period 2009-2013 in the European Union and European Economic Area (EU/EEA). Time series analysis for the 2008-2015 period in the EU/EEA indicated an increasing trend of the monthly notified incidence rate of confirmed human invasive listeriosis of the over 75 age groups and female age group between 25 and 44 years old (probably related to pregnancies). A conceptual model was used to identify factors in the food chain as potential drivers for L. monocytogenes contamination of RTE foods and listeriosis. Factors were related to the host (i. population size of the elderly and/or susceptible people; ii. underlying condition rate), the food (iii. L. monocytogenes prevalence in RTE food at retail; iv. L. monocytogenes concentration in RTE food at retail; v. storage conditions after retail; vi. consumption), the national surveillance systems (vii. improved surveillance), and/or the bacterium (viii. virulence). Factors considered likely to be responsible for the increasing trend in cases are the increased population size of the elderly and susceptible population except for the 25-44 female age group. For the increased incidence rates and cases, the likely factor is the increased proportion of susceptible persons in the age groups over 45 years old for both genders. Quantitative modelling suggests that more than 90% of invasive listeriosis is caused by ingestion of RTE food containing > 2,000 colony forming units (CFU)/g, and that one-third of cases are due to growth in the consumer phase. Awareness should be increased among stakeholders, especially in relation to susceptible risk groups. Innovative methodologies including whole genome sequencing (WGS) for strain identification and monitoring of trends are recommended.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...