Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
1.
Animals (Basel) ; 13(20)2023 Oct 11.
Article in English | MEDLINE | ID: mdl-37893894

ABSTRACT

Post-harvest Salmonella mitigation techniques are insufficient at addressing Salmonella harbored in cattle lymph nodes, necessitating the exploration of pre-harvest alternatives that reduce Salmonella prior to dissemination to the lymph nodes. A 2 × 2, unbalanced experiment was conducted to determine the effectiveness of pre-harvest treatments applied to the pen surface for Salmonella mitigation in cattle. Treatments included manure slurry intended to mimic pen run-off water (n = 4 pens), a bacteriophage cocktail (n = 4), a combination of both treatments (n = 5), and a control group (n = 5) that received no treatment. Environment samples from 18 feedlot pens and fecal grabs, hide swabs, and subiliac lymph nodes from 178 cattle were collected and selectively enriched for Salmonella, and Salmonella isolates were sequenced. The combination treatment was most effective at reducing Salmonella, and the prevalence was significantly lower compared with the control group for rump swabs on Days 14 and 21. The treatment impact on Salmonella in the lymph nodes could not be determined due to low prevalence. The reduction on cattle hides suggests that bacteriophage or water treatments applied to the feedlot pen surface may reduce Salmonella populations in cattle during the pre-harvest period, resulting in reduced contamination during slaughter and processing.

2.
J Food Prot ; 86(4): 100062, 2023 04.
Article in English | MEDLINE | ID: mdl-37005037

ABSTRACT

Salmonella prevalence in bovine lymph nodes (LNs) varies due to seasonality, geographic location, and feedyard environment. The objectives of this study were to (1) establish prevalence rates of Salmonella in environmental components (trough water, pen soil, individual feed ingredients, prepared rations, and fecal samples) and LNs from weaning to finish in three feeding locations, and (2) characterize recovered salmonellae. Calves (n = 120) were raised at the Texas A&M University McGregor Research Center; in lieu of beginning the backgrounding/stocker phase, thirty weanling calves were harvested. Of the remaining ninety calves, thirty were retained at McGregor and sixty were transported to commercial feeding operations (Location A or B; thirty calves each). Locations A and B have historically produced cattle with relatively "low" and "high" rates of Salmonella-positive LNs, respectively. Ten calves per location were harvested at the conclusion of (1) the backgrounding/stocker phase, (2) 60 d on feed, and (3) 165 d on feed. On each harvest day, peripheral LNs were excised. Environmental samples were obtained from each location before and after each phase, and every 30 d during the feeding period. In line with previous work, no Salmonella-positive LNs were recovered from cattle managed at Location A. Salmonella-positive LNs (30%) and environmental components (41%) were most commonly recovered from Location B. Of 7 and 36 total serovars recovered from Salmonella-positive LN and environmental samples, respectively, Anatum was identified most frequently. Data from this study provide insight into Salmonella prevalence differences among feeding locations and the possible influence of environmental and/or management practices at each. Such information can be used to shape industry best practices to reduce Salmonella prevalence in cattle feeding operations, resulting in a decreased prevalence of Salmonella in LNs, and thus, minimizing risks to human health.


Subject(s)
Cattle Diseases , Salmonella , Humans , Animals , Cattle , Weaning , Texas , Lymph Nodes , Prevalence , Cattle Diseases/epidemiology , Animal Feed
3.
Appl Environ Microbiol ; 89(4): e0003323, 2023 04 26.
Article in English | MEDLINE | ID: mdl-37022263

ABSTRACT

Salmonella can persist in the feedlot pen environment, acting as a source of transmission among beef cattle. Concurrently, cattle that are colonized with Salmonella can perpetuate contamination of the pen environment through fecal shedding. To study these cyclical dynamics, pen environment and bovine samples were collected for a 7-month longitudinal comparison of Salmonella prevalence, serovar, and antimicrobial resistance profiles. These samples included composite environment, water, and feed from the feedlot pens (n = 30) and cattle (n = 282) feces and subiliac lymph nodes. Salmonella prevalence across all sample types was 57.7%, with the highest prevalence in the pen environment (76.0%) and feces (70.9%). Salmonella was identified in 42.3% of the subiliac lymph nodes. Based on a multilevel mixed-effects logistic regression model, Salmonella prevalence varied significantly (P < 0.05) by collection month for most sample types. Eight Salmonella serovars were identified, and most isolates were pansusceptible, except for a point mutation in the parC gene, associated with fluoroquinolone resistance. There was a proportional difference in serovars Montevideo, Anatum, and Lubbock comparing the environment (37.2, 15.9, and 11.0%, respectively), fecal (27.5, 22.2, and 14.6%, respectively), and lymph node (15.6, 30.2, and 17.7%, respectively) samples. This suggests that the ability of Salmonella to migrate from the pen environment to the cattle host-or vice versa-is serovar specific. The presence of certain serovars also varied by season. Our results provide evidence that Salmonella serovar dynamics differ when comparing environment and host; therefore, developing serovar-specific preharvest environmental Salmonella mitigation strategies should be considered. IMPORTANCE Salmonella contamination of beef products, specifically from the incorporation of bovine lymph nodes into ground beef, remains a food safety concern. Current postharvest Salmonella mitigation techniques do not address Salmonella bacteria that are harbored in the lymph nodes, nor is it well understood how Salmonella invades the lymph nodes. Alternatively, preharvest mitigation techniques that can be applied to the feedlot environment, such as moisture applications, probiotics, or bacteriophage, may reduce Salmonella before dissemination into cattle lymph nodes. However, previous research conducted in cattle feedlots includes study designs that are cross-sectional, are limited to point-in-time sampling, or are limited to sampling of the cattle host, making it difficult to assess the Salmonella interactions between environment and hosts. This longitudinal analysis of the cattle feedlot explores the Salmonella dynamics between the feedlot environment and beef cattle over time to determine the applicability of preharvest environmental treatments.


Subject(s)
Cattle Diseases , Salmonella enterica , Animals , Cattle , Serogroup , Longitudinal Studies , Prevalence , Cross-Sectional Studies , Cattle Diseases/epidemiology , Cattle Diseases/microbiology , Salmonella , Feces/microbiology , Lymph Nodes/microbiology
4.
J Contam Hydrol ; 247: 103988, 2022 05.
Article in English | MEDLINE | ID: mdl-35303484

ABSTRACT

With growing global use of methanol as a fuel additive and extensive use in other industrial processes, there is the potential for unintended release and spills into soils and aquifers. In these subsurface systems it is likely that methanol will be readily biodegraded; however, degradation may lead to the production of by-products, most importantly methane possibly resulting in explosion hazards and volatile fatty acids (VFAs) causing aesthetic issues for groundwater. In this study, the formation of these potentially harmful by-products due to methanol biodegradation was investigated in natural sand and silt sediments using microcosms inoculated with neat methanol (100%) ranging in concentration from 100 to 100,000 ppm. To assess the rate of degradation and by-product formation, water and headspace samples were collected and analyzed for methanol, volatile fatty acids (VFAs, including acetic, butyric, and propionic acid), cation (metal) concentrations (Al, Ca, Fe, K, Mg, Mn and Na), microbial community structure and activity, headspace pressure, gas composition (CH4, CO2, O2 and N2), and compound specific isotopes. Methanol was completely biodegraded in sand and silt up to concentrations of 1000 ppm and 10,000 ppm, respectively. Degradation was initially aerobic, consuming oxygen (O2) and producing carbon dioxide (CO2). When O2 was depleted, the microcosms became anaerobic and a lag in methanol degradation occurred (ranging from 41 to 87 days). Following this lag, methanol was preferentially degraded to acetate, coupled with CO2 reduction. Microcosms with high methanol concentrations (10,000 ppm) were driven further down the redox ladder and exhibited fermentation, leading to concurrent acetate and methane (CH4) generation. In all cases acetate was an intermediate product, further degraded to the final products of CH4 and CO2. Carbonates present in the microcosm sediments helped buffer VFA acidification and replenished CO2. Methane generation in the anaerobic microcosms was short-lived, but temporarily reached high rates up to 13 mg kg-1 day-1. Under the conditions of these experiments, methanol degradation occurred rapidly, after initial lag periods, which were a function of methanol concentration and sediment type. Our experiment also showed that methanol degradation and associated methane production can occur in a stepwise fashion.


Subject(s)
Groundwater , Methanol , Acetates , Carbon Dioxide/analysis , Fatty Acids, Volatile , Groundwater/chemistry , Methane/metabolism , Sand
5.
Nutr Metab (Lond) ; 18(1): 65, 2021 Jun 24.
Article in English | MEDLINE | ID: mdl-34167568

ABSTRACT

BACKGROUND: Impaired hepatic fatty acid metabolism and persistent mitochondrial dysfunction are phenomena commonly associated with liver failure. Decreased serum levels of L-carnitine, a amino acid derivative involved in fatty-acid and energy metabolism, have been reported in severe burn patients. The current study aimed to evaluate the effects of L-carnitine supplementation on mitochondrial damage and other hepatocyte injuries following severe burns and the related mechanisms. METHODS: Serum carnitine and other indicators of hepatocytic injury, including AST, ALT, LDH, TG, and OCT, were analyzed in severe burn patients and healthy controls. A burn model was established on the back skin of rats; thereafter, carnitine was administered, and serum levels of the above indicators were evaluated along with Oil Red O and TUNEL staining, transmission electron microscopy, and assessment of mitochondrial membrane potential and carnitine palmitoyltransferase 1 (CPT1) activity and expression levels in the liver. HepG2 cells pretreated with the CPT1 inhibitor etomoxir were treated with or without carnitine for 24 h. Next, the above indicators were examined, and apoptotic cells were analyzed via flow cytometry. High-throughput sequencing of rat liver tissues identified several differentially expressed genes (Fabp4, Acacb, Acsm5, and Pnpla3) were confirmed using RT-qPCR. RESULTS: Substantially decreased serum levels of carnitine and increased levels of AST, ALT, LDH, and OCT were detected in severe burn patients and the burn model rats. Accumulation of TG, evident mitochondrial shrinkage, altered mitochondrial membrane potential, decreased ketogenesis, and reduced CPT1 activity were detected in the liver tissue of the burned rats. Carnitine administration recovered CPT1 activity and improved all indicators related to cellular and fatty acid metabolism and mitochondrial injury. Inhibition of CPT1 activity with etomoxir induced hepatocyte injuries similar to those in burn patients and burned rats; carnitine supplementation restored CPT1 activity and ameliorated these injuries. The expression levels of the differentially expressed genes Fabp4, Acacb, Acsm5, and Pnpla3 in the liver tissue from burned rats and etomoxir-treated hepatocytes were also restored by treatment with exogenous carnitine. CONCLUSION: Exogenous carnitine exerts protective effects against severe burn-induced cellular, fatty-acid metabolism, and mitochondrial dysfunction of hepatocytes by restoring CPT1 activity.

6.
J Food Prot ; 84(1): 80-86, 2021 Jan 01.
Article in English | MEDLINE | ID: mdl-32853371

ABSTRACT

ABSTRACT: Managing the presence of Salmonella in ground beef has been an ongoing challenge for the beef industry. Salmonella prevalence can vary regionally, seasonally, and within the animal, making the development of interventions difficult. The objective of this study was to assess the efficacy of an autogenous Salmonella vaccine in mitigating Salmonella in lymph nodes (LNs) of feedlot cattle. An autogenous vaccine was developed using the most common Salmonella enterica serovars (Salmonella Kentucky, Salmonella Anatum, Salmonella Muenchen, Salmonella Montevideo, and Salmonella Mbandaka) identified from cattle managed at a South Texas feedlot with historically high Salmonella prevalence. Fifty-five heifers were selected for even distribution across five groups: (i) BASE, which received no autogenous vaccinations and were harvested after the stocker stage, (ii) CNTRL, which received no autogenous vaccinations, (iii) FARM, which received autogenous vaccinations at the ranch only, (iv) SPLIT, which received autogenous vaccinations at both the ranch and feedlot, and (v) YARD, which received vaccinations at the feedlot only. One heifer each from the BASE and CNTRL groups did not complete the study. All treatment groups except BASE were harvested after reaching market weight. Left and right superficial cervical and subiliac LNs from each carcass were collected and analyzed for Salmonella presence, and positive samples were serotyped. No salmonellae were recovered from LNs derived from BASE, FARM, SPLIT, or YARD groups. Cattle in the BASE group were expected to have a low occurrence of Salmonella based on previous research. However, the percentage of Salmonella-positive animals in the CNTRL group was 20.0% (2 of 10), which is lower than expected based on historical data from the same feeding location. There could be several causes of decreased Salmonella presence in the LNs of control cattle, creating an opportunity for future investigation into the development of preharvest interventions to combat Salmonella in feedlots.


Subject(s)
Autovaccines , Cattle Diseases , Salmonella Infections, Animal , Animals , Cattle , Cattle Diseases/prevention & control , Feces , Female , Lymph Nodes , Salmonella , Salmonella Infections, Animal/prevention & control , Texas , Vaccination
7.
Meat Sci ; 172: 108319, 2021 Feb.
Article in English | MEDLINE | ID: mdl-33022542

ABSTRACT

Beef carcasses (n = 90; U.S. Choice) met a 3 ribeye area (REA - Small, Medium, Large) × 3 carcass weight (CW - Light, Intermediate, Heavy) scheme to assess palatability on steaks cut by portion thickness (PT- 3.18 cm) and weight (PW -340 g). Significant interactions revealed trends for steaks from the Small REA, regardless of CW, to have among the lowest shear-force values. For PT steaks, significant interaction for overall liking revealed no differences for Small and Medium REA across all CW categories, but steaks from Large REA from Light CW differed (P < 0.05) from the other two CW categories. For PT steaks, overall liking and tenderness liking scores were higher (P < 0.05) for Small REA compared to other categories, whereas CW did not influence any palatability trait. REA and CW do impact beef steak palatability, though steaks from all combinations were "very tender" and highly acceptable from a palatability standpoint.


Subject(s)
Consumer Behavior , Portion Size , Red Meat/analysis , Adult , Aged , Animals , Cattle , Cooking , Female , Humans , Male , Middle Aged , Prohibitins , Shear Strength
8.
Cureus ; 12(8): e10096, 2020 Aug 28.
Article in English | MEDLINE | ID: mdl-33005517

ABSTRACT

Delirium is a multifactorial syndrome and is described as an acute brain dysfunction seen commonly in post-cardiac surgery patients. The prevalence of post-operative Delirium (POD) ranges from 11.4% to 55%, depending on the diagnostic tool and type of study. Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) and the Intensive Care Delirium Screening Checklist (ICDSC) are the two most used and recommended tools by the Society of Intensive Care Medicine. Annual delirium-related healthcare costs in the United States (US) range from 6.6 to 20.4 billion USD in ICU patients. However, delirium in cardiac ICU (CICU) is underdiagnosed and warrants vigorous workup. The risk factors for delirium in CICU can be classified as modifiable, non-modifiable, and cardiac surgical causes. After cardiac procedures, delirium is associated with increased mortality, increased length of hospital stay, loss of functional independence, increased hospital costs, and an independent predictor of death 10 years postoperatively. Non-pharmacological measures such as avoiding delirium-risk medications, early physical rehabilitation, occupational therapy, and sleep improvement strategies have shown significant benefits in decreasing delirium. Pharmacological options are limited for use in CICU, and a need for future studies in this topic is in demand.

9.
Cureus ; 12(9): e10600, 2020 Sep 22.
Article in English | MEDLINE | ID: mdl-33123420

ABSTRACT

There are two types of well-known muscular dystrophies: Duchenne's muscular dystrophy (DMD) and Becker's muscular dystrophy. This article focuses on the X-linked recessive disorder of Duchenne's muscular dystrophy, which primarily affects children at age four, with a shortened life span of up to 40 years. A defective dystrophin protein lacking the gene dystrophin is the primary cause of the disease pathophysiology. This defect causes cardiac and skeletal muscle down-regulation of dystrophin, leading to weak and fibrotic muscles. The disease is currently untreatable, so most kids die due to cardiac failure in their late 30's. This review presents current treatment options, based on previous studies conducted over the last five years. We used the PubMed database to analyze and review the most important investigations. We also included an analysis of induced pluripotent stem cell therapy vs. genetic therapy using the mdx mouse model. We have discovered promising results on mdx mouse models to date and excited about the potential for where further clinical human trials can go.

10.
Cureus ; 12(9): e10279, 2020 Sep 06.
Article in English | MEDLINE | ID: mdl-33042714

ABSTRACT

Eosinophilic granulomatosis with polyangiitis (EGPA) is a rare autoimmune systemic necrotizing vasculitis of blood vessels that often presents with hypereosinophilia. Cardiac involvement in EGPA directly correlates with the mortality of patients with the disease and is a central part of the disease process. The evaluation and treatment of cardiac anomalies are vital in patients with EGPA. The frequency with which cardiac involvement is seen in the disease process makes early diagnosis crucial in all patients with EGPA. Early treatment has been proven to reverse or cause the disease to go into remission. Several studies have shown that cardiac magnetic resonance (CMR) imaging is the most sensitive and best early indicator of cardiovascular involvement in EGPA. CMR routinely outperforms other diagnostic techniques such as ECG (echocardiography) and CTA (computed tomography angiography) in the detection of cardiac anomalies and should be a part of the standardized assessment of all patients with EGPA. CMR is also a non-invasive diagnostic tool that can also outperform biopsy in the detection of EGPA cardiac involvement. CMR is also a valuable technique that can be used to monitor disease progression while treatment is being performed. Although long-term research studies have yet to show these benefits, the studies that are available today provide ample evidence that shows CMR imaging could ultimately help bring down mortality rates currently seen in EGPA patients if it is used as an evaluation tool from initial diagnosis and throughout the entire course of disease management.

11.
Cureus ; 12(9): e10280, 2020 Sep 06.
Article in English | MEDLINE | ID: mdl-33042715

ABSTRACT

The gut microbiota in humans communicates to the central nervous system through the gut-brain axis, and this communication functions in a bidirectional manner. The backbone of this axis is via the vagus nerve allowing the communication. Research on the functionality of the gut-brain axis is present; however, analysis of the diversity and stratification of the gut microbiota is in its infancy. Through the exploration of various studies focusing on the role of the gut microbiota and its effects on the efficacy of selective serotonin receptor inhibitors (SSRIs) in depression management, many promising alterations in constructive changes have emerged. It has become evident that a set of quantifiable microbial markers have been identified as consistent in the stools of depressive subjects that can be further used to determine the severity of disease progression - the presence of certain bacterial species being a common thread amongst the therapeutic bacteria for depression management. The vagus nerve's role in the gut-brain axis, which is vital to carry out any constructive alterations in the gut microbiota, has been strengthened through evidence of SSRIs depending on the vagus to execute therapeutic effects. This review will focus on the interaction between the diversity of the gut microbiota and investigate its link with depression.

12.
Cureus ; 12(8): e9742, 2020 Aug 14.
Article in English | MEDLINE | ID: mdl-32944457

ABSTRACT

Rhabdomyolysis is characterized by rapid muscle breakdown and release of intracellular muscle components into the circulation. Acute renal injury is the most common and fatal complication of rhabdomyolysis. The current literature emphasizes the importance of preventing rhabdomyolysis and finding the benefits of sodium bicarbonates and mannitol in its prevention. A PubMed database search for the keywords "Rhabdomyolysis," "Sodium bicarbonate use in rhabdomyolysis," "Mannitol use in rhabdomyolysis," and a Medical Subject Headings (MeSH) search using the keyword "Rhabdomyolysis; Acute Kidney Injury (Subheading-Prevention and control)" generated 10,005 articles overall. After a thorough application of inclusion/exclusion criteria, 37 relevant studies were selected for this literature study. This analysis demonstrates that aggressive early volume resuscitation with normal saline should continue being the principal focus of therapy, and the use of sodium bicarbonate and mannitol in practical situations is not entirely justified. This article also emphasizes the need for future research on this topic and provides recommendations for future research.

13.
Transl Anim Sci ; 4(3): txaa107, 2020 Jul.
Article in English | MEDLINE | ID: mdl-32856015

ABSTRACT

The objectives of the study were to evaluate if sorting beef carcasses at the packer level by loin muscle (LM) area, using instrument grading technology, would increase the consistency of three boxed beef products for the foodservice and retail sectors of the industry. U.S. Department of Agriculture (USDA) Choice beef sides (n = 100) and USDA Select sides (n = 100) were selected and stratified into five LM area categories (±2.9 cm2): 1) 77.4, 2) 83.9, 3) 90.3, 4) 96.8, and 5) 103.2 cm2. Beef lip-on ribeyes and boneless strip loins were obtained from USDA Choice sides and full, partially defatted tenderloins were obtained from USDA Select sides. Subprimals were scanned with a portioner that captured visual images and dimensional analyses of each subprimal, and data were analyzed by the software to determine multiple portioning outcomes for each subprimal. Portioning data were generated for each subprimal based on a variety of targeted portion weights (ribeye and strip loin steaks = 340.2 g; tenderloin steak = 170.1 g), as well as various portion thicknesses (ribeye and strip loin steaks = 31.8 mm; tenderloin steak = 44.5 and 50.8 mm). Subprimal utility varied across targeted portion weights and thicknesses within each LM area category. For the ribeyes and strip loins, optimal portion weight and thickness combinations were observed more frequently in LM area categories 1 and 2 than for the three larger LM area categories. Analysis of data for tenderloins revealed that LM area categories played a lesser role in identifying optimization of steak portion weight and thickness combinations. Findings demonstrate that creating categories of beef subprimals based on LM area as opposed to subprimal weight might provide a unique sorting method that would improve boxed beef product consistency and uniformity for foodservice and retail sectors.

14.
Cureus ; 12(12): e11916, 2020 Dec 05.
Article in English | MEDLINE | ID: mdl-33425502

ABSTRACT

Bowel restoration following Hartmann's procedure (HP) remains a topic of discussion and innovation. This article seeks to highlight and analyze the outcomes of conventional reversal approaches such as open surgery (OS) and conventional laparoscopic (CL) to single-port laparoscopic reversal (SPLR) approach to evaluate whether SPLR is a feasible alternative to the OS or CL approach. A PubMed search using keywords yielded 5,750 articles. After applying the inclusion/exclusion criteria, 40 articles of relevance were reviewed, and endpoints considered. These included 13 systematic reviews and 27 observational reviews, three of which identified themselves as retrospective or comparative studies. The analysis showed overwhelming support for CL over OS as a choice for HP reversal. Studies comparing SPLR to CL showed SPLR to be a safe and feasible alternative, given its significantly shorter operating times, hospitalization times, and complication rates.

15.
Wounds ; 32(11): E50-E54, 2020 Nov.
Article in English | MEDLINE | ID: mdl-33465040

ABSTRACT

INTRODUCTION: Tibial osteomyelitis is a common complication of bone tissue trauma. Obtaining good soft tissue coverage and effective infection management is key to the treatment of chronic osteomyelitis of the tibia accompanied with bone defect and bone exposure. The pedicled posterior tibial artery perforator layered fasciocutaneous flap can be used to repair soft tissue defects and can be used as a long-term, localized anti-infective. CASE REPORT: A 54-year-old male presented with an ulcer, purulent discharge at the left anterior tibia, and a fever 28 years after complete healing of the scar site. The patient received debridement and negative pressure wound therapy (NPWT) in a hospital setting. After presenting to the authors' department, there was difficulty in closing the exposed bone marrow cavity. On the basis of systemic use of intravenous antibiotics, multiple debridements and NPWT were used to effectively remove necrotic tissue and control infection. Afterward, the pedicled posterior tibial artery perforator layered fasciocutaneous flap was designed to fill the bone marrow cavity as well as cover and seal the wound of bone exposure and soft tissue defect simultaneously. The layered fasciocutaneous flap was well established after operation, and no recurrence of osteomyelitis was found. CONCLUSION: Debridement with negative pressure wound therapy can be an effective treatment for the wound bed preparation in advance of surgery, and the pedicled posterior tibial artery perforator layered fasciocutaneous flap can be used for the treatment of several soft tissue defects.


Subject(s)
Fascia/transplantation , Osteomyelitis/surgery , Surgical Flaps/blood supply , Tibia/surgery , Tibial Arteries/transplantation , Chronic Disease , Debridement , Humans , Male , Middle Aged , Negative-Pressure Wound Therapy , Tibia/microbiology
16.
J Food Prot ; 82(2): 310-315, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30682264

ABSTRACT

Foodborne salmonellosis has been traced to undercooked ground beef and other beef products in the past, and peripheral lymph node (LN) presence in the fatty tissues of beef carcasses is one possible source of Salmonella contamination. Researchers have previously reported higher rates of Salmonella prevalence in LNs from cattle raised and harvested in Mexico compared with rates typically observed from cattle harvested in the United States. With cattle of Mexican origin comprising the majority of U.S. live cattle imports, the objectives of this study were designed to determine whether Salmonella prevalence in LNs differed (i) between cattle of Mexican and U.S. origins when exposed to the same South Texas feeding operation and (ii) between warm and cool seasons. To meet these objectives, paired (left and right sides) subiliac LNs ( n = 800 LNs; n = 400 pooled samples) were collected from 100 carcasses per origin (Mexico and United States) per season (cool, December to January; warm, July to September). Overall, Salmonella prevalence in LN samples was 52.0% (208 of 400). No difference ( P = 0.4836) was seen in Salmonella prevalence as a function of origin, with 54.0% (108 of 200) and 50.0% (100 of 200) of LN samples returning Salmonella-positive results from cattle of Mexican and U.S. origin, respectively. Salmonella prevalence differed ( P = 0.0354) between seasons, with 46.5% (93 of 200) of cool and 57.5% (115 of 200) of warm season samples returning Salmonella-positive results. Serotyping of PCR-confirmed positive samples resulted in 14 different serovars being identified, with Cerro (21.6%), Anatum (19.7%), Muenchen (17.8%), Montevideo (14.4%), and Kentucky (12.0%) comprising the majority of serovars. These results suggest that factors other than cattle origin may be impacting Salmonella prevalence rates in bovine LNs and that additional research is needed to better understand the role of environment and management-related factors on Salmonella prevalence in bovine LNs.


Subject(s)
Abattoirs , Cattle Diseases , Lymph Nodes/microbiology , Salmonella Infections, Animal , Salmonella/isolation & purification , Animals , Cattle , Cattle Diseases/diagnosis , Cattle Diseases/epidemiology , Kentucky , Mexico , Prevalence , Salmonella Infections, Animal/diagnosis , Salmonella Infections, Animal/epidemiology , Texas , Zoonoses
17.
Meat Sci ; 146: 1-8, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30064052

ABSTRACT

Palatability, color, and aroma of steaks derived from subprimals aged for 14 d at conventional temperatures (0.0 to 1.1 °C) versus those aged for 7 d at conventional temperatures followed by 7 d at elevated temperatures (3.3 to 4.4 °C) were evaluated before and after 5-d retail display. Subprimals from the elevated temperature aging treatment had stronger (P < 0.05) sweet and sour aromas, and the top sirloin had stronger (P < 0.05) bloody/serumy scores. After the 5-day retail display, aroma (sour, bloody/serumy) and discoloration of T-bone/Porterhouse steaks were most impacted compared to other steaks. Elevated temperature during the last 7 d of aging did not significantly improve consumer panelists' palatability scores, and no differences (P = 0.66) were seen in WBS force between aging treatments. Using higher storage temperatures to age beef does not warrant the risk associated with impacting color and odor characteristics that could negatively influence consumer acceptance of retail beef.


Subject(s)
Consumer Behavior , Red Meat/standards , Temperature , Animals , Cattle , Color , Food Handling/methods , Humans , Odorants , Red Meat/classification
18.
Transl Anim Sci ; 2(1): 37-49, 2018 Feb.
Article in English | MEDLINE | ID: mdl-32704688

ABSTRACT

To continue the series that began in 1994, the National Beef Quality Audit (NBQA) - 2016 was conducted to quantify the quality status of the market cow and bull beef sector, as well as determine improvements made in the beef and dairy industry since 2007. The NBQA-2016 was conducted from March through December of 2016, and assessed hide-on carcasses (n = 5,278), chilled carcasses (n = 4,285), heads (n = 5,720), and offal items (n = 4,800) in 18 commercial processing facilities throughout the United States. Beef cattle were predominantly black-hided; 68.0% of beef cows and 67.2% of beef bulls possessed a black hide. Holstein was the predominant type of dairy animal observed. Just over half (56.0%) of the cattle surveyed had no mud contamination on the hide, and when mud was present, 34.1% of cattle only had small amounts. Harvest floor assessments found 44.6% of livers, 23.1% of lungs, 22.3% of hearts, 20.0% of viscera, 8.2% of heads, and 5.9% of tongues were condemned. Liver condemnations were most frequently due to abscess presence. In contrast, contamination was the primary reason for condemnation of all other offal items. Of the cow carcasses surveyed, 17.4% carried a fetus at the time of harvest. As expected, mean carcass weight and loin muscle area values observed for bulls were heavier and larger than cows. The marbling scores represented by cull animal carcasses were most frequently slight and traces amounts. Cow carcasses manifested a greater amount of marbling on average than bull carcasses. The predominant fat color score showed all carcasses surveyed had some level of yellow fat. Only 1.3% of carcasses exhibited signs of arthritic joints. Results of the NBQA-2016 indicate there are areas in which the beef and dairy industries have improved and areas that still need attention to prevent value loss in market cows and bulls.

19.
Transl Anim Sci ; 2(4): 365-371, 2018 Oct.
Article in English | MEDLINE | ID: mdl-32704719

ABSTRACT

Livestock are known to harbor Salmonella in their gastrointestinal (GI) tract and lymphatic tissues. Pathogens may be transferred from the GI tract to external carcass surfaces during normal harvest procedures but can be mitigated by antimicrobial carcass interventions. Lymph nodes (LNs) are typically encased in fat and are protected from antimicrobial carcass surface treatments, thus serving as a possible root cause of foodborne illnesses attributed to Salmonella in meat products. Members of the pork industry are committed to food safety and want to better understand Salmonella as a potential contaminant in pork products. To establish a baseline of Salmonella prevalence in porcine LNs across the United States, 21 commercial pork harvest facilities, representing northern (n = 12) or southern (n = 9) geographical regions, participated in this study. As processing volumes allowed, 25 carcasses were selected from each establishment. From each carcass, left and right superficial inguinal LNs (n =1,014 LNs) were removed and pooled to yield one sample per animal or n = 507 total LN samples. Salmonella prevalence rates differed (P < 0.05) between hog types in both regions. Specifically, 6.4% of market hog and 37.0% of sow samples were Salmonella positive in the northern region. This was reversed in the southern region as 13.0% of market hog and 4.8% of sow samples were Salmonella positive. There also was a difference (P < 0.05) in prevalence rates between northern and southern regions for sows, but not market hogs (P > 0.05). Type of chilling method (conventional, blast, or other) used at each market hog facility (n = 12) was documented. In the northern region, prevalence rates of Salmonella across chilling types were as follows: 20.0%, 2.7%, and 1.3% positive samples for conventional, other, and blast chill methods, respectively. In the southern region, 20.0% of samples were positive for conventional, 0.0% for blast, and 12.0% for other chilling methods. In both regions, samples from conventionally chilled carcasses returned more (P < 0.05) positive results than any other chill method. Overall, the higher rate of Salmonella prevalence in northern sows warrants further investigation, and members of the pork industry would benefit from the identification of possible methods to address the presence of Salmonella in porcine LNs.

20.
J Food Prot ; 79(8): 1332-40, 2016 08.
Article in English | MEDLINE | ID: mdl-27497120

ABSTRACT

Asymptomatic Salmonella carriage in beef cattle is a food safety concern, and the beef feedlot environment may function as a reservoir of this pathogen. The goal of this study was to identify and isolate Salmonella and Salmonella bacteriophages from beef cattle feedlot environments in order to better understand the microbial ecology of Salmonella and identify phages that might be useful as anti-Salmonella beef safety interventions. Three feedlots in south Texas were visited, and 27 distinct samples from each source were collected from dropped feces, feed from feed bunks, drinking water from troughs, and soil in cattle pens (n = 108 samples). Preenrichment, selective enrichment, and selective/differential isolation of Salmonella were performed on each sample. A representative subset of presumptive Salmonella isolates was prepared for biochemical identification and serotyping. Samples were pooled by feedlot and sample type to create 36 samples and enriched to recover phages. Recovered phages were tested for host range against two panels of Salmonella hosts. Salmonella bacteria were identified in 20 (18.5%) of 108 samples by biochemical and/or serological testing. The serovars recovered included Salmonella enterica serovars Anatum, Muenchen, Altona, Kralingen, Kentucky, and Montevideo; Salmonella Anatum was the most frequently recovered serotype. Phage-positive samples were distributed evenly over the three feedlots, suggesting that phage prevalence is not strongly correlated with the presence of culturable Salmonella. Phages were found more frequently in soil and feces than in feed and water samples. The recovery of bacteriophages in the Salmonella-free feedlot suggests that phages might play a role in suppressing the Salmonella population in a feedlot environment.


Subject(s)
Salmonella Phages , Salmonella enterica/isolation & purification , Animals , Cattle , Cattle Diseases/microbiology , Feces/microbiology , Kentucky , Prevalence , Red Meat , Salmonella/isolation & purification , Texas
SELECTION OF CITATIONS
SEARCH DETAIL
...