Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
Add more filters










Publication year range
1.
Meat Sci ; 97(4): 558-67, 2014 Aug.
Article in English | MEDLINE | ID: mdl-24769877

ABSTRACT

This study was designed to provide updated information on the separable components, cooking yields, and proximate composition of retail cuts from the beef chuck. Additionally, the impact the United States Department of Agriculture (USDA) Quality and Yield Grade may have on such factors was investigated. Ultimately, these data will be used in the USDA - Nutrient Data Laboratory's (NDL) National Nutrient Database for Standard Reference (SR). To represent the current United States beef supply, seventy-two carcasses were selected from six regions of the country based on USDA Yield Grade, USDA Quality Grade, gender, and genetic type. Whole beef chuck primals from selected carcasses were shipped to three university laboratories for subsequent retail cut fabrication, raw and cooked cut dissection, and proximate analyses. The incorporation of these data into the SR will improve dietary education, product labeling, and other applications both domestically and abroad, thus emphasizing the importance of accurate and relevant beef nutrient data.


Subject(s)
Commerce , Cooking , Databases, Factual , Diet , Meat/analysis , Nutritive Value , United States Department of Agriculture , Animals , Cattle , Female , Humans , Male , Meat/classification , Reference Standards , United States
2.
Meat Sci ; 97(1): 21-6, 2014 May.
Article in English | MEDLINE | ID: mdl-24473460

ABSTRACT

Paired ribeyes (n=24) and top sirloin butts (n=24) were dry-aged or wet-aged for 35 days before being merchandised as individual muscles: M. spinalis thoracis, M. longissimus thoracis, M. gluteobiceps, and M. gluteus medius. Wet-aged subprimals had greater saleable yields than dry-aged. Dry-aged M. spinalis thoracis and M. gluteobiceps received lower consumer overall like and flavor ratings than did wet-aged; interior muscles - M. longissimus thoracis and M. gluteus medius - did not differ. Trained panelists found higher musty and putrid flavors for dry-aged muscles closer to exterior surface. These flavors may have contributed to lower consumer overall like and flavor ratings for dry-aged M. spinalis thoracis and M. gluteobiceps. Using innovative styles to cut beef allows for greater merchandising options. However, development of undesirable flavor characteristics may be more pronounced when exterior muscles - M. spinalis thoracis and M. gluteobiceps - are exposed during dry-aging to extreme conditions and are consumed individually.


Subject(s)
Meat/analysis , Muscle, Skeletal/chemistry , Taste , Adult , Animals , Cattle , Consumer Behavior , Cooking , Food Packaging , Food Storage , Humans , Temperature , Vacuum , Young Adult
3.
Meat Sci ; 95(3): 486-94, 2013 Nov.
Article in English | MEDLINE | ID: mdl-23793084

ABSTRACT

Beef nutrition is important to the worldwide beef industry. The objective of this study was to analyze proximate composition of eight beef rib and plate cuts to update the USDA National Nutrient Database for Standard Reference (SR). Furthermore, this study aimed to determine the influence of USDA Quality Grade on the separable components and proximate composition of the examined retail cuts. Carcasses (n=72) representing a composite of Yield Grade, Quality Grade, gender and genetic type were identified from six regions across the U.S. Beef plates and ribs (IMPS #109 and 121C and D) were collected from the selected carcasses and shipped to three university meat laboratories for storage, retail fabrication, cooking, and dissection and analysis of proximate composition. These data provide updated information regarding the nutrient content of beef and emphasize the influence of common classification systems (Yield Grade and Quality Grade) on the separable components, cooking yield, and proximate composition of retail beef cuts.


Subject(s)
Body Composition , Cooking , Databases, Factual , Diet , Meat/analysis , Nutritive Value , Animals , Body Composition/genetics , Cattle , Female , Humans , Male , Meat/classification , Meat/standards , Quality Improvement , Reference Standards , Ribs , United States , United States Department of Agriculture
4.
J Anim Sci ; 91(2): 1005-14, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23230117

ABSTRACT

The tenderness and palatability of retail and food service beef steaks from across the United States (12 cities for retail, 5 cities for food service) were evaluated using Warner-Bratzler shear (WBS) and consumer sensory panels. Subprimal postfabrication storage or aging times at retail establishments averaged 20.5 d with a range of 1 to 358 d, whereas postfabrication times at the food service level revealed an average time of 28.1 d with a range of 9 to 67 d. Approximately 64% of retail steaks were labeled with a packer/processor or store brand. For retail, top blade had among the lowest (P < 0.05) WBS values, whereas steaks from the round had the greatest (P < 0.05) values. There were no differences (P > 0.05) in WBS values between moist-heat and dry-heat cookery methods for the top round and bottom round steaks or between enhanced (contained salt or phosphate solution) or nonenhanced steaks. Food service top loin and rib eye steaks had the lowest (P < 0.05) WBS values compared with top sirloin steaks. Retail top blade steaks and food service top loin steaks received among the greatest (P < 0.05) consumer sensory panel ratings compared with the other steaks evaluated. Prime food service rib eye steaks received the greatest ratings (P < 0.05) for overall like, like tenderness, tenderness level, like juiciness, and juiciness level, whereas ungraded rib eye steaks received the lowest ratings (P < 0.05) for like tenderness and tenderness level. The WBS values for food service steaks were greater (P < 0.05) for the Select and ungraded groups compared with the Prime, Top Choice, and Low Choice groups. The WBS values and sensory ratings were comparable to the last survey, signifying that no recent or substantive changes in tenderness have occurred.


Subject(s)
Cooking/methods , Food Services , Meat/standards , Animals , Cattle , Consumer Behavior , Food Services/economics , Time Factors , United States
5.
J Food Prot ; 75(4): 682-9, 2012 Apr.
Article in English | MEDLINE | ID: mdl-22488055

ABSTRACT

The effect of heating rate on the heat resistance, germination, and outgrowth of Clostridium perfringens spores during cooking of cured ground pork was investigated. Inoculated cured ground pork portions were heated from 20 to 75°C at a rate of 4, 8, or 12°C/h and then held at 75°C for 48 h. No significant differences (P > 0.05) in the heat resistance of C. perfringens spores were observed in cured ground pork heated at 4, 8, or 12°C/h. At heating rates of 8 and 12°C/h, no significant differences in the germination and outgrowth of spores were observed (P > 0.05). However, when pork was heated at 4°C/h, growth of C. perfringens occurred when the temperature of the product was between 44 and 56°C. In another set of experiments, the behavior of C. perfringens spores under temperature abuse conditions was studied in cured and noncured ground pork heated at 4°C/h and then cooled from 54.4 to 7.2°C within 20 h. Temperature abuse during cooling of noncured ground pork resulted in a 2.8-log CFU/g increase in C. perfringens. In cured ground pork, C. perfringens decreased by 1.1 log CFU/g during cooling from 54.4 to 36.3°C and then increased by 0.9 log CFU/g until the product reached 7.2°C. Even when the initial level of C. perfringens spores in cured ground pork was 5 log CFU/g, the final counts after abusive cooling did not exceed 3.4 log CFU/g. These results suggest that there is no risk associated with C. perfringens in cured pork products under the tested conditions.


Subject(s)
Clostridium perfringens/physiology , Food Handling/methods , Meat Products/microbiology , Animals , Cold Temperature , Colony Count, Microbial , Consumer Product Safety , Food Contamination/analysis , Food Contamination/prevention & control , Hot Temperature , Humans , Microbial Viability , Spores, Bacterial , Swine
6.
Meat Sci ; 90(2): 420-5, 2012 Feb.
Article in English | MEDLINE | ID: mdl-21955981

ABSTRACT

Effectiveness of trimming external carcass surfaces from subprimals during fabrication to reduce Escherichia coli O157:H7 surrogates was evaluated. Carcass sides (n = 10 sides) were inoculated along the hide pattern opening before entering the blast chill cooler with a gelatin slurry containing a bacterial cocktail of three rifampicin-resistant, nonpathogenic E. coli biotype I strains. Following a 48 h chill, sides were fabricated to produce eight subprimals. Microbiological samples were taken from the original carcass fat surface area, initial lean surface area, trimmed fat surface area (where applicable), and trimmed lean surface area (where applicable). Newly exposed lean surfaces had lower (P < 0.05) counts of rifampicin-resistant E. coli than did the external fat surfaces. However, fat and lean surfaces that were not inoculated became contaminated during the fabrication process. Trimming external surfaces reduced levels of pathogens, but under normal fabrication processes, pathogens were still spread to newly exposed surfaces.


Subject(s)
Escherichia coli O157/isolation & purification , Food Handling/methods , Meat/microbiology , Animals , Cattle , Colony Count, Microbial , Escherichia coli O157/pathogenicity , Food Contamination/prevention & control , Food Microbiology
7.
J Food Prot ; 74(10): 1741-5, 2011 Oct.
Article in English | MEDLINE | ID: mdl-22004824

ABSTRACT

The U.S. Department of Agriculture Food Safety and Inspection Service (USDA-FSIS) has a specific lethality performance standard for ready-to-eat products. To assist meat processing establishments in meeting the performance standard, USDA-FSIS developed Appendix A, which provides guidelines for cooking temperatures, times, and relative humidity. This project determined whether the USDA-FSIS performance standards for lethality were met when using parameters other than those identified in Appendix A to cook large hams and beef inside rounds. The effects of alternative lethality parameters on the reduction of Salmonella Typhimurium and coliforms and on the toxin production of Staphylococcus aureus were evaluated. Large (9- to 12-kg) cured bone-in hams (n = 80) and large (8- to 13-kg) uncured beef inside rounds (n = 80) were used in this study. The products were subjected to 1 of 10 treatments defined by combinations of final internal product temperatures (48.9, 54.4, 60.0, 65.6, or 71.1°C) and batch oven relative humidities (50 or 90 % ). For all treatments, at least a 6.5-log reduction in Salmonella Typhimurium was achieved. The coliform counts were also substantially reduced for both hams and rounds. Across all treatments for both products, S. aureus toxin production was not detected. The relative humidity did not alter the lethality effectiveness for any of the treatments. The final internal temperatures and relative humidity combinations used in this project achieved the lethality performance standard established by USDA-FSIS for fully cooked, ready-to-eat products.


Subject(s)
Consumer Product Safety , Cooking/methods , Food-Processing Industry/standards , Meat Products/microbiology , Animals , Cattle , Colony Count, Microbial , Cooking/standards , Enterobacteriaceae/growth & development , Food Inspection , Food Microbiology , Humans , Humidity , Meat Products/standards , Risk Assessment , Salmonella typhimurium/growth & development , Species Specificity , Staphylococcus aureus/growth & development , Swine , Temperature , Time Factors , United States , United States Department of Agriculture
8.
Meat Sci ; 89(2): 228-32, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21663807

ABSTRACT

This study evaluated the influence of various degrees of doneness on proximate composition and energy content of beef. Ten steaks were obtained from each of five USDA Prime, five USDA Choice, and five USDA Select strip loins and assigned to one of five degree of doneness treatments (two sets of treatments per strip loin): raw, medium rare (63 °C), medium (71 °C), well done (77 °C), and very well done (82 °C). After cooking, steaks were dissected into separable tissue components consisting of lean, fat, and refuse. Lean tissue was used to obtain proximate analyses of protein, moisture, fat, and ash. Degree of doneness did influence (P<0.05) the nutrient composition of beef steaks. As the degree of doneness increased, percent fat and protein increased, while percent moisture decreased. Cooking steaks to a higher degree of doneness resulted in a higher caloric value when reported per 100g basis.


Subject(s)
Meat/standards , Animals , Cattle , Consumer Behavior , Cooking/methods , Food Handling/methods , Lipids/analysis , Meat/analysis , Proteins/analysis , United States , United States Department of Agriculture
9.
Meat Sci ; 81(2): 335-43, 2009 Feb.
Article in English | MEDLINE | ID: mdl-22064172

ABSTRACT

A market basket survey for beef retail cut composition at the retail level (four stores each from two chains in each city) was conducted in 11 US cities from January to March 2006. Beef cuts (n=17,495) were measured for external fat thickness with cuts from the chuck (0.05cm), round (0.05cm), and miscellaneous (0.04cm) having less (P<0.05) fat than cuts from the loin (0.11cm) and rib (0.11cm). Beef cuts (n=1327) were separated physically into separable components with round cuts having more (P<0.05) separable lean (96.63%) than chuck cuts (86.81%) and miscellaneous cuts (86.18%), which had more (P<0.05) separable lean than loin cuts (84.53%) with rib cuts (69.34%) having the lowest (P<0.05) separable lean. Chemical fat from the separable lean differed (P<0.05) between each cut category: round cuts (3.71%), miscellaneous cuts (4.99%), loin cuts (5.60%), chuck cuts (6.90%), and rib cuts (8.61%). Ground beef samples (n=235), with declared lean/fat percentages ranging from 73/27 to 96/4, had overall chemical fat values of 13.41% and moisture values of 67.42%. This survey documents the current beef retail cut and ground beef composition, which is helpful to those who need this information for various dietary and marketing purposes.

10.
Meat Sci ; 79(4): 631-9, 2008 Aug.
Article in English | MEDLINE | ID: mdl-22063024

ABSTRACT

Paired beef short loins from US Choice (n=48) and US Select (n=48) carcasses were assigned to be dry or wet aged for 14, 21, 28 or 35d. After aging, short loins were processed to determine retail yields and processing times. Upon completion of cutting tests, steaks were served to consumers to assess palatability characteristics. Retail cutting tests showed that dry-aged short loins had reduced yields and increased cutting times when compared to wet-aged short loins. Consumers were unable to determine differences between dry- and wet-aged steaks and for aging periods; however, USDA quality grade had a significant impact on consumer perception of palatability attributes.

11.
Meat Sci ; 80(3): 795-804, 2008 Nov.
Article in English | MEDLINE | ID: mdl-22063599

ABSTRACT

Top Choice (n=48) and Select (n=48) paired bone-in ribeye rolls, bone-in strip loins, and boneless top sirloin butts were assigned randomly to one of two aging treatments, dry or wet, and were aged for 14, 21, 28 or 35d. Cutting tests, performed to determine retail yields and processing times, showed dry-aged subprimals had lower total saleable yield percentages and increased processing times compared to wet-aged subprimals. Sensory and Warner-Bratzler shear evaluation was conducted to determine palatability characteristics. For the most part, aging treatment and aging period did not affect consumer sensory attributes. However, ribeye and top loin steaks from the Top Choice quality grade group received higher sensory ratings than their Select counterparts. For top sirloin steaks, no consumer sensory attributes were affected by aging treatment, aging period, or quality grade group.

12.
Meat Sci ; 73(2): 245-8, 2006 Jun.
Article in English | MEDLINE | ID: mdl-22062295

ABSTRACT

This project was designed to evaluate interventions capable of reducing bacterial counts on the hide prior to opening. In Trial I, fresh beef hides (n=12) were cut into sections and assigned to serve as either clipped (hair trimmed) or non-clipped sections. Sections were inoculated with a bovine fecal slurry and sampled following a water wash. Treatments (distilled water, isopropyl alcohol, 3% hydrogen peroxide, 2% l-lactic acid, 10% povidone-iodine, and 1% cetylpyridinium chloride (CPC)) were then applied to each section and the sections were sampled for enumeration of aerobic plate counts (APCs), coliforms, and Escherichia coli. Within clipped samples, 1% CPC and 3% hydrogen peroxide caused the greatest reductions in APCs (4.6 and 4.4 log(10)CFU/100-cm(2), respectively), and 1% CPC, 2% l-lactic acid, and 3% hydrogen peroxide caused the greatest reductions in coliform counts (4.5, 4.1, and 3.9 log(10)CFU/100-cm(2), respectively). In Trial II, beef carcasses with hides on were sampled initially and clipped, and then 2% l-lactic acid, 3% hydrogen peroxide, or 1% CPC were applied before sampling. For APCs, 1% CPC produced the greatest reduction on the hide surface (3.8 log(10)CFU/100-cm(2)). Selective application of these antimicrobials to clipped hide opening sites reduced bacterial counts on hide surfaces, and therefore could reduce final carcass counts in these areas by decreasing the bacterial load before opening.

13.
Meat Sci ; 69(3): 401-7, 2005 Mar.
Article in English | MEDLINE | ID: mdl-22062977

ABSTRACT

Four experiments were conducted to test the efficacy of peroxyacetic acid as a microbial intervention on beef carcass surfaces. In these experiments, beef carcass surfaces were inoculated with fecal material (no pathogens) or fecal material containing rifampicin-resistant Escherichia coli O157:H7 and Salmonella Typhimurium. Inoculated surfaces were subjected to a simulated carcass wash with and without 2% l-lactic acid treatment before chilling. In Experiments 1 and 2, the chilled carcass surfaces were sprayed with peroxyacetic acid (200 ppm; 43°) for 15 s. Peroxyacetic acid had no effect on microbial counts of any organism measured on these carcass surfaces. However, lactic acid reduced counts of E. coli Type I (1.9log(10) CFU/cm(2)), coliforms (3.0log(10) CFU/cm(2)), E. coli O157:H7 (2.7log(10) CFU/cm(2)), and S. Typhimurium (2.8log(10) CFU/cm(2)) entering the chilling cooler and prevented growth during the chilling period. In Experiment 3, peroxyacetic acid at different concentrations (200, 600, and 1000 ppm) and application temperatures (45 and 55 °C) were used to investigate its effectiveness in killing E. coli O157:H7 and S. Typhimurium compared to 4% l-lactic acid (55 °C). Application temperature did not affect the counts of either microorganism. Peroxyacetic acid concentrations up to 600 ppm had no effect on these microorganisms. Concentrations of 1000 ppm reduced E. coli O157:H7 and S. Typhimurium by up to 1.7 and 1.3log(10) CFU/cm(2), respectively. However, 4% lactic acid reduced these organisms by 2.7 and 3.4log(10) CFU/cm(2), respectively. In Experiment 4, peroxyacetic acid (200 ppm; 43 °C) was applied to hot carcass surfaces. This treatment caused a 0.7log(10) CFU/cm(2) reduction in both E. coli O157:H7 and S. Typhimurium. The collective results from these experiments indicate that peroxyacetic acid was not an effective intervention when applied to chilled inoculated carcass piece surfaces.

14.
Meat Sci ; 70(1): 197-203, 2005 May.
Article in English | MEDLINE | ID: mdl-22063297

ABSTRACT

Peroxyacetic acid was evaluated in four separate trials for ability to reduce populations of Escherichia coli O157:H7 and Salmonella serotype Typhimurium on fresh beef trim. Trial 1 examined the effectiveness of peroxyacetic acid on individual pieces of fresh beef trim. Trial 2 evaluated the efficacy of peroxyacetic acid at low levels of contamination on batches of fresh beef trim. Trial 3 studied a washing effect of water. Lastly, Trial 4 compared the effectiveness of peroxyacetic acid to lactic acid. At various inoculation levels, peroxyacetic acid reduced populations of both pathogens by approximately 1.0log(10)CFU/cm(2) on fresh beef trim. Trial 3 showed that approximately half of the reductions found in Trials 1 and 2 were due to a washing effect of the water dip. In addition, as shown in Trial 1, increases in concentrations (>200ppm) did not significantly increase log(10) reductions of both pathogens. Following a water dip in Trial 4, peroxyacetic acid caused a reduction of 0.7log(10)CFU/cm(2) in E. coli O157:H7 and 1.0log(10)CFU/cm(2) in Salmonella Typhimurium, whereas lactic acid caused a reduction of 1.3log(10)CFU/cm(2) in E. coli O157:H7 and 2.1log(10)CFU/cm(2) in S. Typhimurium following the water dip. These results show that peroxyacetic acid was not more effective than 2% l-lactic acid in reducing pathogens on fresh beef trim.

15.
J Food Prot ; 67(3): 579-82, 2004 Mar.
Article in English | MEDLINE | ID: mdl-15035377

ABSTRACT

Two trials were conducted to determine the efficacy of cattle wash treatments in reducing pathogens on hides of cattle before slaughter. In trial I, live cattle (n = 120) were washed in an automated, commercial cattle wash system with one of four treatments (single water wash, double water wash, water wash with 0.5% L-lactic acid, or water wash with 50 ppm chlorine). Samples were collected at three locations (brisket, belly, and inside round) pre- and posttreatment to evaluate the effectiveness of treatments on the reduction of aerobic plate counts, coliforms, Escherichia coli and the incidence of Salmonella. For all three locations, bacterial numbers increased from 0.1 to 0.8 log CFU/cm2 posttreatment. In trial II, hide samples were inoculated in the laboratory with 6.0 log CFU/cm2 of rifampicin-resistant Salmonella serotype Typhimurium. Hide wash treatments included higher concentrations of chlorine (100, 200, and 400 ppm) and L-lactic acid (2, 4, and 6%), as well as other antimicrobial agents such as ethanol (70, 80, and 90%), acetic acid (2, 4, and 6%), and Oxy-Sept 333 (0.5, 2, and 4%). Spray wash treatments with ethanol and 4 to 6% concentrations of lactic acid had greater (P < 0.05) mean log reductions than 2% solutions of acetic or lactic acid, as well as 100, 200, and 400 ppm chlorine and the control water wash treatment. Spray wash treatments with Oxy-Sept 333 and 100, 200, or 400 ppm chlorine were not effective (P > 0.05) in reducing Salmonella Typhimurium compared to the (control) distilled water spray wash treatment. Several effective cattle hide interventions were identified in a controlled laboratory setting, but the high concentrations required for effectiveness would likely present problems from an animal welfare standpoint.


Subject(s)
Anti-Infective Agents, Local/pharmacology , Bacteria/drug effects , Cattle/microbiology , Sanitation/methods , Abattoirs , Animals , Bacteria/isolation & purification , Chlorine/pharmacology , Colony Count, Microbial , Dose-Response Relationship, Drug , Food Contamination/prevention & control , Lactic Acid/pharmacology , Salmonella typhimurium/drug effects , Salmonella typhimurium/isolation & purification , Skin/microbiology , Water/pharmacology
16.
Meat Sci ; 66(1): 55-61, 2004 Jan.
Article in English | MEDLINE | ID: mdl-22063931

ABSTRACT

At approximately 8 weeks of age, four-way cross (Chester White×Landrace×Large White×Yorkshire) pigs (n=24) were selected based on genetically high (H) or low (L) serum cholesterol levels-12 from each genetic group-to determine the relationship between genetics, fat source, and sex class on plasma cholesterol, growth, carcass characteristics, and cholesterol and lipid content of muscle and adipose tissues. Boars and gilts, six each from the two genetic groups, were assigned randomly to one of three dietary treatments for 46 days. A standard grower diet was modified to include beef tallow (T), corn oil (CR) or coconut oil (CC), and the pigs were given ad libitum access to feed. Cholesterol was added to each diet to ensure the diets contained the same amount of cholesterol. Except for the plasma lipids, there were no differences between boars and gilts at the initial evaluation or at the end of the treatment; therefore, sex means were pooled for statistical analyses. Body weight was unaffected by diet on days 18, 29 or 46. Blood samples were taken on days 1, 29, and 46 via the anterior vena cava. Plasma total cholesterol (TC) and low density lipoprotein cholesterol (LDL) concentrations were greater in the H than L groups (overall TC in H and L pigs=150 and 124 mg/dl, respectively, and LDL in H and L pigs=105 and 76 mg/dl, respectively). Pigs fed diets containing saturated fats had greater TC and LDL than pigs fed unsaturated fats (TC=165, 149, and 126 mg/dl for T, CC, and CR diets, respectively, and LDL=108, 88, and 77 mg/dl for T, CC, and CR diets, respectively). There were significant time×gene×sex interactions for both TC and LDL yielding subtle differences in the response of the sexes from the two genetic groups over time. Pigs were slaughtered on day 46, and carcass data were collected. There were no differences in fat at the first rib, 10th rib, last rib, or last lumbar vertebra, but differences (P <0.05) were found between genetic groups for M. longissimus thoracis et lumborum (LTL) muscle area (H=21.0±0.8 cm(2), L=18.1±1.0 cm(2)) and USDA muscle score (H=2.1±0.1, L=1.7±0.1). There were no genetic or diet effects for cholesterol content of pre-rigor or post-rigor LTL muscle. Neither genetics nor dietary treatment affected the cholesterol content of the adipose tissue. There were no differences in fat percentage between genetic groups for muscle or adipose tissue. There were differences (P <0.05) in total lipid content among the dietary treatments for the pre-rigor (T=6.0±0.6%, CC=4.3±0.3%, CN=3.9±0.5%) and post-rigor (T=6.4±0.9%, CC=4.1±0.3%, CN=5.0±0.4%) LTL. Cholesterol accretion in muscle and adipose tissues of growing pigs was not influenced by source of fat in the diet or by their genetic propensity for high or low plasma cholesterol.

17.
Breast Cancer Res Treat ; 37(1): 1-9, 1996.
Article in English | MEDLINE | ID: mdl-8750522

ABSTRACT

BACKGROUND: Accurate measurement of the size of breast cancers becomes more important as breast cancer therapy advances. This study reports the accuracy of magnetic resonance imaging (MRI), ultrasonography and mammography for measuring the largest breast cancer diameter in comparison to the pathology measurement. MATERIALS AND METHODS: Fourteen breast cancers were examined in 13 women with MRI, ultrasonography and mammography. The age range was 31-73 (mean 56). Six of the cancers were in premenopausal women. The MRI was performed with the intravenous injection of gadolinium based contrast agent and a three dimensional fast spoiled gradient echo sequence with fat suppression. The largest cancer diameter was measured with each imaging technique and compared to the largest cancer diameter measured at pathology. RESULTS: At pathological examination cancers ranged from 0.6 to 6 cm (mean 2.2) in largest diameter. MRI measurements had the highest correlation coefficient (r = 0.98) and the smallest standard error (0.34). Ultrasonography measurements had a correlation coeffient of r = 0.45 and a standard error of 0.78. Mammography measurements had a correlation coefficient of r = 0.46 and a standard error of 1.04. CONCLUSIONS: MRI was more accurate than ultrasonography and mammography in measuring the largest cancer diameters in this group of women. This was particularly evident for several larger cancers, and a postchemotherapy cancer.


Subject(s)
Adenocarcinoma/pathology , Breast Neoplasms/pathology , Carcinoma, Ductal, Breast/pathology , Carcinoma, Lobular/pathology , Adenocarcinoma/diagnostic imaging , Adult , Aged , Breast Neoplasms/diagnostic imaging , Carcinoma, Ductal, Breast/diagnostic imaging , Carcinoma, Lobular/diagnostic imaging , Female , Humans , Magnetic Resonance Imaging , Middle Aged , Radiography , Reproducibility of Results , Ultrasonography
18.
Arch Intern Med ; 154(11): 1261-7, 1994 Jun 13.
Article in English | MEDLINE | ID: mdl-8203993

ABSTRACT

BACKGROUND: The recommendation to lower saturated fat intake is often interpreted as requiring the elimination of beef to control or lower serum cholesterol levels. The study hypothesis was that the Step I Diet (8% to 10% of energy intake from saturated fatty acids) containing beef would have the same effect on plasma lipid levels of hypercholesterolemic men as a like diet containing chicken. METHODS: Thirty-eight free-living hypercholesterolemic (otherwise healthy) men completed a 13-week dietary intervention study. Subjects consumed their usual diets for 3 weeks, followed by a 5-week stabilization diet (18% of energy intake from saturated fatty acids), before randomization to one of two test diets for 5 weeks. The test diets contained either 85 g of cooked beef (8% fat) or 85 g of cooked chicken (7% fat) per 4184 kJ and had 7% to 8% of energy from saturated fatty acids. All food was supplied during the stabilization and test diets. RESULTS: The beef and chicken test diets both produced significant decreases in average plasma total cholesterol level (0.54 mmol/L [7.6%] for beef and 0.70 mmol/L [10.2%] for chicken) and low-density lipoprotein cholesterol level (0.46 mmol/L [9%] for beef and 0.55 mmol/L [11%] for chicken). Changes in average levels of plasma total cholesterol, high-density lipoprotein cholesterol, triglyceride, and low-density lipoprotein cholesterol were not statistically different (smallest P = .26) between the beef and chicken test diets. The average triglyceride level did not change for either test diet group. CONCLUSIONS: In this short-term study, comparably lean beef and chicken had similar effects on plasma levels of total, low-density lipoprotein, and high-density lipoprotein cholesterol and triglyceride. We concluded that lean beef and chicken are interchangeable in the Step I Diet.


Subject(s)
Hypercholesterolemia/diet therapy , Lipids/blood , Meat , Adult , Animals , Cattle , Chickens , Cholesterol/blood , Cholesterol, HDL/blood , Cholesterol, LDL/blood , Humans , Hypercholesterolemia/blood , Male , Middle Aged , Triglycerides/blood
20.
J Anim Sci ; 71(4): 807-10, 1993 Apr.
Article in English | MEDLINE | ID: mdl-8478281

ABSTRACT

Thirty-six female pigs selected for three generations for high or low serum cholesterol were chosen to evaluate the effects of a high-fat, high-cholesterol diet and a low-fat, low-cholesterol diet provided on an ad libitum basis for 92 d beginning at 12 wk of age on the cholesterol content and percentage of fat in muscle and organ tissues. The pigs were four-way crosses (Chester White x Landrace x Large White x Yorkshire). Samples of cerebrum, heart, ileum, kidney, liver, longissimus muscle, semitendinosus muscle, and subcutaneous fat were collected from each animal for determination of cholesterol concentration. The liver was the only tissue that had a significant difference in cholesterol content and in fat percentage between the genetic groups (high serum cholesterol and low serum cholesterol) and between the two diets (high-fat, high-cholesterol diet and low-fat, low-cholesterol diet). There were no interactions between diet and genetic background on cholesterol accretion or on the percentage of fat in the tissues.


Subject(s)
Animal Feed , Cholesterol, Dietary/administration & dosage , Cholesterol/analysis , Dietary Fats/administration & dosage , Swine/metabolism , Adipose Tissue/chemistry , Animal Feed/analysis , Animals , Brain Chemistry , Breeding , Cholesterol/blood , Cholesterol, Dietary/pharmacokinetics , Female , Ileum/chemistry , Kidney/chemistry , Lipids/analysis , Liver/chemistry , Muscles/chemistry , Myocardium/chemistry , Swine/genetics
SELECTION OF CITATIONS
SEARCH DETAIL
...