Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 42
Filter
1.
J Int Soc Sports Nutr ; 21(1): 2368167, 2024 Dec.
Article in English | MEDLINE | ID: mdl-38934469

ABSTRACT

POSITION STATEMENT: The International Society of Sports Nutrition (ISSN) provides an objective and critical review of the use of a ketogenic diet in healthy exercising adults, with a focus on exercise performance and body composition. However, this review does not address the use of exogenous ketone supplements. The following points summarize the position of the ISSN.1. A ketogenic diet induces a state of nutritional ketosis, which is generally defined as serum ketone levels above 0.5 mM. While many factors can impact what amount of daily carbohydrate intake will result in these levels, a broad guideline is a daily dietary carbohydrate intake of less than 50 grams per day.2. Nutritional ketosis achieved through carbohydrate restriction and a high dietary fat intake is not intrinsically harmful and should not be confused with ketoacidosis, a life-threatening condition most commonly seen in clinical populations and metabolic dysregulation.3. A ketogenic diet has largely neutral or detrimental effects on athletic performance compared to a diet higher in carbohydrates and lower in fat, despite achieving significantly elevated levels of fat oxidation during exercise (~1.5 g/min).4. The endurance effects of a ketogenic diet may be influenced by both training status and duration of the dietary intervention, but further research is necessary to elucidate these possibilities. All studies involving elite athletes showed a performance decrement from a ketogenic diet, all lasting six weeks or less. Of the two studies lasting more than six weeks, only one reported a statistically significant benefit of a ketogenic diet.5. A ketogenic diet tends to have similar effects on maximal strength or strength gains from a resistance training program compared to a diet higher in carbohydrates. However, a minority of studies show superior effects of non-ketogenic comparators.6. When compared to a diet higher in carbohydrates and lower in fat, a ketogenic diet may cause greater losses in body weight, fat mass, and fat-free mass, but may also heighten losses of lean tissue. However, this is likely due to differences in calorie and protein intake, as well as shifts in fluid balance.7. There is insufficient evidence to determine if a ketogenic diet affects males and females differently. However, there is a strong mechanistic basis for sex differences to exist in response to a ketogenic diet.


Subject(s)
Athletic Performance , Diet, Ketogenic , Sports Nutritional Physiological Phenomena , Humans , Athletic Performance/physiology , Body Composition , Ketosis , Sports Nutritional Sciences , Dietary Carbohydrates/administration & dosage , Exercise/physiology , Physical Endurance/physiology
2.
PM R ; 2024 May 31.
Article in English | MEDLINE | ID: mdl-38818973

ABSTRACT

BACKGROUND: Injury characteristics of high school track and field throwing athletes in the United States are not well studied. Understanding epidemiology of injuries is important to identify treatment and prevention strategies. OBJECTIVE: To describe injury rates and patterns in U.S. high school track and field throwing events from a longitudinal national sports injury surveillance system. DESIGN: Descriptive epidemiology study. SETTING: Data were provided by the National High School Sports Related Injury Surveillance System, High School RIO (Reporting Information Online). METHODS: Athletic trainers reported injury and exposure data through the High School RIO website on a weekly basis. An athlete exposure (AE) was defined as one athlete participating in one school-sanctioned practice or competition. Throwing events of discus, shot put, and javelin were analyzed in this study. MAIN OUTCOME MEASURES: Injury rate, rate ratios (RR), injury proportion ratios (IPR). PARTICIPANTS: U.S. high school athletes. RESULTS: A total of 267 track and field throwing injuries occurred during 5,486,279 AEs. Overall, the rate of injuries in competition was higher than in practice (RR 1.35, 95% confidence interval [CI] 1.01-1.80). In practice, the rate of injuries was higher for girls than boys (RR 1.53, 95% CI 1.12-2.08). The most frequently injured body part was the shoulder (21.7%), followed by the ankle (16.5%) and knee (12.0%). The most common types of injury were muscle strains (26.14%) and ligament sprains (25%). Recurrent injuries accounted for a higher proportion of chronic injuries compared to new injuries (IPR 1.85, 95% CI 1.16-2.97). CONCLUSION: This study described injury characteristics of high school track and field throwing athletes from 2008 to 2019. Based on our results, injury prevention may be particularly important for female throwers with prior injury.

4.
BMJ Open Sport Exerc Med ; 9(2): e001545, 2023.
Article in English | MEDLINE | ID: mdl-37180969

ABSTRACT

Objectives: We evaluated the effect of a nutrition education intervention on bone stress injury (BSI) incidence among female distance runners at two NCAA Division I institutions. Methods: Historical BSI rates were measured retrospectively (2010-2013); runners were then followed prospectively in pilot (2013-2016) and intervention (2016-2020) phases. The primary aim was to compare BSI rates in the historical and intervention phases. Pilot phase data are included only for descriptive purposes. The intervention comprised team nutrition presentations focused on optimising energy availability plus individualised nutrition sessions for runners with elevated Female Athlete Triad risk. Annual BSI rates were calculated using a generalised estimating equation Poisson regression model adjusted for age and institution. Post hoc analyses were stratified by institution and BSI type (trabecular-rich or cortical-rich). Results: The historical phase included 56 runners and 90.2 person-years; the intervention phase included 78 runners and 137.3 person-years. Overall BSI rates were not reduced from the historical (0.52 events per person-year) to the intervention (0.43 events per person-year) phase. Post hoc analyses demonstrated trabecular-rich BSI rates dropped significantly from 0.18 to 0.10 events per person-year from the historical to intervention phase (p=0.047). There was a significant interaction between phase and institution (p=0.009). At Institution 1, the overall BSI rate dropped from 0.63 to 0.27 events per person-year from the historical to intervention phase (p=0.041), whereas no decline was observed at Institution 2. Conclusion: Our findings suggest that a nutrition intervention emphasising energy availability may preferentially impact trabecular-rich BSI and depend on team environment, culture and resources.

5.
J Sci Med Sport ; 26(6): 285-290, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37248163

ABSTRACT

OBJECTIVES: This study evaluated pathways to low energy availability in a sample of female adolescent athletes (n = 464). DESIGN: Cross-sectional. METHODS: Participants (age 13-18 y) underwent assessments for height, weight, eating attitudes and behaviors, and menstrual function. Bone mineral density and body composition were evaluated by dual-energy x-ray absorptiometry in a subset of participants (n = 209). Athletes were classified with clinical indicators of low energy availability if they met criteria for 1) primary or secondary amenorrhea or 2) clinical underweight status (body mass index-for-age < 5th percentile). Disordered eating was assessed using the Eating Disorder Examination Questionnaire. RESULTS: Thirty (6.5%) athletes exhibited clinical indicators of low energy availability, with higher estimates in leanness than non-leanness sports (10.9% vs. 2.1%, p < 0.005). Among athletes with clinical indicators of low energy availability, 80% (n = 24) did not meet criteria for disordered eating, eating disorder, or report the desire to lose weight. Athletes with (vs. without) clinical indicators of low energy availability exhibited lower lumbar spine (-1.30 ±â€¯1.38 vs. -0.07 ±â€¯1.21, p < 0.001) and total body (-0.30 ±â€¯0.98 vs. 0.53 ±â€¯0.97, p < 0.006) bone mineral density Z-scores. CONCLUSIONS: A majority of female adolescent athletes with clinical indicators of low energy availability did not exhibit characteristics consistent with intentional dietary restriction, supporting the significance of the inadvertent pathway to low energy availability and need for increased nutrition education in this population.


Subject(s)
Feeding and Eating Disorders , Sports , Female , Adolescent , Humans , Cross-Sectional Studies , Amenorrhea/epidemiology , Bone Density , Athletes , Absorptiometry, Photon
6.
J Am Coll Health ; 71(9): 2697-2704, 2023 Dec.
Article in English | MEDLINE | ID: mdl-34788580

ABSTRACT

OBJECTIVE: To evaluate the agreement between a 61-item Nutrition Screening Survey (NSS) and 127-item validated Food Frequency Questionnaire (FFQ). PARTICIPANTS: Forty-seven college students (male, n = 29; female, n = 18), age 21.7 ± 0.4 years, BMI of 23.5 ± 0.4 kg/m2. METHODS: Participants completed the NSS, Block FFQ, and anthropometric measurements. Pearson's correlation, paired sample t test, and Bland-Altman plot evaluated agreement between the assessments. RESULTS: Moderate to strong associations between assessments (0.61-0.89, p < 0.001) were identified for meals/day, snacks/day, calories, carbohydrate, fiber, grains, non-starchy vegetables, potatoes, legumes, fruit, yogurt, cheese, and eggs. Mean daily meals/day, calories, fat, fiber, grains, fruit, milk, and eggs did not significantly differ between surveys. The Bland-Altman plot analyses indicated no proportional bias for calories, fat, fiber, grains, fruit, milk, and eggs. CONCLUSIONS: The NSS and Block FFQ display reasonable agreement, supporting use of the NSS for evaluating a range of dietary components among physically active college students.


Subject(s)
Diet , Students , Male , Humans , Female , Young Adult , Adult , Universities , Nutritional Status , Surveys and Questionnaires , Vegetables , Reproducibility of Results
7.
J Am Nutr Assoc ; 42(2): 178-186, 2023 Feb.
Article in English | MEDLINE | ID: mdl-35512779

ABSTRACT

BACKGROUND: Endurance runners exhibit an elevated prevalence of low bone mass and characteristics consistent with undernourishment. OBJECTIVE: This quasi-experimental, pretest-posttest design study evaluated the efficacy of a 4-week nutrition education curriculum to optimize nutrition knowledge, self-efficacy, and the intake of nutrient-rich carbohydrate foods. METHODS: Forty-eight adolescent endurance runners, age 15.7 ± 1.2 y, from two high schools in Southern California were recruited to complete four, weekly lessons addressing the quantity, quality, and timing of carbohydrate intake. Differences in pre- compared to post-intervention nutrition knowledge and self-efficacy to consume nutrient-rich carbohydrate foods were evaluated using paired samples t-tests. Qualitative coding of open-response questions explored changes in food intake behaviors reported by runners during the intervention. RESULTS: The percent of nutrition knowledge questions answered correctly increased after Lessons 1 and 2 (59.0% ± 20.0% pre- vs. 81.9% ± 22.8% post-Lesson 1; 44.7% ± 13.7% pre- vs. 74.5% ± 17.4% post-Lesson 2, P<.001) and the number of identified nutrient-rich carbohydrate foods (8.7 ± 2.7 vs. 12.4 ± 2.3, P < 0.001). Self-efficacy scores improved after all lessons (P<.001). After Lesson 2, 84% (n = 27/32) of runners increased the carbohydrate included in a snack or meal; after Lesson 4, 85% (n = 29/34) added a post-exercise snack. Frequent themes identified from questions addressing dietary changes included increasing quantity and quality of carbohydrates in snacks and meals and being more aware of food choices.Conclusions: Findings suggest that the curriculum enhanced nutrition knowledge, self-efficacy, and dietary behaviors related to intake of nutrient-dense carbohydrate foods in adolescent runners.


Subject(s)
Diet , Energy Intake , Humans , Adolescent , Self Efficacy , Nutrients , Curriculum , Carbohydrates
8.
J Acad Nutr Diet ; 122(3): 573-582, 2022 03.
Article in English | MEDLINE | ID: mdl-35195521

ABSTRACT

BACKGROUND: The prevalence of dietary supplement intake among preadolescent endurance runners is currently unknown. OBJECTIVE: Our aim was to describe use of dietary supplements, higher-risk supplements, and sport foods among preadolescent endurance athletes and identify associated characteristics of dietary supplement users in this population. DESIGN: This was a retrospective, cross-sectional study. PARTICIPANTS/SETTING: Participants were 2,113 preadolescent endurance runners (male: n = 1,255, female: n = 858; mean age ± standard deviation = 13.2 ± 0.9 years). MAIN OUTCOME MEASURES: Use of dietary supplements, higher-risk dietary supplements, and sport foods on 2 or more days per week during the past year. STATISTICAL ANALYSES PERFORMED: Mann-Whitney U tests, χ2 tests, univariate and multivariate analyses. RESULTS: Twenty-six percent (n = 551) of preadolescent runners used dietary supplements on 2 or more days per week during the past year; 1.3% (n = 27) reported taking higher-risk supplements. Compared with male runners, female runners reported higher use of 1 or more supplements (32.5% vs 21.7%; P < .001) and 4 or more supplements (4.0% vs 1.9%; P = 0.005), multivitamin/minerals (24.2% vs 14.4%; P < .001), vitamin D (12.4% vs 5.6%; P < .001), calcium (8.9% vs 4.8%; P < .001), iron (3.1 vs 1.1%; P < .001), probiotic supplements (8.2% vs 1.3%; P < .001), and diet pills (0.5% vs 0.0%; P = .02). Male runners reported higher use of creatine (1.3% vs 0.0%; P < .001) and sport foods, including protein bars and drinks (19.5% vs 8.4%; P < .001), energy bars (23.5% vs 9.7%; P < .001), and carbohydrate-electrolyte drinks (27.9% vs 13.3%; P < .001) than female runners. Factors independently associated with a higher likelihood for dietary supplement use included weight loss in the past year, female (vs male) gender, following a vegetarian diet, skipping meals, attempting to gain weight, and history of a running-related bone stress injury. CONCLUSIONS: More than one-quarter of preadolescent runners regularly consumed dietary supplements. Behaviors consistent with dietary restriction and history of bone stress injury were associated with higher likelihood for supplement use. Further work to understand supplement use patterns and potential value for nutrition education is advised to optimize health of preadolescent runners.


Subject(s)
Athletes , Dietary Supplements , Running , Adolescent , Child , Cross-Sectional Studies , Female , Humans , Male , New England , Prevalence , Retrospective Studies , Schools , Self Report
9.
J Am Nutr Assoc ; 41(6): 551-558, 2022 08.
Article in English | MEDLINE | ID: mdl-34032561

ABSTRACT

Backgroud: Despite the evidence of an elevated prevalence of low bone mass in adolescent endurance runners, reports on dietary intake in this population is limited.Objectives: This study aimed to evaluate energy availability (EA) and dietary intake among 72 (n = 60 female, n = 12 male) high school cross-country runners.Methods: The sample consisted of a combined dataset of two cohorts. In both cohorts, the Block Food Frequency Questionnaire (FFQ; 2005 & 2014 versions) assessed dietary intake. Fat free mass was assessed using dual-energy x-ray absorptiometry or bioelectrical impedance analysis.Results: Mean EA was less than recommended (45 kcal/kgFFM/day) among male (35.8 ± 14.4 kcal/kg FFM/day) and female endurance runners (29.6 ± 17.4 kcal/kgFFM/day), with 30.0% of males and 60.0% of females meeting criteria for low EA (<30 kcal/kgFFM/day). Calorie intake for male (2,614.2 ± 861.8 kcal/day) and female (1,879.5 ± 723.6 kcal/day) endurance runners fell below the estimated energy requirement for "active" boys (>3,100 kcal/day) and girls (>2,300 kcal/day). Female endurance runners' relative carbohydrate intake (4.9 ± 2.1 g/kg/day) also fell below recommended levels (6-10 g/kg/day). Male and female endurance runners exhibited below-recommended intakes of calcium, vitamin D, potassium, fruit, vegetables, grains, and dairy. Compared to male endurance runners, female endurance runners demonstrated lower relative intakes of energy (kcal/kg/day), protein (g/kg/day), fat (g/kg/day), fiber, vegetables, total protein, and oils.Conclusion: This study provides evidence of the nutritional risk of adolescent endurance runners and underscores the importance of nutritional support efforts in this population.


Subject(s)
Energy Intake , Nutritional Status , Adolescent , Eating , Female , Humans , Male , Nutritional Requirements , Vegetables , Vitamins
10.
PM R ; 14(7): 793-801, 2022 07.
Article in English | MEDLINE | ID: mdl-34053194

ABSTRACT

BACKGROUND: Understanding the prevalence and factors associated with running-related injuries in middle school runners may guide injury prevention. OBJECTIVE: To determine the prevalence of running-related injuries and describe factors related to a history of injury. DESIGN: Retrospective cross-sectional study. SETTING: Survey distributed online to middle school runners. METHODS: Participants completed a web-based survey regarding prior running-related injuries, training, sleep, diet, and sport participation. MAIN OUTCOME MEASUREMENTS: Prevalence and characteristics differentiating girls and boys with and without running-related injury history adjusted for age. PARTICIPANTS: Youth runners (total: 2113, average age, 13.2 years; boys: n = 1255, girls: n = 858). RESULTS: Running-related injuries were more prevalent in girls (56% vs. 50%, p = .01). Ankle sprain was the most common injury (girls: 22.5%, boys: 21.6%), followed by patellofemoral pain (20.4% vs. 7.8%) and shin splints (13.6% vs. 5.9%); both were more prevalent in girls (p < .001). Boys more frequently reported plantar fasciitis (5.6% vs. 3.3%, p = .01), iliotibial band syndrome (4.1% vs. 1.4%, p = .001) and Osgood-Schlatter disease (3.8% vs. 1.2%, p = .001). Runners with history of running-related injuries were older, ran greater average weekly mileage, ran faster, had fewer average hours of sleep on weekends, skipped more meals, missed breakfast, and consumed less milk (all p < .05). Girls with history of running-related injuries reported higher dietary restraint scores, later age of menarche, more menstrual cycle disturbances, and higher likelihood of following vegetarian diets and an eating disorder diagnosis (all p < .05). Runners with no history of running-related injuries were more likely to have participated in ≥2 years of soccer or basketball (p < .001). CONCLUSIONS: Most middle school runners reported a history of running-related injuries and certain injuries differing by gender. Modifiable factors with the greatest association with running-related injuries included training volume, dietary restraint, skipping meals, and less sleep. Sport sampling, including participation in ball sports, may reduce running-related injury risk in this population.


Subject(s)
Athletic Injuries , Iliotibial Band Syndrome , Adolescent , Athletic Injuries/epidemiology , Cross-Sectional Studies , Female , Humans , Male , Prevalence , Retrospective Studies , Schools
11.
PM R ; 14(9): 1056-1067, 2022 09.
Article in English | MEDLINE | ID: mdl-34251763

ABSTRACT

BACKGROUND: Bone stress injury (BSI) in youth runners is clinically important during times of skeletal growth and is not well studied. OBJECTIVE: To evaluate the prevalence, anatomical distribution, and factors associated with running-related BSI in boy and girl middle school runners. DESIGN: Retrospective cross-sectional study. SETTING: Online survey distributed to middle school runners. METHODS: Survey evaluated BSI history, age, grade, height, weight, eating behaviors, menstrual function, exercise training, and other health characteristics. MAIN OUTCOME MEASUREMENTS: Prevalence and characteristics associated with history of BSI, stratified by cortical-rich (eg, tibia) and trabecular-rich (pelvis and femoral neck) locations. PARTICIPANTS: 2107 runners (n = 1250 boys, n = 857 girls), age 13.2 ± 0.9 years. RESULTS: One hundred five (4.7%) runners reported a history of 132 BSIs, with higher prevalence in girls than boys (6.7% vs 3.8%, p = .004). The most common location was the tibia (n = 51). Most trabecular-rich BSIs (n = 16, 94% total) were sustained by girls (pelvis: n = 6; femoral neck: n = 6; sacrum: n = 4). In girls, consuming <3 daily meals (odds ratio [OR] = 18.5, 95% confidence interval [CI] = 7.3, 47.4), eating disorder (9.8, 95% CI = 2.0, 47.0), family history of osteoporosis (OR = 6.9, 95% CI = 2.6, 18.0), and age (OR = 1.6, 95% CI = 1.0, 2.6) were associated with BSI. In boys, family history of osteoporosis (OR = 3.2, 95% CI = 1.2, 8.4), prior non-BSI fracture (OR = 3.2, 95% CI = 1.6, 6.7), and running mileage (OR = 1.1, 95% CI = 1.0, 1.1) were associated with BSI. Participating in soccer or basketball ≥2 years was associated with lower odds of BSI for both sexes. CONCLUSION: Whereas family history of osteoporosis and prior fracture (non-BSI) were most strongly related to BSI in the youth runners, behaviors contributing to an energy deficit, such as eating disorder and consuming <3 meals daily, also emerged as independent factors associated with BSI. Although cross-sectional design limits determining causality, our findings suggest promoting optimal skeletal health through nutrition and participation in other sports including soccer and basketball may address factors associated with BSI in this population.


Subject(s)
Osteoporosis , Running , Adolescent , Bone Density , Child , Cross-Sectional Studies , Female , Humans , Male , Prevalence , Retrospective Studies , Running/injuries , Schools
12.
Int J Sport Nutr Exerc Metab ; 31(4): 337-344, 2021 07 01.
Article in English | MEDLINE | ID: mdl-34098530

ABSTRACT

This prospective study evaluated the 3-year change in menstrual function and bone mass among 40 female adolescent endurance runners (age 15.9 ± 1.0 years) according to baseline disordered eating status. Three years after initial data collection, runners underwent follow-up measures including the Eating Disorder Examination Questionnaire and a survey evaluating menstrual function, running training, injury history, and prior sports participation. Dual-energy X-ray absorptiometry was used to measure bone mineral density and body composition. Runners with a weight concern, shape concern, or global score ≥4.0 or reporting >1 pathologic behavior in the past 28 days were classified with disordered eating. Compared with runners with normal Eating Disorder Examination Questionnaire scores at baseline, runners with disordered eating at baseline reported fewer menstrual cycles/year (6.4 ± 4.5 vs. 10.5 ± 2.8, p = .005), more years of amenorrhea (1.6 ± 1.4 vs. 0.3 ± 0.5, p = .03), and a higher proportion of menstrual irregularity (75.0% vs. 31.3%, p = .02) and failed to increase lumbar spine or total hip bone mineral density at the 3-year follow-up. In a multivariate model including body mass index and menstrual cycles in the past year at baseline, baseline shape concern score (B = -0.57, p value = .001) was inversely related to the annual number of menstrual cycles between assessments. Weight concern score (B = -0.40, p value = .005) was inversely associated with lumbar spine bone mineral density Z-score change between assessments according to a multivariate model adjusting for age and body mass index. These finding support associations between disordered eating at baseline and future menstrual irregularities or reduced accrual of lumbar spine bone mass in female adolescent endurance runners.


Subject(s)
Feeding and Eating Disorders/complications , Female Athlete Triad Syndrome/etiology , Physical Endurance/physiology , Running/physiology , Absorptiometry, Photon , Adolescent , Body Composition , Body Weight , Bone Density , Feeding and Eating Disorders/diagnosis , Female , Female Athlete Triad Syndrome/diagnosis , Female Athlete Triad Syndrome/psychology , Follow-Up Studies , Hip/physiology , Humans , Lumbar Vertebrae/physiology , Prospective Studies , Running/psychology , Sports Nutritional Physiological Phenomena , Time Factors
13.
Clin J Sport Med ; 31(4): 335-348, 2021 07 01.
Article in English | MEDLINE | ID: mdl-34091537

ABSTRACT

ABSTRACT: The Male Athlete Triad is a syndrome of 3 interrelated conditions most common in adolescent and young adult male endurance and weight-class athletes and includes the clinically relevant outcomes of (1) energy deficiency/low energy availability (EA) with or without disordered eating/eating disorders, (2) functional hypothalamic hypogonadism, and (3) osteoporosis or low bone mineral density with or without bone stress injury (BSI). The causal role of low EA in the modulation of reproductive function and skeletal health in the male athlete reinforces the notion that skeletal health and reproductive outcomes are the primary clinical concerns. At present, the specific intermediate subclinical outcomes are less clearly defined in male athletes than those in female athletes and are represented as subtle alterations in the hypothalamic-pituitary-gonadal axis and increased risk for BSI. The degree of energy deficiency/low EA associated with such alterations remains unclear. However, available data suggest a more severe energy deficiency/low EA state is needed to affect reproductive and skeletal health in the Male Athlete Triad than in the Female Athlete Triad. Additional research is needed to further clarify and quantify this association. The Female and Male Athlete Triad Coalition Consensus Statements include evidence statements developed after a roundtable of experts held in conjunction with the American College of Sports Medicine 64th Annual Meeting in Denver, Colorado, in 2017 and are in 2 parts-Part I: Definition and Scientific Basis and Part 2: The Male Athlete Triad: Diagnosis, Treatment, and Return-to-Play. In this first article, we discuss the scientific evidence to support the Male Athlete Triad model.


Subject(s)
Relative Energy Deficiency in Sport/diagnosis , Sports Medicine , Adolescent , Athletes , Bone Density , Consensus , Humans , Male , Sports , Young Adult
14.
Clin J Sport Med ; 31(4): 349-366, 2021 Jul 01.
Article in English | MEDLINE | ID: mdl-34091538

ABSTRACT

ABSTRACT: The Male Athlete Triad is a medical syndrome most common in adolescent and young adult male athletes in sports that emphasize a lean physique, especially endurance and weight-class athletes. The 3 interrelated conditions of the Male Athlete Triad occur on spectrums of energy deficiency/low energy availability (EA), suppression of the hypothalamic-pituitary-gonadal axis, and impaired bone health, ranging from optimal health to clinically relevant outcomes of energy deficiency/low EA with or without disordered eating or eating disorder, functional hypogonadotropic hypogonadism, and osteoporosis or low bone mineral density with or without bone stress injury (BSI). Because of the importance of bone mass acquisition and health concerns in adolescence, screening is recommended during this time period in the at-risk male athlete. Diagnosis of the Male Athlete Triad is best accomplished by a multidisciplinary medical team. Clearance and return-to-play guidelines are recommended to optimize prevention and treatment. Evidence-based risk assessment protocols for the male athlete at risk for the Male Athlete Triad have been shown to be predictive for BSI and impaired bone health and should be encouraged. Improving energetic status through optimal fueling is the mainstay of treatment. A Roundtable on the Male Athlete Triad was convened by the Female and Male Athlete Triad Coalition in conjunction with the 64th Annual Meeting of the American College of Sports Medicine in Denver, Colorado, in May of 2017. In this second article, the latest clinical research to support current models of screening, diagnosis, and management for at-risk male athlete is reviewed with evidence-based recommendations.


Subject(s)
Relative Energy Deficiency in Sport/diagnosis , Return to Sport , Adolescent , Athletes , Bone Density , Consensus , Humans , Male , Relative Energy Deficiency in Sport/therapy , Young Adult
15.
Eat Behav ; 40: 101460, 2021 01.
Article in English | MEDLINE | ID: mdl-33307469

ABSTRACT

This cross-sectional study investigated associations between cognitive dietary restraint (CDR), energy, macronutrient and food group intake, menstrual function, and bone density in female adolescent endurance runners. Participants were forty female adolescent endurance runners. The independent variable was CDR, as assessed by the Three Factor Eating Questionnaire (TFEQ). Runners with CDR subscale scores ≥11 were classified with elevated CDR. The main outcomes measured were dietary intake measured by 24-hour recall for 7 days, menstrual history, and bone mineral density (BMD). Twelve of 40 participants (30.0%) met criteria for elevated CDR. Compared to runners with normal CDR, runners with elevated CDR scores reported consuming lower energy (kcal/kg/day) (37.5 ± 8.6 vs. 44.0 ± 9.6, p = 0.052), lower carbohydrate (g/kg/day) (5.3 ± 1.3 vs. 6.3 ± 1.3, p = 0.042), more fiber (g/day) (24.9 ± 6.7 vs. 20.0 ± 5.3, p = 0.018), more servings of fruit (3.3 ± 1.4 vs. 1.9 ± 1.2, p = 0.003), more servings of vegetables (2.7 ± 1.4 vs. 1.7 ± 0.7, p = 0.004), and fewer servings of grain (7.6 ± 2.4 vs. 9.8 ± 2.4, p = 0.009) per day. Runners with elevated CDR exhibited significantly lower lumbar spine BMD Z-scores (adjusting for BMI) (-0.78 ± 0.19 vs. -0.22 ± 0.12, p = 0.016) than runners with normal CDR. Menstrual history did not significantly differ based on CDR status. Elevated CDR may increase risk of dietary patterns associated with consuming inadequate levels of energy, key nutrients, and developing low BMD in endurance runners. Trial Registration:ClinicalTrials.gov Identifier: NCT01059968.


Subject(s)
Running , Adolescent , Bone Density , Carbohydrates , Cognition , Cross-Sectional Studies , Energy Intake , Female , Humans
16.
J Athl Train ; 55(12): 1239-1246, 2020 Dec 01.
Article in English | MEDLINE | ID: mdl-33176358

ABSTRACT

CONTEXT: Sport specialization may contribute to sport injury and menstrual dysfunction in female high school distance runners. Despite the recent growth in sport specialization, including among high school-aged runners, the association of sport specialization with bone mineral density (BMD) remains poorly described. OBJECTIVE: To evaluate whether sport specialization was associated with BMD in female high school distance runners. DESIGN: Cross-sectional study. SETTING: Six high schools. PATIENTS OR OTHER PARTICIPANTS: Sixty-four female runners (age = 15.6 ± 1.4 years) who competed in cross-country or track distance events and were not currently on birth control medication. MAIN OUTCOME MEASURE(S): Each runner completed a survey on menstrual history and sport participation. Height and weight were measured, and dual-energy x-ray absorptiometry was used to measure whole-body, spine, and hip BMD. Each runner was assigned a sport specialization status: low (participation in ≥1 nonrunning sport and distance-running sport(s) for ≤8 mo/y); moderate (participation in both distance-running sport(s) ≥9 mo/y and ≥1 nonrunning sport(s) or limited to distance-running sport(s) for ≤8 mo/y); or high (participation only in distance-running sport(s) for ≥9 mo/y). Multivariable logistic regression was performed to determine the adjusted odds ratio and 95% confidence interval for sport specialization to BMD values, adjusting for body mass index and gynecological age. RESULTS: Overall, 21.9%, 37.5%, and 40.6% of participants were high, moderate, or low sport specializers, respectively. Low BMD (spine or whole-body BMD z score < -1.0 [standardized by age and sex normative values]) was present in 23 (35.9%) runners. Compared with low sport specializers, high sport specializers were 5 times more likely (adjusted odds ratio = 5.42, 95% confidence interval = 1.3, 23.3; P = .02) to have low BMD. CONCLUSIONS: A high level of sport specialization in high school female distance runners may be associated with a heightened risk for low BMD. Further investigation of this association is warranted due to the health concerns about low BMD in adolescent female runners.


Subject(s)
Bone Diseases, Metabolic/epidemiology , Running , Absorptiometry, Photon , Adolescent , Body Mass Index , Body Weight , Bone Density , Cross-Sectional Studies , Female , Humans , Menstruation , Schools
17.
J Nutr Educ Behav ; 52(9): 867-873, 2020 09.
Article in English | MEDLINE | ID: mdl-32059834

ABSTRACT

OBJECTIVE: To evaluate dietary supplement information needs among collegiate athletes. METHODS: Three hundred seven (n = 154 male; n = 153 female) student athletes participating in a National Collegiate Athletic Association Division I team completed a dietary supplement survey. Qualitative coding addressed open-ended responses, and chi-square test of independence explored differences among athlete subgroups. RESULTS: Five themes representing athletes' information needs included quality/composition (53.5%; n = 77), general information (31.9%; n = 46), nutrition information (30.6%; n = 44), performance (18.8%; n = 27), and body composition (13.2%; n = 19). Athletes with "no" or "minimal" (n = 63), vs "moderate" or "strong" (n = 195), perceived knowledge of supplement safety were more likely to list a question about supplement quality or composition (34.9% [n = 22/63] vs 21.5% [n= 42/195]; P = .03; chi-square = 4.6). CONCLUSIONS AND IMPLICATIONS: Further research is needed to corroborate findings to inform educational efforts and promote safe and effective use of dietary supplements by student athletes.


Subject(s)
Athletes/statistics & numerical data , Dietary Supplements , Sports Nutritional Sciences , Adolescent , Adult , Diet Surveys , Dietary Supplements/standards , Dietary Supplements/statistics & numerical data , Female , Humans , Male , Young Adult
18.
J Am Coll Nutr ; 39(7): 619-627, 2020.
Article in English | MEDLINE | ID: mdl-31935156

ABSTRACT

Background: Supplements may expose athletes to dangerous ingredients, banned substances, toxins or contaminants; however, few investigations assess use among collegiate athletes in the U.S.Objective: This cross-sectional study evaluated habitual dietary supplement intake, defined use ≥2 days/week over the past year, in NCAA Division I athletes.Methods: Male and female members of a NCAA Division I team, at two universities in southern California completed a 13-item survey. Among 705 eligible participants, 596 submitted surveys (84.5% response rate), 557 surveys included complete data. Chi-square (χ2) analyses evaluated differences among athletes based on sex, weight status, year in college, and sport-type. Independent t-test or ANOVA evaluated mean differences for continuous variables.Results: A total of 45.2% athletes (n = 252) reported taking supplements (≥2 days/week over the past year). Vitamin/minerals (25.5%, n = 142), protein/amino acids (24.6%, n = 137) were used most frequently. Male, vs female athletes, took more supplements overall (1.2 ± 0.1 vs 0.8 ± 0.1, p = 0.004) and indicated higher use of protein/amino acid products (34.2% vs 13.5%, p < 0.005), whereas, females reported higher use of vitamin/minerals (30.5% vs 21.1%, p < 0.05). Higher supplement use was also reported by athletes with BMI ≥ 30.0 kg/m2 (vs <30 kg/m2, 1.9 ± 0.3 vs 1.0 ± 0.1, p = 0.02), and athletes in ≥ third college year (vs first or second year, 1.2 ± 0.1 vs 0.9 ± 0.1, p = 0.03).Conclusions: Nearly half of NCAA athletes reported habitual supplements use, with significant variation in patterns based on sex, sport-type, year in college, and weight status.


Subject(s)
Athletes , Dietary Supplements , Cross-Sectional Studies , Female , Humans , Male , Universities , Vitamins
19.
J Am Coll Nutr ; 38(2): 141-148, 2019 02.
Article in English | MEDLINE | ID: mdl-30247991

ABSTRACT

OBJECTIVE: A cross-sectional study of first-year college students was conducted to identify the prevalence and predictors of disordered eating (DE). METHODS: College freshmen students, aged 18 years, (n = 106) completed the Eating Disorder Examination Questionnaire (EDE-Q) and a supplemental survey. A subset of the sample (n = 77) underwent measurements of height, weight, and body composition. DE was defined as an elevated (3) weight, shape, eating concern, or dietary restraint EDE-Q subscale score. RESULTS: The sample, consisting of 56.6%, 15.1%, 11.3%, and 11.3% Latino/a, Asian, African American, and Caucasian students, respectively (37% male), reported a 31.1% prevalence of DE. The current desire to lose weight was the strongest predictor of DE (odds ratio = 15.3; 95% confidence interval = 2.8, 82.5). Other variables linked to DE or elevated EDE-Q subscale scores included body mass index (BMI) 25.0 kg/m2, vegetarianism, weight loss in the past year, female gender, and eating breakfast < 5 d/wk. Participants with BMI 25.0 kg/m2 and the current desire to lose weight (n = 23) or following a vegetarian diet (n = 5) exhibited the highest prevalence of DE (78.3% and 80.0%, respectively). A higher proportion of Latinas reported binge episodes compared to female Caucasian, Asian, and African American students (36.4% vs. 0.0%, 6.7%, and 28.6%, respectively, p = 0.056, χ2 = 7.6). Males, versus females, were more likely to report excessive exercise (56.4% vs. 37.3%, p = 0.056, χ2 = 3.6). CONCLUSIONS: This study adds to the current body of literature on DE by providing a diverse sample and potentially novel predictors and risk factors for DE.


Subject(s)
Feeding and Eating Disorders/epidemiology , Students/statistics & numerical data , Adolescent , Cross-Sectional Studies , Diet Surveys , Feeding Behavior/psychology , Feeding and Eating Disorders/etiology , Feeding and Eating Disorders/psychology , Female , Humans , Male , Prevalence , Risk Factors , Students/psychology , Surveys and Questionnaires , Universities , Young Adult
20.
J Strength Cond Res ; 33(2): 443-450, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30531412

ABSTRACT

Sassone, J, Muster, M, and Barrack, MT. Prevalence and predictors of higher-risk supplement use among National Collegiate Athletic Association Division I athletes. J Strength Cond Res 33(2): 443-450, 2019-This study aimed to identify the prevalence and predictors associated with the use of higher-risk dietary supplements, defined as supplements containing herbal ingredients, caffeine, or those classified for weight loss, muscle-building, or as a preworkout supplement, among 557 National Collegiate Athletic Association Division I male and female collegiate athletes. Although 252 (45.2%) athletes reported the use of a dietary supplement on ≥2 days per week over the past year, 46 (8.3%) athletes met criteria for higher-risk supplement use. Twenty (3.6%) athletes reported the use of herbal, 1 (0.2%) caffeinated, 5 (0.9%) weight loss, 28 (5.0%) preworkout, and 1 (0.2%) muscle-building supplements. Body mass index status (BMI ≥30 kg·m), sport-type (sports using the phosphocreatine energy system), and college year (≥4th year) were associated with the use of preworkout, muscle-building, or herbal supplements. A multiple regression analysis identified predictors of higher-risk supplement use including the number of dietary supplements used in the past year (odds ratio [OR] = 2.1, 95% confidence interval [CI] = 1.7-2.7, p < 0.001), the reported motivation of taking dietary supplements to gain muscle and lose body fat (OR = 3.5, 95% CI = 1.1-11.7, p = 0.04), and the motivation to increase athletic endurance (OR = 3.5, 95% CI = 4.0, 95% CI = 1.6-9.9, p < 0.005). These factors may be considered as a part of a screening process to evaluate athletes with an increased risk of higher-risk supplement use and potential consequences to health or eligibility status.


Subject(s)
Athletes/statistics & numerical data , Dietary Supplements/statistics & numerical data , Anti-Obesity Agents/administration & dosage , Athletes/psychology , Body Mass Index , Caffeine/administration & dosage , Cross-Sectional Studies , Female , Humans , Male , Motivation , Plant Preparations/administration & dosage , Prevalence , Universities , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...