Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
1.
Clin J Sport Med ; 34(1): 44-51, 2024 Jan 01.
Article in English | MEDLINE | ID: mdl-36853903

ABSTRACT

OBJECTIVE: To describe the presentation and management of lumbar bone stress injuries (LBSI), recurrent LBSI, and lumbar nonunited defects in elite Australian male and female cricket players. DESIGN: Retrospective case series. SETTING: Professional domestic and international cricket teams over 13 seasons. PARTICIPANTS: Elite Australian cricket players. INDEPENDENT VARIABLES: Symptomatic LBSI requiring time off cricket and lumbar nonunited defects, both confirmed by imaging. MAIN OUTCOME MEASURES: Incidence, presentation, history, healing, and management. RESULTS: 211 LBSI were identified at an average incidence of 5.4 per 100 players per season. LBSI were most common in male pace bowlers younger than 20 years of age (58.1 per 100 players per season), however, were also observed in older players, females, and non-pace bowlers. Recurrent LBSI accounted for 33% (27%-40%) of all LBSI. Median days to return to match availability was 182 (128-251) days for all LBSI, with a shorter time frame observed for new and less severe injuries, and male spin bowlers. Healing was demonstrated in 87% (81%-91%) of all LBSI cases. 29 nonunited defects were identified and predisposed subsequent pain, LBSI, and spondylolisthesis. CONCLUSIONS: LBSI are experienced by approximately 5.4 in every 100 elite Australian cricket players per season, with a high time cost of approximately 4 to 8 months. Nonunited defects also have a high time cost with associated subsequent lumbar spine issues. The findings of this study reinforce the importance of early detection and conservative management of LBSI, particularly for younger male pace bowlers and players with recurrent LBSI, which may be supported by MRI.


Subject(s)
Athletic Injuries , Back Injuries , Cricket Sport , Humans , Male , Female , Aged , Athletic Injuries/diagnostic imaging , Athletic Injuries/epidemiology , Athletic Injuries/therapy , Retrospective Studies , Australia/epidemiology
2.
Indian J Orthop ; 57(10): 1584-1591, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37766950

ABSTRACT

Objectives: To describe traumatic head and neck injuries in elite Australian cricket players, for the purposes of understanding risk and the role of protective equipment and regulations. Design: Retrospective cohort study. Methods: This study reviewed twelve seasons of clinical data for elite male and female cricket players who sustained a traumatic head or neck injury (excluding isolated concussion) whilst participating in a cricket match or training. Results: 199 events of head and neck injuries were recorded over the 12 seasons, equating to an average incidence of 5.6 per 100 players per season. Since the introduction of helmet regulations in 2016, the average incidence was 7.3. Including concurrent injuries, 232 injuries revealed contusions were the most common type of injury (41%, 35-48%), and the face was the most common location (63%, 57-69%). Injuries resulted in the player being unavailable for cricket for one or more days in 15% (11-22%) of events. Since the introduction of cricket helmet regulations, the proportion of injuries sustained while batting decreased from 54% (43-65%) to 38% (30-47%) (p = 0.026), and the proportion of injuries sustained while wicket keeping decreased from 19% (11-29%) to 6% (3-11%) (p=0.004). Conclusion: Traumatic head and neck injuries occur at an incidence of approximately 7.3 per 100 players per season in elite Australian male and female cricket players. Whilst most injuries cause a low burden with respect to days unavailable, the risk of potentially serious or catastrophic consequences warrants further risk reduction strategies including tightening of the existing industry standard for helmets and governing body regulations.

3.
PLoS One ; 18(6): e0282040, 2023.
Article in English | MEDLINE | ID: mdl-37390108

ABSTRACT

Australia's headspace initiative is world-leading in nation-wide youth mental healthcare reform for young people aged 12 to 25 years, now with 16 years of implementation. This paper examines changes in the key outcomes of psychological distress, psychosocial functioning, and quality of life for young people accessing headspace centres across Australia for mental health problems. Routinely collected data from headspace clients commencing an episode of care within the data collection period, 1 April 2019 to 30 March 2020, and at 90-day follow-up were analysed. Participants came from the 108 fully-established headspace centres across Australia, and comprised 58,233 young people aged 12-25 years first accessing headspace centres for mental health problems during the data collection period. Main outcome measures were self-reported psychological distress and quality of life, and clinician-reported social and occupational functioning. Most headspace mental health clients presented with depression and anxiety issues (75.21%). There were 35.27% with a diagnosis: overall, 21.74% diagnosed with anxiety, 18.51% with depression, and 8.60% were sub-syndromal. Younger males were more likely to present for anger issues. Cognitive behavioural therapy was the most common treatment. There were significant improvements in all outcome scores over time (P<0.001). From presentation to last service rating, over one-third had significant improvements in psychological distress and a similar proportion in psychosocial functioning; just under half improved in self-reported quality of life. Significant improvement on any of the three outcomes was shown for 70.96% of headspace mental health clients. After 16 years of headspace implementation, positive outcomes are being achieved, particularly when multi-dimensional outcomes are considered. A suite of outcomes that capture meaningful change for young people's quality of life, distress and functioning, is critical for early intervention, primary care settings with diverse client presentations, such as the headspace youth mental healthcare initiative.


Subject(s)
Mental Health Services , Quality of Life , Male , Humans , Adolescent , Mental Health , Health Care Reform , Australia/epidemiology
4.
J Sci Med Sport ; 26(1): 19-24, 2023 Jan.
Article in English | MEDLINE | ID: mdl-36522249

ABSTRACT

OBJECTIVES: This study presents seven seasons of injury surveillance data for both elite Australian male and female cricket players, revealing injury statistics and allowing for comparison between sexes. DESIGN: Retrospective cohort. METHODS: Participants were elite Australian male and female cricket players who were contracted to play for a national and/or state/territory team and/or T20 franchise between 2015-16 and 2021-22 (7 seasons). Injury data was recorded in Cricket Australia's Athlete Management System database and combined with match data. The STROBE-SIIS statement was used as the relevant guideline for this study. RESULTS: Data for 1345 male player seasons and 959 female player seasons revealed sex-related differences in the injury incidence rates and prevalence. Males had higher incidence (average 136 vs 101 injuries per 1000 match days) and prevalence of match time-loss injuries (average 10.4% vs 6.5% players unavailable). However, the overall incidence of all medical attention injuries were similar between sexes (Incidence Rate Ratio (IRR) 0.9, 95%CI 0.8-1.0). The most frequent match time-loss injuries for males were hamstring strains (7.4 new injuries per 100 players per season), side and abdominal strains (5.5), concussion (5.0), lumbar stress fractures (4.3), and wrist and hand fractures (3.9). The most frequent match time-loss injuries for females over the 7 seasons were hamstring strains (3.1), concussion (2.3), quadriceps strains (2.4) and shin/foot/ankle stress fractures (2.0). The IRR of medical attention injuries for males compared to females was higher for lumbosacral stress fractures (IRR 2.3), elbow and forearm injuries (1.5), and concussion (1.4), and lower for lower leg, foot, and ankle stress fractures (0.6), shoulder and upper arm injuries (0.7), and quadriceps strains (0.6). CONCLUSIONS: Robust long-term injury surveillance enabled the injury profiles of elite Australian male and female cricket players to be understood and compared. Males had a higher incidence and prevalence of match time-loss injuries, likely reflecting a higher match exposure.


Subject(s)
Athletic Injuries , Brain Concussion , Fractures, Stress , Humans , Male , Female , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Retrospective Studies , Australia/epidemiology , Brain Concussion/epidemiology , Brain Concussion/complications , Incidence
5.
Clin J Sport Med ; 32(2): e121-e125, 2022 03 01.
Article in English | MEDLINE | ID: mdl-33239511

ABSTRACT

OBJECTIVE: Describe the proportion of upper lumbar bone stress injuries (LBSI; T12-L3) relative to all LBSI, and the clinical presentation and diagnosis of upper LBSI in elite cricketers. DESIGN: Case series. SETTING: Professional domestic and international cricket teams over a 9-year period. PARTICIPANTS: Elite Australian cricketers. INDEPENDENT VARIABLES: Symptomatic upper LBSI diagnosed based on clinical findings and medical imaging. MAIN OUTCOME MEASURES: Prevalence, injury history, and clinical management. RESULTS: Twenty-four pace bowlers (22 male and 2 female) sustained 39 cases of upper LBSI (T12:2, L1:3, L2:20, L3:14). Upper lumbar vertebrae were involved in 41% (95% CI 31-51) of all LBSI in this cohort. Twenty-seven (69%, 54-81) cases had an injury that occurred only on the side contralateral to the bowling arm. Ipsilateral injuries tended to occur secondary to a contralateral nonunited defect. In all 7 cases with known radiology follow-up that had a contralateral then ipsilateral LBSI, the contralateral injury did not achieve bony union before the onset of the ipsilateral LBSI. For stress fractures with imaging follow-up, those who achieved bony union took longer to return to bowling training [median 152 days (IQR 117-188)], compared to those who achieved partial or no union [median 68 days (IQR 46-115)]. CONCLUSIONS: Upper LBSI in elite cricketers occurs in approximately 2 out of 5 cases of LBSI. Clinicians should allow sufficient time for upper LBSI to resolve and unite (if a fracture) because cases that returned to bowling training earlier were less likely to achieve bony union, and those that failed to unite commonly went on to have a recurrent LBSI. LEVEL OF EVIDENCE: Therapy/prognosis/diagnosis level 2b.


Subject(s)
Athletic Injuries , Back Injuries , Fractures, Stress , Sports , Athletic Injuries/diagnostic imaging , Athletic Injuries/epidemiology , Athletic Injuries/etiology , Australia , Female , Fractures, Stress/diagnostic imaging , Fractures, Stress/epidemiology , Humans , Lumbar Vertebrae/diagnostic imaging , Lumbar Vertebrae/injuries , Male
6.
Front Psychol ; 12: 581914, 2021.
Article in English | MEDLINE | ID: mdl-33995169

ABSTRACT

Guilt and shame are self-conscious emotions with implications for mental health, social and occupational functioning, and the effectiveness of sports practice. To date, the assessment and role of athlete-specific guilt and shame has been under-researched. Reporting data from 174 junior elite cricketers (M = 17.34 years; females n = 85), the present study utilized exploratory factor analysis in validating the Athletic Perceptions of Performance Scale (APPS), assessing three distinct and statistically reliable factors: athletic shame-proneness, guilt-proneness, and no-concern. Conditional process analysis indicated that APPS shame-proneness mediated the relationship between general and athlete-specific distress (p < 0.01), with this pathway non-contingent on sex or past 12-month help-seeking for mental health concerns (p's > 0.05). While APPS domains of guilt-proneness and no-concern were not significant mediators, they exhibited correlations in the expected direction with indices of psychological distress and well-being. The APPS may assist coaches and support staff identify players who may benefit from targeted interventions to reduce the likelihood of experiencing shame-prone states.

7.
BMJ Open Sport Exerc Med ; 7(2): e001061, 2021.
Article in English | MEDLINE | ID: mdl-33981449

ABSTRACT

OBJECTIVE: The diagnosis of sport-related concussion is a challenge for practitioners given the variable presentation and lack of a universal clinical indicator. The aim of this study was to describe the CogSport findings associated with concussion in elite Australian cricket players, and to evaluate the diagnostic ability of CogSport for this cohort. METHODS: A retrospective study design was used to evaluate CogSport performance of 45 concussed (male n=27, mean age 24.5±4.5 years; female n=18, 23.5±3.5 years) compared with 45 matched non-concussed (male n=27, mean age 27.3±4.5 years; female n=18, 24.1±4.5 years) elite Australian cricket players who sustained a head impact during cricket specific activity between July 2015 and December 2019. RESULTS: Median number of reported symptoms on the day of injury for concussed players was 7 out of 24, with a median symptom severity of 10 out of 120. CogSport performance deteriorated significantly in concussed cricket players' Detection speed (p<0.001), Identification speed (p<0.001), One Back speed (p=0.001) and One Back accuracy (p=0.022) components. These components, when considered independently and together, had good diagnostic utility. CONCLUSION: This study demonstrated good clinical utility of CogSport for identifying concussed cricket players, particularly symptoms and Detection, Identification and One Back components. Therefore, CogSport may be considered a useful tool to assist concussion diagnosis in this cohort, and the clinician may place greater weight on the components associated with concussion diagnosis.

8.
Bone ; 143: 115626, 2021 02.
Article in English | MEDLINE | ID: mdl-32891868

ABSTRACT

OBJECTIVES: 1) Quantify the intensity of bone marrow oedema (BMO) present in the lumbar vertebrae of asymptomatic elite adult fast bowlers; 2) relate the intensity of BMO to bowling workload and lumbar bone stress injury (LBSI), and; 3) evaluate the utility of MRI screening to reduce the risk of LBSI. METHODS: Thirty-eight elite Australian fast bowlers (21.6 ± 3.7 years) completed 48 screening MRI over 3 years. BMO intensity was quantified on MRI retrospectively. Standard practices for bowling workload monitoring and injury diagnosis were followed. RESULTS: Clinically significant BMO (signal intensity ratio ≥ 2.0) was observed in 22 (46%, 95% CI 31-61) screening MRI. These bowlers had a total of 77 (IQR 45-115) days off between seasons, compared to 66 (IQR 41-94) days off for bowlers with a BMO intensity less than 2.0 (p = 0.510). Fifteen bowlers received follow up MRI as part of individualised management based on their screening MRI, of which less than five went on to develop LBSI in the subsequent season. There was no difference in days or balls bowled in the 12 months following screening MRI between those who sustained LBSI and those who did not. CONCLUSIONS: BMO is common in asymptomatic bowlers. Identification of high-risk bowlers using screening MRI informs individualised management and may prevent progression to LBSI.


Subject(s)
Athletic Injuries , Adult , Australia/epidemiology , Bone Marrow/diagnostic imaging , Edema/diagnostic imaging , Humans , Retrospective Studies
9.
J Sci Med Sport ; 24(5): 420-424, 2021 May.
Article in English | MEDLINE | ID: mdl-33160856

ABSTRACT

OBJECTIVES: Determine intra-individual changes in CogSport performance in elite cricket players diagnosed with concussion, and differentiate this from changes which may be attributed to post-match with no head impact. DESIGN: Retrospective observational study of elite Australian male and female cricket players with diagnosed concussion and prospective cohort study of cricket players with no head impact post-match. METHODS: CogSport performance relative to an individual's baseline was compared between 46 cricket players diagnosed with concussion following a head impact sustained during a match, and 84 cricket players who played a match during which they had no head impact. RESULTS: CogSport performance post-match for players diagnosed with concussion was slower for detection speed (p < 0.001), identification speed (p = 0.007), and one back speed (p = 0.011). No changes in one card learning speed or any accuracy measures were observed. CogSport performance post-match with no head impact was faster but less accurate for one card learning (both p < 0.001). No changes in the other three test components were observed. CONCLUSIONS: Slower performance in three of four CogSport tasks (detection, identification, one back) may be indicative of concussion, as these intra-individual changes were not observed in players post-match with no head impact. The fourth task, one card learning, may not be a useful indicator of concussion as it was not observed to change with concussion yet was susceptible to change post-match with no head impact. CogSport may have clinical utility in assisting the clinical diagnosis of concussion in elite male and female cricket players.


Subject(s)
Athletic Injuries/physiopathology , Brain Concussion/physiopathology , Cognition/physiology , Cricket Sport/injuries , Adult , Cohort Studies , Female , Humans , Male , Neuropsychological Tests , Prospective Studies , Retrospective Studies , Young Adult
10.
J Sci Med Sport ; 24(2): 112-115, 2021 Feb.
Article in English | MEDLINE | ID: mdl-32680702

ABSTRACT

OBJECTIVES: Review magnetic resonance imaging (MRI) of elite adult fast bowlers with a history of lumbar spine stress fracture for evidence of bone healing. The findings will determine whether bone healing can occur in this population, and whether MRI may be used as a tool to assess bone healing and inform clinical decision making. DESIGN: Retrospective cohort. METHODS: Participants were elite Australian fast bowlers who sustained a lumbar spine stress fracture confirmed on MRI and had at least one subsequent MRI. Two radiologists independently reviewed all images. RESULTS: Thirty-one fractures from 20 male fast bowlers were reviewed. Median maximum fracture size was 6mm (range 2-25mm). Twenty-five fractures achieved bone healing, with a median 203 (IQR 141-301) days between the initial MRI (to confirm diagnosis) and the MRI when bone healing was observed. Fracture size and signal intensity of bone marrow oedema were positively associated with the number of days to the MRI when bone healing was observed (r2=0.245, p<0.001 and r2=0.292, p<0.001 respectively). Fractures which occurred at the same site as a previously united fracture took longer to heal than the first fracture (median 276 days to the MRI when bone healing was observed compared to 114 days for first fracture; p=0.036). CONCLUSIONS: Lumbar spine stress fractures in elite adult fast bowlers are capable of achieving complete bone healing, as demonstrated in the majority of bowlers in this study. Larger fractures, greater bone marrow oedema, and history of previous injury at the same site may require longer healing time which may be monitored with MRI.


Subject(s)
Cricket Sport/injuries , Fractures, Stress/diagnostic imaging , Lumbar Vertebrae/injuries , Spinal Fractures/diagnostic imaging , Adolescent , Adult , Bone Marrow/diagnostic imaging , Clinical Decision-Making , Edema/diagnostic imaging , Fracture Healing , Fractures, Stress/pathology , Humans , Lumbar Vertebrae/diagnostic imaging , Lumbar Vertebrae/pathology , Magnetic Resonance Imaging , Male , Retrospective Studies , Return to Sport , Spinal Fractures/pathology , Time Factors , Young Adult
11.
J Orthop ; 22: 100-103, 2020.
Article in English | MEDLINE | ID: mdl-32308261

ABSTRACT

INTRODUCTION: Hand fractures are one of the most common injuries sustained whilst playing cricket. Further research is required to inform future clinical management and risk-reduction strategies. METHODS: This retrospective cohort study reviewed all cases of hand fractures in elite Australian cricket players over a three-year period. Data included specific activity when injury occurred, location of injury, management (non-surgical or surgical) and days to return to play. RESULTS: Seventy (17%, 95% CI 14-21 of players; 43 male, 27 female) players sustained 90 hand fractures. Seventy-three (81%, 95% CI 72-89) fractures occurred whilst fielding the ball. Eighty-four (93%, 95% CI 86-97) fractures occurred to the 'exterior' bones of the hand: distal phalanx, middle phalanx, first and fifth rays. Thirteen (14%, 95% CI 9-23) fractures were managed with surgical internal fixation, of which 11 were to the phalanges, most commonly at the proximal phalanx (n = 5, 36% of all proximal phalanx fractures) or fifth ray middle and proximal phalanges (n = 5, 42% of all fifth ray phalangeal fractures). Fractures requiring surgical management typically had longer time injured (median 33 days, IQR 27-41) than fractures managed non-surgically (median 6 days, IQR 0-21) (p = 0.001). Total time to return to full unrestricted play was similar between surgical (49 days, IQR 45-52) and non-surgical (32 days, IQR 15-45) management (p = 0.197). CONCLUSIONS: Hand fractures sustained by elite male and female Australian cricket players were found to display a pattern of occurring to the 'exterior' bones of the hand. The results of this study may inform clinical decision making with respect to non-surgical or surgical management and anticipated return to play times. Further effort is needed to address risk reduction strategies including gloves and skill proficiency.

12.
Spine (Phila Pa 1976) ; 45(18): E1166-E1171, 2020 Sep 15.
Article in English | MEDLINE | ID: mdl-31593063

ABSTRACT

STUDY DESIGN: Comparative reliability and prospective validity. OBJECTIVE: First, to evaluate the reliability of four methods of assessing magnetic resonance imaging (MRI) bone marrow edema (BMO) of the posterior vertebral arch of the lumbar vertebrae of elite junior fast bowlers. Second, to evaluate the validity of the most reliable method for the early detection of lumbar bone stress injury. SUMMARY OF BACKGROUND DATA: MRI has demonstrated utility in identifying BMO in lumbar vertebrae. Methods to grade the severity of BMO may provide valuable insight to inform clinical management, particularly in elite athletes where detection of early-stage bone stress may prevent progression to more severe and costly bone stress injury. METHODS: Sixty-five male elite junior fast bowlers had repeat MRI scans during a cricket season. A subset of 19 bowlers' images were reassessed by experienced musculoskeletal radiologists to determine intra- and inter-rater reliability. All images were aligned with independent medical records of lower back symptoms and diagnosed bone stress injuries to establish the relationship of BMO and lumbar bone stress injury. RESULTS: Clinical detection of abnormal BMO, whether the pars region of the vertebra was considered in its entirety or subdivided into regions, had fair-to-moderate inter-rater reliability, and fair-to-almost perfect intra-rater reliability. Measurement of BMO signal intensity using an imaging software tool had excellent intra-rater and inter-rater reliability (ICC = 0.848, 0.837). BMO signal intensity was positively associated with subsequent LBSI (P < 0.001), and differentiated between asymptomatic and symptomatic bowlers (P < 0.001). CONCLUSION: Measurement of BMO signal intensity using an imaging software tool proved a reliable and valid measure of the severity of lumbar bone stress injury in elite junior fast bowlers. LEVEL OF EVIDENCE: 2.


Subject(s)
Bone Marrow Diseases/diagnostic imaging , Cricket Sport , Edema/diagnostic imaging , Fractures, Stress/diagnostic imaging , Lumbar Vertebrae/diagnostic imaging , Magnetic Resonance Imaging/standards , Adolescent , Back Injuries/diagnostic imaging , Back Injuries/epidemiology , Bone Marrow/diagnostic imaging , Bone Marrow Diseases/epidemiology , Cricket Sport/injuries , Early Diagnosis , Edema/epidemiology , Fractures, Stress/epidemiology , Humans , Lumbar Vertebrae/injuries , Magnetic Resonance Imaging/methods , Male , Prospective Studies , Reproducibility of Results , Young Adult
13.
J Clin Neurosci ; 68: 28-32, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31399319

ABSTRACT

Experiential knowledge was collated to improve understanding of the mechanism of vertebral artery dissection (VAD) and inform recommendations for risk-reduction strategies in sport. Fourteen experts from fields of neurology, forensic pathology, biomedical engineering, radiology, physiotherapy, and sport and exercise medicine participated in semi-structured interviews. Experts were asked to provide their hypothesised mechanism of VAD, and suggest strategies to reduce the risk of VAD in non-motorised sports. Experts agreed that there is no single mechanism of VAD. Factors relating to predisposition, susceptibility, and an inciting event exist on a spectrum, as does the severity of the resulting VAD. Particularly concerning inciting events which may occur during sports participation include blunt force impact to the specific area behind and below the ear; and extreme movement of the neck, which may be facilitated by impact to the head or neck. Risk reduction strategies must be feasible within the particular sporting context. Strategies include rules, personal protective equipment, and education to reduce the risk of impact to the head or neck. Education may also serve to improve early recognition of VAD. VAD is a risk (low frequency, severe consequence) in sports in which athletes are exposed to head or neck impact from an object or opponent. Best practice risk management suggests that sports governing bodies should assess VAD risk and consider risk controls.


Subject(s)
Athletic Injuries/etiology , Athletic Injuries/prevention & control , Risk Reduction Behavior , Vertebral Artery Dissection/etiology , Vertebral Artery Dissection/prevention & control , Athletes , Humans , Personal Protective Equipment , Vertebral Artery/pathology
14.
Sports Med ; 49(4): 553-564, 2019 Apr.
Article in English | MEDLINE | ID: mdl-30758815

ABSTRACT

BACKGROUND: Vertebral artery dissection (VAD) is a potentially catastrophic injury that may occur during sports participation. A comprehensive review is needed to collate documented cases to improve understanding and inform future preventative approaches. OBJECTIVE: This review aimed to understand the extent of VAD in sport and characterise trends suggestive of mechanisms of injury. METHODS: Electronic databases were searched using terms related to VAD and sport. Records were included if they described one or more cases of VAD attributed to sport. RESULTS: A total of 79 records described 128 individual cases of VAD in sport, of which 118 were confirmed by imaging or autopsy and included in analyses. Cases were attributed to 43 contact and non-contact sports. The median age of cases was 33 years (IQR 22-44), and 75% were male. There were 22 cases of fatal injury, of which ten involved an impact to the mastoid region and seven involved an impact to the head or neck. Non-fatal cases of VAD were attributed to impact to the head or neck (not mastoid region), movement or held position without impact, and in some cases no reported incident. CONCLUSIONS: VAD attributed to sports participation is uncommonly reported and the mechanisms are varied. Impact to the mastoid region is consistently implicated in fatal cases and should be the focus of injury prevention strategies in sport. Efforts may also be directed at improving the prognosis of cases with delayed presentation through clinical recognition and imaging. The review was registered on the international prospective register for systematic reviews ( http://www.crd.york.ac.uk/PROSPERO ) (CRD42018090543).


Subject(s)
Sports , Vertebral Artery Dissection/epidemiology , Athletic Injuries/prevention & control , Humans , Mastoid , Neck , Risk Factors , Vertebral Artery Dissection/mortality , Wounds, Nonpenetrating/prevention & control
15.
Br J Sports Med ; 53(19): 1236-1239, 2019 Oct.
Article in English | MEDLINE | ID: mdl-30425044

ABSTRACT

OBJECTIVES: Lumbar bone stress injury ('bone stress injury') is common in junior fast bowlers. The repetitive loading of cricket fast bowling may cause bone marrow oedema (BMO), detectable on MRI, before the bowler suffers from symptomatic bone stress injury. We investigated the temporal relationship between BMO, bone stress injury, along with bowling workload correlates, in elite junior fast bowlers throughout a cricket season. METHODS: 65 junior fast bowlers were prospectively monitored for one 8-month cricket season. For research purposes, participants had up to six MRI scans at set times in the season; findings were withheld from them and their clinicians. Standard practices for bowling workload monitoring and injury diagnosis were followed. RESULTS: 15 (23%) participants developed bone stress injury during the study. All 15 of these participants had BMO detected on at least one of the preceding MRI scans, including the scan immediately prior to diagnosis. The risk of BMO progressing to bone stress injury during the season was greatest for participants with BMO present 2 weeks prior to the national championship tournament (period of high load) (RR=18.9, OR=44.8). Both bone stress injury and BMO were associated with bowling a higher percentage of days in training and having a shorter bowling break during the season. The number of balls bowled and acute-to-chronic workload were not associated with imaging abnormalities or injury. CONCLUSION: The presence of BMO on MRI in asymptomatic junior cricket fast bowlers confers a very high risk for bone stress injury. The risk may be managed by MRI screening and monitoring bowling frequency.


Subject(s)
Athletic Injuries/diagnosis , Back Injuries/diagnosis , Bone Marrow Diseases/diagnostic imaging , Edema/diagnostic imaging , Adolescent , Bone Marrow/pathology , Cohort Studies , Humans , Magnetic Resonance Imaging , Risk Factors , Sports , Workload
16.
Sports (Basel) ; 6(4)2018 Dec 05.
Article in English | MEDLINE | ID: mdl-30563035

ABSTRACT

This study aimed to observe core temperature responses in elite cricket players under match conditions during the summer in Australia. Thirty-eight Australian male cricketers ingested capsule temperature sensors during six four-day first-class matches between February 2016 and March 2017. Core temperature (Tc) was recorded during breaks in play. Batters showed an increase in Tc related to time spent batting of approximately 1 °C per two hours of play (p < 0.001). Increases in rate of perceived exertion (RPE) in batters correlated with smaller elevations in Tc (0.2 °C per one unit of elevation in RPE) (p < 0.001). Significant, but clinically trivial, increases in Tc of batters were found related to the day of play, wet bulb globe temperature (WBGT), air temperature, and humidity. A trivial increase in Tc (p < 0.001) was associated with time in the field and RPE when fielding. There was no association between Tc and WBGT, air temperature, humidity, or day of play in fielders. This study demonstrates that batters have greater rises in Tc than other cricket participants, and may have an increased risk of exertional heat illness, despite exposure to similar environmental conditions.

17.
Sports (Basel) ; 6(3)2018 Jul 20.
Article in English | MEDLINE | ID: mdl-30036955

ABSTRACT

Monitoring is an essential yet unstandardized component of managing athletic preparation. The purpose of this paper is to provide insight into the typical measurements and responses observed from monitoring elite road cyclist and swimmers during training camps, and translate these observations to practical strategies for other practitioners to employ. Twenty-nine male professional cyclists, 12 male and 19 female international swimmers participated in up to three of the eight 4⁻19 day training camps, held early in the season or leading into major competitions, at sea-level or moderate altitude. Monitoring included body mass and composition, subjective sleep, urinary specific gravity (USG), resting heart rate (HR) and peripheral oxygen saturation (SpO2) at altitude. Sum of seven skinfolds most likely decreased in the order of 3.1 ± 3.6 mm week-to-week, accompanied by a most likely trivial decrease in body mass of 0.4 ± 0.4 kg week-to-week. At altitude, sleep quality very likely trivially improved week-to-week (0.3 ± 0.3 AU), SpO2 possibly increased week-to-week (0.6 ± 1.7%), whilst changes in resting HR were unclear (0 ± 4 bpm). Sleep duration and USG were stable. Comparing individual to group day-to-day change in monitored variables may prove effective to flag athletes potentially at risk of training maladaptation. Practitioners may replicate these methods to establish thresholds specific to their cohort and setting. This study provides further support for a multi-faceted approach to monitoring elite athletes in training camp environments.

18.
Article in English | MEDLINE | ID: mdl-29750119

ABSTRACT

BACKGROUND: Eating disorders are serious psychiatric illnesses that are often associated with poor quality of life and low long-term recovery rates. Peer mentor programs have been found to improve psychiatric symptoms and quality of life in other mental illnesses, and a small number of studies have suggested that eating disorder patients may benefit from such programs. The aim of this study is to assess the efficacy of a peer mentor program for individuals with eating disorders in terms of improving symptomatology and quality of life. METHODS: Up to 30 individuals with a past history of an eating disorder will be recruited to mentor 30 individuals with a current eating disorder. Mentoring will involve 13 sessions (held approximately every 2 weeks), of up to 3 h each, over 6 months. DISCUSSION: This pilot proof-of-concept feasibility study will inform the efficacy of a peer mentoring program on improving eating disorder symptomatology and quality of life, and will inform future randomised controlled trials. TRIAL REGISTRATION: Australian and New Zealand Clinical Trials Registration Number: ACTRN12617001412325. The date of registration (retrospective): 05/10/2017.

19.
Eur J Sport Sci ; 18(4): 458-472, 2018 May.
Article in English | MEDLINE | ID: mdl-29431589

ABSTRACT

Athletes often record details of their training and competitions, supported by information such as environmental conditions, travel, as well as how they felt. However, it is not known how prevalent these practices are in golfers, or how valuable this process is perceived. The purpose of this study was to develop a golf-specific load monitoring tool (GLMT), and establish the content validity and feasibility of this tool amongst high-level golfers. In the first phase of development, 21 experts were surveyed to determine the suitability of items for inclusion in the GLMT. Of the 36 items, 21 received >78% agreement, a requirement to establish content validity and for inclusion in the GLMT. Total duration was the preferred metric for golf-specific activities, whilst rating of perceived exertion (RPE) was preferred for measuring physical training. In the second phase, feasibility of the tool was assessed by surveying 13 high-level male golfers following 28-days of daily GLMT use. All items included in the GLMT were deemed feasible to record, with all players participating in the feasibility study providing high to very high ratings. Golfers responded that they would consider using a load monitoring tool of this nature long term, provided it can be completed in less than five minutes per day.


Subject(s)
Athletic Performance/standards , Data Collection/instrumentation , Golf , Physical Exertion , Athletes , Feasibility Studies , Humans , Male
20.
Sports (Basel) ; 5(3)2017 Jul 26.
Article in English | MEDLINE | ID: mdl-29910414

ABSTRACT

The experience of athletes and practitioners has led to the suggestion that use of an athlete self-report measure (ASRM) may increase an athlete's self-awareness, satisfaction, motivation, and confidence. This study sought to provide empirical evidence for this assertion by evaluating psychological alterations associated with ASRM use across a diverse athlete population. Athletes (n = 335) had access to an ASRM for 16 weeks and completed an online survey at baseline, and weeks 4, 8, and 16. Generalized estimating equations were used to evaluate the associations between ASRM compliance and outcome measures. Compared to baseline, confidence and extrinsic motivation were most likely increased at weeks 4, 8, and 16. Satisfaction and intrinsic motivation were most likely decreased at week 4, but no different to baseline values at weeks 8 and 16. Novice athletes and those who were instructed to use an ASRM (rather than using one autonomously) were less responsive to ASRM use. This study provides preliminary evidence for ASRM to prompt initial dissatisfaction and decreased intrinsic motivation which, along with increased confidence and extrinsic motivation, may provide the necessary stimulus to improve performance-related behaviors. Novice and less autonomous athletes may benefit from support to develop motivation, knowledge, and skills to use the information gleaned from an ASRM effectively.

SELECTION OF CITATIONS
SEARCH DETAIL
...