Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
1.
Spine J ; 19(3): 552-563, 2019 03.
Article in English | MEDLINE | ID: mdl-30149083

ABSTRACT

BACKGROUND CONTEXT: Lower extremity amputation (LEA) is associated with an elevated risk for development and progression of secondary health conditions. Low back pain (LBP) is one such condition adversely affecting function, independence, and quality of life. PURPOSE: The purpose of this study was to systematically review the literature to determine the strength of evidence relating the presence and severity of LBP secondary to LEA, thereby supporting the formulation of empirical evidence statements (EESs) to guide practice and future research. STUDY DESIGN/SETTING: Systematic review of the literature. METHODS: A systematic review of five databases was conducted followed by evaluation of evidence and synthesis of EESs. RESULTS: Seventeen manuscripts were included. From these, eight EESs were synthesized within the following categories: epidemiology, amputation level, function, disability, leg length, posture, spinal kinematics, and osseointegrated prostheses. Only the EES on epidemiology was supported by evidence at the moderate confidence level given support by eight moderate quality studies. The four EESs for amputation level, leg length, posture, and spinal kinematics were supported by evidence at the low confidence level given that each of these statements had some evidence not supporting the statement but ultimately more evidence (and of higher quality) currently supporting the statement. The remaining three EESs that addressed function, disability and osseointegrated prosthetic use were all supported by single studies or had comparable evidence that disagreed with study findings rendering insufficient evidence to support the respective EES. CONCLUSIONS: Based on the state of the current evidence, appropriate preventative and, particularly, treatment strategies to manage LBP in persons with LEA remain a knowledge gap and an area of future study.


Subject(s)
Amputees/statistics & numerical data , Low Back Pain/epidemiology , Lower Extremity/surgery , Amputation, Surgical/adverse effects , Artificial Limbs/adverse effects , Biomechanical Phenomena , Humans , Low Back Pain/etiology
2.
Mil Med ; 184(5-6): e323-e329, 2019 05 01.
Article in English | MEDLINE | ID: mdl-30371883

ABSTRACT

INTRODUCTION: Despite medical interventions to preserve viability and functionality of injured limb(s) among combat-injured service members, delayed amputations may occur. The goal of this study was to determine whether specific lower extremity (LE) injuries were associated with delayed amputations. METHODS: The Expeditionary Medical Encounter Database was queried for combat-related LE injuries between 2003 and 2015. The Abbreviated Injury Scale (AIS) was used to categorize LE injuries by severity. Injury episodes with a maximum LE AIS of 1 or amputation on the day of injury were excluded. The final sample included 2,996 service members with at least one LE injury with an AIS ≥2. The frequencies of specific LE fractures and nerve and vessel injuries were determined. Logistic regression with paired independent variables was performed to examine the impact of multiple LE injuries on the odds of delayed amputation. RESULTS: Delayed LE amputation was identified in 308 (10.3%) service members in the sample. The delayed and no amputation groups did not differ in age and service branch. The majority of injury episodes were blast-related and with an Injury Severity Score ≥9. The most frequent fractures were tibia (34.4%) and fibula (29.3%), but the highest rates of delayed amputation were in those with navicular (36.2%), talus (30.0%), or calcaneus (28.1%) fractures. Odds of amputation were highest among service members with the calcaneus fracture and LE nerve injury (odds ratio [OR]: 41.74; 95% confidence interval [CI], 14.70, 118.55; p < 0.001), calcaneal fracture and LE vessel injury (OR: 17.99; 95% CI: 10.53, 30.74; p < 0.001), and calcaneus and tibia fractures (OR: 15.12; 95% CI: 9.54, 23.96; p < 0.001) combinations. CONCLUSIONS: Odds of delayed amputation increased substantially with specific injury combinations. These findings may guide clinical decision-making in the acute care period.


Subject(s)
Amputation, Surgical/methods , Lower Extremity/injuries , Time-to-Treatment , Adult , Female , Humans , Injury Severity Score , Leg Injuries/surgery , Limb Salvage/classification , Limb Salvage/methods , Logistic Models , Lower Extremity/surgery , Male , Middle Aged
3.
J Neuroeng Rehabil ; 15(Suppl 1): 64, 2018 09 05.
Article in English | MEDLINE | ID: mdl-30255804

ABSTRACT

BACKGROUND: Growing discontent with the k-level system for functional classification of patients with limb loss and movement of healthcare toward evidence-based practice has resulted in the need for alternative forms of functional classification and development of clinical practice guidelines to improve access to quality prosthetic interventions. The purpose of this project was to develop and present a clinical practice recommendation for exercise testing in prosthetic patient care based on the results and synthesis of a systematic literature review. METHODS: Database searches of PubMed, Google Scholar, Web of Science, and Cochrane were conducted and articles reviewed. Of the potential 1386 articles 10 met the criteria for inclusion. These articles were assessed using the critical appraisal tool of the United Kingdom National Service Framework for Long-Term Conditions. Of the 10 included articles eight were of high, one of medium, and one of low, quality. Data from these articles were synthesized into 6 empirical evidence statements, all qualifying for research grade A. These statements were used to develop the proposed clinical practice guideline. RESULTS: While the results of this systematic review were not able to support the direct connection between cardiorespiratory performance and K-levels, the literature did support the ability of exercise testing results to predict successful prosthetic ambulation in some demographics. Both continuous maximum-intensity single lower extremity ergometer propelled by a sound limb and intermittent submaximal upper extremity ergometer protocols were found to be viable evaluation tools of cardiorespiratory fitness and function in the target population. CONCLUSION: The ability to sustain an exercise intensity of ≥50% of a predicted VO2max value in single leg cycle ergometry testing and achievement of a sustained workload of 30 W in upper extremity ergometry testing were found to be the strongest correlates to successful ambulation with a prosthesis. VO2 values were found to increase in amputee subjects following a 6-week exercise program. These synthesized results of the systematic literature review regarding exercise testing in patients with loss of a lower extremity were used to develop and a present a clinical treatment pathway.


Subject(s)
Amputees/classification , Artificial Limbs , Exercise Test , Humans , Lower Extremity
4.
Mil Med ; 183(suppl_1): 55-65, 2018 03 01.
Article in English | MEDLINE | ID: mdl-29635559

ABSTRACT

Background: Injuries during basic combat training (BCT) impact military health and readiness in the U.S. Army. Identifying risk factors is crucial for injury prevention, but few Army-wide studies to identify risk factors for injury during BCT have been completed to date. This study examines associations between individual and training-related characteristics and injuries during Army BCT. Methods: Using administrative data from the Total Army Injury and Health Outcomes Database (TAIHOD), we identified individuals who apparently entered BCT for the first time between 1 January 2002 and 30 September 2007, based on review of administrative records. Injuries were identified and categorized based on coded medical encounter data. When combined with dates of medical services, we could count injuries per person, identify unique injuries, and identify the quantity and type of medical care delivered. Regression models produced odds ratios (ORs) and 95% confidence intervals (CIs) to identify risk factors for injury during BCT (yes/no), adjusted for potential confounders. Results: Of the 278,045 (83.4%) men and 55,302 (16.6%) women who were apparently first-time trainees, 39.5% (n = 109,760) of men and 60.9% (n = 33,699) of women were injured during training based on over 2 million recorded medical encounters entries. The large cohort yielded statistically significant, small magnitude associations between injury and all individual and training-related covariates for men, and all but medical accession waivers and weight for women. After adjustment, largest magnitude effects among men were due to age > 25 yr vs. 17-18 yr (OR = 1.83, 95% CI: 1.75, 1.91); having been married in the past vs. being single (OR = 1.36, 95% CI: 1.24, 1.49); rank E4-E7 vs. E1 (OR = 0.56, 95% CI: 0.53. 0.59); training at Ft. Jackson (OR = 0.66, 95% CI: 0.64, 0.69), Ft. Leonard Wood (OR = 0.67, 95% CI: 0.65, 0.70), or Ft. Knox (OR = 0.69, 95% CI: 066, 0.72) vs. Ft. Benning. Odds of injury were highest during 2005, 2006, and 2007. After adjustment for weight and body mass index, taller men had higher odds of BCT injury than average height men (OR = 1.08, 95% CI: 1.05, 1.11). Among women, short stature (OR = 1.11; 95% CI: 1.04, 1.19), training at Ft Leonard Wood (OR = 1.10; 95% CI: 1.04, 1.16) and evidence of injury prior to training based on accession waiver (OR = 1.12; 95% CI: 1.00, 1.26) increased injury risk. Conclusions: This Army-wide analysis reveals higher BCT-related injury rates for both men and women than prior studies and identifies risk factors for injuries during BCT. The large data set allows adjustment for many covariates, but because statistical analysis may yield significant findings for small differences, results must be interpreted based on minimally important differences determined by military and medical professionals. Results provide information that may be used to adapt training or medical screening and examination procedures for basic trainees.


Subject(s)
Military Personnel/statistics & numerical data , Risk Factors , Teaching/standards , Adolescent , Adult , Body Mass Index , Cohort Studies , Female , Humans , Logistic Models , Male , Military Personnel/education , Odds Ratio , Retrospective Studies , Sex Factors , Teaching/statistics & numerical data , United States
5.
Arch Phys Med Rehabil ; 99(2): 348-354.e1, 2018 02.
Article in English | MEDLINE | ID: mdl-29100967

ABSTRACT

OBJECTIVE: To describe the incidence of overuse musculoskeletal injuries in service members with combat-related lower limb amputation. DESIGN: Retrospective cohort study. SETTING: Military treatment facilities. PARTICIPANTS: Service members with deployment-related lower limb injury (N=791): 496 with a major lower limb amputation and 295 with a mild lower limb injury. INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: The outcomes of interest were clinical diagnosis codes (International Classification of Diseases-9th Revision) associated with musculoskeletal overuse injuries of the lumbar spine, upper limb, and lower limb regions 1 year before and 1 year after injury. RESULTS: The overall incidence of developing at least 1 musculoskeletal overuse injury within the first year after lower limb amputation was between 59% and 68%. Service members with unilateral lower limb amputation were almost twice as likely to develop an overuse lower or upper limb injury than those with mild combat-related injury. Additionally, service members with bilateral lower limb amputation were more than twice as likely to develop a lumbar spine injury and 4 times more likely to develop an upper limb overuse injury within the first year after amputation than those with mild combat-related injury. CONCLUSIONS: Incidence of secondary overuse musculoskeletal injury is elevated in service members with lower limb amputation and warrants focused research efforts toward developing preventive interventions.


Subject(s)
Amputation, Traumatic , Cumulative Trauma Disorders/epidemiology , Leg Injuries/surgery , Military Personnel , Musculoskeletal System/injuries , Adult , Female , Humans , Incidence , Injury Severity Score , Male , Retrospective Studies
6.
J Neurotrauma ; 34(23): 3249-3255, 2017 12 01.
Article in English | MEDLINE | ID: mdl-28895451

ABSTRACT

The purpose of this study was to determine the association of mild traumatic brain injury (mTBI) with subsequent post-traumatic stress disorder (PTSD) and mental health disorders (MHD), and the intervening role of acute stress disorder (ASD). This matched case-control study utilized the Total Army Injury and Health Outcomes Database (TAIHOD) to analyze soldiers' (n = 1,261,297) medical encounter data between 2002 and 2011. International Classification of Diseases, Ninth Revision (ICD-9) codes were used to identify: mTBI (following Centers for Disease Control [CDC] surveillance definition for mTBI), MHD (ICD-9 codes for depression and anxiety, excluding PTSD), PTSD (ICD-9 309.81), and ASD (ICD-9 308.3). Incident cases of mTBI (n = 79,505), PTSD (n = 71,454), and MHD (n = 285,731) were identified. Overall incidence rates per 1000 soldier years were: mTBI = 17.23, PTSD = 15.37, and MHD = 67.99. mTBI was associated with increased risk for PTSD (risk ratio [RR] 5.09, 95% confidence interval [CI] 4.82-5.37) and MHD (RR 2.94, 95% CI 2.84-3.04). A sub-analysis of the mTBI-only soldiers found that a diagnosis ASD, compared with a diagnosis of no ASD, was associated with greater risk for subsequent PTSD (RR 2.13, 95% CI 1.96-2.32) and MHD (RR 1.90, 95% CI 1.72-2.09) following mTBI. Results indicate that soldiers with previous mTBI have a higher risk for PTSD and MHD, and that ASD may also mediate PTSD and MHD risk subsequent to mTBI. These data may help guide important surveillance and clinical rehabilitation considerations for high-risk populations.


Subject(s)
Brain Concussion/complications , Brain Concussion/epidemiology , Military Personnel/psychology , Stress Disorders, Post-Traumatic/epidemiology , Adult , Case-Control Studies , Female , Humans , Incidence , Male , Mental Disorders/epidemiology , Middle Aged , Risk Factors , United States , Young Adult
7.
Mil Med ; 182(7): e1836-e1841, 2017 07.
Article in English | MEDLINE | ID: mdl-28810980

ABSTRACT

The standardized mortality rate of rhabdomyolysis (RM) in Active Duty U.S. Army Soldiers is considerably higher than in the civilian population. RM occurs when large amounts of intracellular contents from damaged skeletal muscle escape into circulation, leading to serious sequelae (e.g., acute renal failure, hyperkalemia, compartment syndrome). Extended physical exertion, especially in hot environments, and trauma can precipitate RM. The aim of this study was to identify RM risk factors among U.S. Active Duty Army (ADA) Soldiers. METHODS: This nested case-control study used data from the Total Army Injury and Health Outcomes Database (years 2004-2006) to examine RM among ADA male Soldiers. Demographic and occupational variables were identified as potential risk factors. Each RM case was age and date-matched to 4 controls. Adjusted odds ratios (OR) were computed using conditional logistic regression analyses. RESULTS: From years 2004 to 2006, 1,086 Soldiers (0.19%) met the study criteria for clinically diagnosed RM. Three variables were found to increase the odds of acquiring RM: (1) prior heat stroke, OR 4.95 (95% confidence interval [CI] 1.1-21.7); (2) self-reported Black race, OR 2.56 (95% CI 2.2-3.0); and (3) length of service (0-90 days), OR 2.05 (95% CI 1.6-2.7). CONCLUSION: There is a substantially greater likelihood for male U.S. Army Soldiers to develop RM who: (1) have had a prior heat injury, (2) self-report in the Black racial category, and (3) who are within the initial 90 days of service. Greater awareness of the risk factors associated with RM may improve force health protection and readiness through targeted mitigation strategies.


Subject(s)
Military Personnel/statistics & numerical data , Rhabdomyolysis/epidemiology , Adult , Case-Control Studies , Female , Heat Stroke/complications , Humans , Male , Middle Aged , Odds Ratio , Racial Groups/statistics & numerical data , Retrospective Studies , Rhabdomyolysis/mortality , Risk Factors , United States/epidemiology
8.
PLoS One ; 12(1): e0170144, 2017.
Article in English | MEDLINE | ID: mdl-28095509

ABSTRACT

Individuals entering US Army service are generally young and healthy, but many are overweight, which may impact cardiometabolic risk despite physical activity and fitness requirements. This analysis examines the association between Soldiers' BMI at accession and incident cardiometabolic risk factors (CRF) using longitudinal data from 731,014 Soldiers (17.0% female; age: 21.6 [3.9] years; BMI: 24.7 [3.8] kg/m2) who were assessed at Army accession, 2001-2011. CRF were defined as incident diagnoses through 2011, by ICD-9 code, of metabolic syndrome, glucose/insulin disorder, hypertension, dyslipidemia, or overweight/obesity (in those not initially overweight/obese). Multivariable-adjusted proportional hazards models were used to estimate hazard ratios (HR) and 95% confidence intervals (CI) between BMI categories at accession and CRF. Initially underweight (BMI<18.5 kg/m2) were 2.4% of Soldiers, 53.5% were normal weight (18.5-<25), 34.2% were overweight (25-<30), and 10.0% were obese (≥30). Mean age range at CRF diagnosis was 24-29 years old, with generally low CRF incidence: 228 with metabolic syndrome, 3,880 with a glucose/insulin disorder, 26,373 with hypertension, and 13,404 with dyslipidemia. Of the Soldiers who were not overweight or obese at accession, 5,361 were eventually diagnosed as overweight or obese. Relative to Soldiers who were normal weight at accession, those who were overweight or obese, respectively, had significantly higher risk of developing each CRF after multivariable adjustment (HR [95% CI]: metabolic syndrome: 4.13 [2.87-5.94], 13.36 [9.00-19.83]; glucose/insulin disorder: 1.39 [1.30-1.50], 2.76 [2.52-3.04]; hypertension: 1.85 [1.80-1.90], 3.31 [3.20-3.42]; dyslipidemia: 1.81 [1.75-1.89], 3.19 [3.04-3.35]). Risk of hypertension, dyslipidemia, and overweight/obesity in initially underweight Soldiers was 40%, 31%, and 79% lower, respectively, versus normal-weight Soldiers. BMI in early adulthood has important implications for cardiometabolic health, even within young, physically active populations.


Subject(s)
Body Mass Index , Cardiovascular Diseases/etiology , Metabolic Diseases/etiology , Obesity/complications , Overweight/complications , Thinness/complications , Adult , Cardiovascular Diseases/epidemiology , Female , Humans , Incidence , Longitudinal Studies , Male , Metabolic Diseases/epidemiology , Military Personnel , Risk Factors , Time Factors , United States/epidemiology , Young Adult
9.
Mil Med ; 181(S4): 3-12, 2016 11.
Article in English | MEDLINE | ID: mdl-27849455

ABSTRACT

Congress authorized creation of the Extremity Trauma and Amputation Center of Excellence (EACE) as part of the 2009 National Defense Authorization Act. The legislation mandated the Department of Defense (DoD) and Department of Veterans Affairs (VA) to implement a comprehensive plan and strategy for the mitigation, treatment, and rehabilitation of traumatic extremity injuries and amputation. The EACE also was tasked with conducting clinically relevant research, fostering collaborations, and building partnerships across multidisciplinary international, federal, and academic networks to optimize the quality of life of service members and veterans who have sustained extremity trauma or amputations. To fulfill the mandate to conduct research, the EACE developed a Research and Surveillance Division that complements and collaborates with outstanding DoD, VA, and academic research programs across the globe. The EACE researchers have efforts in four key research focus areas relevant to extremity trauma and amputation: (1) Novel Rehabilitation Interventions, (2) Advanced Prosthetic and Orthotic Technologies, (3) Epidemiology and Surveillance, and (4) Medical and Surgical Innovations. This overview describes the EACE efforts to innovate, discover, and translate knowledge gleaned from collaborative research partnerships into clinical practice and policy.


Subject(s)
Amputation, Surgical/rehabilitation , Extremities/injuries , Rehabilitation Centers/organization & administration , Humans , Population Surveillance/methods , Prostheses and Implants/trends , Quality of Life/legislation & jurisprudence , Rehabilitation Centers/legislation & jurisprudence , Research/organization & administration , United States , United States Department of Defense/organization & administration , United States Department of Veterans Affairs/organization & administration
10.
Mil Med ; 181(S4): 69-76, 2016 11.
Article in English | MEDLINE | ID: mdl-27849465

ABSTRACT

High-energy lower extremity trauma is a consequence of modern war and it is unclear if limb amputation or limb salvage enables greater recovery. To improve function in the injured extremity, a passive dynamic ankle-foot orthosis, the Intrepid Dynamic Exoskeletal Orthosis (IDEO), was introduced with specialized return to run (RTR) therapy program. Recent research suggests, these interventions may improve function and return to duty rates. This systematic literature review sought to rate available evidence and formulate empirical evidence statements (EESs), regarding outcomes associated with IDEO utilization. PubMed, CINAHL, and Google Scholar were systematically searched for pertinent articles. Articles were screened and rated. EESs were formulated based upon data and conclusions from included studies. Twelve studies were identified and rated. Subjects (n = 487, 6 females, mean age 29.4 year) were studied following limb trauma and salvage. All included studies had high external validity, whereas internal validity was mixed because of reporting issues. Moderate evidence supported development of four EESs regarding IDEO use with specialized therapy. Following high-energy lower extremity trauma and limb salvage, use of IDEO with RTR therapy can enable return to duty, return to recreation and physical activity, and decrease pain in some high-functioning patients. In higher functioning patients following limb salvage or trauma, IDEO use improved agility, power and speed, compared with no-brace or conventional bracing alternatives.


Subject(s)
Amputees/rehabilitation , Equipment Design/standards , Exoskeleton Device/standards , Lower Extremity/injuries , Outcome Assessment, Health Care , Adult , Female , Humans , Male , Orthotic Devices/standards , Return to Work
11.
N Engl J Med ; 375(5): 435-42, 2016 Aug 04.
Article in English | MEDLINE | ID: mdl-27518662

ABSTRACT

BACKGROUND: Studies have suggested that sickle cell trait elevates the risks of exertional rhabdomyolysis and death. We conducted a study of sickle cell trait in relation to these outcomes, controlling for known risk factors for exertional rhabdomyolysis, in a large population of active persons who had undergone laboratory tests for hemoglobin AS (HbAS) and who were subject to exertional-injury precautions. METHODS: We used Cox proportional-hazards models to test whether the risks of exertional rhabdomyolysis and death varied according to sickle cell trait status among 47,944 black soldiers who had undergone testing for HbAS and who were on active duty in the U.S. Army between January 2011 and December 2014. We used the Stanford Military Data Repository, which contains comprehensive medical and administrative data on all active-duty soldiers. RESULTS: There was no significant difference in the risk of death among soldiers with sickle cell trait, as compared with those without the trait (hazard ratio, 0.99; 95% confidence interval [CI], 0.46 to 2.13; P=0.97), but the trait was associated with a significantly higher adjusted risk of exertional rhabdomyolysis (hazard ratio, 1.54; 95% CI, 1.12 to 2.12; P=0.008). This effect was similar in magnitude to that associated with tobacco use, as compared with no use (hazard ratio, 1.54; 95% CI, 1.23 to 1.94; P<0.001), and to that associated with having a body-mass index (BMI; the weight in kilograms divided by the square of the height in meters) of 30.0 or more, as compared with a BMI of less than 25.0 (hazard ratio, 1.39; 95% CI, 1.04 to 1.86; P=0.03). The effect was less than that associated with recent use of a statin, as compared with no use (hazard ratio, 2.89; 95% CI, 1.51 to 5.55; P=0.001), or an antipsychotic agent (hazard ratio, 3.02; 95% CI, 1.34 to 6.82; P=0.008). CONCLUSIONS: Sickle cell trait was not associated with a higher risk of death than absence of the trait, but it was associated with a significantly higher risk of exertional rhabdomyolysis. (Funded by the National Heart, Lung, and Blood Institute and the Uniformed Services University of the Health Sciences.).


Subject(s)
Black or African American , Military Personnel , Physical Exertion , Rhabdomyolysis/etiology , Sickle Cell Trait/complications , Sickle Cell Trait/mortality , Adolescent , Adult , Hemoglobin, Sickle/analysis , Humans , Male , Proportional Hazards Models , Retrospective Studies , Risk Assessment , Tobacco Use , United States/epidemiology , Young Adult
12.
Technol Innov ; 18(2-3): 85-98, 2016 Sep.
Article in English | MEDLINE | ID: mdl-28066519

ABSTRACT

Transtibial amputation (TTA) is life-altering emotionally, functionally, and economically. The economic impact to all stakeholders is largely unknown, as is the cost-effectiveness of prosthetic intervention. This scoping report's purpose was to determine if there is sufficient evidence to conduct a formal systematic review or meta-analysis in any particular prosthetic intervention area and to determine if any evidence statements could be synthesized relative to economic evaluation of interventions provided to patients with TTA. The scoping review revealed six articles representing three topical areas of transtibial care: Care Models, Prosthetic Treatment, and Prosthetic Sockets. All six articles were cost-identification or cost-consequence design and included a total of 704 subjects. Presently, it can be concluded with moderate confidence that specific weight-bearing and total-contact sockets for transtibial amputees are functionally and economically equivalent in the short term when costs, delivery time, and all stakeholder perspectives are considered. Long-term socket outcomes are relatively unexplored. Further primary research is needed beyond this to determine cost-effectiveness for other areas of transtibial prosthetic care although clinical outcomes are somewhat established through systematic review and meta-analysis in other areas of care. Conversely, evaluation of narrative economic reports relative to transtibial care may be sufficient to warrant further analysis. Guidance from the profession may also be useful in devising a strategy for how to assure economic analyses are a routine element of future prosthetic science.

13.
Am J Prev Med ; 50(6): e163-e171, 2016 06.
Article in English | MEDLINE | ID: mdl-26699247

ABSTRACT

INTRODUCTION: Little data exist regarding the long-term impact of excess weight on lower extremity musculoskeletal injury/disorder (MID) in U.S. Army Soldiers. This prospective analysis examines the association between BMI of Soldiers at accession and risk of MID. METHODS: A total of 736,608 Soldiers were followed from accession into the Army, 2001-2011. Data were analyzed January through March 2015. MID was categorized as any first incident lower extremity musculoskeletal injury/disorder, and secondarily, as first incident injury/disorder at a specific site (i.e., hips, upper legs/thighs, knees, lower legs/ankles, feet/toes). Multivariable-adjusted proportional hazards models estimated associations between BMI category at accession and MID risk. RESULTS: During 15,678,743 person-months of follow-up, 411,413 cases of any first MID were documented (70,578 hip, 77,050 upper leg, 162,041 knee, 338,080 lower leg, and 100,935 foot injuries in secondary analyses). The overall MID rate was 2.62 per 100 person-months. Relative to Soldiers with normal BMI (18.5 to <25 kg/m(2)) at accession, those who were underweight (<18.5); overweight (25 to <30); or obese (≥30) had 7%, 11%, and 33% higher risk of MID, respectively, after adjustment. Risks were highest in Soldiers who were obese at accession, and lowest in those with a BMI of 21-23 kg/m(2). CONCLUSIONS: Soldier BMI at accession has important implications for MID. A BMI of 21-23 kg/m(2) in newly accessing Soldiers was associated with the lowest risk of incident MID, suggesting that accession be limited to people within this range to reduce overall incidence of MID among service personnel.


Subject(s)
Body Mass Index , Lower Extremity/injuries , Military Personnel , Obesity/epidemiology , Adult , Female , Humans , Incidence , Male , Prospective Studies , Risk Factors , United States/epidemiology
14.
J Orthop Sports Phys Ther ; 45(6): 477-84, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25899214

ABSTRACT

STUDY DESIGN: Retrospective cohort study. OBJECTIVES: To report the incidence rate of ankle sprains in active-duty soldiers and to examine if soldiers who sustain ankle sprain injuries are more likely to leave the Army than those who do not sustain an ankle sprain. BACKGROUND: Ankle sprains are one of the most common musculoskeletal injuries in physically active people and have been identified as the most common foot or ankle injury in active-duty Army personnel, with a rate of 103 sprains per 1000 soldiers per year. METHODS: Data were analyzed on the entire active-duty US Army population from 2000 to 2006 (n = 1 014 042). A semi-parametric Cox proportional hazard model was built. RESULTS: The overall incidence rate for ankle sprains was 45.14 per 1000 person-years. After controlling for length of service prior to the study period, soldiers who sustained a single ankle sprain were 27% less likely (relative risk ratio = 0.73; 95% confidence interval: 0.73, 0.75) to leave the service than soldiers who had no documented history of an ankle sprain. However, this trend toward increased service time no longer held true for those who sustained a recurrent sprain (risk ratio = 1.07; 95% confidence interval: 0.99, 1.15). CONCLUSION: It appears that individuals who sustain an incident ankle sprain have longer time in service in the Army than those who do not sustain this injury. However, this trend toward longer service time no longer held true for soldiers who sustained a recurrent. LEVEL OF EVIDENCE: Prognosis, level 2b.


Subject(s)
Ankle Injuries/epidemiology , Military Personnel , Sprains and Strains/epidemiology , Adult , Female , Humans , Incidence , Male , Proportional Hazards Models , Recurrence , Retrospective Studies , Risk Factors , Time Factors , United States/epidemiology , Young Adult
15.
Obesity (Silver Spring) ; 23(3): 662-70, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25611465

ABSTRACT

OBJECTIVE: The US Army recruits new soldiers from an increasingly obese civilian population. The change in weight status at entry into the Army between 1989 and 2012 and the demographic characteristics associated with overweight/obesity at entry were examined. METHODS: 1,741,070 unique individuals with complete sex, age, and anthropometric information contributed data to linear and logistic regressions examining time trends and associations between demographic characteristics and overweight/obesity. RESULTS: The prevalence of overweight (body mass index 25-<30 kg/m(2)) generally increased, from 25.8% (1989) to 37.2% (2012), peaking at 37.9% (2011). The prevalence of obesity (body mass index ≥30 kg/m(2)) also increased from 5.6% (1989) to 8.0% (2012), peaking at 12.3% (2009); 2005-2009 annual prevalence exceeded 10%. The most consistent demographic characteristics predicting overweight/obesity were male sex, older age, Hispanic or Asian/Pacific Island race/ethnicity, and being married. There were no distinct geographic trends. CONCLUSIONS: The US Army is not immune to the US obesity epidemic. Demographic characteristics associated with being overweight or obese should be considered when developing military-sponsored weight management programs for new soldiers.


Subject(s)
Military Personnel/statistics & numerical data , Obesity/epidemiology , Overweight/epidemiology , Adolescent , Asian People/statistics & numerical data , Body Mass Index , Epidemics , Female , Hispanic or Latino/statistics & numerical data , Humans , Logistic Models , Male , Middle Aged , Obesity/ethnology , Overweight/ethnology , Prevalence , United States/epidemiology , White People/statistics & numerical data
16.
Mil Med ; 179(12): 1487-96, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25469973

ABSTRACT

OBJECTIVES: Training-related injury is a threat to military health and readiness. Prevalence of potential risk factors for training-related injury can change with U.S. Army recruitment goals and may influence basic combat training (BCT) injury rates. This article describes challenges of using administrative data to identify a trainee cohort and describes demographic and training characteristics across the five BCT locations. METHODS: Data from the Total Army Injury and Health Outcomes Database were used to identify a U.S. Army-wide cohort of first-time trainees from January 1, 2002 to September 30, 2007 and describe its characteristics. RESULTS: The cohort includes 368,102 first-time trainees. The annual number starting BCT increased from 52,187 in 2002 to 68,808 in 2004. The proportion of males increased from 81.57% in 2003 to 83.84% in 2007. Mean (SD) age increased from 20.67 (3.55) years in 2002 to 20.94 (3.65) years in 2007. Mean (SD) body mass index increased from 24.53 (3.56) kg/m(2) in 2002 to 24.94 (3.84) kg/m(2) in 2006. Other characteristics fluctuated by year, including proportions of race/ethnicity, accession waivers, and confirmed graduates. CONCLUSIONS: Fluctuations in trainee characteristics warrant further analysis of potential influence on BCT injury rates. For research uses, careful acquisition of administrative data is needed.


Subject(s)
Databases, Factual/standards , Inservice Training/statistics & numerical data , Military Personnel/statistics & numerical data , Physical Conditioning, Human/statistics & numerical data , Wounds and Injuries/epidemiology , Adult , Body Mass Index , Cohort Studies , Female , Humans , Male , Physical Conditioning, Human/adverse effects , Risk Factors , United States/epidemiology , Warfare , Young Adult
17.
Mil Med ; 178(6): 676-82, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23756076

ABSTRACT

BACKGROUND: There is a scarcity of analytic research on active duty Army (ADA) knee injuries (KI), such as soft tissue knee injuries (STKI), which are the predominant ADA KI pattern. PURPOSE: To quantify the independent adjusted association of significant ADA STKI risk factors, 2000-2005. MATERIALS/METHODS: Using the Total Army Injury and Health Outcomes Database, we (1) captured absolute STKI numbers and rates (N = 83,323) and (2) developed regression models to determine significant STKI risk factors. Models included STKI overall and subcategories: meniscus, patella, anterior/posterior cruciate ligament, and medial/lateral cruciate ligament. RESULTS: Eight risk factors significantly increased STKI. They are: (1) prior KI (within 2 years) (odds ratio [OR] 9.83, 95% confidence interval [CI] 9.67-10.00); (2) increasing length of service (OR 1.83, 95% CI 1.76-1.90); (3) increasing age (OR 1.57, 95% CI 1.50-1.65); (4) prior deployment (OR 1.39, 95% CI 1.36-1.41); (5) prior ankle injury (OR 1.16, 95% CI 1.14-1.19); (6) Infantry occupation (OR 1.12, 95% CI 1.04-1.21); (7) marital status (OR 1.10, 95% CI 1.08-1.12); (8) and prior hip injury (OR 1.08, 95% CI 1.03-1.12). MAJOR CONCLUSION: Soldiers with a prior KI have nearly a 10-fold increased relative risk of developing a subsequent STKI.


Subject(s)
Knee Injuries/etiology , Military Personnel/statistics & numerical data , Soft Tissue Injuries/etiology , Adult , Female , Humans , Knee Injuries/epidemiology , Male , Risk Assessment , Risk Factors , Soft Tissue Injuries/epidemiology , United States , Young Adult
18.
BMC Urol ; 13: 6, 2013 Jan 29.
Article in English | MEDLINE | ID: mdl-23356551

ABSTRACT

BACKGROUND: We sought to improve prostate cancer (PC) detection through developing a prostate biopsy clinical decision rule (PBCDR), based on an elevated PSA and laboratory biomarkers. This decision rule could be used after initial PC screening, providing the patient and clinician information to consider prior to biopsy. METHODS: This case-control study evaluated men from the Tampa, Florida, James A. Haley (JH) Veteran's Administration (VA) (N = 1,378), from January 1, 1998, through April 15, 2005. To assess the PBCDR we did all of the following: 1) Identified biomarkers that are related to PC and have the capability of improving the efficiency of PC screening; 2) Developed statistical models to determine which can best predict the probability of PC; 3) Compared each potential model to PSA alone using Receiver Operator Characteristic (ROC) curves, to evaluate for improved overall effectiveness in PC detection and reduction in (negative) biopsies; and 4) Evaluated dose-response relationships between specified lab biomarkers (surrogates for extra-prostatic disease development) and PC progression. RESULTS: The following biomarkers were related to PC: hemoglobin (HGB) (OR = 1.42 95% CI 1.27, 1.59); red blood cell (RBC) count (OR = 2.52 95% CI 1.67, 3.78); PSA (OR = 1.04 95% CI 1.03, 1.05); and, creatinine (OR = 1.55 95% CI 1.12, 2.15). Comparing all PC stages versus non-cancerous conditions, the ROC curve area under the curve (AUC) enlarged (increasing the probability of correctly classifying PC): PSA (alone) 0.59 (95% CI 0.55, 0.61); PBCDR model 0.68 (95% CI 0.65, 0.71), and the positive predictive value (PPV) increased: PSA 44.7%; PBCDR model 61.8%. Comparing PC (stages II, III, IV) vs. other, the ROC AUC increased: PSA (alone) 0.63 (95% CI 0.58, 0.66); PBCDR model 0.72 (95% CI 0.68, 0.75), and the PPV increased: 20.6% (PSA); PBCDR model 55.3%. CONCLUSIONS: These results suggest evaluating certain common biomarkers in conjunction with PSA may improve PC prediction prior to biopsy. Moreover, these biomarkers may be more helpful in detecting clinically relevant PC. Follow-up studies should begin with replicating the study on different U.S. VA patients involving multiple practices.


Subject(s)
Decision Support Techniques , Early Detection of Cancer/methods , Kallikreins/blood , Prostate-Specific Antigen/blood , Prostate/pathology , Prostatic Neoplasms/pathology , Veterans Health , Adult , Aged , Aged, 80 and over , Biopsy , Case-Control Studies , Creatinine/blood , Erythrocyte Count , Hemoglobins , Humans , Male , Middle Aged , Models, Statistical , Predictive Value of Tests , Prostatic Neoplasms/blood , ROC Curve , Retrospective Studies , United States , United States Department of Veterans Affairs
19.
Mil Med ; 177(7): 840-4, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22808892

ABSTRACT

We sought to summarize knee injuries (KI) in the U.S. Active Duty Army (ADA) in terms of absolute numbers, examine current rate trends, and identify ADA who were at increased risk for experiencing a KI. We used the Total Army Injury and Health Outcomes Database (TAIHOD) to compute unadjusted and adjusted rates of KI, categorized by the Barell Matrix, within the ADA for the years 2000-2005. During this period, 21 to 25 per 1,000 ADA suffered from KI. The highest yearly rates were observed for knee dislocation and sprains/strains (31 per 1,000 ADA). In ADA with a history of a KI (within 2 years), rates increased nearly tenfold. Elevated KI rates were also seen in ADA with prior upper or lower leg injuries, those > 30 years of age, and those with a category IV Armed Forces Qualification Test score (lowest admissible in Army). ADA KI rates remained fairly stable throughout the study period. Relative to other ADA Soldiers, those with prior knee, upper leg, or lower leg injuries are at increased risk for subsequent KI.


Subject(s)
Knee Injuries/epidemiology , Military Personnel/statistics & numerical data , Occupational Injuries/epidemiology , Sprains and Strains/epidemiology , Adolescent , Adult , Age Factors , Female , Humans , Incidence , Knee Dislocation/epidemiology , Male , Risk Factors , United States/epidemiology , Young Adult
20.
Med Sci Sports Exerc ; 44(3): 442-9, 2012 Mar.
Article in English | MEDLINE | ID: mdl-21857374

ABSTRACT

PURPOSE: Rhabdomyolysis (RM) is a skeletal muscle disorder resulting in severe cellular injury caused by vigorous physical activity and other systemic etiologies. RM is associated with significant morbidity, such as acute renal failure, and can be fatal. RM that occurs in the US Active Duty Army (ADA) results in time lost from training, deployment, and combat. We sought to systemically describe the epidemiology of ADA clinical RM by quantifying RM in terms of absolute numbers, examine rate trends, and identify soldiers at elevated risk. METHODS: We used data from the Total Army Injury and Health Outcomes Database to calculate yearly RM rates in the overall ADA, as well as adjusted RM rates within soldier subpopulations for 2003-2006. RESULTS: During this period, the absolute numbers of clinically diagnosed ADA RM ranged between 382 and 419 cases per year. Annual rates were 7-8 per 10,000, which is 300%-400% higher than the estimated US civilian population (2 per 10,000). In soldiers with a history of a prior heat injury, RM rates climbed to 52-86 per 10,000, a 7- to 11-fold increase. Increased RM rates were seen in soldiers who are male, African American, younger, less educated, and with a shorter length of service. Approximately 8% of yearly ADA RM cases resulted in acute renal failure, an estimate lower than that for the US civilian population. CONCLUSIONS: Our findings suggest that rates of RM are higher in the ADA than in the US civilian population. Rates remained fairly stable; however, relative to other ADA soldiers, those with prior heat injury, who are African American, or who have a length of service of less than 90 d are at the highest risk for RM development.


Subject(s)
Military Personnel , Rhabdomyolysis/epidemiology , Adult , Educational Status , Female , Humans , Incidence , Male , Rhabdomyolysis/ethnology , Risk Factors , United States/epidemiology
SELECTION OF CITATIONS
SEARCH DETAIL
...