Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
1.
Clin Pharmacol Ther ; 107(3): 514-520, 2020 03.
Article in English | MEDLINE | ID: mdl-31608984

ABSTRACT

A significant regulatory gap exists to facilitate global development of therapeutics for nononcology severely debilitating or life-threatening diseases or conditions (SDLTs). In a 2017 publication, a streamlined approach to the development of treatments for SDLTs was proposed to facilitate earlier and continued patient access to new, potentially beneficial therapeutics.1 However, a major hindrance to broad adoption of this streamlined approach has been the lack of universally accepted, objective criteria to define SDLTs. This article serves to extend the 2017 publication by further addressing the challenge of defining SDLT scope in order to stimulate broader discussion and facilitate development of regional and ultimately international guidelines on the development of therapeutics for SDLTs. Using case examples, we describe key attributes of SDLTs and provide criteria for consideration of an SDLT scope definition.


Subject(s)
Drug Development/legislation & jurisprudence , Guidelines as Topic , Internationality , Humans , Severity of Illness Index , Terminology as Topic
2.
J Gen Intern Med ; 30(7): 1004-12, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25735938

ABSTRACT

OBJECTIVES: We set out to review the efficacy of Community Health Worker (CHW) interventions to improve glycemia in people with diabetes. METHODS: Data sources included the Cochrane Central Register of Controlled Trials, Medline, clinicaltrials.gov, Google Scholar, and reference lists of previous publications. We reviewed randomized controlled trials (RCTs) that assessed the efficacy of CHW interventions, as compared to usual care, to lower hemoglobin A1c (A1c). Two investigators independently reviewed the RCTs and assessed their quality. Only RCTs with a follow-up of at least 12 months were meta-analyzed. A random effects model was used to estimate, from unadjusted within-group mean reductions, the standardized mean difference (SMD) in A1c achieved by the CHW intervention, beyond usual care. RESULTS: Thirteen RCTs were included in the narrative review, and nine of them, which had at least 12 months of follow-up, were included in the meta-analysis. Publication bias could not be ruled-out due to the small number of trials. Outcome heterogeneity was moderate (I(2)= 37%). The SMD in A1c (95% confidence interval) was 0.21 (0.11-0.32). Meta-regression showed an association between higher baseline A1c and a larger effect size. CONCLUSIONS: CHW interventions showed a modest reduction in A1c compared to usual care. A1c reduction was larger in studies with higher mean baseline A1c. Caution is warranted, given the small number of studies.


Subject(s)
Community Health Workers , Delivery of Health Care/organization & administration , Diabetes Mellitus/therapy , Hyperglycemia/prevention & control , Blood Glucose/metabolism , Community Health Services/organization & administration , Diabetes Mellitus/blood , Glycated Hemoglobin/metabolism , Humans , Hyperglycemia/blood , Randomized Controlled Trials as Topic/methods
3.
Inj Epidemiol ; 1(1): 31, 2014 Dec.
Article in English | MEDLINE | ID: mdl-27747664

ABSTRACT

BACKGROUND: Unintentional drug overdose has increased markedly in the past two decades and surpassed motor vehicle crashes as the leading cause of injury mortality in many states. The purpose of this study was to understand the trajectory of the drug overdose epidemic in the United States by applying Farr's Law. Farr's "law of epidemics" and the Bregman-Langmuir back calculation method were applied to United States drug overdose mortality data for the years 1980 through 2011 to project the annual death rates from drug overdose from 2012 through 2035. FINDINGS: From 1980-2011, annual drug overdose mortality increased from 2.7 to 13.2 deaths per 100,000 population. The projected drug overdose mortality would peak in 2016-2017 at 16.1 deaths per 100,000 population and then decline progressively until reaching 1.9 deaths per 100,000 population in 2035. CONCLUSION: The projected data based on Farr's Law suggests that drug overdose mortality in the United States will decline in the coming years and return to the 1980 baseline level approximately by the year 2034.

4.
Cancer Epidemiol Biomarkers Prev ; 22(10): 1756-61, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23897585

ABSTRACT

BACKGROUND: Our prior studies of lung cancer suggested that a novel biomarker (pro-surfactant protein B or pro-SFTPB) might serve as a predictive marker for this disease. We aimed to determine the potential use of pro-SFTPB for distinguishing lung cancer cases from matched controls as a risk marker. METHODS: Study subjects were drawn from the longitudinal Physicians' Health Study (PHS). Cases (n = 188) included individuals who were cancer-free at study enrollment but developed lung cancer during follow-up. Controls (n = 337) were subjects who did not develop lung cancer. Cases and controls were matched on date of study enrollment, age at enrollment, and smoking status and amount. Baseline plasma samples drawn at enrollment were analyzed for pro-SFTPB using ELISA to detect differences in protein expression levels for cases and controls. RESULTS: Pro-SFTPB nondetectable status was significantly associated with lung cancer risk [OR = 5.88; 95% confidence interval (CI) 1.24-27.48]. Among subjects with detectable levels of the protein, increasing plasma concentration of pro-SFTPB was associated with higher lung cancer risk (OR = 1.41 per unit increase in log pro-SFTPB; 95% CI 1.08-1.84). CONCLUSION: These results suggest a nonlinear, J-shaped association between plasma pro-SFTPB levels and lung cancer risk, with both nondetectable and higher levels of the marker being associated with lung cancer. IMPACT: These results show promise of a risk marker that could contribute to predicting risk for lung cancer development and to narrowing the high-risk population for low-dose computed tomography screening.


Subject(s)
Biomarkers, Tumor/blood , Lung Neoplasms/blood , Receptors, Fc/blood , Case-Control Studies , Female , Humans , Lung Neoplasms/prevention & control , Male , Middle Aged , Risk Factors , Vitamins/administration & dosage , beta Carotene/administration & dosage
5.
Mil Med ; 175(6): 417-23, 2010 Jun.
Article in English | MEDLINE | ID: mdl-20572474

ABSTRACT

This investigation evaluated the effects of a 13-month deployment to Iraq on body composition and selected fitness measures. Seventy-three combat arms soldiers were measured pre- and postdeployment. Body composition was assessed by dual X-ray absorptiometry (DXA). Strength was measured by single repetition maximum (1-RM) lifts on bench press and squat. Power was assessed by a bench throw and squat jump. Aerobic endurance was evaluated with a timed 2-mile run. Exercise and injury history were assessed by questionnaire. Upper and lower body strength improved by 7% and 8%, respectively (p < 0.001). Upper body power increased 9% (p < 0.001) and lean mass increased 3% (p < 0.05). In contrast, aerobic performance declined 13% (p < 0.001) and fat mass increased 9% (p < 0.05). Fewer soldiers participated in aerobic exercise or sports during deployment (p < 0.001). Unit commanders should be aware of potential fitness and body composition changes during deployment and develop physical training programs to enhance fitness following deployment.


Subject(s)
Body Composition/physiology , Military Personnel , Physical Fitness/physiology , Absorptiometry, Photon , Adolescent , Adult , Follow-Up Studies , Humans , Iraq War, 2003-2011 , Male , Surveys and Questionnaires , Time Factors , Young Adult
6.
Am J Prev Med ; 38(1 Suppl): S182-8, 2010 Jan.
Article in English | MEDLINE | ID: mdl-20117591

ABSTRACT

INTRODUCTION: Military parachuting has been shown to result in injuries. This investigation systematically reviewed studies examining the influence of the parachute ankle brace (PAB) on injuries during military parachuting and performed a cost-effectiveness analysis. EVIDENCE ACQUISITION: Parachute ankle brace studies were obtained from seven databases, personal contacts, and other sources. Investigations were reviewed if they contained original, quantitative information on PAB use and injuries during parachuting. Meta-analysis was performed using a general variance-based meta-analysis method that calculated summary risk ratios (SRR) and 95% CIs. EVIDENCE SYNTHESIS: Five studies met the review criteria. Compared with PAB users, PAB non-users had a higher risk of ankle injuries (SRR=2.1, 95% CI=1.8-2.5); ankle sprains (SRR=2.1, 95% CI=1.4-3.1); ankle fractures (SRR=1.8, 95% CI=1.1-2.9); and all parachuting injuries combined (SRR=1.2, 95% CI=1.1-1.4). The PAB had little effect on lower body injuries exclusive of the ankle (SRR [no PAB/PAB]=0.9, 95% CI=0.7-1.2). Cost-effectiveness analysis estimated that, for every dollar expended on the PAB, a savings of about $7 to $9 could be achieved in medical and personnel costs. CONCLUSIONS: The PAB reduces ankle injuries by about half and is a cost effective device that should be worn during military airborne operations to reduce injury risk.


Subject(s)
Accidents, Occupational/prevention & control , Ankle Injuries/prevention & control , Aviation/statistics & numerical data , Braces , Military Personnel/statistics & numerical data , Accidents, Occupational/economics , Accidents, Occupational/statistics & numerical data , Ankle Injuries/economics , Ankle Injuries/epidemiology , Ankle Injuries/etiology , Aviation/economics , Cost-Benefit Analysis , Humans , Protective Devices , United States/epidemiology
7.
J Strength Cond Res ; 23(4): 1353-62, 2009 Jul.
Article in English | MEDLINE | ID: mdl-19528858

ABSTRACT

This paper reviews the rationale and evaluations of Physical Readiness Training (PRT), the new U.S. Army physical training doctrine designed to improve soldiers' physical capability for military operations. The purposes of PRT are to improve physical fitness, prevent injuries, progressively train soldiers, and develop soldiers' self-confidence and discipline. The PRT follows the principles of progressive overload, regularity, specificity, precision, variety, and balance. Specificity was achieved by examining the standard list of military (warrior) tasks and determining 1) the physical requirements, 2) the fitness components involved, and 3) the training activities that most likely could improve the military tasks. Injury-prevention features include reduced running mileage, exercise variety (cross-training), and gradual, progressive training. In 3 military field studies, the overall adjusted risk of injury was 1.5-1.8 times higher in groups of soldiers performing traditional military physical training programs when compared with groups using a PRT program. Scores on the Army Physical Fitness Test were similar or higher in groups using PRT programs. In an 8-week laboratory study comparing PRT with a weightlifting/running program, both programs resulted in major improvements in militarily relevant tasks (e.g., 3.2-km walk/run with 32-kg load, 400-m run with 18-kg load, 5- to 30-second rushes to and from prone position, 80-kg casualty drag, obstacle course). When compared with traditional military physical training programs, PRT consistently resulted in fewer injuries and in equal or greater improvements in fitness and military task performance.


Subject(s)
Military Medicine , Military Personnel , Physical Education and Training/methods , Physical Fitness , Humans , Self Concept , United States , Wounds and Injuries/prevention & control
8.
J Strength Cond Res ; 23(3): 685-97, 2009 May.
Article in English | MEDLINE | ID: mdl-19387413

ABSTRACT

Popular running magazines and running shoe companies suggest that imprints of the bottom of the feet (plantar shape) can be used as an indication of the height of the medial longitudinal foot arch and that this can be used to select individually appropriate types of running shoes. This study examined whether or not this selection technique influenced injury risk during United States Army Basic Combat Training (BCT). After foot examinations, BCT recruits in an experimental group (E: n = 1,079 men and 451 women) selected motion control, stability, or cushioned shoes for plantar shapes judged to represent low, medium, or high foot arches, respectively. A control group (C: n = 1,068 men and 464 women) received a stability shoe regardless of plantar shape. Injuries during BCT were determined from outpatient medical records. Other previously known injury risk factors (e.g., age, fitness, and smoking) were obtained from a questionnaire and existing databases. Multivariate Cox regression controlling for other injury risk factors showed little difference in injury risk between the E and C groups among men (risk ratio (E/C) = 1.01; 95% confidence interval = 0.88-1.16; p = 0.87) or women (risk ratio (E/C) = 1.07; 95% confidence interval = 0.91-1.25; p = 0.44). In practical application, this prospective study demonstrated that selecting shoes based on plantar shape had little influence on injury risk in BCT. Thus, if the goal is injury prevention, this selection technique is not necessary in BCT.


Subject(s)
Foot Injuries/prevention & control , Military Personnel , Orthotic Devices , Running/injuries , Shoes , Adult , Biomechanical Phenomena , Case-Control Studies , Female , Humans , Male , Proportional Hazards Models , Prospective Studies , Risk Factors , United States
9.
Med Sci Sports Exerc ; 40(9): 1687-92, 2008 Sep.
Article in English | MEDLINE | ID: mdl-18685520

ABSTRACT

PURPOSE: To examine change in physical fitness and body composition after a military deployment to Afghanistan. METHODS: One hundred and ten infantry soldiers were measured before and after a 9-month deployment to Afghanistan for Operation Enduring Freedom. Measurements included treadmill peak oxygen uptake (peak VO2), lifting strength, medicine ball put, vertical jump, and body composition estimated via dual-energy x-ray absorptiometry (percent body fat, absolute body fat, fat-free mass, bone mineral content, and bone mineral density). RESULTS: There were significant decreases (P < 0.01) in peak VO2 (-4.5%), medicine ball put (-4.9%), body mass (-1.9%), and fat-free mass (-3.5%), whereas percent body fat increased from 17.7% to 19.6%. Lifting strength and vertical jump performance did not change predeployment to postdeployment. CONCLUSIONS: Nine months deployment to Afghanistan negatively affected aerobic capacity, upper body power, and body composition. The predeployment to postdeployment changes were not large and unlikely to present a major health or fitness concern. If deployments continue to be extended and time between deployments decreased, the effects may be magnified and further study warranted.


Subject(s)
Body Composition/physiology , Military Personnel , Physical Fitness/physiology , Adolescent , Adult , Afghanistan , Exercise Test , Humans , Male , Muscle Strength , Oxygen Consumption , United States , Young Adult
10.
Aviat Space Environ Med ; 79(7): 689-94, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18619129

ABSTRACT

INTRODUCTION: This investigation examined risk factors for injuries during military parachute training and solicited attitudes and opinions regarding a parachute ankle brace (PAB) that has been shown to protect against ankle injuries. METHODS: Male Army airborne students (N = 1677) completed a questionnaire after they had successfully executed 4 of the 5 jumps necessary for qualification as a military paratrooper. The questionnaire asked about injuries during parachute descents, demographics, lifestyle characteristics, physical characteristics, physical fitness, airborne recycling (i.e., repeating airborne training because of failure to qualify on a previous attempt), PAB wear, problems with aircraft exits, and injuries in the year before airborne school. A final section of the questionnaire solicited open-ended comments about the PAB. RESULTS: Increased risk of a parachute-related injury occurred among students who had longer time in service, were older, taller, heavier, performed fewer push-ups, ran slower, were airborne recycles, did not wear the PAB, had an aircraft exit problem, and/or reported an injury in the year prior to jump school. Among students who wore the brace, most negative comments about the PAB had to do with design, comfort, and difficulties during parachute landing falls. CONCLUSIONS: This study supported some previously identified injury risk factors (older age, greater body weight, and not using a PAB) and identified a number of new risk factors. To address PAB design and comfort issues, a strap is being added over the dorsum of the foot to better hold the PAB in place.


Subject(s)
Ankle Injuries/prevention & control , Ankle , Aviation , Braces , Military Personnel , Adolescent , Adult , Aerospace Medicine , Ankle Injuries/etiology , Humans , Male , Risk Factors , United States
11.
Mil Med ; 173(5): 465-73, 2008 May.
Article in English | MEDLINE | ID: mdl-18543568

ABSTRACT

This retrospective study was conducted to assess the nature and causes of serious oral-facial illnesses and injuries among U.S. Army personnel deployed to Iraq and Afghanistan in 2003 and 2004. Information for this study came from the U.S. Air Force Transportation Regulating and Command & Control Evacuation System database for medical evacuations (MEDEVACS) for 2003 to 2004. The study found 327 oral-facial MEDEVACS out of Iraq (cumulative incidence: 11/10,000 soldiers per year) and 47 out of Afghanistan (cumulative incidence: 21/10,000 soldiers per year), for a total of 374 MEDEVACS. Forty-two percent (n = 158) of all oral-facial MEDEVACS were due to diseases of the oral cavity, salivary glands, and jaw. Another 36% (n = 136) of oral-facial MEDEVACS were for battle injuries, primarily fractures of the mandible, caused by acts of war. Twenty-one percent (n = 80) of oral-facial MEDEVACS were due to nonbattle injuries, primarily fractures of the mandible, mainly caused by motor vehicle accidents and fighting.


Subject(s)
Air Ambulances , Facial Injuries/epidemiology , Mouth Diseases/epidemiology , Mouth/injuries , Patient Transfer , Triage , Warfare , Adolescent , Adult , Case-Control Studies , Female , Humans , Iraq , Male , Middle Aged , Military Medicine , Retrospective Studies , United States/epidemiology
12.
Aviat Space Environ Med ; 79(4): 408-15, 2008 Apr.
Article in English | MEDLINE | ID: mdl-18457298

ABSTRACT

INTRODUCTION: This study examined the injury prevention effectiveness of the parachute ankle brace (PAB) while controlling for known extrinsic risk factors. METHODS: Injuries among airborne students who wore the PAB during parachute descents were compared with injuries among those who did not. Injury risk factors from administrative records included wind speed, combat loads, and time of day (day/night). Injuries were collected in the drop zone. RESULTS: A total of 596 injuries occurred in 102,784 parachute descents. In univariate analysis, students not wearing the PAB (Controls) were 2.00 [95% confidence interval (95% CI) = 1.32-3.02] times more likely to experience an ankle sprain, 1.83 (95% CI = 1.04-3.24) times more likely to experience an ankle fracture, and 1.92 (95% CI = 1.38-2.67) times more likely to experience an ankle injury of any type. PAB wearers and Controls had a similar incidence of lower body injuries exclusive of the ankle [risk ratio (Control/PAB) = 0.92, 95% CI = 0.65-1.30]. After accounting for known extrinsic injury risk factors, Controls were 1.90 (95% CI = 1.24-2.90) times more likely than PAB wearers to experience an ankle sprain, 1.47 (95% CI = 0.82- 2.63) times more likely to experience an ankle fracture, and 1.75 (95% CI = 1.25-2.48) times more likely to experience an ankle injury of any type. The incidence of parachute entanglements that persisted until the jumpers reached the ground were similar among PAB wearers and Controls IRR (Control/PAB) = 1.17, 95% CI = 0.61-2.29]. CONCLUSION: After controlling for known injury risk factors, the PAB protected against ankle injuries, and especially ankle sprains, while not influencing parachute entanglements or lower body injuries exclusive of the ankle.


Subject(s)
Ankle Injuries/prevention & control , Aviation , Braces , Military Personnel , Sprains and Strains/prevention & control , Accidents, Occupational , Adult , Ankle Injuries/etiology , Female , Humans , Male , Risk Factors , Sprains and Strains/etiology
13.
Am J Ind Med ; 50(12): 951-61, 2007 Dec.
Article in English | MEDLINE | ID: mdl-17979136

ABSTRACT

BACKGROUND: This project documented injuries in the professional musical performers of the US Army Band and used a multivariate approach to determine injury risk factors. METHODS: Injuries were obtained from a medical surveillance database. Administrative records from the Band provided fitness test scores, physical characteristics, performing unit (Blues, Ceremonial, Chorale, Chorus, Concert, Strings), and functional group (strings, winds, keyboard, vocal, percussion, brass). A questionnaire completed by 95% of the Band (n=205) included queries on practice time, physical activity, tobacco use, and medical care. RESULTS: One or more injuries were diagnosed in 44 and 53% of Band members in the years 2004 and 2005, respectively. In univariate analysis, higher injury risk was associated with higher body mass index (BMI), less physical activity, prior injury, unit, functional group, and practice duration. In multivariate analysis, less self-rated physical activity, a prior injury, and functional group were independent risk factors. CONCLUSION: In the US Army Band, about half the performers had a medical visit for an injury in a 1-year period and injury risk was associated with identifiable factors.


Subject(s)
Military Medicine , Military Personnel , Motor Activity , Music , Occupational Diseases/etiology , Occupational Health , Physical Fitness , Wounds and Injuries/etiology , Adult , Body Mass Index , Databases as Topic , Female , Health Surveys , Humans , Incidence , Male , Middle Aged , Occupational Diseases/epidemiology , Surveys and Questionnaires , United States/epidemiology , Wounds and Injuries/epidemiology
14.
Mil Med ; 172(9): 988-96, 2007 Sep.
Article in English | MEDLINE | ID: mdl-17937365

ABSTRACT

This study describes injury rates, injury diagnoses, anatomical locations of injuries, limited duty days, and activities associated with injuries in a sample of Army mechanics. Medical records of 518 male and 43 female Army mechanics were screened for injuries during 1 year at a large U.S. Army installation. Weight, height, age, and ethnicity were also extracted from the medical records. Body mass index was calculated as weight/height2. Overall injury rates for men and women were 124 and 156 injuries/100 person-years, respectively, with a rate of 127 injuries/100 person-years for all soldiers combined. Women had higher overuse injury rates while men had higher traumatic injury rates. Limited duty days for men and women were 2,076 and 1,966 days/100 person-years, respectively. The lower back, knee, ankle, foot, and shoulder involved 61% of the injuries. Activities associated with injury included (in order of incidence) physical training, mechanical work, sports, airborne-related activities, road marching, garrison/home activities, and chronic conditions. Among the men, elevated injury risk was associated with higher body weight and higher body mass index. It may be possible to prevent many injuries by implementation of evidenced-based interventions currently available in the literature.


Subject(s)
Military Personnel/statistics & numerical data , Motor Vehicles , Wounds and Injuries/epidemiology , Adolescent , Adult , Body Mass Index , Body Weight , Female , Humans , Male , Retrospective Studies , Risk Factors , United States/epidemiology
15.
Mil Med ; 172(2): 115-20, 2007 Feb.
Article in English | MEDLINE | ID: mdl-17357760

ABSTRACT

Epidemiological studies often have to rely on a participant's self-reporting of information. The validity of the self-report instrument is an important consideration in any study. The purpose of this investigation was to determine the validity of self-reported Army Physical Fitness Test (APFT) scores. The APFT is administered to all soldiers in the U.S. Army twice a year and consists of the maximum number of push-ups completed in 2 minutes, the maximum number of sit-ups completed in 2 minutes, and a 2-mile run for time. Army mechanics responded to a questionnaire in March and June 2004 asking them to report the exact scores of each event on their most recent APFT. Actual APFT scores were obtained from the soldier's military unit. The mean +/- standard deviation (SD) of actual and self-reported numbers of push-ups was 61 +/- 14 and 65 +/- 13, respectively. The mean +/- SD of actual and self-reported numbers of sit-ups were 66 +/- 10 and 68 +/- 10, respectively. The mean +/- SD of actual and self-reported run times (minutes) were 14.8 +/- 1.4 and 14.6 +/- 1.4, respectively. Correlations between actual and self-reported push-ups, sit-ups, and run were 0.83, 0.71, and 0.85, respectively. On average, soldiers tended to slightly over-report performance on all APFT events and individual self-reported scores could vary widely from actual scores based on Bland-Altman plots. Despite this, the close correlations between the actual and self-reported scores suggest that self-reported values are adequate for most epidemiological military studies involving larger sample sizes.


Subject(s)
Exercise Test/methods , Military Personnel , Physical Fitness , Adult , Female , Humans , Male , Prospective Studies , Reproducibility of Results , Surveys and Questionnaires , United States
16.
Sports Med ; 37(2): 117-44, 2007.
Article in English | MEDLINE | ID: mdl-17241103

ABSTRACT

Three systematic reviews were conducted on: (i) the history of mouthguard use in sports; (ii) mouthguard material and construction; and (iii) the effectiveness of mouthguards in preventing orofacial injuries and concussions. Retrieval databases and bibliographies were explored to find studies using specific key words for each topic. The first recorded use of mouthguards was by boxers, and in the 1920s professional boxing became the first sport to require mouthguards. Advocacy by the American Dental Association led to the mandating of mouthguards for US high school football in the 1962 season. Currently, the US National Collegiate Athletic Association requires mouthguards for four sports (ice hockey, lacrosse, field hockey and football). However, the American Dental Association recommends the use of mouthguards in 29 sports/exercise activities. Mouthguard properties measured in various studies included shock-absorbing capability, hardness, stiffness (indicative of protective capability), tensile strength, tear strength (indicative of durability) and water absorption. Materials used for mouthguards included: (i) polyvinylacetate-polyethylene or ethylene vinyl acetate (EVA) copolymer; (ii) polyvinylchloride; (iii) latex rubber; (iv) acrylic resin; and (v) polyurethane. Latex rubber was a popular material used in early mouthguards but it has lower shock absorbency, lower hardness and less tear and tensile strength than EVA or polyurethane. Among the more modern materials, none seems to stand out as superior to another since the characteristics of all the modern materials can be manipulated to provide a range of favourable characteristics. Impact studies have shown that compared with no mouthguard, mouthguards composed of many types of materials reduce the number of fractured teeth and head acceleration. In mouthguard design, consideration must be given to the nature of the collision (hard or soft objects) and characteristics of the mouth (e.g. brittle incisors, more rugged occusal surfaces of molars, soft gingiva). Laminates with different shock absorbing and stress distributing (stiffness) capability may be one way to accommodate these factors.Studies comparing mouthguard users with nonusers have examined different sports, employed a variety of study designs and used widely-varying injury case definitions. Prior to the 1980s, most studies exhibited relatively low methodological quality. Despite these issues, meta-analyses indicated that the risk of an orofacial sports injury was 1.6-1.9 times higher when a mouthguard was not worn. However, the evidence that mouthguards protect against concussion was inconsistent, and no conclusion regarding the effectiveness of mouthguards in preventing concussion can be drawn at present. Mouthguards should continue to be used in sport activities where there is significant risk of orofacial injury.


Subject(s)
Athletic Injuries/prevention & control , Mouth Protectors , Safety , Tooth Injuries/prevention & control , Boxing/injuries , Equipment Design , Football/injuries , Hockey/injuries , Humans , Risk Factors , United States
17.
Mil Med ; 171(11): 1051-6, 2006 Nov.
Article in English | MEDLINE | ID: mdl-17153540

ABSTRACT

This study describes injury and illness rates and some risk factors among soldiers from an armor division during a rotation at the National Training Center (Fort Irwin, California). Soldiers from a brigade of the 1st Cavalry Division were involved in a 5-week training exercise at the National Training Center. Health care visits were systematically recorded by the unit medics. Of 4,101 men and 413 women who participated in the exercise, 504 soldiers (409 men and 95 women) sought medical care at the main support medical clinic or Weed Army Community Hospital. The rates of injury and illness visits were 1.2% and 0.6% per week for men and 2.3% and 2.2% per week for women, respectively. Women had twice the risk of an injury and 3.5 times the risk of an illness, compared with men. Compared with other branches, combat service support soldiers had higher rates of injuries and illnesses. Enlisted soldiers of lower rank (E1-E4) experienced higher injury and illness rates than did noncommissioned officers and commissioned officers. Musculoskeletal injuries, environmental conditions, and dermatological conditions accounted for most visits.


Subject(s)
Accidents, Occupational/statistics & numerical data , Hospitals, Military/statistics & numerical data , Military Medicine/statistics & numerical data , Military Personnel/statistics & numerical data , Occupational Diseases/epidemiology , Physical Education and Training/statistics & numerical data , Risk Assessment , Wounds and Injuries/epidemiology , Adolescent , Adult , California , Female , Humans , Incidence , Male , Military Personnel/education , Occupational Diseases/classification , Risk Factors , Sex Distribution , United States/epidemiology , Wounds and Injuries/classification
18.
Mil Med ; 171(7): 669-77, 2006 Jul.
Article in English | MEDLINE | ID: mdl-16895139

ABSTRACT

During the first few days of Army Basic Combat Training (BCT), recruits take a running test and after completing this test they are ranked from fast to slow. Four roughly equal-sized "ability groups" are established from these rankings and recruits run together in these groups for their physical training during BCT. In the past, there has been no formal guidance regarding how fast or how far these ability groups should run. To fill this void, this study provides guidance for running speeds and distances during BCT. The major considerations included are: (1) minimizing injuries, (2) the initial aerobic fitness level of recruits, (3) historical improvements in run times during BCT, (4) historical running speeds of the slower individuals in each ability group, (5) running speeds that must be achieved to "pass" the 2-mile run in BCT, (6) the gender composition of the ability groups, and (7) recommendations from the trainers and field testing. Three databases were analyzed that contained a total of 16,716 men and 11,600 women. Four steps were used in the analyses: (1) establishment of run-time cut points for representative ability groups, (2) determination of initial (starting) run speeds, (3) estimation of changes in run speeds with training, and (4) establishment of run speeds and distances for each week of BCT. Efforts were made to (1) keep the running speeds between 70% and 83% of the estimated maximal oxygen uptake (VO2max) for all ability groups, (2) consider the 2-mile running pace of the slower individuals in each ability group, and (3) keep the total running distance for the two slower ability groups below a total of 25 miles, the apparent threshold for increasing injury incidence. A chart provides speeds and distances for each ability group at each week of BCT. Using these recommended speeds and distances should allow trainees to improve their aerobic fitness, pass the Army Physical Fitness Test, and minimize injuries that result in lost training time and, ultimately, lower fitness levels.


Subject(s)
Military Personnel/classification , Physical Fitness/physiology , Running/physiology , Adolescent , Adult , Databases as Topic , Female , Humans , Male , Military Personnel/education , Oxygen Consumption , Physical Education and Training , Professional Competence , Risk Assessment , Task Performance and Analysis , Time Factors
19.
Sports Med ; 36(7): 613-34, 2006.
Article in English | MEDLINE | ID: mdl-16796397

ABSTRACT

This article defines physical fitness and then reviews the literature on temporal trends in the physical fitness of new US Army recruits. Nineteen papers were found that met the review criteria and had published recruit fitness data from 1975 to 2003. The limited data on recruit muscle strength suggested an increase from 1978 to 1998 (20-year period). Data on push-ups and sit-ups suggested no change in muscular endurance between 1984 and 2003 (19-year period). Limited data suggested that maximal oxygen uptake (VO2max) [mL/kg/min] of male recruits did not change from 1975 to 1998 (23-year period), while there was some indication of a small increase in female recruit VO2max in the same period. On the other hand, slower times on 1-mile (1.6km) and 2-mile (3.2km) runs indicate declines in aerobic performance from 1987 to 2003 (16-year period). The apparent discrepancy between the VO2max and endurance running data may indicate that recruits are not as proficient at applying their aerobic capability to performance tasks, such as timed runs, possibly because of factors such as increased bodyweight, reduced experience with running, lower motivation and/or environmental factors. Recruit height, weight and body mass index have progressively increased between 1978 and 2003 (25-year period). Both the body fat and fat-free mass of male recruits increased from 1978 to 1998 (20-year period); however, body composition data on female recruits did not show a consistent trend. In this same time period, the literature contained little data on youth physical activity but there was some suggestion that caloric consumption increased. This article indicates that temporal trends in recruit fitness differ depending on the fitness component measured. The very limited comparable data on civilian populations showed trends similar to the recruit data.


Subject(s)
Military Medicine , Military Personnel , Physical Fitness , Body Composition , Body Mass Index , Humans , Muscle, Skeletal , Physical Endurance , Time Factors , United States
20.
Mil Med ; 171(1): 45-54, 2006 Jan.
Article in English | MEDLINE | ID: mdl-16532873

ABSTRACT

Recruits arriving for basic combat training (BCT) between October 1999 and May 2004 were administered an entry-level physical fitness test at the reception station. If they failed the test, then they entered the Fitness Assessment Program (FAP), where they physically trained until they passed the test and subsequently entered BCT. The effectiveness of the FAP was evaluated by examining fitness, injury, and training outcomes. Recruits who failed the test, trained in the FAP, and entered BCT after passing the test were designated the preconditioning (PC) group (64 men and 94 women). Recruits who failed the test but were allowed to enter BCT without going into the FAP were called the no preconditioning (NPC) group (32 men and 73 women). Recruits who passed the test and directly entered BCT were designated the no need of preconditioning (NNPC) group (1,078 men and 731 women). Army Physical Fitness Test (APFT) scores and training outcomes were obtained from a company-level database, and injured recruits were identified from cases documented in medical records. The proportions of NPC, PC, and NNPC recruits who completed the 9-week BCT cycle were 59%, 83%, and 87% for men (p < 0.01) and 52%, 69%, and 78% for women (p < 0.01), respectively. Because of attrition, only 63% of the NPC group took the week 7 APFT, compared with 84% and 86% of the PC and NNPC groups, respectively. The proportions of NPC, PC, and NNPC recruits who passed the final APFT after all retakes were 88%, 92%, and 98% for men (p < 0.01) and 89%, 92%, and 97% for women (p < 0.01), respectively. Compared with NNPC men, injury risk was 1.5 (95% confidence interval, 1.0-2.2) and 1.7 (95% confidence interval, 1.0-3.1) times higher for PC and NPC men, respectively. Compared with NNPC women, injury risk was 1.2 (95% confidence interval, 0.9-1.6) and 1.5 (95% confidence interval, 1.1-2.1) times higher for PC and NPC women, respectively. This program evaluation showed that low-fit recruits who preconditioned before BCT had reduced attrition and tended to have lower injury risk, compared with recruits of similar low fitness who did not precondition.


Subject(s)
Military Personnel , Outcome Assessment, Health Care , Physical Fitness/physiology , Wounds and Injuries , Adolescent , Adult , Female , Humans , Male , South Carolina
SELECTION OF CITATIONS
SEARCH DETAIL
...