Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
1.
Am J Trop Med Hyg ; 102(6): 1455-1462, 2020 06.
Article in English | MEDLINE | ID: mdl-32228790

ABSTRACT

Environmental factors, including high temperature and humidity, can influence dermal absorption of chemicals. Soldiers can be dermally exposed to permethrin while wearing permethrin-treated uniforms. This study aimed at examining the effects of high temperature and a combined high temperature and humid environment on permethrin absorption compared with ambient conditions when wearing a permethrin-treated uniform. Twenty-seven male enlisted soldiers wore study-issued permethrin-treated army uniforms for 33 consecutive hours in three different environments: 1) simulated high temperature (35°C, 40% relative humidity [rh]) (n = 10), 2) simulated high temperature and humidity (30°C, 70% rh) (n = 10), and 3) ambient conditions (13°C, 60% rh) (n = 7). Spot urine samples, collected at 21 scheduled time points before, during, and after wearing the study uniforms, were analyzed for permethrin exposure biomarkers (3-phenoxybenzoic acid, cis- and trans-3-(2,2-dichlorovinyl)-2,2-dimethylcyclopropane-1-carboxylic acid) and creatinine. Biomarker concentrations were 60-90% higher in the heat and combined heat/humidity groups (P < 0.001-0.022) than the ambient group. Also, the average daily permethrin dose, calculated 12 hours after removing the treated uniforms, was significantly higher in the heat (P = 0.01) and the heat/humidity (P = 0.03) groups than the ambient group. There were no significant differences in biomarker concentrations or computed average daily dose between the heat and the heat/humidity groups. Both hot and combined hot and humid environmental conditions significantly increased permethrin absorption in soldiers wearing permethrin-treated uniforms.


Subject(s)
Humidity , Insecticides/urine , Military Personnel , Permethrin/urine , Protective Clothing , Temperature , Adolescent , Biomarkers/urine , Humans , Insecticides/chemistry , Insecticides/pharmacokinetics , Male , Occupational Exposure , Permethrin/chemistry , Permethrin/pharmacokinetics , Time Factors , United States , Young Adult
2.
J Expo Sci Environ Epidemiol ; 30(3): 525-536, 2020 05.
Article in English | MEDLINE | ID: mdl-30728486

ABSTRACT

This study examined the effect of high-temperature conditions and uniform wear time durations (expeditionary, 33 h continuous wear; garrison, 3 days, 8 h/day wear) on permethrin exposure, assessed by urinary permethrin biomarkers, from wearing post-tailored, factory-treated military uniforms. Four group study sessions took place over separate 11-day periods, involving 33 male Soldiers. Group 1 (n = 10) and Group 2 (n = 8) participants wore a study-issued permethrin-treated Army uniform under high heat environment (35 °C, 40% relative humidity (rh)) and expeditionary and garrison wear-time conditions, respectively. For comparison, Group 3 (n = 7) and Group 4 (n = 8) participants wore study-issued permethrin-treated uniforms in cooler ambient conditions under operational and garrison wear-time conditions, respectively. Urinary biomarkers of permethrin (3-phenoxybenzoic acid, and the sum of cis- and trans-3-(2,2-dichlorovinyl)-2,2-dimethylcyclopropane-1-carboxylic acid) were significantly higher under high temperature compared to ambient conditions, regardless of wear-time situations (Group 1 vs. Group 3; Group 2 vs. Group 4; p < 0.001, for both). Under high-temperature conditions, expeditionary (continuous) compared to garrison wear-time resulted in significantly (p < 0.001) higher urinary biomarker concentrations (Group 1 vs. Group 2). Differences related to wear-time under the ambient conditions (Group 3 vs. Group 4) were not statistically significant. Findings suggest that wearing permethrin-treated clothing in heat conditions results in higher internal dose of permethrin above that observed under ambient conditions.


Subject(s)
Clothing , Hot Temperature , Insecticides , Military Personnel , Permethrin , Adult , Benzoates , Biomarkers , Humans , Male , Time Factors
3.
Comput Biol Med ; 107: 131-136, 2019 04.
Article in English | MEDLINE | ID: mdl-30802695

ABSTRACT

PURPOSE: We examined the accuracy of the Heat Strain Decision Aid (HSDA) as a predictor of core body temperature in healthy individuals wearing chemical protective clothing during laboratory and field exercises in hot and humid conditions. METHODS: The laboratory experiment examined three chemical protective clothing ensembles in eight male volunteers (age 24 ±â€¯6 years; height 178 ±â€¯5 cm; body mass 76.6 ±â€¯8.4 kg) during intermittent treadmill marching in an environmental chamber (air temperature 29.3 ±â€¯0.1 °C; relative humidity 56 ±â€¯1%; wind speed 0.4 ±â€¯0.1 m s-1). The field experiment examined four different chemical protective clothing ensembles in twenty activity military volunteers (26 ±â€¯5 years; 175 ±â€¯8 cm; 80.2 ±â€¯12.1 kg) during a prolonged road march (26.0 ±â€¯0.5 °C; 55 ±â€¯3%; 4.3 ±â€¯0.7 m s-1). Predictive accuracy and precision were evaluated by the bias, mean absolute error (MAE), and root mean square error (RMSE). Additionally, accuracy was evaluated using a prediction bias of ±0.27 °C as an acceptable limit and by comparing predictions to observations within the standard deviation (SD) of the observed data. RESULTS: Core body temperature predictions were accurate for each chemical protective clothing ensemble in laboratory (Bias -0.10 ±â€¯0.36 °C; MAE 0.28 ±â€¯0.24 °C; RMSE 0.37 ±â€¯0.24 °C) and field experiments (Bias 0.23 ±â€¯0.32 °C; MAE 0.30 ±â€¯0.25 °C; RMSE 0.40 ±â€¯0.25 °C). From all modeled data, 72% of all predictions were within one standard deviation of the observed data including 92% of predictions for the laboratory experiment (SD ±â€¯0.64 °C) and 67% for the field experiment (SD ±â€¯0.38 °C). Individual-based predictions showed modest errors outside the SD range with 98% of predictions falling <1 °C; while, 81% of all errors were within 0.5 °C of observed data. CONCLUSION: The HSDA acceptably predicts core body temperature when wearing chemical protective clothing during laboratory and field exercises in hot and humid conditions.


Subject(s)
Body Temperature Regulation/physiology , Body Temperature/physiology , Models, Statistical , Protective Clothing , Thermometry/methods , Adolescent , Adult , Exercise/physiology , Humans , Male , Military Medicine , Weather , Young Adult
4.
Med Sci Sports Exerc ; 51(4): 744-750, 2019 04.
Article in English | MEDLINE | ID: mdl-30439786

ABSTRACT

PURPOSE: To determine the efficacy residing for 2 d at various altitudes while sedentary (S) or active (A; ~90 min hiking 2 d) on exercise performance at 4300 m. METHODS: Sea-level (SL) resident men (n = 45) and women (n = 21) (mean ± SD; 23 ± 5 yr; 173 ± 9 cm; 73 ± 12 kg; V˙O2peak = 49 ± 7 mL·kg·min) were randomly assigned to a residence group and, S or A within each group: 2500 m (n = 11S, 8A), 3000 m (n = 6S, 12A), 3500 m (n = 6S, 8A), or 4300 m (n = 7S, 8A). Exercise assessments occurred at SL and 4300 m after 2-d residence and consisted of 20 min of steady-state (SS) treadmill walking (45% ± 3% SL V˙O2peak) and a 5-mile, self-paced running time trial (TT). Arterial oxygen saturation (SpO2) and HR were recorded throughout exercise. Resting SpO2 was recorded at SL, at 4 and 46 h of residence, and at 4300 m before exercise assessment. To determine if 2-d altitude residence improved 4300 m TT performance, results were compared with estimated performances using a validated prediction model. RESULTS: For all groups, resting SpO2 was reduced (P < 0.01) after 4 h of residence relative to SL inversely to the elevation and did not improve after 46 h. Resting SpO2 (~83%) did not differ among groups at 4300 m. Although SL and 4300 m SS exercise SpO2 (97% ± 2% to 74% ± 4%), HR (123 ± 10 bpm to 140 ± 12 bpm) and TT duration (51 ± 9 to 73 ± 16 min) were different (P < 0.01), responses at 4300 m were similar among all groups, as was actual and predicted 4300 m TT performances (74 ± 12 min). CONCLUSIONS: Residing for 2 d at 2500 to 4300 m, with or without daily activity, did not improve resting SpO2, SS exercise responses, or TT performance at 4300 m.


Subject(s)
Acclimatization/physiology , Altitude , Physical Endurance/physiology , Adult , Altitude Sickness/physiopathology , Exercise/physiology , Heart Rate/physiology , Humans , Male , Oxygen/blood , Sedentary Behavior , Young Adult
5.
High Alt Med Biol ; 19(4): 329-338, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30517038

ABSTRACT

OBJECTIVE: To determine whether 2 days of staging at 2500-3500 m, combined with either high or low physical activity, reduces acute mountain sickness (AMS) during subsequent ascent to 4300 m. METHODS: Three independent groups of unacclimatized men and women were staged for 2 days at either 2500 m (n = 18), 3000 m (n = 16), or 3500 m (n = 15) before ascending and living for 2 days at 4300 m and compared with a control group that directly ascended to 4300 m (n = 12). All individuals departed to the staging altitudes or 4300 m after spending one night at 2000 m during which they breathed supplemental oxygen to simulate sea level conditions. Half in each group participated in ∼3 hours of daily physical activity while half were sedentary. Women accounted for ∼25% of each group. AMS incidence was assessed using the Environmental Symptoms Questionnaire. AMS was classified as mild (≥0.7 and <1.5), moderate (≥1.5 and <2.6), and severe (≥2.6). RESULTS: While staging, the incidence of AMS was lower (p < 0.001) in the 2500 m (0%), 3000 m (13%), and 3500 m (40%) staged groups than the direct ascent control group (83%). After ascent to 4300 m, the incidence of AMS was lower in the 3000 m (43%) and 3500 m (40%) groups than the 2500 m group (67%) and direct ascent control (83%). Neither activity level nor sex influenced the incidence of AMS during further ascent to 4300 m. CONCLUSIONS: Two days of staging at either 3000 or 3500 m, with or without physical activity, reduced AMS during subsequent ascent to 4300 m but staging at 3000 m may be recommended because of less incidence of AMS.


Subject(s)
Acclimatization/physiology , Altitude Sickness/prevention & control , Altitude , Oxygen Inhalation Therapy/methods , Acute Disease , Altitude Sickness/epidemiology , Altitude Sickness/etiology , Exercise/physiology , Female , Healthy Volunteers , Humans , Incidence , Male , Time Factors , Treatment Outcome , Young Adult
6.
J Appl Physiol (1985) ; 123(5): 1214-1227, 2017 Nov 01.
Article in English | MEDLINE | ID: mdl-28705998

ABSTRACT

This study examined whether normobaric hypoxia (NH) treatment is more efficacious for sustaining high-altitude (HA) acclimatization-induced improvements in ventilatory and hematologic responses, acute mountain sickness (AMS), and cognitive function during reintroduction to altitude (RA) than no treatment at all. Seventeen sea-level (SL) residents (age = 23 ± 6 yr; means ± SE) completed in the following order: 1) 4 days of SL testing; 2) 12 days of HA acclimatization at 4,300 m; 3) 12 days at SL post-HA acclimatization (Post) where each received either NH (n = 9, [Formula: see text] = 0.122) or Sham (n = 8; [Formula: see text] = 0.207) treatment; and 4) 24-h reintroduction to 4,300-m altitude (RA) in a hypobaric chamber (460 Torr). End-tidal carbon dioxide pressure ([Formula: see text]), hematocrit (Hct), and AMS cerebral factor score were assessed at SL, on HA2 and HA11, and after 20 h of RA. Cognitive function was assessed using the SynWin multitask performance test at SL, on HA1 and HA11, and after 4 h of RA. There was no difference between NH and Sham treatment, so data were combined. [Formula: see text] (mmHg) decreased from SL (37.2 ± 0.5) to HA2 (32.2 ± 0.6), decreased further by HA11 (27.1 ± 0.4), and then increased from HA11 during RA (29.3 ± 0.6). Hct (%) increased from SL (42.3 ± 1.1) to HA2 (45.9 ± 1.0), increased again from HA2 to HA11 (48.5 ± 0.8), and then decreased from HA11 during RA (46.4 ± 1.2). AMS prevalence (%) increased from SL (0 ± 0) to HA2 (76 ± 11) and then decreased at HA11 (0 ± 0) and remained depressed during RA (17 ± 10). SynWin scores decreased from SL (1,615 ± 62) to HA1 (1,306 ± 94), improved from HA1 to HA11 (1,770 ± 82), and remained increased during RA (1,707 ± 75). These results demonstrate that HA acclimatization-induced improvements in ventilatory and hematologic responses, AMS, and cognitive function are partially retained during RA after 12 days at SL whether or not NH treatment is utilized.NEW & NOTEWORTHY This study demonstrates that normobaric hypoxia treatment over a 12-day period at sea level was not more effective for sustaining high-altitude (HA) acclimatization during reintroduction to HA than no treatment at all. The noteworthy aspect is that athletes, mountaineers, and military personnel do not have to go to extraordinary means to retain HA acclimatization to an easily accessible and relevant altitude if reexposure occurs within a 2-wk time period.


Subject(s)
Acclimatization/physiology , Altitude Sickness/physiopathology , Altitude , Exercise/physiology , Hypoxia/physiopathology , Pulmonary Ventilation/physiology , Adolescent , Adult , Altitude Sickness/blood , Altitude Sickness/diagnosis , Female , Heart Rate/physiology , Humans , Hypoxia/blood , Hypoxia/diagnosis , Male , Middle Aged , Treatment Outcome , Young Adult
7.
J Therm Biol ; 64: 78-85, 2017 Feb.
Article in English | MEDLINE | ID: mdl-28166950

ABSTRACT

Physiological models provide useful summaries of complex interrelated regulatory functions. These can often be reduced to simple input requirements and simple predictions for pragmatic applications. This paper demonstrates this modeling efficiency by tracing the development of one such simple model, the Heat Strain Decision Aid (HSDA), originally developed to address Army needs. The HSDA, which derives from the Givoni-Goldman equilibrium body core temperature prediction model, uses 16 inputs from four elements: individual characteristics, physical activity, clothing biophysics, and environmental conditions. These inputs are used to mathematically predict core temperature (Tc) rise over time and can estimate water turnover from sweat loss. Based on a history of military applications such as derivation of training and mission planning tools, we conclude that the HSDA model is a robust integration of physiological rules that can guide a variety of useful predictions. The HSDA model is limited to generalized predictions of thermal strain and does not provide individualized predictions that could be obtained from physiological sensor data-driven predictive models. This fully transparent physiological model should be improved and extended with new findings and new challenging scenarios.


Subject(s)
Exercise , Heat-Shock Response , Hot Temperature , Models, Theoretical , Sweating/physiology , Humans , Military Personnel , Protective Clothing
8.
Eur J Appl Physiol ; 113(3): 735-41, 2013 Mar.
Article in English | MEDLINE | ID: mdl-22941031

ABSTRACT

The head's capacity for evaporative heat loss is important for design of protective helmets for use in hot environments. This study quantified head sweating rate (m (sw)) in eight males during rest and exercise at three metabolic rates (338 ± 36, 481 ± 24, 622 ± 28 W) in hot-dry (HD: 45 °C, 21 % RH) and hot-wet (HW: 35 °C, 69 % RH) conditions (matched at 31.6 °C WBGT), which were counterbalanced. Heads were shaved, and surface area was (458 ± 61 cm(2)) measured by 3D scanner. For measurement of head m (sw), dry air was passed through a sealed helmet, whereas for forearm m (sw) a capsule (15.9 cm(2)) was ventilated with ambient air. Evaporation rate was determined from the increase in vapor pressure in the exiting air. Whole-body sweat loss was calculated from the change in nude weight plus fluid intake and corrected for respiratory fluid losses. Head m (sw) increased (p = 0.001) with metabolic rate, and was lower (p = 0.018) in HD (0.4 ± 0.2 mg cm(-2) min(-1) at rest to 1.1 ± 0.6 mg cm(-2) min(-1) at 622 W), compared to HW (0.5 ± 0.3-1.4 ± 0.8 mg cm(-2) min(-1)). Forearm m (sw) increased (p < 0.001) with metabolic rate, but was higher (p = 0.002) in HD (0.4 ± 0.3-1.4 ± 0.7 mg cm(-2) min(-1)) than HW (0.1 ± 0.1-1.1 ± 0.3 mg cm(-2) min(-1)). Whole-body sweat loss was not significantly different (p = 0.06) between HD (647 ± 139 g m(-2) h(-1)) and HW (528 ± 189 g m(-2) h(-1)). This study demonstrates the importance of the head for evaporative heat loss, particularly for populations who wear protective clothing which can impair vapor transfer from the skin.


Subject(s)
Exercise/physiology , Head/physiology , Hot Temperature , Rest/physiology , Sweat/metabolism , Sweating/physiology , Acclimatization/physiology , Adolescent , Adult , Basal Metabolism/physiology , Body Temperature Regulation/physiology , Forearm/physiology , Hot Temperature/adverse effects , Humans , Male , Young Adult
9.
J Occup Environ Hyg ; 8(10): 588-99, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21936698

ABSTRACT

Personal protective equipment (PPE) refers to clothing and equipment designed to protect individuals from chemical, biological, radiological, nuclear, and explosive hazards. The materials used to provide this protection may exacerbate thermal strain by limiting heat and water vapor transfer. Any new PPE must therefore be evaluated to ensure that it poses no greater thermal strain than the current standard for the same level of hazard protection. This review describes how such evaluations are typically conducted. Comprehensive evaluation of PPE begins with a biophysical assessment of materials using a guarded hot plate to determine the thermal characteristics (thermal resistance and water vapor permeability). These characteristics are then evaluated on a thermal manikin wearing the PPE, since thermal properties may change once the materials have been constructed into a garment. These data may be used in biomedical models to predict thermal strain under a variety of environmental and work conditions. When the biophysical data indicate that the evaporative resistance (ratio of permeability to insulation) is significantly better than the current standard, the PPE is evaluated through human testing in controlled laboratory conditions appropriate for the conditions under which the PPE would be used if fielded. Data from each phase of PPE evaluation are used in predictive models to determine user guidelines, such as maximal work time, work/rest cycles, and fluid intake requirements. By considering thermal stress early in the development process, health hazards related to temperature extremes can be mitigated while maintaining or improving the effectiveness of the PPE for protection from external hazards.


Subject(s)
Cold Temperature , Hot Temperature , Protective Clothing , Stress, Physiological , Acclimatization , Body Temperature , Humans , Occupational Exposure/prevention & control
10.
Med Sci Sports Exerc ; 41(3): 597-602, 2009 Mar.
Article in English | MEDLINE | ID: mdl-19204591

ABSTRACT

PURPOSE: The validity and the reliability of using intestinal temperature (T int) via ingestible temperature sensors (ITS) to measure core body temperature have been demonstrated. However, the effect of elapsed time between ITS ingestion and T int measurement has not been thoroughly studied. METHODS: Eight volunteers (six men and two women) swallowed ITS 5 h (ITS-5) and 29 h (ITS-29) before 4 h of varying intensity activity. T int was measured simultaneously from both ITS, and T int differences between the ITS-5 and the ITS-29 over the 4 h of activity were plotted and compared relative to a meaningful threshold of acceptance (+/-0.25 degrees C). The percentage of time in which the differences between paired ITS (ITS-5 vs ITS-29) were greater than or less than the threshold of acceptance was calculated. RESULTS: T int values showed no systematic bias, were normally distributed, and ranged from 36.94 degrees C to 39.24 degrees C. The maximum T int difference between paired ITS was 0.83 degrees C with a minimum difference of 0.00 degrees C. The typical magnitude of the differences (SE of the estimate) was 0.24 degrees C, and these differences were uniform across the entire range of observed temperatures. Paired T int measures fell outside of the threshold of acceptance 43.8% of the time during the 4 h of activity. CONCLUSIONS: The differences between ITS-5 and ITS-29 were larger than the threshold of acceptance during a substantial portion of the observed 4-h activity period. Ingesting an ITS more than 5 h before activity will not completely eliminate confounding factors but may improve accuracy and consistency of core body temperature.


Subject(s)
Body Temperature/physiology , Intestines/physiology , Telemetry/instrumentation , Adolescent , Adult , Female , Humans , Male , Time Factors
11.
Eur J Appl Physiol ; 103(3): 307-14, 2008 Jun.
Article in English | MEDLINE | ID: mdl-18327605

ABSTRACT

This study determined whether a torso-vest forced ambient air body ventilation system (BVS) reduced physiological strain during exercise-heat stress. Seven heat-acclimated volunteers attempted nine, 2-h treadmill walks at 200 W m(-2) in three environments, -40 degrees C, 20% rh (HD), 35 degrees C, 75% rh (HW), and 30 degrees C, 50% rh, (WW) wearing the Army Combat Uniform, interceptor body armor (IBA) and Kevlar helmet. Three trials in each environment were BVS turned on (BVS(On)), BVS turned off (BVS(Off)), and no BVS (IBA). In HD, BVS(On) significantly lowered core temperature (T (re)), heart rate (HR), mean skin temperature (T (sk)), mean torso skin temperature (T (torso)), thermal sensation (TS), heat storage (S), and physiological strain index (PSI), versus BVS(Off) and IBA (P < 0.05). For HW (n = 6), analyses were possible only through 60 min. Exercise tolerance time (min) during HW was significantly longer for BVS(On) (116 +/- 10 min) versus BVS(Off) (95 +/- 22 min) and IBA (96 +/- 18 min) (P < 0.05). During HW, BVS(On) lowered HR at 60 min versus IBA, T (sk) from 30 to 60 min versus BVS(Off) and IBA, and PSI from 45 to 60 min versus BVS(Off) and at 60 min versus IBA (P < 0.05). BVS(On) changes in T (re) and HR were lower in HD and HW. During WW, BVS(On) significantly lowered HR, T (sk), and T (torso) versus BVS(Off) and IBA (P < 0.05) during late exercise. Sweating rates were significantly lower for BVS(On) versus BVS(Off) and IBA in both HD and WW (P < 0.05), but not HW. These results indicate that BVS(On) reduces physiological strain in all three environments by a similar amount; however, in hot-dry conditions the BVS(Off) increases physiological strain.


Subject(s)
Acclimatization , Climate , Exercise Tolerance , Heat Stress Disorders/prevention & control , Hot Temperature , Humidity , Protective Clothing , Adult , Body Temperature , Energy Metabolism , Equipment Design , Female , Heart Rate , Heat Stress Disorders/physiopathology , Humans , Male , Military Medicine , Sweating , Time Factors
12.
Aviat Space Environ Med ; 78(8): 809-13, 2007 Aug.
Article in English | MEDLINE | ID: mdl-17760290

ABSTRACT

BACKGROUND: This study evaluated adding reflective thermal inserts (RTI) to reduce the physiological strain during exercise-heat stress with a radiant load. RTI were used with a U.S. Army desert battle dress uniform, body armor, and helmet. METHODS: Four male volunteers attempted four trials (10 min rest followed by 100 min walking at 1.56 m x s(-1)). All trials were at 40.0 degrees C dry bulb (Tdb), 12.4 degrees C dew point (Tdp), 20% RH, and 1.0 m x s(-1) wind speed. On 2 d, there was supplementary irradiance (+1) with globe temperature (Tbg) = 56.5 degrees C and on 2 d there was no supplementary irradiance (-I) with Tbg approximately Tdb. Trial conditions were: 1) RTI and armor with supplementary irradiance (RA+I); 2) plain armor with supplementary irradiance (PA+I); 3) RTI and armor with no supplementary irradiance (RA-I); and 4) plain armor with no supplementary irradiance (PA-I). RESULTS: Endurance times were not significantly different among trials. With one exception, armor and helmet interior and exterior surface temperatures were not significantly different between either RA+I and PA+I or RA-I and PA-I. Temperature on the inside of the helmet in RA+I (47.1 +/- 1.4 degrees C) was significantly lower than in PA+I (49.5 +/- 2.6 degrees C). There were no differences for any physiological measure (core temperature, heart rate, mean weighted skin temperature, forehead skin temperature, sweating rate, evaporative cooling, rate of heat storage) between either RA+I and PA+L or RA-I and PA-I. CONCLUSIONS: Results showed no evidence that wearing RTI with body armor and helmet reduces physiological strain during exercise-heat stress with either high or low irradiance.


Subject(s)
Desert Climate/adverse effects , Heat Stress Disorders/prevention & control , Materials Testing , Military Personnel , Protective Clothing , Analysis of Variance , Exercise , Heat Stress Disorders/etiology , Humans , Iraq , United States
13.
Ergonomics ; 49(2): 209-19, 2006 Feb 10.
Article in English | MEDLINE | ID: mdl-16484146

ABSTRACT

The effectiveness of intermittent, microclimate cooling for men who worked in US Army chemical protective clothing (modified mission-oriented protective posture level 3; MOPP 3) was examined. The hypothesis was that intermittent cooling on a 2 min on-off schedule using a liquid cooling garment (LCG) covering 72% of the body surface area would reduce heat strain comparably to constant cooling. Four male subjects completed three experiments at 30 degrees C, 30% relative humidity wearing the LCG under the MOPP 3 during 80 min of treadmill walking at 224 +/- 5 W . m(-2). Water temperature to the LCG was held constant at 21 degrees C. The experiments were; 1) constant cooling (CC); 2) intermittent cooling at 2-min intervals (IC); 3) no cooling (NC). Core temperature increased (1.6 +/- 0.2 degrees C) in NC, which was greater than IC (0.5 +/- 0.2 degrees C) and CC (0.5 +/- 0.3 degrees C) ( p < 0.05). Mean skin temperature was higher during NC (36.1 +/- 0.4 degrees C) than IC (33.7 +/- 0.6 degrees C) and CC (32.6 +/- 0.6 degrees C) and mean skin temperature was higher during IC than CC ( p < 0.05). Mean heart rate during NC (139 +/- 9 b . min(-1)) was greater than IC (110 +/- 10 b . min(-1)) and CC (107 +/- 9 b . min(-1)) ( p < 0.05). Cooling by conduction (K) during NC (94 +/- 4 W . m(-2)) was lower than IC (142 +/- 7 W . m(-2)) and CC (146 +/- 4 W . m(-2)) ( p < 0.05). These findings suggest that IC provided a favourable skin to LCG gradient for heat dissipation by conduction and reduced heat strain comparable to CC during exercise-heat stress in chemical protective clothing.


Subject(s)
Cold Temperature , Exercise/physiology , Heat Stress Disorders/prevention & control , Microclimate , Military Medicine/instrumentation , Military Personnel , Protective Clothing , Skin Temperature/physiology , Adult , Body Temperature Regulation , Hazardous Substances , Hot Temperature/adverse effects , Humans , Male , Physical Exertion/physiology , Prospective Studies , Time Factors , United States
14.
AIHA J (Fairfax, Va) ; 64(4): 510-5, 2003.
Article in English | MEDLINE | ID: mdl-12908867

ABSTRACT

This study compared endurance in a U.S. Army developmental Occupational Safety and Health Administration Level B personal protective equipment (PPE) system against the toxicological agent protective (TAP) suit, the Army's former standard PPE for Level A and Level B toxic environments. The developmental system consisted of two variations: the improved toxicological agent protective (ITAP) suit with self-contained breathing apparatus (ITAP-SCBA), weight 32 kg, and the ITAP with blower (ITAP-B), weight 21 kg. Both ITAP suits included the personal ice cooling system (PICS). TAP (weight 9.5 kg) had no cooling. It was hypothesized that PICS would effectively cool both ITAP configurations, and endurance in TAP would be limited by heat strain. Eight subjects (six men, two women) attempted three 2-hour treadmill walks (0.89 m/sec, 0% grade, rest/exercise cycles of 10/20 min) at 38 degrees C, 30% relative humidity. Metabolic rate for TAP (222+/-35 W) was significantly less than either ITAP-SCBA (278+/-27 W) or ITAP-B (262+/-24 W) (p<0.05). Endurance time was longer in ITAP-SCBA (85+/-20 min) and ITAP-B (87+/-25 min) than in TAP (46+/-10 min) (p<0.05). Heat storage was greater in TAP (77+/-15 W.m(-2)) than in ITAP-SCBA (51+/-16 W.m(-2)) (p<0.05), which was not different from ITAP-B (59+/-14 W.m(-2)). Sweating rate was greater in TAP (23.5+/-11.7 g/min(1)) than in either ITAP-SCBA (11.1+/-2.9 g/min) or ITAP-B (12.8+/-3.5 g/min) (p<0.05). Endurance in ITAP was nearly twice as long as in PPE with no cooling, even though the PICS, SCBA tanks, and new uniform itself all served to increase metabolic cost over that in TAP. PICS could also be used with civilian Levels A and B PPE increasing work time and worker safety.


Subject(s)
Body Temperature Regulation , Heat Stress Disorders , Physical Endurance , Protective Clothing , Adult , Equipment Design , Exercise/physiology , Exercise Test , Female , Hazardous Substances , Humans , Male , Materials Testing , Respiratory Protective Devices
15.
J Appl Physiol (1985) ; 94(5): 1841-8, 2003 May.
Article in English | MEDLINE | ID: mdl-12679347

ABSTRACT

The vasomotor response to cold may compromise the capacity for microclimate cooling (MCC) to reduce thermoregulatory strain. This study examined the hypothesis that intermittent, regional MCC (IRC) would abate this response and improve heat loss when compared with constant MCC (CC) during exercise heat stress. In addition, the relative effectiveness of four different IRC regimens was compared. Five heat-acclimated men attempted six experimental trials of treadmill walking ( approximately 225 W/m(2)) in a warm climate (dry bulb temperature = 30 degrees C, dewpoint temperature = 11 degrees C) while wearing chemical protective clothing (insulation = 2.1; moisture permeability = 0.32) with a water-perfused (21 degrees C) cooling undergarment. The six trials conducted were CC (continuous perfusion) of 72% body surface area (BSA), two IRC regimens cooling 36% BSA by using 2:2 (IRC(1)) or 4:4 (IRC(2)) min on-off perfusion ratios, two IRC regimens cooling 18% BSA by using 1:3 (IRC(3)) or 2:6 (IRC(4)) min on-off perfusion ratios, and a no cooling (NC) control. Compared with NC, CC significantly reduced changes in rectal temperature ( approximately 1.2 degrees C) and heart rate ( approximately 60 beats/min) (P < 0.05). The four IRC regimens all provided a similar reduction in exercise heat strain and were 164-215% more efficient than CC because of greater heat flux over a smaller BSA. These findings indicate that the IRC approach to MCC is a more efficient means of cooling when compared with CC paradigms and can improve MCC capacity by reducing power requirements.


Subject(s)
Body Temperature Regulation/physiology , Cold Temperature , Microclimate , Adult , Body Surface Area , Body Temperature/physiology , Clothing , Heart Rate/physiology , Hot Temperature , Humans , Male , Perfusion , Skin Physiological Phenomena , Sweating
16.
Med Sci Sports Exerc ; 35(1): 175-81, 2003 Jan.
Article in English | MEDLINE | ID: mdl-12544652

ABSTRACT

UNLABELLED: PURPOSE; This study examined the effects of short-term (3.5 d) sustained military operations (SUSOPS) on thermoregulatory responses to cold stress. METHODS: Ten men (22.8 +/- 1.4 yr) were assessed during a cold-air test (CAT) after a control week (control) and again after an 84-h SUSOPS (sleep = 2 h.d (-1), energy intake = approximately 1650 kcal.d(-1), and energy expenditure = approximately 4500 kcal.d(-1). CAT consisted of a resting subject (seminude) being exposed to an ambient temperature ramp from 25 degrees C to 10 degrees C during the initial 30 min of CAT, with the ambient temperature then remaining at 10 degrees C for an additional 150 min. RESULTS: SUSOPS decreased (P< 0.05) body weight, % body fat, and fat-free mass by 3.9 kg, 1.6%, and 1.8 kg, respectively. During CAT, rectal temperature decreased to a greater extent (P< 0.05) after SUSOPS (0.52 +/- 0.09 degrees C) versus control (0.45 +/- 0.12 degrees C). Metabolic heat production was lower (P< 0.05) after SUSOPS at min 30 (55.4 +/- 3.3 W.m (-2)) versus control (66.9 +/- 4.4 W.m(-2)). Examination of the mean body temperature-metabolic heat production relationship indicated that the threshold for shivering was lower (P< 0.05) after SUSOPS (34.8 +/- 0.2 degrees C) versus control (35.8 +/- 0.2 degrees C). Mean weighted skin temperatures ( degrees C) were lower during the initial 1.5 h of CAT in SUSOPS versus control. Heat debt was similar between trials. CONCLUSION: These results indicate that sustained (84-h) military operations leads to greater declines in core temperature, due to either a lag in the initial shivering response or heat redistribution secondary to an insulative acclimation.


Subject(s)
Body Temperature Regulation , Military Personnel , Adult , Body Composition , Body Temperature Regulation/physiology , Cold Temperature , Energy Intake , Humans , Male , Shivering/physiology , Skin Temperature , Sleep Deprivation/physiopathology , Time Factors , United States
17.
Aviat Space Environ Med ; 73(7): 665-72, 2002 Jul.
Article in English | MEDLINE | ID: mdl-12137102

ABSTRACT

BACKGROUND: The purpose of this study was to compare a vapor compression microclimate cooling system (MCC) and a personal ice cooling system (PIC) for their effectiveness in reducing physiological strain when used with cooling garments worn under the impermeable self-contained toxic environment protective outfit (STEPO). A second comparison was done between the use of total body (TOTAL) and hooded shirt-only (SHIRT) cooling garments with both the MCC and PIC systems. It was hypothesized that the cooling systems would be equally effective, and total body cooling would allow 4 h of physical work in the heat while wearing STEPO. METHODS: Eight subjects (six men, two women) attempted four experiments at 38 degrees C (100 degrees F), 30% rh, 0.9 m x sec(-1) wind, while wearing the STEPO. Subjects attempted 4 h of treadmill walking (rest/exercise cycles of 10/20 min) at a time-weighted metabolic rate of 303 +/- 50 W. RESULTS: Exposure time was not different between MCC and PIC, but exposure time was greater with TOTAL (131 +/- 66 min) than with SHIRT (83 +/- 27 min) for both cooling systems (p < 0.05). Cooling rate was not different between MCC and PIC, but cooling rate while wearing TOTAL (362 +/- 52 W) was greater than with SHIRT (281 +/- 48 W) (p < 0.05). Average heat storage was lower with MCC (39 +/- 20 W x m(-2)) than with PIC (50 +/- 17 W x m(-2)) in both TOTAL and SHIRT (p < 0.05). Also, average heat storage while wearing TOTAL (34 +/- 19 W x m(-2)) was less than with SHIRT (55 +/- 13 W x m(-2)) for both cooling systems (p < 0.05). The Physiological Strain Index (PSI) was lower in MCC-TOTAL (2.4) than MCC-SHIRT (3.7), PIC-SHIRT (3.8), and PIC-TOTAL (3.3) after 45 min of heat exposure (p < 0.05). CONCLUSIONS: Total body circulating liquid cooling was more effective than shirt-only cooling under the impermeable STEPO uniform, providing a greater cooling rate, allowing longer exposure time, and reducing the rate of heat storage. The MCC and PIC systems were equally effective during heat exposure, but neither system could extend exposure for the 4 h targeted time.


Subject(s)
Chemical Warfare , Heat Stress Disorders/etiology , Heat Stress Disorders/prevention & control , Military Personnel , Occupational Diseases/etiology , Occupational Diseases/prevention & control , Protective Clothing/adverse effects , Refrigeration/methods , Body Temperature , Equipment Design , Exercise Test , Female , Gases , Heart Rate , Heat Stress Disorders/classification , Heat Stress Disorders/diagnosis , Humans , Ice , Male , Occupational Diseases/classification , Occupational Diseases/diagnosis , Refrigeration/adverse effects , Refrigeration/instrumentation , Severity of Illness Index , Skin Temperature , Time Factors , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...