Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 29
Filter
1.
Mil Med ; 189(1-2): e448-e453, 2024 Jan 23.
Article in English | MEDLINE | ID: mdl-37647618

ABSTRACT

Behind armor blunt trauma (BABT) is a non-penetrating injury caused by energy transfer and rapid deformation of protective body armor. Although modern military body armor is designed to prevent penetrating trunk injuries, high-energy projectiles can produce a significant energy transfer to tissues behind the armor and inflict injuries such as fractures or organ contusions. However, knowledge of BABT is limited to biomechanical and cadaver modeling studies and rare case reports. We report two cases of BABT resulting from close-range fire and discuss the potential implications for triaging patients with BABT in battlefield scenarios. In the first case, a 19-year-old male soldier sustained a single close-range 5.56-mm assault rifle gunshot to his chest body armor. The soldier initially reported mild pain in the parasternal region and assessment revealed a 4 cm × 3 cm skin abrasion. Following emergency department evaluation, the soldier was diagnosed with a non-displaced transverse fracture of the sternal body. In the second case, a 20-year-old male sustained five machine gun bullets (7.62 mm) to his body armor. Computed tomography of the chest revealed pulmonary contusions in the right lower and middle lobes. Both soldiers achieved full recovery and returned to combat duty within several weeks. These cases highlight the potential risks of energy transfer from high-velocity projectiles impacting body armor and the need for frontline providers to be aware of the risk of underlying blunt injuries. Further reporting of clinical cases and modeling studies using high-velocity projectiles could inform recommendations for triaging, evacuating, and assessing individuals with BABT.


Subject(s)
Contusions , Thoracic Injuries , Wounds, Gunshot , Wounds, Nonpenetrating , Male , Humans , Young Adult , Adult , Protective Clothing , Wounds, Gunshot/complications , Wounds, Nonpenetrating/complications , Wounds, Nonpenetrating/diagnosis , Contusions/complications
2.
Transfus Med ; 33(6): 440-452, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37668175

ABSTRACT

BACKGROUND: Cold-stored low-titer group O whole blood (LTOWB) has become increasingly utilised in both prehospital and in-hospital settings for resuscitation of traumatic haemorrhage. However, implementing the use of LTOWB to ground medical teams has been limited due to logistic challenges. METHODS: In 2022, the Israel Defense Forces (IDF) started using LTOWB in ambulances for the first time in Israel. This report details the initial experience of this rollout and presents a case-series of the first patients treated with LTOWB. RESULTS: Between January-December 2022, seven trauma patients received LTOWB administered by ground IDF intensive care ambulances after presenting with profound shock. Median time from injury to administration of LTOWB was 35 min. All patients had evidence of severe bleeding upon hospital arrival with six undergoing damage control laparotomy and all but one surviving to discharge. CONCLUSIONS: The implementation of LTOWB in ground medical units is in its early stages, but continued experience may demonstrate its feasibility, safety, and effectiveness in the prehospital setting. Further research is necessary to fully understand the indications, methodology, and benefits of LTOWB in resuscitating severely injured trauma patients in this setting.


Subject(s)
Military Personnel , Wounds and Injuries , Humans , Blood Transfusion/methods , Ambulances , Israel , Hemorrhage/therapy , ABO Blood-Group System , Wounds and Injuries/therapy
3.
EClinicalMedicine ; 42: 101211, 2021 Dec.
Article in English | MEDLINE | ID: mdl-34849479

ABSTRACT

BACKGROUND: Pregestational excessive body mass index (BMI) is linked to an increased risk for gestational diabetes mellitus (GDM), but less is known on the effect of adolescent BMI on GDM occurrence. The study aimed to investigate possible associations of adolescent BMI and changes in BMI experienced before first pregnancy, with gestational diabetes risk. METHODS: This retrospective study was based on linkage of a military screening database of adolescent health status (Israel Defence Forces) including measured height and weight, with medical records (Maccabi Healthcare Services, MHS) of a state-mandated health provider. The latter covers about 25% of the Israeli population; about 90% of pregnant women undergo screening by the two-step Carpenter-Coustan method. Adolescent BMI was categorized according to Center of Disease Control and Prevention percentiles. Only first documented pregnanies were analyzed and GDM was the outcome. FINDINGS: Of 190,905 nulliparous women, 10,265 (5.4%) developed GDM. Incidence proportions of GDM were 5.1%, 6.1%, 7.3%, and 8.9% among women with adolescent normal BMI, underweight, overweight, and obesity (p<0.001), respectively. In models that accounted for age at pregnancy, birth year, and sociodemographic variables, the adjusted odd ratios (aORs) for developing GDM were: 1.2 (95%CI, 1.1-1.3), 1.5 (1.4-1.6), and 1.9 (1.7-2.1) for adolescent underweight, overweight, and obesity (reference group, normal BMI). Adolescent BMI tracked with BMI notes in the pre-pregnancy period (r=63%). Resuming normal pre-pregnancy BMI from overweight or obesity in adolescence diminished GDM risk, but this diminished risk was not observed among those who returned to a normal per-pre-pregnancy BMI from being underweight in adolescence. Sustained overweight or obesity conferred an aOR for developing GDM of 2.5 (2.2-2.7); weight gain from adolescent underweight and normal BMI to pre-pregnancy excessive BMI conferred aORs of 3.1 (1.6-6.2) and 2.6 (2.2-2.7), respectively. INTERPRETATION: Change in BMI status from adolescence to pre-pregnancy may contribute to GDM risk. Identifying at-risk populations is important for early preventive interventions. FUNDING: None.

4.
Clin Microbiol Infect ; 27(10): 1502-1506, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34111591

ABSTRACT

OBJECTIVE: To analyse the correlation between COVID-19 vaccination percentage and socioeconomic status (SES). METHODS: A nationwide ecologic study based on open-sourced, anonymized, aggregated data provided by the Israel Ministry of Health. The correlations between municipal SES, vaccination percentage and active COVID-19 cases during the vaccination campaign were analysed by using weighted Pearson correlations. To assess the adequacy of first dose vaccination rollout relative to the municipality COVID-19 disease burden, a metric termed the vaccination need ratio was devised by dividing the total number of active cases (per 10 000 people) by the vaccination percentage of the population over 60 in each municipality, and its correlation with the SES was examined. RESULTS: 23 days after initiation of the vaccination campaign, 760 916 (56.8%) individuals over the age of 60 were vaccinated in Israel with the first dose of the BNT162b2 COVID-19 vaccine. A negative correlation was found between the COVID-19 active case burden and the vaccination percentage of the study population in each municipality (r = -0.47, 95% CI -0.59 to -0.30). The vaccination percentage significantly correlated with the municipal SES (r = 0.83, 95% CI 0.79 to 0.87). This finding persisted but was attenuated over a 5-week period. A negative correlation between the vaccination need ratio and municipal SES (r = -0.80, 95% CI -0.88 to -0.66) was found. DISCUSSION: Lower COVID-19 vaccination percentage was associated with lower SES and high active disease burden. Vaccination efforts should focus on areas with lower SES and high disease burden to assure equality of vaccine allocation and potentially provide a more diligent disease mitigation.


Subject(s)
COVID-19/epidemiology , Patient Acceptance of Health Care/statistics & numerical data , Vaccination/statistics & numerical data , Aged , Aged, 80 and over , COVID-19/prevention & control , COVID-19 Vaccines/administration & dosage , Humans , Immunization Programs , Israel/epidemiology , Middle Aged , Risk Factors , SARS-CoV-2/immunology , Social Class , Socioeconomic Factors
5.
Obes Res Clin Pract ; 14(6): 542-547, 2020.
Article in English | MEDLINE | ID: mdl-33189604

ABSTRACT

OBJECTIVES: To assess the association between sleep disorders prevalence and obesity in Israeli adolescents. METHODS: A nationwide, population-based, cross-sectional study of 1,348,817 Israeli adolescents (57% males) who were medically examined prior to military service between 1997 and 2015; height and weight were measured along with assessment of medical status at age 17.3⬰±â¬°0.4 years. The diagnosis of a sleep disorder was made based on objective diagnostic criteria. The prevalence and odds ratio (OR) for a sleep disorder were computed across BMI subgroups and were adjusted for socio-demographic confounders. RESULTS: Overall sleep disorders prevalence was 1.8:1000 (males) and 0.45:1000 (females), with a total of 1601 cases. There was a gradual increase in the odds ratio for sleep disorders with increasing BMI. Multivariable-adjusted ORs for sleep disorders were 1.29 (95% CI 1.10⬜1.52), 1.44 (1.18⬜1.75), 3.03 (2.32⬜3.96) and 3.38 (1.98⬜5.75) for overweight, obese class I, II and III, respectively (5th⬜49th BMI percentile was the reference). Results persisted in extensive sensitivity analyses including limiting the study sample to participants with unimpaired health. CONCLUSIONS: We found a higher prevalence of sleep disorders in males and a dose-dependent association between sleep disorders and adolescent BMI in both sexes. Our findings warrant clinical awareness among healthcare providers, given the rise in obesity in teenagers, and particularly in light of the obesity epidemic that we are experiencing in this era. Sleep related complaints should be actively screened in adolescents who suffer obesity.


Subject(s)
Obesity , Sleep Wake Disorders , Adolescent , Body Mass Index , Cross-Sectional Studies , Female , Humans , Male , Overweight , Prevalence
7.
Contraception ; 102(5): 332-338, 2020 11.
Article in English | MEDLINE | ID: mdl-32652092

ABSTRACT

OBJECTIVE: To compare pain during laminaria insertion after lidocaine spray versus placebo spray anesthesia in women about to undergo a surgical abortion procedure. STUDY DESIGN: A double blind, randomized, placebo-controlled trial of women at 12-24 weeks gestation one day prior to surgical uterine evacuation procedure. Participants received lidocaine 10% or placebo (saline 0.9%) spray to the endocervix and ectocervix two minutes before laminaria insertion. The primary outcome was participants' pain score immediately after initial laminaria insertion, measured using a 10 cm visual analog scale (VAS). Secondary outcomes included scores at speculum removal and 15 min after speculum insertion. RESULTS: From 7/2016 through 8/2018, we enrolled 68 and 66 women to the lidocaine and placebo groups, respectively. Baseline characteristics were similar in both groups. The primary outcome did not differ between lidocaine and placebo groups (median VAS 2.0 vs. 2.0 respectively, p = 0.69). Reported VAS after speculum removal and 15 min from speculum insertion were similar in the lidocaine and placebo groups (median 2.0, p = 0.99; median 1.0 vs. 1.5 respectively, p = 0.32). In multivariate analyses, lidocaine use was associated with decreased VAS score at 15 min from speculum insertion [95%CI -0.96 (-1.74 to -0.18), p = 0.016]. Reported VAS ≥7 at 1st laminaria insertion did not differ between lidocaine and placebo groups (5.88% vs. 10.61% respectively, p = 0.362). CONCLUSION: In women scheduled for laminaria insertion prior to surgical uterine evacuation at 12-24 weeks gestation, topical application of lidocaine spray to the cervix before insertion did not result in lower reported pain as compared with placebo. IMPLICATIONS: Our results imply that physicians should not use topical application of lidocaine spray to the cervix before laminaria insertion to reduce women's pain. Continued efforts must be made to find means to relieve pain by using simple, effective analgesia or adjusting the technique, and not using a tenaculum whenever possible.


Subject(s)
Anesthesia , Laminaria , Anesthetics, Local , Cervix Uteri , Double-Blind Method , Female , Humans , Lidocaine , Pregnancy
8.
Int J Gynecol Cancer ; 30(7): 959-968, 2020 07.
Article in English | MEDLINE | ID: mdl-32169875

ABSTRACT

INTRODUCTION: Pre-malignant cervical disease and invasive cervical cancer present a significant global health burden with respect to morbidity and mortality, mostly in low- and middle-income countries. Human papillomavirus (HPV) infection typically manifests for the first time in adolescence. We aimed to identify adolescent sociodemographic and anthropometric characteristics associated with subsequent risk for pre-malignant cervical disease and cervical cancer, in a country that offers free screening and HPV vaccines. METHODS: This historical cohort study included 969 123 Israeli women examined and anthropometrically measured at age 17 years between January 1967 and December 2011. Data on pre-malignant disease and invasive cervical tumors were obtained from the national cancer registry by linkage. We excluded non-Jewish minorities (a total of 25 472 women) and orthodox/ultraorthodox Jewish women since these populations are not required by law to serve in the military, as well as women with a pre-examination diagnosis of cancer. Cox proportional hazards regression models were applied per each lesion type, adjusted for origin, measured body mass index, height, education, dwelling type, birth year, and age at examination. RESULTS: In total, 5094 and 859 incident pre-malignant cervical disease and cervical cancer cases, respectively, were diagnosed during a median follow-up of 17.6 years. Risk for both lesions was origin-dependent, with higher incidence in women of North-African origin (HR (pre-malignant cervical disease) 1.22, 95% CI 1.04 to 1.42; HR (cervical cancer) 1.87, 95% CI 1.30 to 2.69) compared with European origin. Height, lower education, and later birth year were associated with higher pre-malignant cervical disease and cervical cancer risk also. Adolescent overweight (HR 0.81, 95% CI 0.74 to 0.90) and obesity (HR 0.56, 95% CI 0.43 to 0.71) status were associated with reduced pre-malignant cervical disease but not cervical cancer incidence, as did urban (vs rural) residence. DISCUSSION: Ethnic background, tall stature, and education were associated with pre-malignant cervical disease and cervical cancer incidence, while adolescent overweight and obesity were inversely associated with only pre-malignant cervical disease. Despite free screening and HPV vaccines, these findings suggest that there is still a need for appropriate safe sex and screening education in adolescence.


Subject(s)
Precancerous Conditions/epidemiology , Uterine Cervical Neoplasms/epidemiology , Adolescent , Adult , Africa, Northern/ethnology , Cohort Studies , Female , Humans , Incidence , Israel/epidemiology , Jews/statistics & numerical data , Multivariate Analysis , Neoplasm Invasiveness , Proportional Hazards Models , Socioeconomic Factors , Young Adult
9.
Lancet Diabetes Endocrinol ; 8(3): 216-225, 2020 03.
Article in English | MEDLINE | ID: mdl-32027851

ABSTRACT

BACKGROUND: Obesity has been established as a causal factor for several types of cancer, and adolescent obesity is increasing worldwide. We examined associations between measured body-mass index (BMI) at age 17 years and cancer incidence, and with mortality among those who developed cancer. METHODS: In a nationwide, population-based cohort of adolescents, height and weight were measured at pre-recruitment mandatory medical examination during 1967-2010. BMI was classified according to US Center for Disease Control and Prevention percentiles. We applied Cox proportional hazard models to estimate the hazard ratios (HRs) and 95% CIs for incident cases of cancer using the 5th-49th BMI percentile group as a reference. The primary outcome was any cancer diagnosis between Jan 1, 1967, and Dec 31, 2012, as recorded in the Israeli National Cancer Registry. Participants with a diagnosis of cancer at baseline (before military recruitment assessment) were excluded from this analysis. The secondary outcome of this study was all-cause mortality among cohort members who had cancer, between Jan 1, 1967, and Dec 31, 2017. FINDINGS: Of the 2 458 170 participants examined between Jan 1, 1967, and Dec 31, 2010, 160 040 were excluded. 2 298 130 participants of which 928 110 were women and 1 370 020 were men. During 29 542 735 person-years of follow-up in men, 26 353 incident cases of cancer were recorded and in 18 044 863 person-years of follow-up in women, 29 488 incident cases of cancer were recorded. Cancer incidence increased gradually across BMI percentiles. The adjusted HR was 1·26 (95% CI 1·18-1·35) among men with adolescent obesity. Among women, we found no association between obesity and overall cancer, driven by inverse associations of obesity with cervical and breast cancers. When these cancers were excluded, the adjusted HR for cancer was 1·27 (1·13-1·44) among women with adolescent obesity. In both sexes, high BMI (≥85th percentile) was associated with an increased cancer risk after 10 years. This association was accentuated in the late period of the cohort versus the early period of the cohort. BMI was positively associated with a higher risk of mortality. The projected population attributable risk for high BMI was 5·1% (4·2-6·1) for men and 5·7% (4·2-7·3) for women. INTERPRETATION: The increasing prevalence of adolescent obesity and the possible association between adolescent BMI and cancer incidence might increase the future burden of obesity-related cancers. BMI among adolescents could constitute an important intervention target for cancer prevention. FUNDING: None.


Subject(s)
Body Mass Index , Neoplasms/etiology , Overweight/complications , Pediatric Obesity/complications , Registries/statistics & numerical data , Adolescent , Adult , Cohort Studies , Female , Follow-Up Studies , Humans , Incidence , Israel/epidemiology , Male , Neoplasms/epidemiology , Prognosis , Risk Factors , Survival Rate , Young Adult
10.
Arch Gynecol Obstet ; 300(5): 1245-1252, 2019 11.
Article in English | MEDLINE | ID: mdl-31576451

ABSTRACT

PURPOSE: Information regarding the use of barbed suture in gynecologic surgery is limited. Our aim was to compare maternal morbidity following caesarean deliveries performed with barbed compared with non-barbed suture for uterine closure. METHODS: A historical cohort study from a single tertiary institution. The study group composed of all women that underwent term, uncomplicated singleton caesarean deliveries, where uterine closure was performed with ETHICON's Stratafix®, a polydioxanone barbed suture, compared with caesarean deliveries where uterine closure was performed with ETHICON's VICRYL®, a Polyglactin 910 non-barbed suture. The primary outcomes were the rate of maternal morbidity including the rate of red packed cells transfusion and a composite of infectious morbidity. Operation duration was also evaluated. An analysis restricted to elective caesarean deliveries was performed comparing the suture types. RESULTS: Three thousand and sixty patients were included in the study; 1337 in the study group and 1723 in the control group. There was no significant difference in the rate of the primary outcomes (red packed cells transfusion: 2.5% in the barbed suture vs. 2.1% in the non-barbed suture groups; p = 0.47; composite maternal morbidity: 3.8% vs. 4.8%, respectively; p = 0.18). Barbed suture was associated with reduced risk of postoperative ileus compared with the non-barbed suture (0.3% vs. 1.0%, respectively; p = 0.02) and a longer operation time (31 vs. 29 min, respectively; p < 0.001). In the analysis restricted to elective caesarean deliveries only the duration of operation remained significantly different between the groups. CONCLUSIONS: The rate of short term maternal morbidities among patients undergoing uterine closure with barbed suture during caesarean delivery is similar to the non-barbed suture.


Subject(s)
Cesarean Section/mortality , Postoperative Complications/mortality , Suture Techniques/adverse effects , Uterus/surgery , Adult , Cohort Studies , Female , Humans , Pregnancy , Retrospective Studies
11.
Am J Perinatol ; 36(2): 205-211, 2019 01.
Article in English | MEDLINE | ID: mdl-30031370

ABSTRACT

OBJECTIVE: The aim of this study was to evaluate obstetric outcomes in relation to the extent of donor sperm exposure with and without egg donation. MATERIALS AND METHODS: This is a retrospective cohort study in a single tertiary care center. All women with a singleton pregnancy who conceived following sperm donation (SD) were included. Obstetrics and neonatal outcomes for pregnancies following single SD were compared with pregnancies following repeat SD from the same donor. In a secondary analysis, we compared pregnancy outcomes among three modes of assisted reproductive technology (intrauterine insemination [IUI-SD], in vitro fertilization [IVF-SD], and IVF sperm + egg donation [IVF-SD + ED]). RESULTS: A total of 706 pregnant women met the inclusion criteria, 243 (34.4%) following the first SD and 463 (65.6%) following repeat donations. Compared with repeat SDs, single donation was not associated with higher rates of preterm delivery (12.8 vs. 12.7%, respectively, p = 0.99), preeclampsia (7.0 vs. 6.9%, p = 0.999), and intrauterine growth restriction (4.1 vs. 3.9%, p = 0.88). Pregnancies following IVF-SD + ED had increased risk for preeclampsia (adjusted odds ratio [AOR], 3.1; 95% confidence interval [CI], 1.5-6.6), preterm labor (AOR, 2.4; 95% CI, 1.1-5.4), and cesarean section (AOR, 2.1; 95% CI, 1.0-4.3) compared with IUI-SD and IVF-SD. CONCLUSION: The extent of donor sperm exposure did not correlate with obstetrics complications, but double gamete donation was associated with increased risk for preeclampsia, preterm labor, and cesarean section.


Subject(s)
Fertilization in Vitro/adverse effects , Insemination, Artificial/adverse effects , Oocyte Donation , Pregnancy Complications/etiology , Spermatozoa , Tissue Donors , Adult , Cesarean Section/statistics & numerical data , Female , Humans , Logistic Models , Male , Middle Aged , Pre-Eclampsia/etiology , Pregnancy , Pregnancy Complications/epidemiology , Pregnancy Outcome , Premature Birth/epidemiology , Premature Birth/etiology , Retrospective Studies
12.
J Matern Fetal Neonatal Med ; 32(2): 243-247, 2019 Jan.
Article in English | MEDLINE | ID: mdl-28889762

ABSTRACT

INTRODUCTION: The aim of this study was to assess the correlation between fetal lateral ventricle width and biometric measurements. MATERIAL AND METHODS: A prospective study on 335 fetuses, 101 fetuses with isolated mild ventriculomegaly and a control group of 234 fetuses with a normal US examination. All fetuses underwent a detailed brain ultrasound scan and a full biometric evaluation. To further compare biometric parameters, we matched, according to gestational week and gender, 91 fetuses from the study group to 91 fetuses from the control group. RESULTS: The mean gestational week during the exam was significantly different between the groups (29.6 weeks in the study group versus 28.3 in the control group, p = .001). The mean maternal age, obstetrical history, mode of conception, or fetal gender did not differ between the groups. After matching according to gestational age and fetal gender, the mean gestational week between the matched groups did not differ and was 29 + 5 weeks in both groups. The study group had significantly larger head circumference (p = .009), biparietal diameter (p < .001), femur length (p = .023), and estimated fetal weight (p = .024) compared with the control group. CONCLUSIONS: Isolated mild ventriculomegaly could be related to other larger fetal biometric measurements and does not necessarily mean a pathological condition.


Subject(s)
Fetal Weight/physiology , Lateral Ventricles/diagnostic imaging , Nervous System Malformations/diagnosis , Adult , Case-Control Studies , Female , Gestational Age , Humans , Lateral Ventricles/embryology , Male , Nervous System Malformations/epidemiology , Nervous System Malformations/pathology , Pregnancy , Reference Values , Severity of Illness Index , Ultrasonography, Prenatal
13.
J Matern Fetal Neonatal Med ; 31(14): 1885-1888, 2018 Jul.
Article in English | MEDLINE | ID: mdl-28511577

ABSTRACT

OBJECTIVES: No study thus far has evaluated the LUS thickness in active labor. In this study, we endeavored to assess the LUS during active labor. METHODS: Using transabdominal sonography in the mid-sagittal position with a full urinary bladder, the thickness of the LUS was measured during active labor phase in women with or without a history of a previous cesarean section. RESULTS: A total of 28 women with a previous cesarean delivery were compared to 29 women without a history of uterine surgery. The median LUS was significantly thinner in women with a uterine scar both during (4 versus 5 mm, p = .001) and between contractions (5 versus 7 mm, p = .011). Paired comparison of LUS thickness between and during contractions within each group showed that thinning of LUS during contraction was significant for both the previous CS group (p < .001) and the control group (p < .001). We found no correlation between LUS thickness and chances of successful TOLAC. CONCLUSIONS: In this study, we characterized for the first time the LUS during active labor. We found that LUS was significantly thinner in women after a previous CS and that the LUS was significantly thinner during contraction.


Subject(s)
Cicatrix/diagnostic imaging , Labor, Obstetric , Uterine Contraction , Uterus/diagnostic imaging , Adult , Cesarean Section/adverse effects , Cicatrix/etiology , Cicatrix/physiopathology , Female , Humans , Pregnancy , Prospective Studies , Ultrasonography, Prenatal , Uterus/physiology
14.
J Trauma Acute Care Surg ; 83(4): 675-682, 2017 10.
Article in English | MEDLINE | ID: mdl-28930960

ABSTRACT

BACKGROUND: Hemorrhage is the leading cause of possible preventable death in the battlefield. There is an increasing evidence for the effectiveness of blood component therapy in general, and plasma infusion in particular but their use is less applicable in the prehospital setting due to logistic difficulties. Israeli Defense Force has implemented the use of freeze-dried plasma (FDP) at the point of injury (POI), this adoption of FDP use entailed doubts regarding the feasibility and effectiveness of this practice. In this article, we present our experience with the use of FDP at the POI and prehospital setting regarding the feasibility, safety, adverse reactions, and adherence to clinical practice guidelines. METHODS: This is a descriptive retrospective cohort study based on all casualties receiving FDP during January 2013 to June 2016. The study describes the injury, treatment, and outcome characteristics from POI until hospital discharge. RESULTS: During the study period, 109 casualties received FDP. The majority were men, aged 18 years to 35 years. Multiple severe injuries were found in almost half of the casualties, 78% had penetrating injury, and more than half were involved in a multicasualty event. Eighty-three percent were treated with one unit of FDP, 13% with two units, and 4% casualties with three units, nine patients (8.2%) were also treated in the prehospital setting with packed red blood cells. Fifty-seven percent fulfilled at least one criterion for the administration of FDP. Lifesaving interventions were required in 64%. In five (4.6%) cases, there were difficulties with FDP administration. Side effects were reported in one female patient. CONCLUSION: This study supports the usage feasibility of FDP at the POI and in the prehospital setting. Further adjustment of the clinical practice guidelines is required basing it not only on pathophysiologic parameters but also on clinical judgment. Further investigation of the available data is required to learn about the effectiveness of FDP at POI. LEVEL OF EVIDENCE: Retrospective case series study, level IV.


Subject(s)
Blood Component Transfusion , Emergency Medical Services , Wounds and Injuries/therapy , Adolescent , Adult , Feasibility Studies , Female , Freeze Drying , Guideline Adherence , Humans , Male , Plasma , Retrospective Studies , Treatment Outcome , Wounds and Injuries/mortality , Young Adult
15.
J Adolesc Health ; 61(2): 233-239, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28457687

ABSTRACT

PURPOSE: The secular trend of increasing weight may lead to a decline in height gain compared with the genetic height potential. The impact of weight on height in healthy male and female adolescents compared with their genetic height was assessed. METHODS: Height and weight were measured in Israeli adolescent military recrutees aged 16-19 years between 1967 and 2013. The study population comprised 355,229 recrutees for whom parental height measurements were documented. Subjects were classified into four body mass index percentile groups according to the U.S. Centers for Disease Control and Prevention body mass index percentiles for age and sex:<5th (underweight), 5th-49th (low-normal), 50th-84th (high-normal), and ≥85th (overweight-obese). Short stature was defined as height ≤ third percentile and tall stature as height ≥ 90th percentile for age and sex. RESULTS: Overweight-obese females had a 73% increased risk for short stature (odds ratio [OR]: 1.73, 95% confidence interval [CI] = 1.51-1.97, p < .001). Conversely, underweight females had a 56% lower risk of short stature (OR: .44, 95% CI = .28-.70, p = .001) and a twofold increased risk for being tall (OR: 2.08, 95% CI = 1.86-2.32, p < .001). Overweight-obese males had a 23% increased risk of being short (OR: 1.23, 95% CI = 1.10-1.37, p < .001). Underweight females were on average 4.1 cm taller than their mid-parental height. CONCLUSIONS: Overweight-obese males and females had an increased risk of being short, and underweight females were significantly taller compared with their genetic height. The significantly increased height among underweight healthy females may reflect a potential loss of height gain in overweight-obese females.


Subject(s)
Body Height/genetics , Obesity/epidemiology , Parents , Thinness/epidemiology , Adolescent , Body Height/physiology , Body Mass Index , Female , Humans , Israel/epidemiology , Male , Risk Factors , Sex Factors
16.
J Trauma Acute Care Surg ; 83(2): 278-283, 2017 08.
Article in English | MEDLINE | ID: mdl-28230629

ABSTRACT

BACKGROUND: Hemorrhage is a leading cause of preventable death on the battlefield. Timely tourniquet application to massively bleeding extremity wounds is critical for casualty survival albeit with reported adverse effects to extremity integrity. The aim of this study was to describe the immediate- and short-term outcomes of point of injury (POI) tourniquet applications during "Operation Protective Edge" (OPE). METHODS: A case series study regarding tourniquet application at the POI during OPE was collected. The data gathered included reports by medical providers at the POI, aerial and land evacuation vehicles, and receiving hospitals. Variables collected included, the number of tourniquet applications, caregiver level, tourniquet type, limb characters, tourniquet effectiveness, in-hospital procedures, complications, and short-term limb outcome. RESULTS: During OPE, the Israeli Defense Forces Medical Corps treated 704 casualties. Of these, 90 casualties were treated with 119 tourniquets of which 79 survived. Penetrating trauma was the mechanism of injury in 97.8% (88 of 90) of the casualties. Injuries sustained from improvised explosive devices and shrapnel were related to the use of more than one tourniquet per casualty and per limb (p = 0.034). The success rate of the first tourniquet was reported to be 70% (84 of 119), regardless of caregiver level (p = 0.56), tourniquet type (p = 0.16), or limb characters (p = 0.48). Twenty-seven (25.7%) of 105 of the tourniquets were converted to direct pressure dressings enroute to receiving hospitals two of the conversions failed and thus a new tourniquet was applied. Fasciotomy was performed on eight casualties (a single limb in each). Vascular injury was presumed to be the indication for fasciotomy in three of these cases, in the other five limbs (6%, 5 of 85), no vascular involvement was discovered during surgery, and the fasciotomy is suspected as tourniquet related. 7%) 6 of 85) suffered from neurological sequela that could not be explained by their primary injury. Total complication rate was 11.7% (10 of 85) (one patient had both fasciotomy and neural complication without vascular injury). CONCLUSION: Tourniquet use on the battlefield is a simple method of eliminating preventable death, we believe that clinical practice guidelines should promote liberal use of tourniquets by trained combatants and medical personnel with abilities to convert to direct pressure hemorrhage control when possible since an unjustified tourniquet application risks low rates minor morbidity, whereas a justifiable tourniquet not applied may be lethal. LEVEL OF EVIDENCE: Epidemiologic study, level III; Therapeutic study, level IV.


Subject(s)
Extremities/injuries , Hemorrhage/therapy , Hemostatic Techniques , Military Personnel , Multiple Trauma/therapy , Tourniquets , War-Related Injuries/therapy , Bandages , Emergency Responders/education , Guideline Adherence , Hemorrhage/mortality , Humans , Injury Severity Score , Israel , Male , Military Personnel/education , Multiple Trauma/mortality , Pressure , Registries , Resuscitation/education , Resuscitation/methods , Survival Analysis , War-Related Injuries/mortality , Young Adult
17.
Prehosp Emerg Care ; 21(3): 315-321, 2017.
Article in English | MEDLINE | ID: mdl-27870553

ABSTRACT

BACKGROUND: During resuscitation in the field, intraosseous (IO) access may be achieved using a variety of available devices, often attempted by inexperienced users. AIM: We sought to examine the success rate and ease-of-use ratings of an IO device, the NIO® (New Intraosseous Persys Medical, Houston, TX, USA) in comparison to the Arrow® EZ-IO® (Teleflex Medical Research Triangle Park, NC, USA) by novice users. METHODS: We performed a randomized crossover trial. The study model was a porcine hind leg which was cut distally in order to expose the marrow. The Study population was composed of pre-graduate medical students without prior experience in IO use, all designated future field physicians. The students underwent instruction and practiced the use of both devices. After practice completion, each student attempted a single IO insertion with both devices sequentially in randomized fashion. Success was defined as a flow of fluid through the bone marrow after a single IO attempt. Investigators which determined the success rate were blinded to the used device. RESULTS: 50 users (33 males, 17 females) participated in the trial, mean age of 21.7 years (±1). NIO users were successful in 92% (46/50) attempts while EZ-IO user success rate was 88% (44/50). NIO success rates were comparable to those of EZ-IO (p = NS). Results were similar when examining only the initial device used. Median score of ease of use was 4 (5 point Likert scale) in both devices (p = NS). 54% (27/50) of the participants preferred using the EZ-IO over the NIO (p = NS). CONCLUSION: Novice users were equally successful in establishing IO access with the NIO® in comparison to the EZ-IO® in a porcine model.


Subject(s)
Fluid Therapy/instrumentation , Infusions, Intraosseous/instrumentation , Animals , Cross-Over Studies , Emergency Medical Services , Female , Hindlimb , Humans , Male , Resuscitation/education , Resuscitation/methods , Single-Blind Method , Swine , Young Adult
18.
Prehosp Emerg Care ; 21(1): 39-45, 2017.
Article in English | MEDLINE | ID: mdl-27494564

ABSTRACT

OBJECTIVE: Tourniquet application is a lifesaving skill taught worldwide in first aid bleeding control courses. We observed performance among non-medical users of tourniquets in their confidence, competence, and reasons for failure. METHODS: 179 Israeli military recruits without prior medical training underwent their standard first aid course where they learned Combat Application Tourniquet (CAT; Composite Resources, Rock Hill, SC, USA) use. After course completion, they self-reported confidence in tourniquet use. User performance was assessed 7-14 days later using a HapMed™ mannequin that assessed time, pressure, and blood loss. Competent performance required in aggregate: 1) use with pressure of 200 mmHg or more, 2) hemorrhage volume of less than 638 mL, and 3) correct placement of the tourniquet. For failed performance, a reason for failure was reported independently by both the user and an expert observer. RESULTS: 45 of 179 user performances (25%) were competent. Users who reported high confidence had only a slightly higher chance of achieving competence in tourniquet application (r = 0.17, p = 0.022). The most common reason for failure was excess slack in the CAT's strap (experts 55%, users 39%), and too few turns of the windlass (23% and 31%, respectively) was the second most common reason. Expert and user evaluations had poor agreement (κ = 0.44, 95% CI 0.32-0.56). CONCLUSION: The most common reason for failed use of tourniquets among non-medical users was excess slack in the tourniquet strap. Users self-evaluated their performance inaccurately and demonstrated a confidence-competence mismatch. These pitfalls in performance may help tourniquet instructors improve training of caregivers.


Subject(s)
Hemorrhage/prevention & control , Tourniquets , Adolescent , Clinical Competence , Emergency Medical Services , Equipment Design , Extremities/blood supply , Humans , Male , Manikins , Treatment Failure , Young Adult
19.
Am J Emerg Med ; 34(12): 2356-2361, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27614373

ABSTRACT

INTRODUCTION: Although a lifesaving skill, currently, there is no consensus for the required amount of practice in tourniquet use. We compared the effect of 2 amounts of practice on performance of tourniquet use by nonmedical personnel. METHODS: Israeli military recruits without previous medical training underwent their standard tactical first aid course, and their initial performance in use of the Combat Application Tourniquet (CAT; Composite Resources, Rock Hill, SC) was assessed. The educational intervention was to allocate the participants into a monthly tourniquet practice program: either a single-application practice (SAP) group or a triple-application practice (TAP) group. Each group practiced according to its program. After 3 months, the participants' tourniquet use performance was reassessed. Assessments were conducted using the HapMed Leg Tourniquet Trainer (CHI Systems, Fort Washington, PA), a mannequin which measures time and pressure. RESULTS: A total of 151 participants dropped out, leaving 87 in the TAP group and 69 in the SAP group. On initial assessment, the TAP group and the SAP group performed similarly. Both groups improved their performance from the initial to the final assessment. The TAP group improved more than the SAP group in mean application time (faster by 18 vs 8 seconds, respectively; P = .023) and in reducing the proportion of participants who were unable to apply any pressure to the mannequin (less by 18% vs 8%, respectively; P = .009). CONCLUSION: Three applications per monthly practice session were superior to one. This is the first prospective validation of a tourniquet practice program based on objective measurements.


Subject(s)
Education, Nonprofessional/methods , Military Personnel/education , Practice, Psychological , Tourniquets , Adolescent , First Aid , Humans , Israel , Male , Manikins , Pressure , Prospective Studies , Simulation Training , Task Performance and Analysis , Time Factors , Young Adult
20.
Am J Prev Med ; 50(6): 737-745, 2016 06.
Article in English | MEDLINE | ID: mdl-26810356

ABSTRACT

INTRODUCTION: There are mixed data regarding the effect of emotional distress on diabetes risk, especially among young adults. This study assessed the effect of self-perceived emotional distress on diabetes incidence among young men. METHODS: Incident diabetes during a mean follow-up of 6.3 (4.3) years was assessed among 32,586 men (mean age, 31.0 [5.6] years) of the Metabolic, Lifestyle, and Nutrition Assessment in Young Adults cohort with no history of diabetes between 1995 and 2011. Emotional distress was assessed by asking participants as part of a computerized questionnaire: Are you preoccupied by worries or concerns that affect your overall wellbeing? Time-dependent Cox models were applied. Data analysis took place between 2014 and 2015. RESULTS: There were 723 cases of diabetes during 206,382 person-years. The presence of distress was associated with a 53% higher incidence of diabetes (95% CI=1.08, 2.18, p=0.017) after adjustment for age, BMI, fasting plasma glucose, family history of diabetes, triglyceride and high-density lipoprotein cholesterol levels, education, cognitive performance, white blood cell count, physical activity, and sleep quality. These results persisted when distress, BMI, physical activity, and smoking status were treated as time-dependent variables (hazard ratio=1.66, 95% CI=1.21, 2.17, p=0.002). An adjusted hazard ratio of 2.14 (95% CI=1.04, 4.47, p=0.041) for incident diabetes was observed among participants persistently reporting emotional distress compared with those persistently denying it. CONCLUSIONS: Sustained emotional distress contributes to the development of diabetes among young and apparently healthy men in a time-dependent manner. These findings warrant awareness by primary caregivers when stratifying diabetes risk.


Subject(s)
Diabetes Mellitus, Type 2/epidemiology , Self Concept , Stress, Psychological/psychology , Adult , Blood Glucose/metabolism , Body Mass Index , Follow-Up Studies , Humans , Incidence , Life Style , Male , Prospective Studies , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...