Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
2.
Ann Biomed Eng ; 2024 Jun 29.
Article in English | MEDLINE | ID: mdl-38951421

ABSTRACT

Low back pain (LBP) is a common medical condition worldwide, though the etiology of injuries causing most LBP is unknown. Flexion and repeated compression increase lumbar injury risk, yet the complex viscoelastic behavior of the lumbar spine has not been characterized under this loading scheme. Characterizing the non-injurious primary creep behavior in the lumbar spine is necessary for understanding the biomechanical response preceding injury. Fifteen porcine lumbar spinal units were loaded in repeated flexion-compression with peak compressive stresses ranging from 1.41 to 4.68 MPa. Applied loading simulated real loading exposures experienced by high-speed watercraft occupants. The strain response in the primary creep region was modeled for all tests using a generalized Kelvin-Voigt model. A quasilinear viscoelastic (QLV) approach was used to separate time-dependent (creep) and stress-dependent (elastic) responses. Optimizations between the models and experimental data determined creep time constants, creep coefficients, and elastic constants associated with this tissue under repeated flexion-compression loading. Average R2 for all fifteen models was 0.997. Creep time constants optimized across all fifteen models were 24 s and 580 s and contributed to 20 ± 3% and 30 ± 3% of the overall strain response, respectively. The non-transient behavior contributed to 50 ± 0% of the overall response. Elastic behavior for this porcine population had an average standard deviation of 24.5% strain across the applied stress range. The presented primary creep characterization provides the response precursor to injurious behavior in the lumbar spine. Results from this study can further inform lumbar injury prediction and kinematic models.

3.
Ann Biomed Eng ; 2024 Jun 26.
Article in English | MEDLINE | ID: mdl-38922366

ABSTRACT

Evaluating Behind Armor Blunt Trauma (BABT) is a critical step in preventing non-penetrating injuries in military personnel, which can result from the transfer of kinetic energy from projectiles impacting body armor. While the current NIJ Standard-0101.06 standard focuses on preventing excessive armor backface deformation, this standard does not account for the variability in impact location, thorax organ and tissue material properties, and injury thresholds in order to assess potential injury. To address this gap, Finite Element (FE) human body models (HBMs) have been employed to investigate variability in BABT impact conditions by recreating specific cases from survivor databases and generating injury risk curves. However, these deterministic analyses predominantly use models representing the 50th percentile male and do not investigate the uncertainty and variability inherent within the system, thus limiting the generalizability of investigating injury risk over a diverse military population. The DoD-funded I-PREDICT Future Naval Capability (FNC) introduces a probabilistic HBM, which considers uncertainty and variability in tissue material and failure properties, anthropometry, and external loading conditions. This study utilizes the I-PREDICT HBM for BABT simulations for three thoracic impact locations-liver, heart, and lower abdomen. A probabilistic analysis of tissue-level strains resulting from a BABT event is used to determine the probability of achieving a Military Combat Incapacitation Scale (MCIS) for organ-level injuries and the New Injury Severity Score (NISS) is employed for whole-body injury risk evaluations. Organ-level MCIS metrics show that impact at the heart can cause severe injuries to the heart and spleen, whereas impact to the liver can cause rib fractures and major lacerations in the liver. Impact at the lower abdomen can cause lacerations in the spleen. Simulation results indicate that, under current protection standards, the whole-body risk of injury varies between 6 and 98% based on impact location, with the impact at the heart being the most severe, followed by impact at the liver and the lower abdomen. These results suggest that the current body armor protection standards might result in severe injuries in specific locations, but no injuries in others.

4.
Ann Biomed Eng ; 2024 May 15.
Article in English | MEDLINE | ID: mdl-38748343

ABSTRACT

Low back pain (LBP) affects 50-80% of adults at some point in their lifetime, yet the etiology of injury is not well understood. Those exposed to repeated flexion-compression are at a higher risk for LBP, such as helicopter pilots and motor vehicle operators. Animal injury models offer insight into in vivo injury mechanisms, but interspecies scaling is needed to relate animal results to human. Human (n = 16) and porcine (n = 20) lumbar functional spinal units (FSUs) were loaded in repeated flexion-compression (1 Hz) to determine endplate fracture risk over long loading exposures. Flexion oscillated from 0 to 6° and peak applied compressive stress ranged from 0.65 to 2.38 MPa for human and 0.64 to 4.68 MPa for porcine specimens. Five human and twelve porcine injuries were observed. The confidence intervals for human and porcine 50% injury risk curves in terms of stress and cycles overlapped, indicating similar failure behavior for this loading configuration. However, porcine specimens were more tolerant to the applied loading compared to human, demonstrated by a longer time-to-failure for the same applied stress. Optimization revealed that time-to-failure in human specimens was approximately 25% that of porcine specimens at a given applied stress within 0.65-2.38 MPa. This study determined human and porcine lumbar endplate fracture risks in long-duration repeated flexion-compression that can be directly used for future equipment and vehicle design, injury prediction models, and safety standards. The interspecies scale factor produced in this study can be used for previous and future porcine lumbar injury studies to scale results to relevant human injury.

6.
J Biomech Eng ; 2021 Jul 06.
Article in English | MEDLINE | ID: mdl-34227649

ABSTRACT

Cavitation has been shown to have implications for head injury, but currently there is no solution for detecting the formation of cavitation through the skull during blunt impact. The goal of this communication is to confirm the wideband acoustic wavelet signature of cavitation collapse, and determine that this signature can be differentiated from the noise of a blunt impact. A controlled, laser induced cavitation study was conducted in an isolated water tank to confirm the wide band acoustic signature of cavitation collapse in the absence of a blunt impact. A clear acrylic surrogate head was impacted to induce blunt impact cavitation. The bubble formation was imaged using a high speed camera, and the collapse was synched up with the wavelet transform of the acoustic emission. Wideband acoustic response is seen in wavelet transform of positive laser induced cavitation tests, but absent in laser induced negative controls. Clear acrylic surrogate tests showed the wideband acoustic wavelet signature of collapse can be differentiated from acoustic noise generated by a blunt impact. Broadband acoustic signal can be used as a biomarker to detect the incidence of cavitation through the skull as it consists of frequencies that are low enough to potentially pass through the skull but high enough to differentiate from blunt impact noise. This lays the foundation for a vital tool to conduct CSF cavitation research in-vivo.

7.
Ann Biomed Eng ; 49(11): 3018-3030, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34297262

ABSTRACT

Modern changes in warfare have shown an increased incidence of lumbar spine injuries caused by underbody blast events. The susceptibility of the lumbar spine during these scenarios could be exacerbated by coupled moments that act with the rapid compressive force depending on the occupant's seated posture. In this study, a combined loading lumbar spine vertebral body fracture injury criteria (Lic) across a range of postures was established from 75 tests performed on instrumented cadaveric lumbar spine specimens. The spines were predominantly exposed to axial compressive forces from an upward vertical thrust with 64 of the tests resulting in at least one vertebral body fracture and 11 in no vertebral body injury. The proposed Lic utilizes a recommended metric (κ), based on prismatic beam failure theory, resulting from the combination of the T12-L1 resultant sagittal force and the decorrelated bending moment with optimized critical values of Fr,crit = 5824 N and My,crit = 1155 Nm. The 50% risk of lumbar spine injury corresponded to a combined metric of 1, with the risk decreasing with the combined metric value. At 50% injury risk the Normalized Confidence Interval Size improved from 0.24 of a force-based injury reference curve to 0.17 for the combined loading metric.


Subject(s)
Blast Injuries , Fractures, Bone , Lumbar Vertebrae/injuries , Spinal Injuries , Aged , Explosions , Humans , Male , Middle Aged , Stress, Mechanical
8.
J Biomech ; 117: 110227, 2021 03 05.
Article in English | MEDLINE | ID: mdl-33517244

ABSTRACT

Understanding the initiation of bony failure is critical in assessing the progression of bone fracture and in developing injury criteria. Detection of acoustic emissions in bone can be used to identify fractures more sensitively and at an earlier inception time compared to traditional methods. However, high rate loading conditions, complex specimen-device interaction or geometry may cause other acoustic signals. Therefore, characterization of the isolated local acoustic emission response from cortical bone fracture is essential to distinguish its characteristics from other potential acoustic sources. This work develops a technique to use acoustic emission signals to determine when cortical bone failure occurs by characterization using both a Welch power spectral density estimate and a continuous wavelet transform. Isolated cortical shell specimens from thoracic vertebral bodies with attached acoustic sensors were subjected to quasistatic loading until failure. The resulting acoustic emissions had a wideband frequency response with peaks from 20 to 900 kHz, with the spectral peaks clustered in three bands of frequencies (166 ± 52.6 kHz, 379 ± 37.2 kHz, and 668 ± 63.4 kHz). Using these frequency bands, acoustic emissions can be used as a monitoring tool in biomechanical spine testing, distinguishing bone failure from structural response. This work presents a necessary set of techniques for effectively utilizing acoustic emissions to determine the onset of cortical bone fracture in biological material testing. Acoustic signatures can be developed for other cortical bone regions of interest using the presented methods.


Subject(s)
Acoustics , Fractures, Bone , Cortical Bone , Humans , Materials Testing , Thoracic Vertebrae
9.
PLoS One ; 15(2): e0228802, 2020.
Article in English | MEDLINE | ID: mdl-32053658

ABSTRACT

Since World War I, helmets have been used to protect the head in warfare, designed primarily for protection against artillery shrapnel. More recently, helmet requirements have included ballistic and blunt trauma protection, but neurotrauma from primary blast has never been a key concern in helmet design. Only in recent years has the threat of direct blast wave impingement on the head-separate from penetrating trauma-been appreciated. This study compares the blast protective effect of historical (World War I) and current combat helmets, against each other and 'no helmet' or bare head, for realistic shock wave impingement on the helmet crown. Helmets included World War I variants from the United Kingdom/United States (Brodie), France (Adrian), Germany (Stahlhelm), and a current United States combat variant (Advanced Combat Helmet). Helmets were mounted on a dummy head and neck and aligned along the crown of the head with a cylindrical shock tube to simulate an overhead blast. Primary blast waves of different magnitudes were generated based on estimated blast conditions from historical shells. Peak reflected overpressure at the open end of the blast tube was compared to peak overpressure measured at several head locations. All helmets provided significant pressure attenuation compared to the no helmet case. The modern variant did not provide more pressure attenuation than the historical helmets, and some historical helmets performed better at certain measurement locations. The study demonstrates that both historical and current helmets have some primary blast protective capabilities, and that simple design features may improve these capabilities for future helmet systems.


Subject(s)
Head Protective Devices , Biomechanical Phenomena , Blast Injuries/prevention & control , Equipment Design , Head Injuries, Penetrating/prevention & control , Head Protective Devices/history , History, 20th Century , Humans , World War I
SELECTION OF CITATIONS
SEARCH DETAIL
...