Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Spine J ; 24(4): 721-729, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-37875243

RESUMO

BACKGROUND CONTEXT: Methods to improve osseointegration of orthopedic spinal implants remains a clinical challenge. Materials composed of poly-ether-ether-ketone (PEEK) and titanium are commonly used in orthopedic applications due to their inherent properties of biocompatibility. Titanium has a clinical reputation for durability and osseous affinity, and PEEK offers advantages of a modulus that approximates osseous structures and is radiolucent. The hypothesis for the current investigation was that a titanium plasma spray (TPS) coating may increase the rate and magnitude of circumferential and appositional trabecular osseointegration of PEEK and titanium implants versus uncoated controls. PURPOSE: Using an in vivo ovine model, the current investigation compared titanium plasma-sprayed PEEK and titanium dowels versus nonplasma-sprayed dowels. Using a time course study of 6 and 12 weeks postoperatively, experimental assays to quantify osseointegration included micro-computed tomography (microCT), biomechanical testing, and histomorphometry. STUDY DESIGN/SETTING: In-vivo ovine model. METHODS: Twelve skeletally mature crossbred sheep were equally randomized into postoperative periods of 6 and 12 weeks. Four types of dowel implants-PEEK, titanium plasma-sprayed PEEK (TPS PEEK), titanium, and titanium plasma-sprayed titanium (TPS titanium) were implanted into cylindrical metaphyseal defects in the distal femurs and proximal humeri (one defect per limb, n=48 sites). Sixteen nonoperative specimens (eight femurs and eight humeri) served as zero time-point controls. Half of the specimens underwent destructive biomechanical pullout testing and the remaining half quantitative microCT to quantify circumferential bone volume within 1 mm and 2 mm of the implant surface and histomorphometry to compute direct trabecular apposition. RESULTS: There were no intra- or perioperative complications. The TPS-coated implants demonstrated significantly higher peak loads at dowel pullout at 6 and 12 weeks compared with uncoated controls (p<.05). No differences were observed across dowel treatments at the zero time-point (p>.05). MicroCT results exhibited no significant differences in circumferential osseointegration between implants within 1 mm or 2 mm of the dowel surface (p>.05). Direct appositional osseointegration of trabecular bone based on histomorphometry was higher for TPS-coated groups, regardless of base material, compared with uncoated treatments at both time intervals (p<.05). CONCLUSIONS: The current in vivo study demonstrated the biological and mechanical advantages of plasma spray coatings. TPS improved histological incorporation and peak force required for implant extraction. CLINICAL SIGNIFICANCE: Plasma spray coatings may offer clinical benefit by improving biological fixation and osseointegration within the first 6 to 12 weeks postoperatively- the critical healing period for implant-based arthrodesis procedures.


Assuntos
Benzofenonas , Cetonas , Osseointegração , Polímeros , Animais , Ovinos , Cetonas/química , Titânio/química , Éter , Microtomografia por Raio-X , Etil-Éteres , Éteres , Materiais Revestidos Biocompatíveis/química
2.
Instr Course Lect ; 71: 413-425, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35254798

RESUMO

Vertebral body tethering is a nonfusion technique for the surgical correction of adolescent idiopathic scoliosis. For skeletally immature patients for whom vertebral body tethering is indicated, it is an alternative option to the gold standard posterior spinal fusion (PSF) and may at least partially preserve motion in instrumented segments of the spine. Benefits of the procedure include the possibility of avoiding the long-term sequelae of PSF such as adjacent segment disease and proximal junctional kyphosis. Recent retrospective case series of vertebral body tethering have shown promising results with correction rates up to 70% but greater variability in outcomes compared with PSF. The complication profile of the procedure also appears to differ from PSF with tether breakage and overcorrection as primary concerns in addition to approach-related complications. Although early outcomes have been promising, additional studies to optimize surgical timing, long-term outcomes, and the possible role of tethering in the more skeletally mature patient are required.


Assuntos
Escoliose , Corpo Vertebral , Adolescente , Humanos , Estudos Retrospectivos , Escoliose/cirurgia , Fusão Vertebral/métodos , Vértebras Torácicas/cirurgia , Resultado do Tratamento , Corpo Vertebral/cirurgia
3.
Arthrosc Sports Med Rehabil ; 2(5): e469-e473, 2020 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-33134982

RESUMO

PURPOSE: To compare gap displacement at various intervals of cyclic testing and biomechanical load to failure of a Krackow patellar tendon repair augmented with high-strength suture tape versus the standard Krackow transosseous technique for inferior pole patellar tendon rupture. METHODS: Twelve matched pairs of cadaveric knees were used (8 males and 4 females; mean age 79.6 years, range 57 to 96). An inferior pole patellar tendon rupture was simulated after random assignment of specimens in each pair to the standard or augmented Krackow technique. Each specimen was then repetitively cycled from 90° to 5° for 1,000 cycles. A differential variable reluctance transducer was used to measure gap displacement. After cyclic loading, load to failure was determined by pulling the tendon at a rate of 15 mm/s until a sudden decrease in load occurred. RESULTS: Compared with the control repair, specimens with augmented repair demonstrated significantly less displacement at all testing intervals up to 1,000 cycles (P < .05). Two patellar tendons failed before the end of cyclic loading, and 4 specimens had inadequate tendon length for loading. Among the 18 remaining specimens, no significant difference in load to failure was observed between the experimental group (n = 11) and the control group (n = 7) (1,006.5 ± 332.1 versus 932.8 ± 229.1 N, respectively; P = .567). CONCLUSIONS: Significantly greater gap displacement was observed in the standard Krackow repair group compared with the augmented Krackow group at all cyclic loading intervals. This suggests that the Krackow transosseous procedure augmented with high-strength suture tape is biomechanically viable for inferior pole patellar tendon repair. CLINICAL RELEVANCE: This biomechanical study supports the use of high-strength suture tape augmentation of Krackow transosseous repair for inferior pole patellar tendon rupture.

4.
J Foot Ankle Surg ; 59(2): 286-290, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32130992

RESUMO

The Ottawa ankle rules (OAR) indicate that any patient with the inability to ambulate up to four steps or with tenderness at either malleoli should receive diagnostic imaging for an acute ankle injury. Current trends indicate that health care providers tend to order more images in practice than necessary according to OAR. The purpose of this study is to analyze OAR in geriatric versus nongeriatric patients. Secondarily, we hope to refine these guidelines for ankle imaging in the hopes that health care providers will be comfortable in adhering to these guidelines more strictly. A retrospective chart review was conducted of 491 adult patients with an average (± standard deviation) age of 54.4 ± 21.6 years (range 18 to 96). Applying the current OAR resulted in a sensitivity of 98.2% and a specificity of 58.6% in this entire cohort. The calculated sensitivities were comparable between the nongeriatric and geriatric cohorts, at 98.60% and 97.99%, respectively. The specificities varied between the nongeriatric and geriatric cohorts, at 60.13% and 33.33%. We propose new guidelines that would mandate imaging studies for any patient ≥65 years of age presenting to the emergency department with ankle pain. When applying these proposed guidelines, the sensitivity of the entire study population was found to be improved to 99.0%, whereas the specificity dropped to 56.7%. The slight decrease in specificity was deemed acceptable because these guidelines are meant to be used as a screening tool and because the risk of OAR not correctly identifying ankle fracture (2% of geriatric fractures) was completely mitigated in the geriatric population.


Assuntos
Envelhecimento , Fraturas do Tornozelo/diagnóstico , Traumatismos do Tornozelo/diagnóstico , Articulação do Tornozelo/diagnóstico por imagem , Serviço Hospitalar de Emergência/estatística & dados numéricos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Curva ROC , Estudos Retrospectivos , Adulto Jovem
5.
Am J Sports Med ; 47(6): 1294-1301, 2019 05.
Artigo em Inglês | MEDLINE | ID: mdl-30995074

RESUMO

BACKGROUND: The use of artificial turf in American football continues to grow in popularity, and the effect of these playing surfaces on athletic injuries remains controversial. Knee injuries account for a significant portion of injuries in the National Collegiate Athletic Association (NCAA) football league; however, the effect of artificial surfaces on knee injuries remains ill-defined. HYPOTHESIS: There is no difference in the rate or mechanism of knee ligament and meniscal injuries during NCAA football events on natural grass and artificial turf playing surfaces. STUDY DESIGN: Descriptive epidemiology study. METHODS: The NCAA Injury Surveillance System Men's Football Injury and Exposure Data Sets for the 2004-2005 through 2013-2014 seasons were analyzed to determine the incidence of anterior cruciate ligament (ACL), posterior cruciate ligament (PCL), medial collateral ligament (MCL), medial meniscus, and lateral meniscal tear injuries. Injury rates were calculated per 10,000 athlete exposures, and rate ratios (RRs) were used to compare injury rates during practices and competitions on natural grass and artificial turf in NCAA football as a whole and by competition level (Divisions I, Divisions II and III). Mechanisms of injury were calculated for each injury on natural grass and artificial turf surfaces. RESULTS: A total of 3,009,205 athlete exposures and 2460 knee injuries were reported from 2004 to 2014: 1389 MCL, 522 ACL, 269 lateral meniscal, 164 medial meniscal, and 116 PCL. Athletes experienced all knee injuries at a significantly higher rate when participating in competitions as compared with practices. Athletes participating in competitions on artificial turf experienced PCL injuries at 2.94 times the rate as those playing on grass (RR = 2.94; 95% CI, 1.61-5.68). When stratified by competition level, Division I athletes participating in competitions on artificial turf experienced PCL injuries at 2.99 times the rate as those playing on grass (RR = 2.99; 95% CI, 1.39-6.99), and athletes in lower NCAA divisions (II and III) experienced ACL injuries at 1.63 times the rate (RR = 1.63; 95% CI, 1.10-2.45) and PCL injuries at 3.13 times the rate (RR = 3.13; 95% CI, 1.14-10.69) on artificial turf as compared with grass. There was no statistically significant difference in the rate of MCL, medial meniscal, or lateral meniscal injuries on artificial turf versus grass when stratified by event type or level of NCAA competition. No difference was found in the mechanisms of knee injuries on natural grass and artificial turf. CONCLUSION: Artificial turf is an important risk factor for specific knee ligament injuries in NCAA football. Injury rates for PCL tears were significantly increased during competitions played on artificial turf as compared with natural grass. Lower NCAA divisions (II and III) also showed higher rates of ACL injuries during competitions on artificial turf versus natural grass.


Assuntos
Futebol Americano/lesões , Traumatismos do Joelho/epidemiologia , Poaceae , Ligamento Cruzado Anterior , Lesões do Ligamento Cruzado Anterior/epidemiologia , Atletas , Traumatismos em Atletas/epidemiologia , Humanos , Incidência , Masculino , Meniscos Tibiais , Ligamento Cruzado Posterior/lesões , Fatores de Risco , Estações do Ano , Lesões do Menisco Tibial/epidemiologia , Estados Unidos/epidemiologia , Universidades
6.
Eur J Orthop Surg Traumatol ; 29(6): 1319-1323, 2019 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-30963325

RESUMO

INTRODUCTION: Opioids are commonly used for post-operative pain control. It is known that diabetic patients with ankle fractures will experience prolonged healing, higher risk of hardware failure, and an increased risk of infection. However, the opioid requirements amongst this patient cohort have not been previously evaluated. Thus, the purpose of this study is to retrospectively compare opioid utilization amongst ankle fracture patients with and without diabetes mellitus (DM). METHODS: An IRB approval was obtained for the retrospective review of patients who presented with an ankle fracture and underwent surgery between November 2013 and January 2017. A total of 180 patients (144 without DM, 36 with DM) with a mean age of 50 years (± 18 years) were included. Opioid consumption was quantified utilizing a morphine-milliequivalent conversion algorithm. A repeated measures ANOVA was conducted to compare opioid consumption. A two-tailed p value of 0.05 was set as the threshold for statistical significance. RESULTS: Repeated measures ANOVA revealed a statistically significant decrease in total opioid consumption during the 4-month duration (p < 0.001). The model demonstrated a mean difference in opioid consumption of - 214.3 morphine meq between the patients without and with DM (p = 0.022). Post hoc pair-wise comparison revealed less opioid consumption amongst non-diabetic patients at 2 (- 418.5 Meq; p = 0.009), 3 months (- 355.6 Meq; p = 0.021), and 4 months (- 152.6 Meq; p = 0.006) after surgery. CONCLUSION: Our study revealed increased opioid consumption amongst diabetic patients who are treated surgically for ankle fractures. With increasing efforts aimed at reducing opioid administration, orthopaedic surgeons should be aware of higher opioid consumption amongst this patient cohort. Further studies are needed to verify the results of this study.


Assuntos
Analgésicos Opioides , Fraturas do Tornozelo/cirurgia , Diabetes Mellitus/epidemiologia , Fixação de Fratura/efeitos adversos , Dor Pós-Operatória/tratamento farmacológico , Complicações Pós-Operatórias , Analgésicos Opioides/administração & dosagem , Analgésicos Opioides/efeitos adversos , Fraturas do Tornozelo/epidemiologia , Comorbidade , Revisão de Uso de Medicamentos , Feminino , Fixação de Fratura/métodos , Humanos , Masculino , Pessoa de Meia-Idade , Manejo da Dor/métodos , Manejo da Dor/estatística & dados numéricos , Complicações Pós-Operatórias/diagnóstico , Complicações Pós-Operatórias/etiologia , Uso Excessivo de Medicamentos Prescritos/prevenção & controle , Estudos Retrospectivos
7.
J Am Coll Surg ; 214(4): 717-23; discussion 723-5, 2012 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-22463915

RESUMO

BACKGROUND: Laparoscopic-assisted hepatic resection (LAHR) has been described as a safe and reliable means of liver resection for tumors or live-donor hepatectomy. Here we compare the outcomes in paired cohorts between patients undergoing open hepatic resection (OHR) and LAHR. STUDY DESIGN: Two hundred and twelve patients who underwent either OHR or LAHR from March 2004 to July 2011 were analyzed to assess outcomes. During this time period, 124 patients underwent OHR and 88 underwent LAHR. Demographic and outcomes data were assessed. RESULTS: In the total patient cohort, mean age found in both surgical arms was similar, as was the mean BMI. In addition, there was no difference in the cohort between those who underwent either minor or major hepatic resections (p = 0.52). Operatively, in the OHR arm the mean duration of the operation was 234 minutes and comparable with LAHR at 238 minutes (p = 0.75). There was also no difference in the mean lesion size in the OHR (5.72 cm) and LAHR (5.37 cm) groups (p = 0.55). Notably, there was no difference in the complication incidence rates, which were 10.5% (OHR) and 6.8% (LAHR) (p = 0.59). However, when analyzing for length of stay, there was a significant difference between the 2 arms; patients in OHR arm had longer stays than those in the LAHR arm (7.59 days vs 6.30 days, respectively; mean difference 1.29 days; 95% CI, 0.08-2.5; p = 0.036). CONCLUSIONS: Although reduced surgical pain, improved cosmesis, and shortened hospital stays have been shown to correlate with laparoscopic abdominal procedures, our study indicates these marked advantages are also conferred to those undergoing LAHR. In addition, these findings demonstrate the use of LAHR and highlight the need for the addition of this technique to the liver surgeon's skill set.


Assuntos
Hepatectomia/métodos , Laparoscopia , Estudos de Coortes , Feminino , Humanos , Tempo de Internação/estatística & dados numéricos , Neoplasias Hepáticas/cirurgia , Doadores Vivos , Masculino , Pessoa de Meia-Idade , Avaliação de Resultados em Cuidados de Saúde , Complicações Pós-Operatórias/epidemiologia , Estudos Retrospectivos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...