Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 31
Filter
1.
Cureus ; 16(4): e58218, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38745797

ABSTRACT

STUDY DESIGN: This study is an ambispective evaluation and analysis of a single-center cohort. OBJECTIVE: This study aimed to evaluate the performance of a novel biphasic calcium phosphate (BCP) bone graft with submicron-sized needle-shaped surface topography (BCP<µm) in interbody arthrodesis of the lumbar spine. METHODS: This study was a single-center ambispective assessment of adult patients receiving BCP<µm as part of their lumbar interbody fusion surgery. The primary outcome was a fusion status on computed tomography (CT) 12 months postoperative. The secondary outcomes included postoperative changes in the visual analog scale (VAS), Oswestry Disability Index (ODI), Short Form 12 (SF-12), and length of stay (LOS). RESULTS: Sixty-three patients with one- to three-level anterior (48, 76%) and lateral (15, 24%) interbody fusions with posterior instrumentation were analyzed. Thirty-one participants (49%) had three or more comorbidities, including heart disease (43 participants, 68%), obesity (31 participants, 49%), and previous lumbar surgery (23 participants, 37%). The mean ODI decreased by 24. The mean SF-12 physical health and SF-12 mental health improved by a mean of 11.5 and 6.3, respectively. The mean VAS for the left leg, right leg, and back improved by a mean of 25.75, 22.07, and 37.87, respectively. Of 101 levels, 91 (90%) demonstrated complete bridging trabecular bone fusion with no evidence of supplemental fixation failure. CONCLUSION: The data of BCP<µm in interbody fusions for degenerative disease of the lumbar spine provides evidence of fusion in a complicated cohort of patients.

2.
Int J Spine Surg ; 17(2): 230-240, 2023 Apr.
Article in English | MEDLINE | ID: mdl-37028803

ABSTRACT

BACKGROUND: Over the past 20 years, multiple randomized controlled trials have shown cervical disc arthroplasty (CDA) to be safe and effective for treating 1- and 2-level degenerative disc disease (DDD). The purpose of this postmarket study is to compare 10-year outcomes between CDA and anterior cervical discectomy and fusion (ACDF) from a randomized study at 3 centers. METHODS: This study was a continuation of a randomized, prospective, multicenter clinical trial comparing CDA with the Mobi-C cervical disc (Zimmer Biomet) vs ACDF. Following completion of the 7-year US Food and Drug Administration study, 10-year follow-up was obtained from consenting patients at 3 high-enrolling centers. The clinical and radiographic endpoints collected at 10 years included composite success, Neck Disability Index, neck and arm pain, short form-12, patient satisfaction, adjacent-segment pathology, major complications, and subsequent surgery. RESULTS: A total of 155 patients were enrolled (105 CDA; 50 ACDF). Follow-up was obtained from 78.1% of patients eligible after 7 years. At 10 years, CDA demonstrated superiority to ACDF. Composite success was 62.4% in CDA and 22.2% in ACDF (P < 0.0001). The cumulative risk of subsequent surgery at 10 years was 7.2% vs 25.5% (P = .001), and the risk of adjacent-level surgery was 3.1% vs 20.5% (P = .0005) in CDA vs ACDF, respectively. The progression to radiographically significant adjacent-segment pathology at 10 years was lower in CDA vs ACDF (12.9% vs 39.3%; P = 0.006). At 10 years, patient-reported outcomes and change from baseline were generally better in CDA patients. A higher percentage of CDA patients reported they were "very satisfied" at 10 years (98.7% vs 88.9%; P = 0.05). CONCLUSIONS: In this postmarket study, CDA was superior to ACDF for treating symptomatic cervical DDD. CDA was statistically superior to ACDF for clinical success, subsequent surgery, and neurologic success. Results through 10 years demonstrate that CDA continues to be a safe and effective surgical alternative to fusion. CLINICAL RELEVANCE: The results of this study support the long-term safety and effectiveness of cervical disc arthroplasty with the Mobi-C.

3.
Int J Spine Surg ; 15(6): 1103-1114, 2021 Dec.
Article in English | MEDLINE | ID: mdl-35086867

ABSTRACT

BACKGROUND: Interbody fusion is a widely utilized and accepted procedure to treat advanced debilitating lumbar degenerative disc disease (DDD). Increasingly, surgeons are seeking interbody devices that are large for stability and grafting purposes but can be inserted with less invasive techniques. To achieve these contrary objectives a novel, conformable mesh interbody fusion device was designed to be placed in the disc space through a small portal and filled with bone graft in situ to a large size. This design can reduce the risk of trauma to surrounding structures while creating a large graft footprint that intimately contours to the patient's own anatomy. The purpose of this Investigational Device Exempt (IDE) trial was to evaluate the perioperative and long-term results of this novel conformable mesh interbody fusion device. METHODS: This investigation is a prospective, multicenter, single-arm, Food and Drug Administration and Institutional Review Board-approved IDE, performance goal trial. A total of 102 adults presenting with DDD at a single level between L2 and S1 and unresponsive to 6 months conservative care had instrumented lumbar interbody fusion. Validated assessment tools include 100 mm visual analog scale for pain, Oswestry Disability Index (ODI) for function, single question survey for patient satisfaction, and computed tomography (CT) scan for fusion. Patients were enrolled across 10 geographically distributed sites. Pain/ODI surveys, physical evaluations, and imaging were performed serially through 24 months. Specifically, CT was performed at 12 and, if not fused, 24 months. Independent radiologists assessed CTs for fusion. An independent committee adjudicated adverse events. Patients with complete data at 24 months were included in the analysis. RESULTS: Ninety-six (96, 94% follow-up rate) patients (57.0 ± 12.0 years, 50.0% female, Body Mass Index 30.6 ± 4.9) reported average decreased low back pain from baseline of 45.0 ± 26.6 at 6 weeks and 51.4 ± 26.2 at 24 months. Right/left leg pain reduced by 28.9 ± 36.7/37.8±32.4 at 6 weeks and 30.5±33.0/40.3 34.6 at 24 months. Mean ODI improved 17.1 ± 18.7 from baseline to 6 weeks and 32.0 ± 18.5 by 24 months. At 24 months, 91.7% of patients rated their procedure as excellent/good. Fusion rates were 97.9% (94/96) at 12 months, and 99% (95/96) at 24 months. Mean operative time, estimated blood loss, and length of stay were 2.6 ± 0.9 hours, 137 ± 217 mL, and 2.3 ± 1.2 days, respectively. No device-related serious adverse events have occurred. CONCLUSIONS: Clinically significant outcomes for pain, function, fusion, and device safety were demonstrated in this population. Substantial clinical improvements occur by 6 weeks postoperative and continue to improve to 24 months. The successful outcomes observed in this trial support use of this novel device in an instrumented lumbar interbody fusion. LEVEL OF EVIDENCE: 3. CLINICAL RELEVANCE: This reports substantiates that the preliminary 1-year findings published earlier for this investigation are confirmed and the fusion rates and that patient improvements reported are sustained through 2 years.

4.
Int J Spine Surg ; 14(s3): S108-S114, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33122177

ABSTRACT

INTRODUCTION: The objectives of this paper were to identify and explain specific design factors for lumbar interbody fusion (IBF) devices that can influence bone exchange and stability at the vertebral endplate interface and to provide supporting evidence of these factors through both laboratory and clinical data. The laboratory study (Part 1) compared the pressure profiles and contact areas for a minimally invasive, expandable, and conformable porous mesh (CPM) IBF device and a rigid monolithic lateral PEEK cage (LPC). Furthermore, to demonstrate how these laboratory results translate clinically, a quantitative and qualitative assessment of subject x-rays and computed tomography (CT) scans from a US Food and Drug Administration (FDA) investigational device exemption (IDE) trial of the CPM was performed (Part 2). METHODS: Part 1: Load profile testing. Either CPM or LPC was sandwiched between 2 flat or shaped Grade 15 foam blocks. Each implant type was compressed at a rate of 0.1 mm/s for 3 loads (1100, 2000, or 3000 N). Device and bone graft contact area were analyzed for each test condition, and corresponding load profiles were quantified and mapped using pressure film. Part 2: Radiographic fusion assessment. Two independent radiologists analyzed 12- and 24-month motion studies and CTs for fusion, defined as bridging bone across the intervertebral space. The same CTs were assessed for qualitative biomechanical signs of bone healing. RESULTS: CPM demonstrated significant direct loading on the bone graft across all tested loading conditions, while the LPC graft registered a negligible amount of pressure at only the extreme load of 3000 N. Contact area was in turn statistically greater (P < .05) for CPM. CPM fusion rates were 97.9% and 99% at 12 and 24 months, respectively. Radiographic signs of bone healing are described in terms of radiating bone struts and regions of greater intensity. CONCLUSIONS: CPM allows for an optimized contact area for bone exchange and graft incorporation. The load profiles demonstrate widespread load sharing across the device. The expandable, compliant, porous mesh provides a unique area for bone exchange, contributing to qualitative biomechanical radiographic evidence of bone healing that ultimately leads to clinically acceptable fusion rates as observed in the FDA IDE trial.

5.
Int J Spine Surg ; 14(s3): S121-S132, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33122180

ABSTRACT

BACKGROUND: Extended polyethylene terephthalate mesh (PET, Dacron) can provide containment of compressed particulate allograft and autograft. This study assessed if PET mesh would interfere with osteoprogenitor cell migration from vertebral plates through particulate graft, and its effect on osteoblast differentiation or the quality of bone forming within fusing vertebra during vertebral interbody fusion. METHODS: The impact of PET mesh on the biological response of normal human osteoblasts (NHOst cells) and bone marrow stromal cells (MSCs) to particulate bone graft was examined in vitro. Cells were cultured on rat bone particles +/- mesh; proliferation and osteoblast differentiation were assessed. The interface between the vertebral endplate, PET mesh, and newly formed bone within consolidated allograft contained by mesh was examined in a sheep model via microradiographs, histology, and mechanical testing. RESULTS: Growth on bone particles stimulated proliferation and early differentiation of NHOst cells and MSCs, but delayed terminal differentiation. This was not negatively impacted by mesh. New bone formation in vivo was not prevented by use of a PET mesh graft containment device. Fusion was improved in sites containing allograft/demineralized bone matrix (DBM) versus autograft and was further enhanced when stabilized using pedicle screws. Only sites treated with allograft/DBM+screws exhibited greater percent bone ingrowth versus discectomy or autograft. These results were mirrored biomechanically. CONCLUSIONS: PET mesh does not negatively impact cell attachment to particulate bone graft, proliferation, or initial osteoblast differentiation. The results demonstrated that bone growth occurs from vertebral endplates into graft material within the PET mesh. This was enhanced by stabilization with pedicle screws leading to greater bone ingrowth and biomechanical stability across the fusion site. CLINICAL RELEVANCE: The use of extended PET mesh allows containment of bone graft material during vertebral interbody fusion without inhibiting migration of osteoprogenitor cells from vertebral end plates in order to achieve fusion. LEVEL OF EVIDENCE: 5.

6.
Int J Spine Surg ; 14(s3): S115-S120, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33122181

ABSTRACT

BACKGROUND: A successful intervertebral fusion requires biomechanical stability created by the structural support of the interbody device and loading of the bone graft material to accelerate mechanotransduction and bone remodeling. The objective of this study was to generate a quantitative map of the contact area and stress profile for 2 implant designs; a rigid monolithic polyetheretherketone (PEEK) lateral cage (MPLC), and a unique hybrid interbody design, which includes PEEK terminal supports surrounding an expandable porous mesh (P+EPM) that serves to contain bone graft. METHODS: The construct for each test consisted of a device sandwiched between 2 flat or shaped Grade 15 foam blocks. Pressure sensitive film and thin film sensors were placed between the device and each of the foam blocks. A series of each implant type was compressed at a rate 0.1 mm/second for 2 loads (1100 N and 2000 N) with and without bone graft. Device and bone graft contact area were analyzed for each test condition and corresponding load profiles were quantified and mapped. RESULTS: P+EPM demonstrated 34% greater graft volume than MPLC resulting in a 28% larger area for bone exchange when filled. The load profiles for all applied loading paradigms for P+EPM demonstrated significant direct loading on the bone graft contained within the mesh, resulting in at least 170% greater loaded area than MPLC. Furthermore, the P+EPM demonstrated load sharing with the terminal PEEK supports. MPLC for all loading conditions demonstrated negligible bone graft loading. CONCLUSIONS: P+EPM allows for an optimized contact area for bone exchange and graft incorporation. The load profiles confirmed that the filled mesh does not stress shield terminal PEEK supports and will load share. The expandable, compliant, porous mesh provides a greater multiplanar area for bone exchange and allows for direct contact with the viscoelastic vertebral endplates, improving the endplate and graft interface mechanics.

7.
Int J Spine Surg ; 14(3): 269-277, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32699747

ABSTRACT

BACKGROUND: Adjacent segment pathology (ASP) remains a concern following treatment with cervical disc arthroplasty (CDA) and anterior cervical discectomy and fusion (ACDF). Radiographic ASP (RASP) is ASP identified on imaging, which may or may not include clinical symptoms. The risk factors for development of RASP and its clinical effects remain controversial. In part 1 of a 2-part publication we evaluate the incidence and predictors of RASP as well as determine whether any association exists between RASP and patient-reported outcomes (PROs). METHODS: Data were prospectively collected during a US Food and Drug Administration randomized, multicenter, investigational device exemption trial comparing CDA (Mobi-C; Zimmer Biomet, Westminster, CO) with ACDF. Multiple post hoc analyses were conducted on RASP as it related to demographics and patient outcomes. Kaplan-Meier estimates of time to Kellgren-Lawrence (K-L) grade 3/4 were calculated separately for all groups. Multivariate Cox proportional hazard models were used analyze whether RASP was associated with patient preoperative demographic characteristics and preoperative and postoperative radiographic characteristics. The association of RASP with PROs was analyzed using generalized estimating equations and matched, retrospective cohort analysis. RESULTS: The incidence of grade 3/4 RASP was lower for patients treated with CDA when initial treatment was at 1 level (27% vs 47%, P < .0001) and at 2 levels (14% vs 49%, P < .0001). Kaplan-Meier estimates indicated significantly lower probability of grade 3/4 RASP over time for patients receiving CDA (P < .001). Treatment with ACDF, treatment of 1 level, higher age, body mass index, higher preoperative physical components score, and a lower Cobb angle were associated with elevated risk of grade 3/4 RASP. CDA was shown to be more effective than ACDF (64.4%; 95% CI = 50.9, 74.2; P < .0001) at preventing RASP. CONCLUSIONS: The incidence and risk of RASP is decreased when patients are treated with CDA compared with ACDF. Although the mechanism of CDA that generates this protective effect is not understood, PROs remain unaffected through 7 years despite changes in RASP.

8.
Int J Spine Surg ; 14(3): 278-285, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32699748

ABSTRACT

BACKGROUND: Adjacent segment pathology (ASP) following cervical disc arthroplasty (CDA) or anterior cervical discectomy and fusion (ACDF) is identified by imaging (RASP) or clinical symptoms (CASP). Clinical symptoms of CASP have been broadly defined, but subsequent adjacent-level surgeries are clear indicators of CASP. Current literature remains inconsistent in the incidence and potential predictors of CASP. Here, we will evaluate a robust data set for the incidence of CASP resulting in subsequent surgery, attempt to identify factors that might affect CASP, and analyze the association of CASP with patient-reported outcomes (PROS) and RASP. METHODS: Data were prospectively collected during a US Food and Drug Administration randomized, multicenter, investigational device exemption trial comparing CDA (Mobi-C, Zimmer Biomet, Westminster, CO) with ACDF. CASP was defined as any adjacent-level subsequent surgical intervention. Post hoc analyses were conducted on the incidence, time to CASP diagnosis, and relationship of CASP with patient demographics. Longitudinal retrospective case-control analysis was used to assess the correlation of CASP to PROs and radiographic adjacent segment pathology (RASP). RESULTS: Kaplan-Meier estimates indicated significantly lower probability of CASP over time for 1-level (P = .002) and 2-level (P = .008) CDA patients. Treatment with ACDF and younger age were associated with higher CASP risk. CDA was more effective than ACDF (70.5%; 95% CI = 45.1, 84.2; P < .0001) at preventing CASP. Case-control analysis indicated increased probability of CASP for patients with grade 3/4 RASP, but the difference was not statistically significant. When we pooled CASP patients, the median grade of RASP at the visit prior to surgery was 1, with only 6 patients presenting with grade 3/4 RASP. CONCLUSIONS: Patients treated with CDA have a lower incidence of CASP than do patients treated with ACDF, although the mechanism remains unclear. CASP and RASP remain uncorrelated in this large data set, but other predictive variables such as treatment, age, and number of levels should be further investigated.

9.
Int J Spine Surg ; 12(3): 352-361, 2018 Jun.
Article in English | MEDLINE | ID: mdl-30276092

ABSTRACT

BACKGROUND: Heterotopic ossification (HO) is a known risk following cervical total disc replacement (CTDR) surgery, but the cause and effect of HO are not well understood. Reported HO rates vary, and few studies are specifically designed to report HO. The effects on outcomes, and the risk factors for the development of HO have been hypothesized and reported in small-population, retrospective analyses, using univariate statistics. METHODS: Posthoc, multiple-phase analysis of radiographic, clinical, and demographic data for CTDR as it relates to HO was performed. HO was radiographically graded for 164 one-level and 225 two-level CTDR patients using the McAfee and Mehren system. Analysis was performed to correlate HO grades to clinical outcomes and to evaluate potential risk factors for the development of HO using demographics and baseline clinical measures. RESULTS: At 7 years, 1-level clinically relevant HO grades were 17.6% grade 3 and 11.1% grade 4. Two-level clinically relevant HO grades, evaluated using the highest patient grade, were 26.6% grade 3 and 10.8% grade 4. Interaction between HO and time revealed significance for neck disability index (NDI; P = .04) and Visual Analog Scale (VAS) neck pain (P = .02). When analyzed at each time point NDI was significant at 48-84 months and VAS neck at 60 months. For predictors 2 analyses were run; odds ratios indicated follow-up visit, male sex, and preoperative VAS neck pain are related to HO development, whereas hazard ratios indicated male sex, obesity, endplate coverage, levels treated, and preoperative VAS neck pain. CONCLUSIONS: This is the largest study to report HO rates, and related outcomes and risk factors. To develop an accurate predictive model, further large-scale analyses need to be performed. Based on the results reported here, clinically relevant HO should be more accurately described as motion-restricting HO until a definitive link to outcomes has been established.

10.
Neurosurgery ; 83(6): 1087-1106, 2018 12 01.
Article in English | MEDLINE | ID: mdl-29325074

ABSTRACT

Cervical total disc replacement (cTDR) is still considered a developing technology, with widespread clinical use beginning in the early 2000s. Despite being relatively new to the marketplace, the literature surrounding cTDR is abundant. We conducted a thorough review of literature published in the United States (US) and outside the US to report the current global state of cTDR research and clinical use. Search criteria were restricted to publications with a clinical patient population, excluding finite element analyses, biomechanical studies, cadaver studies, surgical technique-specific papers, and case studies. US publications mostly encompass the results of the highly controlled Food and Drug Administration Investigational Device Exemption trials. The predominantly level I evidence in the US literature supports the use of cTDR at 1 and 2 surgical levels when compared to anterior cervical discectomy and fusion. In general, the outside the US studies typically have smaller patient populations, are rarely controlled, and include broader surgical indications. Though these studies are of lower levels of evidence, they serve to advance patient indications in the use of cTDR. Complications such as secondary surgery, heterotopic ossification, and adjacent segment degeneration also remain a focus of studies. Other external challenges facing cTDR technology include regulatory restrictions and health economics, both of which are beginning to be addressed. Combined, the evidence for cTDR is robust supporting a variety of clinical indications.


Subject(s)
Total Disc Replacement/methods , Cervical Vertebrae/surgery , Humans , Intervertebral Disc Degeneration/surgery , United States
11.
Neurosurg Focus ; 43(6): E11, 2017 Dec.
Article in English | MEDLINE | ID: mdl-29191102

ABSTRACT

OBJECTIVE The aim of this study was to educate medical professionals about potential financial impacts of improper diagnosis-related group (DRG) coding in adult spinal deformity (ASD) surgery. METHODS Medicare's Inpatient Prospective Payment System PC Pricer database was used to collect 2015 reimbursement data for ASD procedures from 12 hospitals. Case type, hospital type/location, number of operative levels, proper coding, length of stay, and complications/comorbidities (CCs) were analyzed for effects on reimbursement. DRGs were used to categorize cases into 3 types: 1) anterior or posterior only fusion, 2) anterior fusion with posterior percutaneous fixation with no dorsal fusion, and 3) combined anterior and posterior fixation and fusion. RESULTS Pooling institutions, cases were reimbursed the same for single-level and multilevel ASD surgery. Longer stay, from 3 to 8 days, resulted in an additional $1400 per stay. Posterior fusion was an additional $6588, while CCs increased reimbursement by approximately $13,000. Academic institutions received higher reimbursement than private institutions, i.e., approximately $14,000 (Case Types 1 and 2) and approximately $16,000 (Case Type 3). Urban institutions received higher reimbursement than suburban institutions, i.e., approximately $3000 (Case Types 1 and 2) and approximately $3500 (Case Type 3). Longer stay, from 3 to 8 days, increased reimbursement between $208 and $494 for private institutions and between $1397 and $1879 for academic institutions per stay. CONCLUSIONS Reimbursement is based on many factors not controlled by surgeons or hospitals, but proper DRG coding can significantly impact the financial health of hospitals and availability of quality patient care.


Subject(s)
Congenital Abnormalities/surgery , Costs and Cost Analysis/statistics & numerical data , Diagnosis-Related Groups/statistics & numerical data , Length of Stay/economics , Medicare/economics , Adult , Humans , Inpatients/statistics & numerical data , Length of Stay/statistics & numerical data , United States
12.
Int J Spine Surg ; 10: 12, 2016.
Article in English | MEDLINE | ID: mdl-27162714

ABSTRACT

BACKGROUND: Bone graft material for lumbar fusion was historically autologous bone graft (ABG). In recent years alternatives such as allograft, demineralized bone matrix (DBM), ceramics, and bone morphogenetic protein (BMP) have gained favor, although the complications of these are not fully understood. Bioactive amniotic suspension (BAS) with allograft is a new class of material derived from human amniotic tissue. METHODS: Eligible patients receiving a one or two level lumbar interbody fusion with Nucel, a BAS with allograft, were contacted and scheduled for a mininmim 12 month follow-up visit. Patients were evaluated for fusion using CT's and plain radiographs. Clincal outcomes, including ODI, VAS back and leg were collected, as well as comorbidities including BMI, smoking status, diabetes and previous lumbar surgery. RESULTS: One-level patients (N=38) were 71.1% female with mean age of 58.4 ± 12.7 and mean BMI of 30.6 ± 6.08. Two-level patients (N=34) were 58.8% female with mean age of 49.3 ±10.9 and mean BMI of 30.1 ± 5.82. Kinematic fusion was achieved in 97.4% of one-level patients and 100% of two-level patients. Baseline comorbidities were present in 89.5% of one-level patients and 88.2% of two-level patients. No adverse events related to BAS were reported in this study. CONCLUSION: Fusion status is evaluated with many different biologics and varying methods in the literature. BAS with allograft in this study demonstrated high fusion rates with no complications within a largely comorbid population. Although a small population, BAS with allograft results were encouraging for one and two-level lumbar interbody fusion in this study. Further prospective studies should be conducted to investigate safety and efficacy in a larger population.

13.
Neurosurgery ; 79(1): 135-45, 2016 Jul.
Article in English | MEDLINE | ID: mdl-26855020

ABSTRACT

BACKGROUND: The cervical total disc replacement (cTDR) was developed to treat cervical degenerative disc disease while preserving motion. OBJECTIVE: Cost-effectiveness of this intervention was established by looking at 2-year follow-up, and this update reevaluates our analysis over 5 years. METHODS: Data were derived from a randomized trial of 330 patients. Data from the 12-Item Short Form Health Survey were transformed into utilities by using the SF-6D algorithm. Costs were calculated by extracting diagnosis-related group codes and then applying 2014 Medicare reimbursement rates. A Markov model evaluated quality-adjusted life years (QALYs) for both treatment groups. Univariate and multivariate sensitivity analyses were conducted to test the stability of the model. The model adopted both societal and health system perspectives and applied a 3% annual discount rate. RESULTS: The cTDR costs $1687 more than anterior cervical discectomy and fusion (ACDF) over 5 years. In contrast, cTDR had $34 377 less productivity loss compared with ACDF. There was a significant difference in the return-to-work rate (81.6% compared with 65.4% for cTDR and ACDF, respectively; P = .029). From a societal perspective, the incremental cost-effective ratio (ICER) for cTDR was -$165 103 per QALY. From a health system perspective, the ICER for cTDR was $8518 per QALY. In the sensitivity analysis, the ICER for cTDR remained below the US willingness-to-pay threshold of $50 000 per QALY in all scenarios (-$225 816 per QALY to $22 071 per QALY). CONCLUSION: This study is the first to report the comparative cost-effectiveness of cTDR vs ACDF for 2-level degenerative disc disease at 5 years. The authors conclude that, because of the negative ICER, cTDR is the dominant modality. ABBREVIATIONS: ACDF, anterior cervical discectomy and fusionAWP, average wholesale priceCE, cost-effectivenessCEA, cost-effectiveness analysisCPT, Current Procedural TerminologycTDR, cervical total disc replacementCUA, cost-utility analysisDDD, degenerative disc diseaseDRG, diagnosis-related groupFDA, US Food and Drug AdministrationICER, incremental cost-effectiveness ratioIDE, Investigational Device ExemptionNDI, neck disability indexQALY, quality-adjusted life yearsRCT, randomized controlled trialRTW, return-to-workSF-12, 12-Item Short Form Health SurveyVAS, visual analog scaleWTP, willingness-to-pay.


Subject(s)
Cervical Vertebrae/surgery , Intervertebral Disc Degeneration/surgery , Spinal Fusion/economics , Total Disc Replacement/economics , Cost-Benefit Analysis , Female , Follow-Up Studies , Humans , Male , Middle Aged , Return to Work , Spinal Fusion/methods , Total Disc Replacement/methods , Treatment Outcome
15.
J Neurosurg Spine ; 22(1): 15-25, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25380538

ABSTRACT

OBJECT: The purpose of this study was to evaluate the safety and effectiveness of 2-level total disc replacement (TDR) using a Mobi-C cervical artificial disc at 48 months' follow-up. METHODS: A prospective randomized, US FDA investigational device exemption pivotal trial of the Mobi-C cervical artificial disc was conducted at 24 centers in the U.S. Three hundred thirty patients with degenerative disc disease were randomized and treated with cervical total disc replacement (225 patients) or the control treatment, anterior cervical discectomy and fusion (ACDF) (105 patients). Patients were followed up at regular intervals for 4 years after surgery. RESULTS: At 48 months, both groups demonstrated improvement in clinical outcome measures and a comparable safety profile. Data were available for 202 TDR patients and 89 ACDF patients in calculation of the primary endpoint. TDR patients had statistically significantly greater improvement than ACDF patients for the following outcome measures compared with baseline: Neck Disability Index scores, 12-Item Short Form Health Survey Physical Component Summary scores, patient satisfaction, and overall success. ACDF patients experienced higher subsequent surgery rates and displayed a higher rate of adjacent-segment degeneration as seen on radiographs. Overall, TDR patients maintained segmental range of motion through 48 months with no device failure. CONCLUSIONS: Four-year results from this study continue to support TDR as a safe, effective, and statistically superior alternative to ACDF for the treatment of degenerative disc disease at 2 contiguous cervical levels. Clinical trial registration no.: NCT00389597 ( clinicaltrials.gov ).


Subject(s)
Cervical Vertebrae/surgery , Diskectomy/methods , Intervertebral Disc Degeneration/surgery , Spinal Fusion/methods , Total Disc Replacement/methods , Adult , Cervical Vertebrae/diagnostic imaging , Disability Evaluation , Female , Follow-Up Studies , Humans , Intervertebral Disc Degeneration/diagnostic imaging , Male , Middle Aged , Patient Satisfaction , Prospective Studies , Radiography , Range of Motion, Articular , Total Disc Replacement/instrumentation , Treatment Outcome
16.
JAMA Surg ; 149(12): 1231-9, 2014 Dec.
Article in English | MEDLINE | ID: mdl-25321869

ABSTRACT

IMPORTANCE: Cervical total disc replacement (CTDR) was developed to treat cervical spondylosis, while preserving motion. While anterior cervical discectomy and fusion (ACDF) has been the standard of care for 2-level disease, a randomized clinical trial (RCT) suggested similar outcomes. Cost-effectiveness of this intervention has never been elucidated. OBJECTIVE: To determine the cost-effectiveness of CTDR compared with ACDF. DESIGN, SETTING, AND PARTICIPANTS: Data were derived from an RCT that followed up 330 patients over 24 months. The original RCT consisted of multi-institutional data including private and academic institutions. Using linear regression for the current study, health states were constructed based on the stratification of the Neck Disability Index and a visual analog scale. Data from the 12-item Short-Form Health Survey questionnaires were transformed into utilities values using the SF-6D mapping algorithm. Costs were calculated by extracting Diagnosis-Related Group codes from institutional billing data and then applying 2012 Medicare reimbursement rates. The costs of complications and return-to-work data were also calculated. A Markov model was built to evaluate quality-adjusted life-years (QALYs) for both treatment groups. The model adopted a third-party payer perspective and applied a 3% annual discount rate. Patients included in the original RCT had to be diagnosed as having radiculopathy or myeloradiculopathy at 2 contiguous levels from C3-C7 that was unresponsive to conservative treatment for at least 6 weeks or demonstrated progressive symptoms. MAIN OUTCOMES AND MEASURES: Incremental cost-effectiveness ratio of CTDR compared with ACDF. RESULTS: A strong correlation (R2 = 0.6864; P < .001) was found by projecting a visual analog scale onto the Neck Disability Index. Cervical total disc replacement had an average of 1.58 QALYs after 24 months compared with 1.50 QALYs for ACDF recipients. Cervical total disc replacement was associated with $2139 greater average cost. The incremental cost-effectiveness ratio of CTDR compared with ACDF was $24,594 per QALY at 2 years. Despite varying input parameters in the sensitivity analysis, the incremental cost-effectiveness ratio value stays below the threshold of $50,000 per QALY in most scenarios (range, -$58,194 to $147,862 per QALY). CONCLUSIONS AND RELEVANCE: The incremental cost-effectiveness ratio of CTDR compared with traditional ACDF is lower than the commonly accepted threshold of $50,000 per QALY. This remains true with varying input parameters in a robust sensitivity analysis, reaffirming the stability of the model and the sustainability of this intervention.


Subject(s)
Cervical Vertebrae , Intervertebral Disc Degeneration/surgery , Spinal Fusion/economics , Total Disc Replacement/economics , Cost-Benefit Analysis , Follow-Up Studies , Humans , Intervertebral Disc Degeneration/economics , Quality of Life , Surveys and Questionnaires , Treatment Outcome
17.
Foot Ankle Int ; 33(6): 492-7, 2012 Jun.
Article in English | MEDLINE | ID: mdl-22735322

ABSTRACT

BACKGROUND: Operative treatment of calcaneus fractures is associated with the risk of early wound complications. Though accepted practice dictates surgery should be delayed until soft tissues recover from the initial traumatic insult, optimal timing of surgery has not been delineated. METHODS: A retrospective chart and radiographic review at a level I trauma center was performed to determine if an aggressive inpatient soft tissue management protocol designed to decrease the time delay from injury to surgery is effective at reducing complications. Ninety-seven patients (17 female, 80 male; mean age, 39.7±14.0 years) with 102 calcaneus fractures treated between October 1995 and January 2005 were identified. Differences in complication rates and quality of reduction between the inpatient and outpatient treatment groups were analyzed. Quality of reduction was determined by measuring postoperative Bohler's angle and posterior facet articular step-off. RESULTS: Mean time from injury to surgery was 6.2 days for the inpatient group and 10.8 days for the outpatient group (p<0.0001). The overall complication rate was over twice as high in the outpatient group (27 versus 12%, p=0.04) and the serious complication rate was 6.5 times higher when patients were managed as outpatients (9% versus 1%, p=0.09). With the numbers available, there were no significant differences in the quality of reduction obtained at surgery. CONCLUSION: This study suggests that this inpatient soft tissue management protocol of calcaneal fractures is a feasible treatment option when a patient is kept in the hospital that offers a reduction in postoperative wound complications while enabling surgery 4 days earlier on average.


Subject(s)
Ambulatory Care , Calcaneus/injuries , Calcaneus/surgery , Fractures, Bone/therapy , Hospitalization , Adult , Clinical Protocols , Compression Bandages , Cryotherapy , External Fixators/statistics & numerical data , Female , Fracture Fixation, Internal , Humans , Male , Postoperative Complications , Retrospective Studies , Splints , Time Factors
18.
J Sport Rehabil ; 21(2): 182-5, 2012 May.
Article in English | MEDLINE | ID: mdl-22104040

ABSTRACT

CONTEXT: Electrically induced muscle cramps (EIMC) do not last long enough to study many cramp treatments. Increasing stimulation frequency lengthens cramp duration; it is unknown which frequency elicits the longest EIMC. OBJECTIVE: To determine which stimulation frequency elicits the longest EIMC and whether cramp duration and stimulation frequency are correlated. DESIGN: Randomized, crossover. SETTING: Laboratory. PARTICIPANTS: 20 participants (12 male, 8 female; age 20.7 ± 0.6 y; height 174.9 ± 1.9 cm; mass 76.6 ± 2.2 kg) with a self-reported history of muscle cramps in their lower extremities within the 6 mo before the study. INTERVENTIONS: The dominant leg's tibial nerve was percutaneously stimulated with 2-s-duration electrical stimuli trains starting at a frequency of 4 Hz. After 1 min of rest, stimulation frequency increased in 2-Hz increments until a cramp occurred in the flexor hallucis brevis. The stimulation frequency at which a cramp occurred was termed cramp threshold frequency (TF). Cramp duration was determined using strict clinical criteria (loss of hallux rigidity and return of hallux neutral). On the next 4 consecutive days, participants were stimulated at 5, 10, 15, or 20 Hz above TF, and cramp duration was reassessed. MAIN OUTCOME MEASURES: Cramp TF and duration. RESULTS: Cramp TF was 16.9 ± 5.1 Hz. Cramp duration was longer at 15 and 20 Hz above TF (77.9 ± 37.6 s and 69.5 ± 36.9 s, respectively) than at TF (40.8 ± 34.0 s; P < .05). Cramp duration and TF were highly correlated (r = .90). CONCLUSIONS: Stimulating at 15 and 20 Hz above cramp TF produces the longest-lasting EIMC.


Subject(s)
Electric Stimulation/adverse effects , Muscle Cramp/etiology , Muscle, Skeletal/physiology , Cross-Over Studies , Electric Stimulation/methods , Electromyography , Female , Foot , Humans , Male , Muscle Contraction/physiology , Muscle Cramp/physiopathology , Muscle, Skeletal/physiopathology , Tibial Nerve , Time Factors , Young Adult
19.
J Sports Sci ; 28(4): 399-405, 2010 Feb.
Article in English | MEDLINE | ID: mdl-20131142

ABSTRACT

Though clinical observations and laboratory data provide some support for the neuromuscular imbalance theory of the genesis of exercise-associated muscle cramps, no direct evidence has been published. The purpose of this study was to determine the effect of local muscle fatigue on the threshold frequency of an electrically induced muscle cramp. To determine baseline threshold frequency, a cramp was electrically induced in the flexor hallucis brevis of 16 apparently healthy participants (7 males, 9 females; age 25.1 +/- 4.8 years). The testing order of control and fatigue conditions was counterbalanced. In the control condition, participants rested in a supine position for 30 min followed by another cramp induction to determine post-threshold frequency. In the fatigue condition, participants performed five bouts of great toe curls at 60% one-repetition maximum to failure with 1 min rest between bouts followed immediately by a post-threshold frequency measurement. Repeated-measures analysis of variance and simple main effects testing showed post-fatigue threshold frequency (32.9 +/- 11.7 Hz) was greater (P < 0.001) than pre-fatigue threshold frequency (20.0 +/- 7.7 Hz). An increase in threshold frequency seems to demonstrate a decrease in one's propensity to cramp following the fatigue exercise regimen used. These results contradict the proposed theory that suggests cramp propensity should increase following fatigue. However, differences in laboratory versus clinical fatiguing exercise and contributions from other sources, as well as the notion of a graded response to fatiguing exercise, on exercise-associated muscle cramp and electrically induced muscle cramp should be considered.


Subject(s)
Exercise/physiology , Muscle Contraction/physiology , Muscle Fatigue/physiology , Muscle, Skeletal/physiology , Adult , Analysis of Variance , Cross-Over Studies , Electric Stimulation , Electromyography , Exercise Test , Female , Humans , Male , Toes , Young Adult
20.
J Electromyogr Kinesiol ; 20(2): 348-53, 2010 Apr.
Article in English | MEDLINE | ID: mdl-19427798

ABSTRACT

Cryotherapy and ankle bracing are often used in conjunction as a treatment for ankle injury. No studies have evaluated the combined effect of these treatments on reflex responses during inversion perturbation. This study examined the combined influence of ankle bracing and joint cooling on peroneus longus (PL) muscle response during ankle inversion. A 2x2 RM factorial design guided this study; the independent variables were: ankle brace condition (lace-up brace, control), and treatment (ice, control), and the dependent variables studied were PL stretch reflex latency (ms), and PL stretch reflex amplitude (% of max). Twenty-four healthy participants completed 5 trials of a sudden inversion perturbation to the ankle/foot complex under each ankle brace and cryotherapy treatment condition. No two-way interaction was observed between ankle brace and treatment conditions on PL latency (P=0.283) and amplitude (P=0.884). The ankle brace condition did not differ from control on PL latency and amplitude. Cooling the ankle joint did not alter PL latency or amplitude compared to the no-ice treatment. Ankle bracing combined with joint cooling does not have a deleterious effect on dynamic ankle joint stabilization during an inversion perturbation in normal subjects.


Subject(s)
Ankle Joint/physiology , Braces , Cryotherapy/methods , Immobilization/methods , Muscle Contraction/physiology , Muscle, Skeletal/physiology , Reflex, Stretch/physiology , Cold Temperature , Female , Humans , Immobilization/instrumentation , Male , Muscle, Skeletal/innervation , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...