Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
1.
J Wound Ostomy Continence Nurs ; 48(6): 553-559, 2021.
Article in English | MEDLINE | ID: mdl-34781312

ABSTRACT

Disorders of bowel function are prevalent, particularly among patients with spinal cord injuries and other neurological disorders. An individual's bowel control significantly impacts quality of life, as predictable bowel function is necessary to actively and independently participate in everyday activities. For many patients with bowel dysfunction, initial lifestyle adjustments and other conservative therapeutic interventions (eg, digital stimulation, oral laxatives, suppositories) are insufficient to reestablish regular bowel function. In addition to these options, rectal irrigation (RI) is a safe and effective method of standard bowel care that has been used for several decades in adults and children suffering from bowel dysfunction associated with neurogenic or functional bowel etiologies. Rectal irrigation is an appropriate option when conservative bowel treatments are inadequate. Unlike surgical options, RI can be initiated or discontinued at any time. This report summarizes the clinical, humanistic, and economic evidence supporting the use of RI in clinical practice, noting features (eg, practical considerations, patient education) that can improve patients' success with RI treatment.


Subject(s)
Fecal Incontinence , Neurogenic Bowel , Spinal Cord Injuries , Adult , Child , Fecal Incontinence/therapy , Humans , Neurogenic Bowel/etiology , Neurogenic Bowel/therapy , Quality of Life , Spinal Cord Injuries/complications , Therapeutic Irrigation
2.
Article in English | MEDLINE | ID: mdl-26633166

ABSTRACT

PURPOSE: The aim of this study was to evaluate the economic and humanistic implications of using ostomy components to prevent subsequent peristomal skin complications (PSCs) in individuals who experience an initial, leakage-related PSC event. DESIGN: Cost-utility analysis. METHODS: We developed a simple decision model to consider, from a payer's perspective, PSCs managed with and without the use of ostomy components over 1 year. The model evaluated the extent to which outcomes associated with the use of ostomy components (PSC events avoided; quality-adjusted life days gained) offset the costs associated with their use. RESULTS: Our base case analysis of 1000 hypothetical individuals over 1 year assumes that using ostomy components following a first PSC reduces recurrent events versus PSC management without components. In this analysis, component acquisition costs were largely offset by lower resource use for ostomy supplies (barriers; pouches) and lower clinical utilization to manage PSCs. The overall annual average resource use for individuals using components was about 6.3% ($139) higher versus individuals not using components. Each PSC event avoided yielded, on average, 8 additional quality-adjusted life days over 1 year. CONCLUSIONS: In our analysis, (1) acquisition costs for ostomy components were offset in whole or in part by the use of fewer ostomy supplies to manage PSCs and (2) use of ostomy components to prevent PSCs produced better outcomes (fewer repeat PSC events; more health-related quality-adjusted life days) over 1 year compared to not using components.


Subject(s)
Ostomy/adverse effects , Skin Diseases/economics , Skin Diseases/etiology , Surgical Stomas/adverse effects , Cohort Studies , Cost-Benefit Analysis , Humans , Ostomy/economics , Ostomy/nursing , Self Care , Skin Care , Skin Diseases/nursing
3.
Popul Health Manag ; 17(3): 159-65, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24476557

ABSTRACT

Some 3 million people in the United States have atrial fibrillation (AF). Without thromboprophylaxis, AF increases overall stroke risk 5-fold. Prevention is paramount as AF-related strokes tend to be severe. Thromboprophylaxis reduces the annual incidence of stroke in AF patients by 22%-62%. However, antithrombotics are prescribed for only about half of appropriate AF patients. The study team estimates the economic implications for Medicare of fewer stroke events resulting from increased thromboprophylaxis among moderate- to high-risk AF patients. The decision model used considers both reduced stroke and increased bleeding risk from thromboprophylaxis for a hypothetical cohort on no thromboprophylaxis (45%), antiplatelets (10%), and anticoagulation (45%). AF prevalence, stroke risk, and stroke risk reduction are adjusted for age, comorbidities, and anticoagulation/antiplatelet status. Health care costs are literature based. At baseline, an estimated 24,677 ischemic strokes, 9127 hemorrhagic strokes, and 9550 bleeding events generate approximately $2.63 billion in annual event-related health care costs to Medicare for every million AF patients eligible for thromboprophylaxis. A 10% increase in anticoagulant use in the untreated population would reduce stroke events by 9%, reduce stroke fatalities by 9%, increase bleed events by 5%, and reduce annual stroke/bleed-related costs to Medicare by about $187 million (7.1%) for every million eligible AF patients. A modest 10% increase in the use of thromboprophylaxis would reduce event-related costs to Medicare by 7.1%, suggesting a compelling economic motivation to improve rates of appropriate thromboprophylaxis. New oral anticoagulants offering better balance between the risks of stroke and major bleeding events may improve these clinical and economic outcomes.


Subject(s)
Anticoagulants/therapeutic use , Atrial Fibrillation/complications , Cost of Illness , Medicare/economics , Stroke/economics , Thrombosis/prevention & control , Atrial Fibrillation/drug therapy , Humans , Stroke/etiology , United States
4.
Perspect Vasc Surg Endovasc Ther ; 25(1-2): 20-7, 2013 Jun.
Article in English | MEDLINE | ID: mdl-24225504

ABSTRACT

OBJECTIVES: To summarize available evidence regarding stent fracture in the femoropopliteal region. METHODS: We searched PubMed, 2000-2011, using MeSH search terms "stents," "popliteal artery," and "femoral artery." RESULTS: We identified 29 original studies reporting 0% to 65% incidence of stent fracture. Fracture-related repeat revascularization could be avoided in the absence of device failure. Recently published data suggest that even a 5% rate of fracture-related reintervention would generate $118.4 million in health care cost in the United States. These excess procedures would also result in major complications and deaths that might have been avoided in the absence of stent fracture. CONCLUSIONS: Reported incidence and clinical relevance of femoropopliteal stent fractures vary across studies. Stent fracture may lead to repeat revascularization. These reinterventions create considerable--and potentially avoidable--economic burden for patients and payers. Further, these costs are effectively invisible wherever stent fractures are not systematically documented as the reason for reintervention.


Subject(s)
Endovascular Procedures/economics , Endovascular Procedures/instrumentation , Femoral Artery , Health Care Costs , Lower Extremity/blood supply , Peripheral Arterial Disease/economics , Peripheral Arterial Disease/therapy , Popliteal Artery , Prosthesis Failure , Stents/economics , Costs and Cost Analysis , Endovascular Procedures/adverse effects , Humans , Peripheral Arterial Disease/diagnosis , Retreatment , Treatment Outcome
5.
Mil Med ; 178(2): 142-5, 2013 Feb.
Article in English | MEDLINE | ID: mdl-23495458

ABSTRACT

The Patient Protection and Affordable Care Act recently passed into law is poised to profoundly affect the provision of medical care in the United States. In today's environment, the foundation for most ongoing comparative effectiveness research is financial claims data. However, there is an alternative that possesses much richer data. That alternative, uniquely positioned to serve as a test system for national health reform efforts, is the Department of Defense Military Health System. This article describes how to leverage the Military Health System and provide effective solutions to current health care reform challenges in the United States.


Subject(s)
Health Care Reform/organization & administration , Military Medicine/organization & administration , Adolescent , Adult , Aged , Budgets , Child , Comparative Effectiveness Research , Delivery of Health Care/organization & administration , Female , Humans , Male , Middle Aged , Military Medicine/economics , Models, Organizational , Private Sector/organization & administration , United States , Young Adult
6.
J Med Econ ; 16(2): 307-17, 2013.
Article in English | MEDLINE | ID: mdl-23216013

ABSTRACT

OBJECTIVE: To evaluate costs and outcomes associated with initial tapentadol ER vs oxycodone CR for the treatment of chronic non-cancer pain (CNCP) in the US. METHODS: This study developed a Monte-Carlo simulation based on the scientific foundation established by published models of long-acting opioids (LAO) in patients having moderate-to-severe CNCP. It estimates costs and outcomes associated with the use of tapentadol ER vs oxycodone CR over a 1-year period from the perspective of a US payer. LAO effectiveness and treatment-emergent adverse event (TEAE) rates are derived from clinical trials of tapentadol ER vs oxycodone CR; other inputs are based on published literature supplemented sparingly with clinical opinion. Sensitivity analyses consider the impact of real-world dosing patterns for LAO on treatment costs. RESULTS: Initial tapentadol ER consistently demonstrates better outcomes than initial oxycodone CR (proportion of patients achieving adequate pain relief and no GI TEAE; acute TEAE-free days; days free of chronic constipation; quality-adjusted life days; productive working hours). While total costs with initial tapentadol ER are slightly (2.2%) higher than with initial oxycodone CR, nearly twice as many modeled patients in the initial tapentadol ER arm (29% vs 15%) achieve adequate pain relief and no GI TEAE compared to initial oxycodone CR. In sensitivity analyses, tapentadol ER becomes a dominant strategy when real-world dosing patterns are considered. CONCLUSION: The additional costs to produce better outcomes (pain relief and no GI TEAE) associated with tapentadol ER are small in the context of double the likelihood of a patient response with tapentadol ER. When daily average consumption (DACON) for oxycodone CR is factored into the analysis, initial tapentadol ER becomes a dominant strategy. Our findings are both strengthened, and limited by the use of randomized trial-centric input parameters. These results should be validated as inputs from clinical practice settings become available.


Subject(s)
Analgesics, Opioid/economics , Chronic Pain/drug therapy , Computer Simulation , Monte Carlo Method , Oxycodone/economics , Phenols/economics , Analgesics, Opioid/therapeutic use , Costs and Cost Analysis , Delayed-Action Preparations/economics , Delayed-Action Preparations/therapeutic use , Dose-Response Relationship, Drug , Health Expenditures , Humans , Outcome Assessment, Health Care , Oxycodone/therapeutic use , Phenols/therapeutic use , Tapentadol , United States
7.
Infect Control Hosp Epidemiol ; 33(10): 1031-8, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22961023

ABSTRACT

OBJECTIVE: To describe the epidemiology and healthcare costs of Clostridium difficile infection (CDI) identified in the outpatient setting. DESIGN: Population-based, retrospective cohort study. PATIENTS: Kaiser Permanente Colorado and Kaiser Permanente Northwest members between June 1, 2005, and September 30, 2008. METHODS: We identified persons with incident CDI and classified CDI by whether it was identified in the outpatient or inpatient healthcare setting. We collected information about baseline variables and follow-up healthcare utilization, costs, and outcomes among patients with CDI. We compared characteristics of patients with CDI identified in the outpatient versus inpatient setting. RESULTS: We identified 3,067 incident CDIs; 56% were identified in the outpatient setting. Few strong, independent predictors of diagnostic setting were identified, although a previous stay in a nonacute healthcare institution (odds ratio [OR], 1.45 [95% confidence interval (CI), 1.13-1.86]) was statistically associated with outpatient-identified CDI, as was age from 50 to 59 years (OR, 1.64 [95% CI, 1.18-2.29]), 60 to 69 years (OR, 1.37 [95% CI, 1.03-1.82]), and 70 to 79 years (OR, 1.36 [95% CI, 1.06-1.74]), when compared with persons aged 80-89 years. CONCLUSIONS: We found that more than one-half of incident CDIs in this population were identified in the outpatient setting. Patients with outpatient-identified CDI were younger with fewer comorbidities, although they frequently had previous exposure to healthcare. These data suggest that practitioners should be aware of CDI and obtain appropriate diagnostic testing on outpatients with CDI symptoms.


Subject(s)
Ambulatory Care Facilities , Clostridioides difficile/isolation & purification , Enterocolitis, Pseudomembranous/economics , Enterocolitis, Pseudomembranous/epidemiology , Health Expenditures/trends , Adolescent , Adult , Aged , Aged, 80 and over , Child , Child, Preschool , Colorado/epidemiology , Confidence Intervals , Female , Health Services/economics , Health Services/statistics & numerical data , Humans , Infant , Male , Middle Aged , Northwestern United States/epidemiology , Odds Ratio , Outcome Assessment, Health Care , Population Surveillance , Retrospective Studies , Young Adult
8.
Med Decis Making ; 32(5): 660-2, 2012.
Article in English | MEDLINE | ID: mdl-22990080
9.
Emerg Infect Dis ; 18(6): 960-2, 2012 Jun.
Article in English | MEDLINE | ID: mdl-22608207

ABSTRACT

To determine the incidence of Clostridium difficile infection during 2007, we examined infection in adult inpatient and outpatient members of a managed-care organization. Incidence was 14.9 C. difficile infections per 10,000 patient-years. Extrapolating this rate to US adults, we estimate that 284,875 C. difficile infections occurred during 2007.


Subject(s)
Clostridioides difficile , Clostridium Infections/epidemiology , Adult , Aged , Aged, 80 and over , Anti-Bacterial Agents/therapeutic use , Clostridium Infections/drug therapy , Cohort Studies , Colorado/epidemiology , Female , Humans , Incidence , Male , Metronidazole/therapeutic use , Middle Aged , Northwestern United States/epidemiology , Vancomycin/therapeutic use , Young Adult
10.
Clin Lung Cancer ; 11(6): 396-404, 2010 Nov 01.
Article in English | MEDLINE | ID: mdl-21062730

ABSTRACT

BACKGROUND: Human immunodeficiency virus (HIV)-infected individuals are at increased risk for primary lung cancer (LC). We wished to compare the clinicopathologic features and treatment outcome of HIV-LC patients with HIV-indeterminate LC patients. We also sought to compare behavioral characteristics and immunologic features of HIV-LC patients with HIV-positive patients without LC. PATIENTS AND METHODS: A database of 75 HIV-positive patients with primary LC in the HAART era was established from an international collaboration. These cases were drawn from the archives of contributing physicians who subspecialize in HIV malignancies. Patient characteristics were compared with registry data from the Surveillance Epidemiology and End Results program (SEER; n = 169,091 participants) and with HIV-positive individuals without LC from the Adult and Adolescent Spectrum of HIV-related Diseases project (ASD; n = 36,569 participants). RESULTS: The median age at HIV-related LC diagnosis was 50 years compared with 68 years for SEER participants (P < .001). HIV-LC patients, like their SEER counterparts, most frequently presented with stage IIIB/IV cancers (77% vs. 70%), usually with adenocarcinoma (46% vs. 47%) or squamous carcinoma (35% vs. 25%) histologies. HIV-LC patients and ASD participants had comparable median nadir CD4+ cell counts (138 cells/µL vs. 160 cells/µL). At LC diagnosis, their median CD4+ count was 340 cells/µL and 86% were receiving HAART. Sixty-three HIV-LC patients (84%) received cancer-specific treatments, but chemotherapy-associated toxicity was substantial. The median survival for both HIV-LC patients and SEER participants with stage IIIB/IV was 9 months. CONCLUSION: Most HIV-positive patients were receiving HAART and had substantial improvement in CD4+ cell count at time of LC diagnosis. They were able to receive LC treatments; their tumor types and overall survival were similar to SEER LC participants. However, HIV-LC patients were diagnosed with LC at a younger age than their HIV-indeterminate counterparts. Future research should explore how screening, diagnostic and treatment strategies directed toward the general population may apply to HIV-positive patients at risk for LC.


Subject(s)
Antiretroviral Therapy, Highly Active/methods , HIV Infections/complications , Lung Neoplasms/epidemiology , Adenocarcinoma/epidemiology , Adenocarcinoma/etiology , Adenocarcinoma/therapy , Adolescent , Adult , Age Factors , Age of Onset , Aged , Aged, 80 and over , Antineoplastic Agents/adverse effects , Antineoplastic Agents/therapeutic use , CD4 Lymphocyte Count , Carcinoma, Squamous Cell/epidemiology , Carcinoma, Squamous Cell/etiology , Carcinoma, Squamous Cell/therapy , Child , Databases, Factual , Female , HIV Infections/drug therapy , Humans , International Cooperation , Lung Neoplasms/etiology , Lung Neoplasms/therapy , Male , Middle Aged , Neoplasm Staging , SEER Program/statistics & numerical data , Survival , Treatment Outcome , Young Adult
11.
Can J Urol ; 17(5): 5377-82, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20974030

ABSTRACT

INTRODUCTION: Robotic assisted laparoscopic radical prostatectomy (RALP) is a common treatment for localized prostate cancer. Despite a primary advantage of improved postoperative pain, patients undergoing RALP still experience discomfort. Belladonna, containing the muscarinic receptor antagonists atropine and scopolamine, in combination with opium as a rectal suppository (B & O) may improve post-RALP pain. This study evaluates whether a single preoperative B & O results in decreased postoperative patient-reported pain and analgesic requirements. MATERIALS AND METHODS: Patients undergoing RALP at Virginia Mason Medical Center between November 2008 and July 2009 were offered the opportunity to enter a randomized, double-blind, placebo-controlled trial. Exclusion criteria included: glaucoma, bronchial asthma, convulsive disorders, chronic pain, chronic use of analgesics, or a history of alcohol or opioid dependency. Surgeons were blinded to suppository placement which was administered after induction of anesthesia. All patients underwent a standardized anesthesia regimen. Postoperative pain was assessed by a visual analog scale (VAS) and postoperative narcotic use was calculated in intravenous morphine equivalents. RESULTS: Ninety-nine patients were included in the analysis. The B & O and control groups were not significantly different in terms of age, body mass index, operative time, nerve sparing status or prostatic volume. Postoperative pain was significantly improved during the first two postoperative hours in the B & O group. Similarly, 24-hour morphine consumption was significantly lower in patients who received a B & O. No adverse effects secondary to suppository placement were identified. CONCLUSION: Preoperative administration of B & O suppository results in significantly decreased postoperative pain and 24-hour morphine consumption in patients undergoing RALP.


Subject(s)
Analgesics, Opioid/administration & dosage , Atropa belladonna , Morphine/administration & dosage , Muscarinic Antagonists/administration & dosage , Pain, Postoperative/drug therapy , Phytotherapy , Plant Preparations/administration & dosage , Preoperative Care/methods , Prostatectomy/adverse effects , Aged , Analgesia, Patient-Controlled , Analgesics, Opioid/therapeutic use , Atropine/administration & dosage , Atropine/therapeutic use , Double-Blind Method , Humans , Laparoscopy , Male , Middle Aged , Muscarinic Antagonists/therapeutic use , Pain, Postoperative/economics , Phytotherapy/economics , Plant Preparations/therapeutic use , Preoperative Care/economics , Prostatic Neoplasms/surgery , Robotics , Scopolamine/administration & dosage , Scopolamine/therapeutic use , Suppositories
12.
Can J Urol ; 17(1): 4995-5001, 2010 Feb.
Article in English | MEDLINE | ID: mdl-20156379

ABSTRACT

AIMS: The assessment of incontinence therapies is complicated by the variety of patient reported outcomes (PRO) measures used in research protocol. Patient satisfaction may be one of the most relevant albeit complex PRO measures and is a function of many related variables. We sought to assess the relationship between patient satisfaction and other PRO. METHODS: A retrospective review of patients undergoing SPARC (n = 314) and autologous rectus pubovaginal sling (PVS) (n = 127) was performed, with 204 (SPARC) and 67 (PVS) patients completing questionnaire surveillance and minimum 12 month follow up. Outcomes were assessed using validated incontinence questionnaires (UDI-6, IIQ-7) supplemented with additional items addressing subjective improvement. Comparisons were made between patients reporting a willingness to recommend and repeat surgical intervention (combined variable, satisfaction surrogate) and achievement of defined endpoints in the remaining outcome measures. RESULTS: A large difference in outcomes was seen depending on PRO measure analyzed. Dry was the strictest measure used (33%, SPARC; 39%, PVS; p = NS), while >or= 50% improvement was reported with the greatest frequency (75%, SPARC; 73%, PVS; p = NS). With the exception of pad use, a statistically significant association between all PRO measures and the willingness to recommend/repeat surgery was identified. CONCLUSIONS: Our data demonstrate an association between a variety of PRO measures and patient reported satisfaction. Based on this finding, the development of a simplified and standardized PRO instrument, one that maintains an accurate reflection of patient satisfaction and is less cumbersome for the patient may be possible.


Subject(s)
Patient Satisfaction , Suburethral Slings , Urinary Incontinence, Stress/surgery , Adult , Aged , Aged, 80 and over , Female , Humans , Middle Aged , Outcome Assessment, Health Care , Urologic Surgical Procedures , Young Adult
13.
Med Decis Making ; 29(6): NP1-2, 2009.
Article in English | MEDLINE | ID: mdl-19959802
14.
Article in English | MEDLINE | ID: mdl-19721098

ABSTRACT

OBJECTIVE: To compare the clinical course of patients with AIDS-related Kaposi's sarcoma (KS) with CD4 counts >300 cells/mm(3) and undetectable HIV viral loads (VLs) to patients with AIDS-KS with lesser CD4 counts and detectable HIV VLs. METHODS: We retrospectively analyzed a cohort of 91 patients with AIDS-KS in a multispeciality clinic. We used chi(2) and Student t tests to analyze intragroup differences; survival was determined by Kaplan-Meier analysis. RESULTS: Twenty (22%) of the 91 patients had newly diagnosed, persistent or progressive KS despite CD4 counts >300 cells/mm(3) and undetectable HIV VLs. Age, gender, ethnicity, mode and duration of HIV acquisition, type of antiretroviral therapy (ART), and KS therapy did not differ significantly (P < or = .005) between this group and the remaining 71 patients. Although tumor stage and response to KS therapy were similar, there was a significantly greater risk of death among the patients with CD4 counts <300 cells/mm(3) and detectable HIV VLs (P = .048). CONCLUSIONS: In the highly active antiretroviral (HAART) era, a substantial proportion of patients with KS had undetectable HIV VLs and CD4 counts greater than the level typically associated with opportunistic diseases. They required systemic therapy to control their KS but were significantly less likely to die and demonstrated a trend toward better 15-year survival than patients having KS with lesser CD4 counts and detectable HIV VLs.


Subject(s)
CD4 Lymphocyte Count , HIV Infections/blood , Sarcoma, Kaposi/mortality , Viral Load , Adult , Aged , Antiretroviral Therapy, Highly Active , Cohort Studies , Female , HIV Infections/drug therapy , HIV Infections/epidemiology , Humans , Kaplan-Meier Estimate , Lymph Nodes/pathology , Male , Middle Aged , Mouth Neoplasms/mortality , Mouth Neoplasms/pathology , Mouth Neoplasms/therapy , Retrospective Studies , Sarcoma, Kaposi/pathology , Sarcoma, Kaposi/therapy , Skin Neoplasms/mortality , Skin Neoplasms/pathology , Skin Neoplasms/therapy
15.
J Urol ; 182(3): 1050-4, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19616792

ABSTRACT

PURPOSE: We performed a prospective multicomponent study to determine whether subjective and objective bladder sensation instruments may provide data on sensory dysfunction in patients with overactive bladder. MATERIALS AND METHODS: We evaluated 70 prospectively enrolled patients with urodynamics and questionnaires on validated urgency (Urgency Perception Score), general overactive bladder (Urogenital Distress Inventory) and quality of life (Incontinence Impact Questionnaire). We first sought a correlation between sensory specific (Urgency Perception Score) and quality of life questionnaire scores. We then assessed a correlation between sensory questionnaire scores and urodynamic variables, exploring the hypothesis that certain urodynamic parameters may be bladder sensation measures. We evaluated 2 urodynamic derivatives (first sensation ratio and bladder urgency velocity) to increase sensory finding discrimination. RESULTS: We noted a moderate correlation between the Urgency Perception Score (0.56) and the Urogenital Distress Inventory (0.74) vs the Incontinence Impact Questionnaire (each p <0.01). A weak negative correlation was seen between Urgency Perception Score and bladder capacity (-0.25, p <0.05). No correlation was noted for the other urodynamics parameters. First sensation ratio and bladder urgency velocity statistically significantly correlated with the Urgency Perception Score despite the lesser or absent correlation associated with the individual components of these derivatives. CONCLUSIONS: Bladder sensation questionnaires may be valuable to identify patients with sensory dysfunction and provide additional data not obtained in generalized symptom questionnaires. Urodynamic variables correlated with bladder sensation questionnaire scores and may be an objective method to assess sensory dysfunction.


Subject(s)
Urinary Bladder, Overactive/complications , Urinary Bladder, Overactive/physiopathology , Aged , Female , Humans , Male , Middle Aged , Quality of Life , Sensation , Surveys and Questionnaires , Urinary Bladder , Urination Disorders/etiology , Urination Disorders/physiopathology , Urodynamics
16.
Clin Ther ; 31(4): 880-8, 2009 Apr.
Article in English | MEDLINE | ID: mdl-19446160

ABSTRACT

BACKGROUND: Although annual per-person health care costs for patients with end-stage renal disease (ESRD) on in-center hemodialysis greatly exceed those for patients on peritoneal dialysis (PD), which is a home dialysis therapy, current use of PD remains low. In April 2008, the Centers for Medicare & Medicaid Services issued a new Dialysis Conditions of Coverage final rule underscoring its intent to promote use of home dialysis whenever appropriate. OBJECTIVES: The objectives of this paper were to provide context for the use of in-home versus in-center dialysis, to describe factors that influence patterns of dialysis utilization in the United States, and to explore the magnitude of the potential savings that might result from broader use of home dialysis therapies. METHODS: A 5-year budget-impact analysis was performed using data from the 2007 Annual Data Report of the United States Renal Data System. Scenarios were developed in which the PD share of total dialysis was varied to estimate the impact on total Medicare dialysis costs. This study took the perspective of Medicare, the main payer for dialysis in the United States. RESULTS: If the PD share of total dialysis were to decrease from the current 8% to 5%, Medicare spending for dialysis would increase by an additional $401 million over a 5-year period. Alternatively, if the PD share of total dialysis were to increase to 15%, Medicare could realize potential savings of >$1.1 billion over 5 years. CONCLUSIONS: Similar to the conclusion articulated in the Dialysis Conditions of Coverage final rule, increasing clinically appropriate use of PD would be associated with considerable savings to Medicare and to the taxpayers who fund Medicare. These savings could be used to offset part of the financial burden of ESRD care on Medicare and to help legislators meet ever-tightening budgetary constraints.


Subject(s)
Hemodialysis, Home/economics , Kidney Failure, Chronic/therapy , Medicare/economics , Peritoneal Dialysis/economics , Cost of Illness , Health Care Costs , Humans , Kidney Failure, Chronic/economics , Models, Economic , Renal Dialysis/economics , United States
17.
J Bone Miner Metab ; 27(3): 287-94, 2009.
Article in English | MEDLINE | ID: mdl-19333685

ABSTRACT

Hyperparathyroidism may play a role in the excess morbidity and mortality in chronic kidney disease. This study examined utilization and outcomes of patients with hyperparathyroidism and chronic kidney disease. In a US health maintenance organization (HMO), patients with chronic kidney disease were identified from the electronic medical record. Patients included in the study had at least one intact parathyroid hormone (iPTH) measurement ordered by a nephrologist and were at least 20 years of age with no history of renal replacement therapy (RRT, n = 455). Cohorts were determined by index iPTH level and were followed for 1 year. Rates of health care utilization were compared between cohorts using Poisson regression; costs comparisons were made using linear regression; mortality and RRT were evaluated using Cox regression. Increasing levels of iPTH were associated with a significantly elevated risk of mortality and RRT, even after adjustment for potential confounders such as stage of chronic kidney disease. Compared to iPTH of <110 pg/ml, we found a 66% increase combined mortality-RRT risk (HR 1.66, 95% CI 1.41-1.97) for those with iPTH 110-199 pg/ml, and a HR of 4.57 (95% CI 3.86-5.43) for iPTH >or=300 pg/ml. We did not find a convincing association between iPTH level and utilization. While this study provides no evidence that treating patients with higher levels of iPTH will ameliorate poor outcomes, it suggests that iPTH levels beyond the targets suggested by clinical guidelines are associated with increased harm in patients with chronic kidney disease.


Subject(s)
Hyperparathyroidism/economics , Hyperparathyroidism/therapy , Kidney Failure, Chronic/economics , Kidney Failure, Chronic/therapy , Aged , Cohort Studies , Costs and Cost Analysis , Female , Humans , Hyperparathyroidism/complications , Kidney Failure, Chronic/complications , Male , Retrospective Studies , Survival Analysis , Treatment Outcome
18.
Value Health ; 12(1): 73-9, 2009.
Article in English | MEDLINE | ID: mdl-18680485

ABSTRACT

OBJECTIVES: End-stage renal disease (ESRD) is a debilitating condition resulting in death unless treated. Treatment options are transplantation and dialysis. Alternative dialysis modalities are peritoneal dialysis (PD) and hemodialysis (HD), each of which has been shown to produce similar outcomes and survival. Nevertheless, the financial implications of each modality are different and these differences vary by country, especially in the developing world. Changes in clinically appropriate dialysis delivery leading to more efficient use of resources would increase the resources available to treat ESRD or other disabling conditions. This article outlines the relative advantages of HD and PD and uses budget impact analysis to estimate the country-specific, 5-year financial implications on total dialysis costs assuming utilization shifts from HD to PD in two high-income (UK, Singapore), three upper-middle-income (Mexico, Chile, Romania), and three lower-middle-income (Thailand, China, Colombia) countries. RESULTS: Peritoneal dialysis is a clinically effective dialysis option that can be significantly cost-saving compared to HD, even in developing countries. CONCLUSIONS: The magnitude of costs associated with treating ESRD patients globally is large and growing. PD is a clinically effective dialysis option that can be used by a majority of ESRD patients and can also be significantly cost-saving compared to HD therapy. Increasing clinically appropriate PD use would substantially reduce health-care costs and help health-care systems meet ever-tightening budget constraints.


Subject(s)
Health Resources , Health Services Needs and Demand/economics , Peritoneal Dialysis/economics , Cost-Benefit Analysis , Developed Countries/economics , Developing Countries/economics , Global Health , Humans , Kidney Failure, Chronic/economics , Kidney Failure, Chronic/therapy , Renal Dialysis/economics
19.
Int Urol Nephrol ; 40(2): 351-4, 2008.
Article in English | MEDLINE | ID: mdl-17619160

ABSTRACT

BACKGROUND: Bladder neck contracture (BNC) following prostatectomy has been reported in 0.5-32% of cases. While the etiology of a BNC is unclear, several factors have been associated with this complication, including blood loss, devascularization of bladder neck tissue, poor mucosal apposition and urinary extravasation. To study the impact of urinary extravasation on BNC formation, we used postoperative drain output as a surrogate measure for anastomotic leakage. METHODS: All patients undergoing a radical retropubic prostatectomy (RRP) or a robotic assisted radical prostatectomy (RARP) from January 2000 to April 2006 have been entered into a prospective review board-approved database. All RRP patients had their anastomosis performed in an interrupted fashion using six monofilament 2-0 sutures. All robotic-assisted radical prostatectomy anastomoses were performed in a running fashion using 2-0 monofilament sutures. A single, closed suction Jackson Pratt drain was placed over the surgical bed at the conclusion of the case. Post-operative drain outputs were recorded. All patients were evaluated at 3, 6, 9, 12 and 24 months post-operatively. All patients who reported a diminished urinary stream or incontinence were evaluated by office cystoscopy. The inability to navigate an 18 French cystoscope through the bladder neck was defined as a bladder neck contracture. RESULTS: A total of 576 patients underwent a radical prostatectomy over this time span. Complete records were available for 535 (93%) of these patients. There were 21 bladder neck contractures (3.9%) overall. The post-operative drain output ranged from 5-5,465 ml (median 119 ml). Eight patients who had drain outputs less than 119 ml developed a BNC while 13 BNC developed in patients with Jackson Pratt drain output > 119 ml (P = 0.343). In patients who underwent an open RRP, 19/424 (4.5%) developed contractures while 2/108 (1.9%) RARP patients developed a BNC (P = 0.105). CONCLUSION: The amount of post-operative drain output is not statistically associated with the development of a bladder neck contracture.


Subject(s)
Prostatectomy/adverse effects , Urinary Bladder Neck Obstruction/etiology , Urination Disorders/etiology , Aged , Anastomosis, Surgical , Contracture/etiology , Cystoscopes , Drainage , Humans , Male , Middle Aged , Robotics , Suture Techniques , Urinary Bladder Neck Obstruction/diagnosis
20.
Oncologist ; 11(6): 666-73, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16794245

ABSTRACT

PURPOSE: To evaluate in a pilot study the safety and efficacy of liposomal doxorubicin, cyclophosphamide, and etoposide (LACE) when combined with antiretroviral therapy (ART) in patients with AIDS-related lymphoma (ARL). The impact of HIV viral control on therapy and survival was also assessed. PATIENTS AND METHODS: Between 1994 and 2005, 40 patients at Virginia Mason Medical Center were diagnosed with ARL. Twelve received LACE every 28 days. All patients received intrathecal chemoprophylaxis, ART, and G-CSF. RESULTS: The median patient CD4+ count was 190 cells/microl (range, 20-510 cells/microl), and the median HIV viral load (VL) was 61,613 copies/ml (range, <50-500,000 copies/ml). Seven patients (58%) had an International Prognostic Index score of 3 or 4. Six patients (50%) were ART-naïve, five were viremic despite ART, and one had an undetectable HIV-1 VL. Nine patients (75%) achieved a complete response (CR), and median overall survival was 107 months. At a median follow-up of 46 months, the recurrence-free survival rate was 50%. Two patients died from relapsed/refractory ARL and one patient achieved a CR with salvage therapy. One CR patient died from complications of pneumonia, and another CR patient died from uncertain causes 5 years after treatment. Grade 3 or 4 neutropenia occurred in 23 of 61 (38%) chemotherapy cycles. Hospitalization was required after 5% of treatment cycles due to neutropenic fever. CONCLUSION: LACE is an effective and tolerable treatment for ARL. HIV viral control can be maintained in the majority of patients during and after completion of LACE.


Subject(s)
Anti-Retroviral Agents/therapeutic use , Antineoplastic Combined Chemotherapy Protocols/administration & dosage , Lymphoma, AIDS-Related/drug therapy , Adult , CD4 Lymphocyte Count , Cyclophosphamide/administration & dosage , Doxorubicin/administration & dosage , Drug Carriers , Etoposide/administration & dosage , Female , Humans , Liposomes , Lymphoma, AIDS-Related/immunology , Lymphoma, AIDS-Related/mortality , Lymphoma, AIDS-Related/virology , Male , Middle Aged , Pilot Projects , Survival Rate , Viral Load
SELECTION OF CITATIONS
SEARCH DETAIL
...