Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
Transplant Direct ; 9(6): e1491, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37250491

ABSTRACT

A large number of procured kidneys continue not to be transplanted, while the waiting list remains high. Methods: We analyzed donor characteristics for unutilized kidneys in our large organ procurement organization (OPO) service area in a single year to determine the reasonableness of their nonuse and to identify how we might increase the transplant rate of these kidneys. Five experienced local transplant physicians independently reviewed unutilized kidneys to identify which kidneys they would consider transplanting in the future. Biopsy results, donor age, kidney donor profile index, positive serologies, diabetes, and hypertension were risk factors for nonuse. Results: Two-thirds of nonused kidneys had biopsies with high degree of glomerulosclerosis and interstitial fibrosis. Reviewers identified 33 kidneys as potentially transplantable (12%). Conclusions: Reducing the rate of unutilized kidneys in this OPO service area will be achieved by setting acceptable expanded donor characteristics, identifying suitable well-informed recipients, defining acceptable outcomes, and systematically evaluating the results of these transplants. Because the improvement opportunity will vary by region, to achieve a significant impact on improving the national nonuse rate, it would be useful for all OPOs, in collaboration with their transplant centers, to conduct a similar analysis.

2.
3.
J Urol ; 202(3): 539-545, 2019 09.
Article in English | MEDLINE | ID: mdl-31009291

ABSTRACT

PURPOSE: The United States health care system is rapidly moving away from fee for service reimbursement in an effort to improve quality and contain costs. Episode based reimbursement is an increasingly relevant value based payment model of surgical care. We sought to quantify the impact of modifiable cost inputs on institutional financial margins in an episode based payment model for prostate cancer surgery. MATERIALS AND METHODS: A total of 157 consecutive patients underwent robotic radical prostatectomy in 2016 at a tertiary academic medical center. We compiled comprehensive episode costs and reimbursements from the most recent urology consultation for prostate cancer through 90 days postoperatively and benchmarked the episode price as a fixed reimbursement to the median reimbursement of the cohort. We identified 2 sources of modifiable costs with undefined empirical value, including preoperative prostate magnetic resonance imaging and perioperative functional recovery counseling visits, and then calculated the impact on financial margins (reimbursement minus cost) under an episode based payment. RESULTS: Although they comprised a small proportion of the total episode costs, varying the use of preoperative magnetic resonance imaging (33% vs 100% of cases) and functional recovery counseling visits (1 visit in 66% and 2 in 100%) reduced average expected episode financial margins up to 22.6% relative to the margin maximizing scenario in which no patient received these services. CONCLUSIONS: Modifiable cost inputs have a substantial impact on potential operating margins for prostate cancer surgery under an episode based payment model. High cost health systems must develop the capability to analyze individual cost inputs and quantify the contribution to quality to inform value improvement efforts for multiple service lines.


Subject(s)
Fee-for-Service Plans , Preoperative Care/economics , Prostatectomy/economics , Prostatic Neoplasms/surgery , Robotic Surgical Procedures/economics , Aged , Cost Savings/methods , Counseling/economics , Counseling/statistics & numerical data , Health Expenditures/statistics & numerical data , Humans , Magnetic Resonance Imaging/economics , Magnetic Resonance Imaging/statistics & numerical data , Male , Middle Aged , Preoperative Care/methods , Preoperative Care/statistics & numerical data , Prostate/diagnostic imaging , Prostate/surgery , Prostatectomy/methods , Prostatectomy/statistics & numerical data , Prostatic Neoplasms/diagnostic imaging , Prostatic Neoplasms/economics , Robotic Surgical Procedures/methods , Robotic Surgical Procedures/statistics & numerical data , United States
4.
Acad Med ; 91(4): 522-9, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26579793

ABSTRACT

PURPOSE: To highlight teaching hospitals' efforts to reduce readmissions by describing interventions implemented to improve care transitions for heart failure (HF) patients and the variability in implemented HF-specific and care transition interventions. METHOD: In 2012, the authors surveyed a network of 17 teaching hospitals to capture information about the number, type, stage of implementation, and structure of 4 HF-specific and 21 care transition (predischarge, bridging, and postdischarge) interventions implemented to reduce readmissions among patients with HF. The authors summarized data using descriptive statistics, including the mean number of interventions implemented and the frequency and stage of specific interventions, and descriptive plots of the structure of two common interventions (multidisciplinary rounds and follow-up telephone calls). RESULTS: Sixteen hospitals (94%) responded. The number and stage of implementation of the HF-specific and care transition interventions implemented varied across institutions. The mean number of interventions at an advanced stage of implementation (i.e., implemented for ≥ 75% of HF patients on the cardiology service or on all services) was 10.9 (standard deviation = 4.3). Overall, predischarge interventions were more common than bridging or postdischarge interventions. There was variability in the personnel involved in multidisciplinary rounds and in the processes/content of follow-up telephone calls. CONCLUSIONS: Teaching hospitals have implemented a wide range of interventions aimed at reducing hospital readmissions, but there is substantial variability in the types, stages, and structure of their interventions. This heterogeneity highlights the need for collaborative efforts to improve understanding of intervention effectiveness.


Subject(s)
Heart Failure/therapy , Hospitals, Teaching , Patient Readmission , Patient Transfer , Quality Improvement , Continuity of Patient Care , Humans , Surveys and Questionnaires
5.
Health Care Manag (Frederick) ; 32(3): 212-26, 2013.
Article in English | MEDLINE | ID: mdl-23903937

ABSTRACT

There has been an increasing emphasis on health care efficiency and costs and on improving quality in health care settings such as hospitals or clinics. However, there has not been sufficient work on methods of improving access and customer service times in health care settings. The study develops a framework for improving access and customer service time for health care settings. In the framework, the operational concept of the bottleneck is synthesized with queuing theory to improve access and reduce customer service times without reduction in clinical quality. The framework is applied at the Ronald Reagan UCLA Medical Center to determine the drivers for access and customer service times and then provides guidelines on how to improve these drivers. Validation using simulation techniques shows significant potential for reducing customer service times and increasing access at this institution. Finally, the study provides several practice implications that could be used to improve access and customer service times without reduction in clinical quality across a range of health care settings from large hospitals to small community clinics.


Subject(s)
Academic Medical Centers/organization & administration , Health Services Accessibility/organization & administration , Quality Improvement/organization & administration , Academic Medical Centers/standards , Efficiency, Organizational , Health Services Accessibility/standards , Hospital Bed Capacity, 300 to 499 , Hospitals, University/organization & administration , Hospitals, University/standards , Humans , Laboratories, Hospital/organization & administration , Laboratories, Hospital/standards , Los Angeles , Models, Organizational , Pharmacy Service, Hospital/organization & administration , Pharmacy Service, Hospital/standards , Quality Improvement/standards , Quality of Health Care/organization & administration , Quality of Health Care/standards , Surgery Department, Hospital/organization & administration , Surgery Department, Hospital/standards , Waiting Lists
6.
J Urban Health ; 89(5): 828-47, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22566148

ABSTRACT

Geographic variation has been of interest to both health planners and social epidemiologists. However, while the major focus of interest of planners has been on variation in health care spending, social epidemiologists have focused on health; and while social epidemiologists have observed strong associations between poor health and poverty, planners have concluded that income is not an important determinant of variation in spending. These different conclusions stem, at least in part, from differences in approach. Health planners have generally studied variation among large regions, such as states, counties, or hospital referral regions (HRRs), while epidemiologists have tended to study local areas, such as ZIP codes and census tracts. To better understand the basis for geographic variation in hospital utilization, we drew upon both approaches. Counties and HRRs were disaggregated into their constituent ZIP codes and census tracts and examined the interrelationships between income, disability, and hospital utilization that were examined at both the regional and local levels, using statistical and geomapping tools. Our studies centered on the Milwaukee and Los Angeles HRRs, where per capita health care utilization has been greater than elsewhere in their states. We compared Milwaukee to other HRRs in Wisconsin and Los Angeles to the other populous counties of California and to a region in California of comparable size and diversity, stretching from San Francisco to Sacramento (termed "San-Framento"). When studied at the ZIP code level, we found steep, curvilinear relationships between lower income and both increased hospital utilization and increasing percentages of individuals reporting disabilities. These associations were also evident on geomaps. They were strongest among populations of working-age adults but weaker among seniors, for whom income proved to be a poor proxy for poverty and whose residential locations deviated from the major underlying income patterns. Among working-age adults, virtually all of the excess utilization in Milwaukee was attributable to very high utilization in Milwaukee's segregated "poverty corridor." Similarly, the greater rate of hospital use in Los Angeles than in San-Framento could be explained by proportionately more low-income ZIP codes in Los Angeles and fewer in San-Framento. Indeed, when only high-income ZIP codes were assessed, there was little variation in hospital utilization among California's 18 most populous counties. We estimated that had utilization within each region been at the rate of its high-income ZIP codes, overall utilization would have been 35 % less among working-age adults and 20 % less among seniors. These studies reveal the importance of disaggregating large geographic units into their constituent ZIP codes in order to understand variation in health care utilization among them. They demonstrate the strong association between low ZIP code income and both higher percentages of disability and greater hospital utilization. And they suggest that, given the large contribution of the poorest neighborhoods to aggregate utilization, it will be difficult to curb the growth of health care spending without addressing the underlying social determinants of health.


Subject(s)
Health Status Disparities , Hospitals/statistics & numerical data , Income/statistics & numerical data , Sociology, Medical , Adolescent , Adult , Age Distribution , Aged , California , Censuses , Geography , Humans , Los Angeles , Middle Aged , Poverty Areas , Urban Health , Wisconsin , Young Adult
7.
Circ Cardiovasc Qual Outcomes ; 2(6): 548-57, 2009 Nov.
Article in English | MEDLINE | ID: mdl-20031892

ABSTRACT

BACKGROUND: Recent studies have found substantial variation in hospital resource use by expired Medicare beneficiaries with chronic illnesses. By analyzing only expired patients, these studies cannot identify differences across hospitals in health outcomes like mortality. This study examines the association between mortality and resource use at the hospital level, when all Medicare beneficiaries hospitalized for heart failure are examined. METHODS AND RESULTS: A total of 3999 individuals hospitalized with a principal diagnosis of heart failure at 6 California teaching hospitals between January 1, 2001, and June 30, 2005, were analyzed with multivariate risk-adjustment models for total hospital days, total hospital direct costs, and mortality within 180-days after initial admission ("Looking Forward"). A subset of 1639 individuals who died during the study period were analyzed with multivariate risk-adjustment models for total hospital days and total hospital direct costs within 180-days before death ("Looking Back"). "Looking Forward" risk-adjusted hospital means ranged from 17.0% to 26.0% for mortality, 7.8 to 14.9 days for total hospital days, and 0.66 to 1.30 times the mean value for indexed total direct costs. Spearman rank correlation coefficients were -0.68 between mortality and hospital days, and -0.93 between mortality and indexed total direct costs. "Looking Back" risk-adjusted hospital means ranged from 9.1 to 21.7 days for total hospital days and 0.91 to 1.79 times the mean value for indexed total direct costs. Variation in resource use site ranks between expired and all individuals were attributable to insignificant differences. CONCLUSIONS: California teaching hospitals that used more resources caring for patients hospitalized for heart failure had lower mortality rates. Focusing only on expired individuals may overlook mortality variation as well as associations between greater resource use and lower mortality. Reporting values without identifying significant differences may result in incorrect assumption of true differences.


Subject(s)
Heart Failure/economics , Heart Failure/mortality , Hospital Costs/statistics & numerical data , Hospital Mortality , Outcome Assessment, Health Care , Aged , Aged, 80 and over , California/epidemiology , Cohort Studies , Female , Hospitals, Teaching , Humans , Length of Stay/economics , Length of Stay/statistics & numerical data , Male
8.
Arch Surg ; 144(9): 859-64, 2009 Sep.
Article in English | MEDLINE | ID: mdl-19797112

ABSTRACT

OBJECTIVE: To identify tools to aid the creation of disaster surge capacity using a model of planned inpatient census reduction prior to relocation of a university hospital. DESIGN: Prospective analysis of hospital operations for 1-week periods beginning 2 weeks (baseline) and 1 week (transition) prior to move day; analysis of regional hospital and emergency department capacity. SETTING: Large metropolitan university teaching hospital. MAIN OUTCOME MEASURES: Hospital census figures and patient outcomes. RESULTS: Census was reduced by 36% from 537 at baseline to 345 on move day, a rate of 18 patients/d (P < .005). Census reduction was greater for surgical services than nonsurgical services (46% vs 30%; P = .02). Daily volume of elective operations also decreased significantly, while the number of emergency operations was unchanged. Hospital admissions were decreased by 42%, and the adjusted discharges per occupied bed were increased by 8% (both P < .05). Inpatient mortality was not affected. Regional capacity to absorb new patients was limited. During a period in which southern California population grew by 8.5%, acute care beds fell by 3.3%, while Los Angeles County emergency departments experienced a 13% diversion rate due to overcrowding. CONCLUSIONS: Local or regional disasters of any size can overwhelm the system's ability to respond. Our strategy produced a surge capacity of 36% without interruption of emergency department and trauma services but required 3 to 4 days for implementation, making it applicable to disasters and mass casualty events with longer lead times. These principles may aid in disaster preparedness and planning.


Subject(s)
Disaster Planning , Patient Transfer/methods , Surge Capacity , California , Civil Defense , Hospitalization , Hospitals, University , Humans , Inpatients , Prospective Studies
9.
J Urol ; 177(2): 632-6, 2007 Feb.
Article in English | MEDLINE | ID: mdl-17222648

ABSTRACT

PURPOSE: We compared the incidence of ureteral complications between the classic (Lich-Gregoir) technique and the recently popularized single stitch (Shanfield) technique in renal transplantation. MATERIALS AND METHODS: The charts of 721 consecutive transplant recipients from May 1999 to July 2002 were retrospectively reviewed. Ureteral and nonureteral complications were reviewed at 3 to 5-year followup. RESULTS: Of the 721 recipients evaluated 713 were included in the study. There were 360 recipients in the Lich-Gregoir group and 353 in the Shanfield group. A significantly higher rate of ureteral complications occurred in the Shanfield group compared to the Lich-Gregoir group (15.6% vs 3.9%, p <0.0001). The Shanfield group consisted of 20 patients with ureteral leakage, 21 with hematuria, 11 with strictures and 3 who had ureteral stones. The Lich-Gregoir group had 8 patients with ureteral leakage, 5 with hematuria and 1 with a stricture. In comparison, urinary tract infections, delayed graft function and rejection rates were not significantly different between the 2 groups (p = 0.76, 0.12 and 0.19, respectively). CONCLUSIONS: In contrast to other reports, the Shanfield group had significantly more ureteral complications. In particular the Shanfield technique may predispose patients to higher rates of hematuria and stone formation. Based on this large series and published meta-analyses we believe that the stented Lich-Gregoir anastomosis is the superior ureteroneocystostomy technique in renal transplantation.


Subject(s)
Cystostomy , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Ureteral Diseases/etiology , Ureterostomy , Adult , Female , Humans , Male , Retrospective Studies , Suture Techniques , Time Factors , Treatment Outcome , Ureteral Diseases/epidemiology
10.
Hum Immunol ; 67(10): 777-86, 2006 Oct.
Article in English | MEDLINE | ID: mdl-17055354

ABSTRACT

A major milestone in transplantation would be the use of biomarkers to monitor rejection. We examined the association between perforin and granzyme-B gene expression detected in the peripheral blood of renal allograft recipients with cellular and antibody-mediated rejection. Furthermore, we judged the appropriateness of assigning negative rejection statuses to persons without a biopsy whose grafts were functioning well clinically. Of the 46 patients who completed the study, recipients with cellular rejection had higher perforin and granzyme-B levels compared with nonrejectors (p = 0.006). Interestingly, recipients with antibody-mediated rejection also had higher perforin and granzyme-B levels compared with nonrejectors (p = 0.04). Patients with high levels of granzyme B had a probability of rejecting that was 26.7 times greater than those patients with low levels of granzyme B. Perforin and granzyme B had sensitivities of 50% and specificities of 95% in predicting rejection (cutoff value = 140). Assigning negative rejection statuses to recipients without a biopsy whose grafts were functioning well did not have a major effect on the direction or significance of covariate values. This study suggests that perforin and granzyme-B gene expressions in peripheral blood are accurate in detecting both cellular and antibody-mediated rejection.


Subject(s)
Graft Rejection/diagnosis , Granzymes/genetics , Kidney Transplantation/immunology , Membrane Glycoproteins/genetics , Pore Forming Cytotoxic Proteins/genetics , Adult , Antibody Formation/immunology , Biopsy , Female , Gene Expression/genetics , Graft Rejection/genetics , Graft Rejection/immunology , Humans , Immunity, Cellular/immunology , Kaplan-Meier Estimate , Male , Middle Aged , Perforin , Proportional Hazards Models , RNA, Messenger/genetics , RNA, Messenger/metabolism , ROC Curve , Reverse Transcriptase Polymerase Chain Reaction/statistics & numerical data , Sensitivity and Specificity , Time Factors
11.
Am J Transplant ; 5(10): 2514-20, 2005 Oct.
Article in English | MEDLINE | ID: mdl-16162202

ABSTRACT

Despite reports demonstrating the safety of laparoscopic donor nephrectomy (LDN) for pediatric recipients of renal transplants, recent evidence has challenged using LDN for recipients 5 years of age or younger. We retrospectively reviewed the records of all pediatric recipients of living donor renal transplants from September 2000 through August 2004. We compared those who received allografts recovered by LDN (n = 34) with those recovered by open donor nephrectomy (ODN, n = 26). Outcomes of interest included operative complications, postoperative renal function, the incidence of delayed graft function or episodes of acute rejection and long-term graft function. Donor and recipient demographic data were similar for the LDN and ODN groups. Serum creatinine and calculated creatinine clearance were not significantly different between groups both in the early postoperative period and at long-term follow-up (p > 0.142). Rates of delayed graft function and acute rejection did not differ between groups. Among recipients aged 5 years old or younger stratified by donor technique (9 LDN, 5 ODN recipients), no difference was noted in graft outcomes both early and long-term (p > 0.079). At our center, pediatric LDN recipients have graft outcomes comparable to those of ODN recipients. At experienced centers, we recommend continued use of LDN for pediatric recipients of all ages.


Subject(s)
Kidney Transplantation/methods , Laparoscopy/methods , Adolescent , Age Factors , Child , Child, Preschool , Creatinine/blood , Creatinine/urine , Female , Graft Rejection , Graft Survival , Humans , Kidney/pathology , Living Donors , Male , Nephrectomy/methods , Renal Artery , Retrospective Studies , Time Factors , Tissue and Organ Harvesting/methods , Tissue and Organ Procurement , Treatment Outcome
12.
Clin Transpl ; : 137-42, 2002.
Article in English | MEDLINE | ID: mdl-12971443

ABSTRACT

The demand for renal transplantation continues to increase. Combined organ transplantation currently accounts for approximately 10% of the kidney transplants at UCLA. As the demand for renal transplantation has increased, living kidney donation has become more common and achieves excellent results. Thirty-five percent of the 1,307 renal transplants at UCLA during the past 5 years were from living donors. The donor morbidity has been reduced with improvements in postoperative analgesia and laparoscopic nephrectomy techniques. Management of the patients waiting for cadaveric renal transplantation is becoming increasingly complex, since this population now exceeds 1,000 patients and the median waiting time is approaching 5 years. Improved immunosuppressive, antibiotic, and antiviral medications have significantly reduced the rate of acute rejection and serious infections. As long-term graft survival improves, the side effect profiles of newer medications are increasingly important. The one- and 3-year graft survival rates during the past 5 years were 98% and 90% for adult recipients of living donor kidneys and were 91% and 82% for recipients of cadaveric grafts, respectively. The results for pediatric transplants were 100% and 97% for living donor kidneys and 97% and 85% for cadaveric grafts at one and 3 years, respectively. We are pleased with our excellent results and the manner in which our program has responded to changes in the organ transplant environment.


Subject(s)
Kidney Transplantation/statistics & numerical data , Liver Transplantation/statistics & numerical data , Adult , California , Child , Demography , Hospitals, University/statistics & numerical data , Humans , Immunosuppression Therapy/methods , Kidney Transplantation/mortality , Kidney Transplantation/physiology , Liver Transplantation/physiology , Living Donors , Pancreas Transplantation/physiology , Pancreas Transplantation/statistics & numerical data
13.
Clin Transplant ; 1(1): 44-48, 1987.
Article in English | MEDLINE | ID: mdl-21151803

ABSTRACT

One-hundred-and-twenty-eight recipients of 131 consecutive, non-matched cadaver renal allografts were treated with cyclosporine and steroids. They have been followed for 4 to 6 yr. Cumulative patient survival at 1-yr was 92.2% and at 6yr it is 77.8%. Cumulative graft survival at 1-yr was 79.4% and at 6 yr it is 50.0%. After the high-risk 1st yr, the rate of graft loss was even and similar to that reported after the 1st yr for grafts treated with azathioprine and steroids. This indicates that cyclosporine nephrotoxicity has not had an obvious adverse effect on the survival of chronically functioning grafts. The results were better with primary grafting versus retransplantation, but were not significantly influenced by age, diabetes mellitus, or a delayed switch in patients from cyclosporine to azathioprine. We have concluded that cyclosporine-steroid therapy is safe and effective for long-term use after cadaveric renal transplantation.

SELECTION OF CITATIONS
SEARCH DETAIL
...