Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
1.
Transfusion ; 59(7): 2316-2323, 2019 07.
Article in English | MEDLINE | ID: mdl-31106447

ABSTRACT

BACKGROUND: Risk-adjusted benchmarking could be useful to compare blood utilization between hospitals or individual groups, such as physicians, while accounting for differences in patient complexity. The aim of this study was to analyze the association of red blood cell (RBC) use and diagnosis-related group (DRG) weights across all inpatient hospital stays to determine the suitability of using DRGs for between-hospital risk-adjusted benchmarking. Specific hierarchical organizational units (surgical vs. nonsurgical patients, departments, and physicians) were also evaluated. STUDY DESIGN AND METHODS: We studied blood use among all adult inpatients, and within organizational units, over 4 years (May 2014 to March 2018) at an academic center. Number of RBCs transfused, all patient refined (APR)-DRGs, and other variables were captured over entire hospital stays. We used multilevel generalized linear modeling (zero-inflated negative binomial) to study the relationship between RBC utilization and APR-DRG. RESULTS: A total of 97,955 hospital stays were evaluated and the median APR-DRG weight was 1.2. The association of RBCs transfused and APR-DRG weight was statistically significant at all hierarchical levels (incidence rate ratio = 1.22; p < 0.001). The impact of APR-DRG on blood use, measured by the incidence rate ratio, demonstrated an association at the all-patient and surgical levels, at several department and physician levels, but not at the medical patient level. The relationship between RBCs transfused and APR-DRG varied across organizational units. CONCLUSION: Number of RBCs transfused was associated with APR-DRG weight at multiple hierarchical levels and could be used for risk-adjusted benchmarking in those contexts. The relationship between RBC use and APR-DRG varied across organizational units.


Subject(s)
Benchmarking , Blood Transfusion , Diagnosis-Related Groups , Inpatients , Length of Stay , Adult , Female , Humans , Male , Middle Aged , Risk Assessment
2.
JAMA ; 316(10): 1061-72, 2016 Sep 13.
Article in English | MEDLINE | ID: mdl-27623461

ABSTRACT

IMPORTANCE: Transformation of US health care from volume to value requires meaningful quantification of costs and outcomes at the level of individual patients. OBJECTIVE: To measure the association of a value-driven outcomes tool that allocates costs of care and quality measures to individual patient encounters with cost reduction and health outcome optimization. DESIGN, SETTING, AND PARTICIPANTS: Uncontrolled, pre-post, longitudinal, observational study measuring quality and outcomes relative to cost from 2012 to 2016 at University of Utah Health Care. Clinical improvement projects included total hip and knee joint replacement, hospitalist laboratory utilization, and management of sepsis. EXPOSURES: Physicians were given access to a tool with information about outcomes, costs (not charges), and variation and partnered with process improvement experts. MAIN OUTCOMES AND MEASURES: Total and component inpatient and outpatient direct costs across departments; cost variability for Medicare severity diagnosis related groups measured as coefficient of variation (CV); and care costs and composite quality indexes. RESULTS: From July 1, 2014, to June 30, 2015, there were 1.7 million total patient visits, including 34 000 inpatient discharges. Professional costs accounted for 24.3% of total costs for inpatient episodes ($114.4 million of $470.4 million) and 41.9% of total costs for outpatient visits ($231.7 million of $553.1 million). For Medicare severity diagnosis related groups with the highest total direct costs, cost variability was highest for postoperative infection (CV = 1.71) and sepsis (CV = 1.37) and among the lowest for organ transplantation (CV ≤ 0.43). For total joint replacement, a composite quality index was 54% at baseline (n = 233 encounters) and 80% 1 year into the implementation (n = 188 encounters) (absolute change, 26%; 95% CI, 18%-35%; P < .001). Compared with the baseline year, mean direct costs were 7% lower in the implementation year (95% CI, 3%-11%; P < .001) and 11% lower in the postimplementation year (95% CI, 7%-14%; P < .001). The hospitalist laboratory testing mean cost per day was $138 (median [IQR], $113 [$79-160]; n = 2034 encounters) at baseline and $123 (median [IQR], $99 [$66-147]; n = 4276 encounters) in the evaluation period (mean difference, -$15; 95% CI, -$19 to -$11; P < .001), with no significant change in mean length of stay. For a pilot sepsis intervention, the mean time to anti-infective administration following fulfillment of systemic inflammatory response syndrome criteria in patients with infection was 7.8 hours (median [IQR], 3.4 [0.8-7.8] hours; n = 29 encounters) at baseline and 3.6 hours (median [IQR], 2.2 [1.0-4.5] hours; n = 76 encounters) in the evaluation period (mean difference, -4.1 hours; 95% CI, -9.9 to -1.0 hours; P = .02). CONCLUSIONS AND RELEVANCE: Implementation of a multifaceted value-driven outcomes tool to identify high variability in costs and outcomes in a large single health care system was associated with reduced costs and improved quality for 3 selected clinical projects. There may be benefit for individual physicians to understand actual care costs (not charges) and outcomes achieved for individual patients with defined clinical conditions.


Subject(s)
Arthroplasty, Replacement/economics , Arthroplasty, Replacement/standards , Decision Support Techniques , Health Care Costs/statistics & numerical data , Outcome Assessment, Health Care , Quality Improvement , Sepsis/economics , Access to Information , Cost Control , Female , Humans , Length of Stay , Longitudinal Studies , Male , Medicare , Physicians , Sepsis/therapy , Severity of Illness Index , Surgical Wound Infection , United States
3.
J Am Med Inform Assoc ; 22(1): 223-35, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25324556

ABSTRACT

OBJECTIVE: To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). MATERIALS AND METHODS: In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. RESULTS: A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. CONCLUSIONS: The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value.


Subject(s)
Health Care Costs , Software , Cost-Benefit Analysis , Humans , Treatment Outcome , Utah
4.
BMC Med Res Methodol ; 11: 151, 2011 Nov 18.
Article in English | MEDLINE | ID: mdl-22099213

ABSTRACT

BACKGROUND: Retrospective research requires longitudinal data, and repositories derived from electronic health records (EHR) can be sources of such data. With Health Information Technology for Economic and Clinical Health (HITECH) Act meaningful use provisions, many institutions are expected to adopt EHRs, but may be left with large amounts of financial and historical clinical data, which can differ significantly from data obtained from newer systems, due to lack or inconsistent use of controlled medical terminologies (CMT) in older systems. We examined different approaches for semantic enrichment of financial data with CMT, and integration of clinical data from disparate historical and current sources for research. METHODS: Snapshots of financial data from 1999, 2004 and 2009 were mapped automatically to the current inpatient pharmacy catalog, and enriched with RxNorm. Administrative metadata from financial and dispensing systems, RxNorm and two commercial pharmacy vocabularies were used to integrate data from current and historical inpatient pharmacy modules, and the outpatient EHR. Data integration approaches were compared using percentages of automated matches, and effects on cohort size of a retrospective study. RESULTS: During 1999-2009, 71.52%-90.08% of items in use from the financial catalog were enriched using RxNorm; 64.95%-70.37% of items in use from the historical inpatient system were integrated using RxNorm, 85.96%-91.67% using a commercial vocabulary, 87.19%-94.23% using financial metadata, and 77.20%-94.68% using dispensing metadata. During 1999-2009, 48.01%-30.72% of items in use from the outpatient catalog were integrated using RxNorm, and 79.27%-48.60% using a commercial vocabulary. In a cohort of 16304 inpatients obtained from clinical systems, 4172 (25.58%) were found exclusively through integration of historical clinical data, while 15978 (98%) could be identified using semantically enriched financial data. CONCLUSIONS: Data integration using metadata from financial/dispensing systems and pharmacy vocabularies were comparable. Given the current state of EHR adoption, semantic enrichment of financial data and integration of historical clinical data would allow the repurposing of these data for research. With the push for HITECH meaningful use, institutions that are transitioning to newer EHRs will be able to use their older financial and clinical data for research using these methods.


Subject(s)
Drug Therapy/economics , Drug Therapy/methods , Electronic Health Records/statistics & numerical data , Medical Informatics/methods , Biomedical Research/economics , Biomedical Research/methods , Humans , Longitudinal Studies , Retrospective Studies , Systems Integration , Vocabulary, Controlled
5.
ASAIO J ; 57(3): 206-12, 2011.
Article in English | MEDLINE | ID: mdl-21389849

ABSTRACT

Prediction of kidney transplant outcome represents an important and clinically relevant problem. Although several prediction models have been proposed based on large, national collections of data, their utility at the local level (where local data distributions may differ from national data) remains unclear. We conducted a comparative analysis that modeled the outcome data of transplant recipients in the national US Renal Data System (USRDS) against a representative local transplant dataset at the University of Utah Health Sciences Center, a regional transplant center. The performance of an identical set of prediction models was evaluated on both national and local data to assess how well national models reflect local outcomes. Compared with the USRDS dataset, several key characteristics of the local dataset differed significantly (e.g., a much higher local graft survival rate; a much higher local percentage of white donors and recipients; and a much higher proportion of living donors). This was reflected in statistically significant differences in model performance. The area under the receiver operating characteristic curve values of the models predicting 1, 3, 5, 7, and 10-year graft survival on the USRDS data were 0.59, 0.63, 0.76, 0.91, and 0.97, respectively. In contrast, in the local dataset, these values were 0.54, 0.58, 0.58, 0.61, and 0.70, respectively. Prediction models trained on a national set of data from the USRDS performed better in the national dataset than in the local data. This might be due to the differences in the data characteristics between the two datasets, suggesting that the wholesale adoption of a prediction model developed on a large national dataset to guide local clinical practice should be done with caution.


Subject(s)
Kidney Transplantation/statistics & numerical data , Adult , Databases, Factual , Female , Graft Survival , Humans , Living Donors/statistics & numerical data , Male , Middle Aged , Models, Statistical , ROC Curve , Time Factors , Treatment Outcome , United States , Utah , Young Adult
6.
Nephrol Dial Transplant ; 24(8): 2575-83, 2009 Aug.
Article in English | MEDLINE | ID: mdl-19286691

ABSTRACT

BACKGROUND: The genetic determinants of acute kidney transplant rejection (AR) are not well studied, and familial aggregation has never been demonstrated. The goal of this retrospective case-control study was to exploit the unique nature of the Utah Population Database (UPDB) to evaluate if AR or rejection-free survival aggregates in families. METHODS: We identified 891 recipients with genealogy data in the UPDB with at least one year of follow-up, of which 145 (16.1%) had AR and 77 recipients had biopsy-proven rejection graded >or=1A. We compared the genealogical index of familiality (GIF) in cases and controls (i.e. recipients with random assignment of rejection status). RESULTS: We did not find evidence for familial clustering of AR in the entire patient population or in the subgroup with early rejection (n = 52). When the subgroup of recipients with rejection grade >or=1A (n = 77) was analysed separately, we observed increased familial clustering (GIF = 3.02) compared to controls (GIF = 1.96), although the p-value did not reach the level of statistical significance (p = 0.17). Furthermore, we observed an increase in familial clustering in recipients who had a rejection-free course (GIF = 2.45) as compared to controls (GIF = 2.08, p = 0.04). When all recipients were compared to non-transplant controls, they demonstrated a much greater degree of familiality (GIF = 2.03 versus GIF 0.63, p < 0.001). CONCLUSIONS: There is a familial component to rejection-free transplant course and trend to familial aggregation in recipients with AR grade 1A or higher. If a genetic association study is performed, there are families in Utah identified in the current study that can be targeted to increase the power of the test.


Subject(s)
Genetic Predisposition to Disease , Graft Rejection/genetics , Kidney Failure, Chronic/genetics , Adult , Case-Control Studies , Female , Follow-Up Studies , Humans , Kidney Failure, Chronic/epidemiology , Kidney Transplantation , Male , Pedigree , Prognosis , Retrospective Studies , Survival Rate
7.
AMIA Annu Symp Proc ; : 1114, 2008 Nov 06.
Article in English | MEDLINE | ID: mdl-18999122

ABSTRACT

The University of Utah has initiated the design and implementation of an innovative data and knowledge management informatics platform for the new Center for Clinical and Translational Science. The main component of the platform is described, along with preliminary project objectives and current status.


Subject(s)
Biomedical Research/organization & administration , Database Management Systems , Databases, Factual , Information Dissemination/methods , Medical Informatics/organization & administration , State Government , Translational Research, Biomedical/organization & administration , Utah
8.
Nephrol Nurs J ; 34(4): 403-14; quiz 415, 2007.
Article in English | MEDLINE | ID: mdl-17891909

ABSTRACT

BACKGROUND/AIMS: A systematic review was undertaken in order to critically appraise the current knowledge base of sodium profiling in hemodialysis. Between 15%-80% of patients on hemodialysis experience symptoms of dialysis intolerance every dialysis session. The purpose of this review was to identify whether sodium profiling is an effective intervention in removing or reducing these untoward effects. METHODS: A literature search was undertaken using Medline and Embase. Inclusion criteria were primary research or controlled clinical trials published between January 1990 and June 2006 and studies in the chronic dialysis setting and studies that identified sodium profiling as the intervention in hemodialysis or hemodiafiltration. Articles excluded included: those that could not establish whether sodium profiling was the intervention responsible for the outcome; articles on hemofiltration; and review articles and research pertaining to the acute setting. Thirteen articles met the inclusion criteria and were included in the final review. RESULTS: A number of flaws were identified with methodological adequacy and consistency of findings. It was not possible to determine whether positive effects outweighed negative effects in this review. In the majority of studies, there was a lack offollow-up and the inability to determine long-term outcomes of patients who received sodium profiling. CONCLUSION: This evaluative review could not provide evidence to support the clinical use of sodium profiling in the population of patients on hemodialysis who are symptomatic. There remains a theoretical base for the use of sodium profiling, however further studies are needed providing consistency in methodology, looking not only at reduction in morbidity but effects on quality of life, long-term outcomes, and mortality.


Subject(s)
Renal Dialysis/adverse effects , Sodium/analysis , Education, Continuing , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...