Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters











Database
Language
Publication year range
1.
J Manag Care Spec Pharm ; 27(1): 95-104, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33377442

ABSTRACT

Results of randomized controlled trials (RCTs) provide valuable comparisons of 2 or more interventions to inform health care decision making; however, many more comparisons are required than available time and resources to conduct them. Moreover, RCTs have limited generalizability. Comparative effectiveness research (CER) using real-world evidence (RWE) can increase generalizability and is important for decision making, but use of nonrandomized designs makes their evaluation challenging. Several tools are available to assist. In this study, we comparatively characterize 5 tools used to evaluate RWE studies in the context of making health care adoption decision making: (1) Good Research for Comparative Effectiveness (GRACE) Checklist, (2) IMI GetReal RWE Navigator (Navigator), (3) Center for Medical Technology Policy (CMTP) RWE Decoder, (4) CER Collaborative tool, and (5) Real World Evidence Assessments and Needs Guidance (REAdi) tool. We describe each and then compare their features along 8 domains: (1) objective/user/context, (2) development/scope, (3) platform/presentation, (4) user design, (5) study-level internal/external validity of evidence, (6) summarizing body of evidence, (7) assisting in decision making, and (8) sharing results/making improvements. Our summary suggests that the GRACE Checklist aids stakeholders in evaluation of the quality and applicability of individual CER studies. Navigator is a collection of educational resources to guide demonstration of effectiveness, a guidance tool to support development of medicines, and a directory of authoritative resources for RWE. The CMTP RWE Decoder aids in the assessment of relevance and rigor of RWE. The CER Collaborative tool aids in the assessment of credibility and relevance. The REAdi tool aids in refinement of the research question, study retrieval, quality assessment, grading the body of evidence, and prompts with questions to facilitate coverage decisions. All tools specify a framework, were designed with stakeholder input, assess internal validity, are available online, and are easy to use. They vary in their complexity and comprehensiveness. The RWE Decoder, CER Collaborative tool, and REAdi tool synthesize evidence and were specifically designed to aid formulary decision making. This study adds clarity on what the tools provide so that the user can determine which best fits a given purpose. DISCLOSURES: This work was supported by the Health Tech Fund, which was provided to the University of Washington School of Pharmacy by its Corporate Advisory Board. This consortium of pharmaceutical and biotech companies supports the research program of the University of Washington School of Pharmacy across the competitive space. The sponsors seeded the idea for the project and contributed to study design and improvement. The authors had full control of all content development, manuscript drafting, and submission for publication. The REAdi tool was developed by the authors. Chen, Bansal, Barthold, Carlson, Veenstra, Basu, Devine, Yun, Ta, and Beal were supported by a training grant from the University of Washington-Allergan Fellowship, unrelated to this work. Basu reports personal fees from Salutis Consulting, unrelated to this work. Graff is an employee of the National Pharmaceutical Council, which was a partner in the development of the CER Collaborative and funding partner for the CMTP RWE Decoder and the GRACE Checklist. A previous version of this work was presented as an invited workshop at AMCP Nexus 2018; October 22-25, 2018; Orlando, FL.


Subject(s)
Decision Support Techniques , Drug Compounding/economics , Pharmaceutical Preparations/economics , Comparative Effectiveness Research , Humans
2.
Med Decis Making ; 35(5): 596-607, 2015 07.
Article in English | MEDLINE | ID: mdl-25349188

ABSTRACT

OBJECTIVES: To compare model input influence on incremental net monetary benefit (INMB) across 3 uncertainty methods: 1) 1-way sensitivity analysis; 2) probabilistic analysis of covariance (ANCOVA); and 3) expected value of partial perfect information (EVPPI). METHODS: In a preliminary model, we used a published cost-effectiveness model and assumed £20,000 per quality-adjusted life-year (QALY) willingness-to-pay (Case 1: lower decision uncertainty) and £8000/QALY willingness-to-pay (Case 2: higher decision uncertainty). We conducted 1-way sensitivity, ANCOVA (10,000 Monte Carlo draws), and EVPPI for each model input (1000 inner and 1000 outer draws). We ranked inputs based on influence of INMB and compared input ranks across methods within case using Spearman's rank correlation. We replicated this approach in 3 follow-up models: an additional linear model, a less linear model with uncorrelated inputs, and a less linear model with correlated inputs. RESULTS: In the preliminary model, lower and higher decision uncertainty cases had the same top 3 influential parameters across uncertainty methods. The 2 most influential inputs contributed 78% and 49% of variation in outcome based on ANCOVA for lower decision uncertainty and higher decision uncertainty cases, respectively. In the follow-up models, input rank order correlations were higher across uncertainty methods in the linear model compared with both of the less linear models. CONCLUSIONS: Evidence across models suggests influential input rank agreement between 1-way and more advanced uncertainty analyses for relatively linear models with uncorrelated parameters but less agreement for less linear models. Although each method provides unique information, the additional resources needed to generate and communicate advanced analyses should be weighed, especially when outcome decision uncertainty is low. For less linear models or those with correlated inputs, performing and reporting deterministic and probabilistic uncertainty analyses appear prudent and conservative.


Subject(s)
Analysis of Variance , Cost-Benefit Analysis/methods , Decision Support Techniques , HIV Infections/drug therapy , HIV Infections/economics , Humans , Linear Models , Monte Carlo Method , Probability , Quality-Adjusted Life Years , Uncertainty
SELECTION OF CITATIONS
SEARCH DETAIL