Your browser doesn't support javascript.
Planning and Reporting Effective Web-Based RAND/UCLA Appropriateness Method Panels: Literature Review and Preliminary Recommendations.
Sparks, Jordan B; Klamerus, Mandi L; Caverly, Tanner J; Skurla, Sarah E; Hofer, Timothy P; Kerr, Eve A; Bernstein, Steven J; Damschroder, Laura J.
  • Sparks JB; VA Center for Clinical Management Research, Ann Arbor, MI, United States.
  • Klamerus ML; VA Center for Clinical Management Research, Ann Arbor, MI, United States.
  • Caverly TJ; VA Center for Clinical Management Research, Ann Arbor, MI, United States.
  • Skurla SE; Department of Internal Medicine, University of Michigan, Ann Arbor, MI, United States.
  • Hofer TP; Institute for Healthcare Policy and Innovation, University of Michigan, Ann Arbor, MI, United States.
  • Kerr EA; Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, United States.
  • Bernstein SJ; VA Center for Clinical Management Research, Ann Arbor, MI, United States.
  • Damschroder LJ; VA Center for Clinical Management Research, Ann Arbor, MI, United States.
J Med Internet Res ; 24(8): e33898, 2022 08 26.
Article in English | MEDLINE | ID: covidwho-2009803
ABSTRACT

BACKGROUND:

The RAND/UCLA Appropriateness Method (RAM), a variant of the Delphi Method, was developed to synthesize existing evidence and elicit the clinical judgement of medical experts on the appropriate treatment of specific clinical presentations. Technological advances now allow researchers to conduct expert panels on the internet, offering a cost-effective and convenient alternative to the traditional RAM. For example, the Department of Veterans Affairs recently used a web-based RAM to validate clinical recommendations for de-intensifying routine primary care services. A substantial literature describes and tests various aspects of the traditional RAM in health research; yet we know comparatively less about how researchers implement web-based expert panels.

OBJECTIVE:

The objectives of this study are twofold (1) to understand how the web-based RAM process is currently used and reported in health research and (2) to provide preliminary reporting guidance for researchers to improve the transparency and reproducibility of reporting practices.

METHODS:

The PubMed database was searched to identify studies published between 2009 and 2019 that used a web-based RAM to measure the appropriateness of medical care. Methodological data from each article were abstracted. The following categories were assessed composition and characteristics of the web-based expert panels, characteristics of panel procedures, results, and panel satisfaction and engagement.

RESULTS:

Of the 12 studies meeting the eligibility criteria and reviewed, only 42% (5/12) implemented the full RAM process with the remaining studies opting for a partial approach. Among those studies reporting, the median number of participants at first rating was 42. While 92% (11/12) of studies involved clinicians, 50% (6/12) involved multiple stakeholder types. Our review revealed that the studies failed to report on critical aspects of the RAM process. For example, no studies reported response rates with the denominator of previous rounds, 42% (5/12) did not provide panelists with feedback between rating periods, 50% (6/12) either did not have or did not report on the panel discussion period, and 25% (3/12) did not report on quality measures to assess aspects of the panel process (eg, satisfaction with the process).

CONCLUSIONS:

Conducting web-based RAM panels will continue to be an appealing option for researchers seeking a safe, efficient, and democratic process of expert agreement. Our literature review uncovered inconsistent reporting frameworks and insufficient detail to evaluate study outcomes. We provide preliminary recommendations for reporting that are both timely and important for producing replicable, high-quality findings. The need for reporting standards is especially critical given that more people may prefer to participate in web-based rather than in-person panels due to the ongoing COVID-19 pandemic.
Subject(s)
Keywords

Full text: Available Collection: International databases Database: MEDLINE Main subject: Research Design / Internet / Expert Testimony / Pandemics / COVID-19 Type of study: Experimental Studies / Prognostic study / Qualitative research / Randomized controlled trials / Reviews Topics: Variants Limits: Humans Language: English Journal: J Med Internet Res Journal subject: Medical Informatics Year: 2022 Document Type: Article Affiliation country: 33898

Similar

MEDLINE

...
LILACS

LIS


Full text: Available Collection: International databases Database: MEDLINE Main subject: Research Design / Internet / Expert Testimony / Pandemics / COVID-19 Type of study: Experimental Studies / Prognostic study / Qualitative research / Randomized controlled trials / Reviews Topics: Variants Limits: Humans Language: English Journal: J Med Internet Res Journal subject: Medical Informatics Year: 2022 Document Type: Article Affiliation country: 33898