Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Implement Sci ; 19(1): 50, 2024 Jul 15.
Article in English | MEDLINE | ID: mdl-39010153

ABSTRACT

BACKGROUND: There are no criteria specifically for evaluating the quality of implementation research and recommending implementation strategies likely to have impact to practitioners. We describe the development and application of the Best Practices Tool, a set of criteria to evaluate the evidence supporting HIV-specific implementation strategies. METHODS: We developed the Best Practices Tool from 2022-2023 in three phases. (1) We developed a draft tool and criteria based on a literature review and key informant interviews. We purposively selected and recruited by email interview participants representing a mix of expertise in HIV service delivery, quality improvement, and implementation science. (2) The tool was then informed and revised through two e-Delphi rounds using a survey delivered online through Qualtrics. The first and second round Delphi surveys consisted of 71 and 52 open and close-ended questions, respectively, asking participants to evaluate, confirm, and make suggestions on different aspects of the rubric. After each survey round, data were analyzed and synthesized as appropriate; and the tool and criteria were revised. (3) We then applied the tool to a set of research studies assessing implementation strategies designed to promote the adoption and uptake of evidence-based HIV interventions to assess reliable application of the tool and criteria. RESULTS: Our initial literature review yielded existing tools for evaluating intervention-level evidence. For a strategy-level tool, additions emerged from interviews, for example, a need to consider the context and specification of strategies. Revisions were made after both Delphi rounds resulting in the confirmation of five evaluation domains - research design, implementation outcomes, limitations and rigor, strategy specification, and equity - and four evidence levels - best, promising, more evidence needed, and harmful. For most domains, criteria were specified at each evidence level. After an initial pilot round to develop an application process and provide training, we achieved 98% reliability when applying the criteria to 18 implementation strategies. CONCLUSIONS: We developed a tool to evaluate the evidence supporting implementation strategies for HIV services. Although specific to HIV in the US, this tool is adaptable for evaluating strategies in other health areas.


Subject(s)
Delphi Technique , HIV Infections , Implementation Science , Humans , HIV Infections/therapy , United States , Quality Improvement/organization & administration
2.
Res Sq ; 2024 Feb 29.
Article in English | MEDLINE | ID: mdl-38464091

ABSTRACT

Background: There are no criteria specifically for evaluating the quality of implementation research and recommend implementation strategies likely to have impact to practitioners. We describe the development and application of the Best Practices Rubric, a set of criteria to evaluate the evidence supporting implementation strategies, in the context of HIV. Methods: We developed the Best Practices Rubric from 2022-2023 in three phases. (1) We purposively selected and recruited by email participants representing a mix of expertise in HIV service delivery, quality improvement, and implementation science. We developed a draft rubric and criteria based on a literature review and key informant interviews. (2) The rubric was then informed and revised through two e-Delphi rounds using a survey delivered online through Qualtrics. The first and second round Delphi surveys consisted of 71 and 52 open and close-ended questions, respectively, asking participants to evaluate, confirm, and make suggestions on different aspects of the rubric. After each survey round, data were analyzed and synthesized as appropriate, and the rubric and criteria were revised. (3) We then applied the rubric to a set of research studies assessing 18 implementation strategies designed to promote the adoption and uptake of pre-exposure prophylaxis, an HIV prevention medication, to assess reliable application of the rubric and criteria. Results: Our initial literature review yielded existing rubrics and criteria for evaluating intervention-level evidence. For a strategy-level rubric, additions emerged from interviews, for example, a need to consider the context and specification of strategies. Revisions were made after both Delphi rounds resulting in the confirmation of five evaluation domains - research design, implementation outcomes, limitations and rigor, strategy specification, and equity - and four evidence levels - best practice, promising practice, more evidence needed, and harmful practices. For most domains, criteria were specified at each evidence level. After an initial pilot round to develop an application process and provide training, we achieved 98% reliability when applying the criteria to 18 implementation strategies. Conclusions: We developed a rubric to evaluate the evidence supporting implementation strategies for HIV services. Although the rubric is specific to HIV, this tool is adaptable for evaluating strategies in other health areas.

SELECTION OF CITATIONS
SEARCH DETAIL
...