Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
2.
AJOB Empir Bioeth ; 13(4): 251-262, 2022.
Article in English | MEDLINE | ID: mdl-35748820

ABSTRACT

BACKGROUND: Institutional review board (IRB) expertise is necessarily limited by maintaining a manageable board size. IRBs are therefore permitted by regulation to rely on outside experts for review. However, little is known about whether, when, why, and how IRBs use outside experts. METHODS: We conducted a national survey of U.S. IRBs to characterize utilization of outside experts. Our study uses a descriptive, cross-sectional design to understand how IRBs engage with such experts and to identify areas where outside expertise is most frequently requested. RESULTS: The survey response rate was 18.4%, with 55.4% of respondents reporting their institution's IRB uses outside experts. Nearly all respondents who reported using outside experts indicated they do so less than once a month, but occasionally each year (95%). The most common method of identifying an outside expert was securing a previously known subject matter expert (83.3%). Most frequently, respondents sought consultation for scientific expertise not held by current members (69.6%). Almost all respondents whose IRBs had used outside experts reported an overall positive impact on the IRB review process (91.5%). CONCLUSIONS: Just over half of the IRBs in our sample report use of outside experts; among them, outside experts were described as helpful, but their use was infrequent overall. Many IRBs report not relying on outside experts at all. This raises important questions about what type of engagement with outside experts should be viewed as optimal to promote the highest quality review. For example, few respondents sought assistance from a Community Advisory Board, which could address expertise gaps in community perspectives. Further exploration is needed to understand how to optimize IRB use of outside experts, including how to recognize when expertise is lacking, what barriers IRBs face in using outside experts, and perspectives on how outside expert review impacts IRB decision-making and review quality.


Subject(s)
Ethics Committees, Research , Research Design , Humans , Cross-Sectional Studies , Surveys and Questionnaires
3.
Ethics Hum Res ; 44(2): 26-32, 2022 Mar.
Article in English | MEDLINE | ID: mdl-35218600

ABSTRACT

Institutional review boards (IRBs) are permitted by regulation to seek assistance from outside experts when reviewing research applications that are beyond the scope of expertise represented in their membership. There is insufficient understanding, however, of when, why, and how IRBs consult with outside experts, as this practice has not been the primary focus of any published literature or empirical study to date. These issues have important implications for IRB quality. The capacity IRBs have to fulfill their mission of protecting research participants without unduly hindering research is influenced by IRBs' access to and use of the right type of expertise to review challenging research ethics, regulatory, and scientific issues. Through a review of the regulations and standards permitting IRBs to draw on the competencies of outside experts and through examination of the needs, strategies, challenges, and concerns related to doing so, we identify critical gaps in the existing literature and set forth an agenda for future empirical research.


Subject(s)
Biomedical Research , Ethics Committees, Research , Ethics, Research , Humans
4.
J Clin Transl Sci ; 5(1): e205, 2021.
Article in English | MEDLINE | ID: mdl-34956653

ABSTRACT

BACKGROUND/OBJECTIVE: Along with the greater research enterprise, Institutional Review Boards (IRBs) had to quickly adapt to the COVID-19 pandemic. IRBs had to review and oversee COVID-related research, while navigating strict public health measures and a workforce largely relegated to working from home. Our objectives were to measure adjustments to standard IRB review processes, IRB turnaround time and document and any novel ethical issues encountered. METHODS: Structured data requests were sent to members of the Consortium to Advance Effective Research Ethics Oversight directing Human Research Protection Programs (HRPP). RESULTS: Fourteen of the 32 HRPP director members responded to a questionnaire about their approach to review and oversight during COVID-19. Eleven of the 14 provided summary data on COVID-19-specific protocols and six of the 11 provided protocol-related documents for our review. All respondents adopted at least one additional COVID-19-specific step to their usual review process. The average turnaround time for convened and expedited IRB reviews was 15 calendar days. In our review of the documents from 194 COVID-19-specific protocols (n = 302 documents), we identified only a single review that raised ethical concerns unique to COVID-19. CONCLUSIONS: Our data provide a snapshot of how HRPPs approached the review of COVID-19-specific protocols at the start of the pandemic in the USA. While not generalizable to all HRPPs, these data indicate that HRPPs can adapt and respond quickly response to a pandemic and likely need little novel expertise in the review and oversight of COVID-19-specific protocols.

5.
Ethics Hum Res ; 43(5): 26-35, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34496157

ABSTRACT

Human research protection programs (HRPP) generate an abundance of data on performance, capacity, and compliance. When used effectively, this information can be instrumental in helping HRPPs meet programmatic and institutional goals, demonstrate growth and success, and improve the HRPP overall. Metrics must be grounded in professional insight so that HRPPs can pair analytics with strategies for future action or improvement. The purpose of this paper is to demonstrate how high-performing HRPPs develop, adopt, and implement a metrics framework that benefits everyday operations and produces real-world results. Through a three-part thematic framework (of insight, data, and action) and by providing case examples and actionable strategies, this article will address how HRPPs iteratively develop and characterize their metrics, build a metrics framework that leverages both quantitative and qualitative data to validate outcomes, and activate human insight to produce meaningful communication, visualization, and dissemination of data.


Subject(s)
Benchmarking , Communication , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...