Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
1.
Am J Transl Res ; 15(9): 5707-5714, 2023.
Article in English | MEDLINE | ID: mdl-37854232

ABSTRACT

OBJECTIVES: Institutions conducting research involving human subjects establish institutional review boards (IRBs) and/or human research protection programs to protect human research subjects. Our objectives were to develop performance metrics to measure human research subject protections and to assess how well IRBs and human research protection programs are protecting human research subjects. METHODS: A set of five performance metrics for measuring human research subject protections was developed and data were collected through annual audits of informed consent documents and human research protocols at 107 Department of Veterans Affairs research facilities from 2010 through 2021. RESULTS: The proposed performance metrics were: local adverse events that were serious, unanticipated, and related or probably related to research, including those that resulted in hospitalization or death; where required informed consent was not obtained; required Heath Insurance Portability and Accountability Act authorization was not obtained; non-exempt research was conducted without IRB approval; and research activities were continued during a lapse in IRB continuing reviews. Analysis of these performance metric data from 2010 through 2021 revealed that incident rates of all five performance metrics were very low; three showed a statistically significant trend of improvement ranging from 70% to 100%; and none of these five performance metrics deteriorated. CONCLUSIONS: Department of Veterans Affairs human research protection programs appeared to be effective in protecting human research subjects and showed improvement from 2010 through 2021. These proposed performance metrics will be useful in monitoring the effectiveness of human research protection programs in protecting human research subjects.

2.
J Empir Res Hum Res Ethics ; 17(4): 525-532, 2022 10.
Article in English | MEDLINE | ID: mdl-35470732

ABSTRACT

The Common Rule, revised extensively to enhance human subjects protections and to reduce burdens to investigators and institutional review boards (IRBs), was implemented on January 19, 2019. We analyzed IRB performance metric data from 2016 through 2021 to evaluate the potential impact of the revised Common Rule on the quality and performance of IRBs. From 2016 to 2021, exempt protocols increased by 159% and protocols requiring IRB continuing reviews decreased by 28%. As only 48% of all protocols in 2021 were subjected to the revised Common Rule requirements, numbers of exempt protocols and protocols requiring IRB continuing reviews will continue to increase and decrease, respectively, in the next few years. Among a total of 16 IRB performance metrics studied, 4 improved, 4 deteriorated, and 8 remained unchanged from 2016 through 2021. This study represents the first effort to evaluate the impact of revised Common rule on IRB quality and performance.


Subject(s)
Ethics Committees, Research , Humans
3.
J Empir Res Hum Res Ethics ; 16(5): 479-484, 2021 12.
Article in English | MEDLINE | ID: mdl-33989094

ABSTRACT

Performance measurement leads to quality improvement, because performance measurement can identify areas of vulnerability to guide quality improvement activities. Recommendations from empirical institutional review board (IRB) performance measurement data on research approval criteria, expedited review protocols, exempt protocols, and IRB continuing review requirements published over the past 10 years are reviewed here to improve the quality and efficiency of IRBs. Implementation of these recommendations should result in improvements that can be evaluated by follow-up performance measurements.


Subject(s)
Ethics Committees, Research , Quality Improvement , Humans
4.
J Empir Res Hum Res Ethics ; 15(5): 407-414, 2020 12.
Article in English | MEDLINE | ID: mdl-32917103

ABSTRACT

How well institutional review boards (IRBs) follow Common Rule criteria for levels of initial protocol review has not been systematically evaluated. We compared levels of review as determined using the Office for Human Research Protections (OHRP) human subject regulations decision charts of 313 protocols that had been approved by IRBs. There was a 97.8% agreement between 140 protocols that were reviewed by full board and the levels of review according to OHRP criteria. Likewise, there was a 93.8% agreement between 113 protocols that were reviewed using an expedited review procedure and OHRP criteria. However, there was only 75% agreement for exempt protocols. Specifically, 10 (16.7%) of the 60 exempt protocols were found to require IRB review, that is, six protocols requiring expedited review and four protocols requiring full board review. Conducting non-exempt research without prior IRB approval constitutes serious noncompliance. Our data suggest that exempt protocols need more scrutiny.


Subject(s)
Ethics Committees, Research , Humans
5.
AMA J Ethics ; 22(3): E201-208, 2020 03 01.
Article in English | MEDLINE | ID: mdl-32220266

ABSTRACT

This article considers a case in which a prominent researcher repeatedly made protocol deviations year after year while the institutional review board and university leadership failed to adequately address his continuing noncompliance. This article argues that, in addition to reporting this researcher's pattern of noncompliance to the Office for Human Research Protections, as required by federal regulations, the university should implement a remedial action plan.


Subject(s)
Ethics Committees, Research , Human Experimentation/ethics , Mandatory Reporting , Organizations/ethics , Personnel Management , Research Design , Research Personnel/ethics , Clinical Protocols , Codes of Ethics , Ethics Committees, Research/legislation & jurisprudence , Ethics, Research , Government Regulation , Human Experimentation/legislation & jurisprudence , Humans , Organizations/legislation & jurisprudence , Research Personnel/legislation & jurisprudence , Universities
6.
J Empir Res Hum Res Ethics ; 15(3): 229-231, 2020 07.
Article in English | MEDLINE | ID: mdl-32102619

ABSTRACT

Investigators of nonexempt human subjects research conducted without prior institutional review board (IRB) approval often have difficulties in publishing data obtained from such research. Retrospective review and approval of such research has been suggested as a potential pathway for an IRB to help these investigators to publish those data. However, under the Common Rule, an IRB has no authority to retrospectively review and approve human subjects research. Prevention remains the best strategy to ensure that no nonexempt human subjects research is initiated prior to IRB approval.


Subject(s)
Ethics Committees, Research , Research Subjects , Humans , Publishing , Research Personnel , Retrospective Studies
7.
J Empir Res Hum Res Ethics ; 14(3): 204-208, 2019 07.
Article in English | MEDLINE | ID: mdl-31131677

ABSTRACT

In 2007, Taylor proposed to move beyond compliance to develop measures for assessing the ethical quality of institutional review board (IRB) reviews. To date, no such tool has been developed. In 2018, Lynch et al. proposed to move beyond quality to advance effective research ethics oversight. Instead of providing a set of measures, they proposed to define and specify ways to measure the effectiveness of IRBs in protecting human subjects. They further claimed that any attempts to measure the quality and performance of IRBs without using such measures, to be developed by them in the unforeseeable future, were not helpful. The realities are that nearly 50 years after its establishment, there has been no systematic assessment of the quality and performance of IRBs, and that nearly two decades after the deaths of Jesse Gelsinger and Ellen Roche and the implementation of reform processes to improve our system of protecting human subjects, there is still plenty of room for improvement. The challenges today are for Taylor to come up with a tool to measure the ethical quality of IRB reviews, for Lynch et al. to develop measures for assessing the effectiveness of IRBs in protecting human subjects, and for the IRB community to decide whether to continue to wait for Taylor and Lynch et al. to develop their promised tools or to start taking advantage of performance measurements to improve the quality and performance of IRBs by using existing IRB performance metrics or to improve those existing performance metrics.


Subject(s)
Ethics Committees, Research/standards , Quality Assurance, Health Care , Human Experimentation/standards , Humans
8.
J Empir Res Hum Res Ethics ; 14(4): 365-371, 2019 10.
Article in English | MEDLINE | ID: mdl-30894051

ABSTRACT

Continuing review of ongoing research is one way by which institutional review boards (IRBs) ensure protection of human subjects. Among the 25 Department of Veterans Affairs (VA) human research protection program performance metrics collected annually since 2010, lapse in IRB continuing reviews had the highest noncompliance rate. In 2013, 10 facilities with lapse rates higher than the VA national average for 3 consecutive years from 2011 to 2013 implemented remedial action plans. Using data from 2011 through 2018, we demonstrated that 70% of these facilities' lapse rates remain significantly improved. In contrast, none of the 10 facilities with their lapse rates that were higher than the national averages in 2 out of 3 years from 2011 to 2013 and that did not implement remedial action plans showed any improvement. Thus, implementation of effective remedial measures in facilities with high lapse rates can result in long-lasting improvement in the majority of these facilities.


Subject(s)
Ethics Committees, Research , Human Experimentation/legislation & jurisprudence , Humans , United States , United States Department of Veterans Affairs
9.
J Empir Res Hum Res Ethics ; 14(3): 187-189, 2019 07.
Article in English | MEDLINE | ID: mdl-30296884

ABSTRACT

Despite the importance of institutional review boards (IRBs) in protecting human subjects participating in research and the well-known benefits of performance measurements, there has been no systematic assessment of the quality and performance of IRBs. The IRB community has frequently cited the lack of credible metrics for measuring human subject protections and the quality of IRB ethics reviews as reasons for not measuring the quality and performance of IRBs. However, the IRB, with its well-defined missions, functions, structure, and procedures, should be readily amendable to performance measurements. In this brief commentary, I analyzed potential barriers for measuring the quality of IRBs and proposed ways to overcome these barriers.


Subject(s)
Ethics Committees, Research/standards , Quality Assurance, Health Care , Humans
10.
J Empir Res Hum Res Ethics ; 13(3): 270-275, 2018 07.
Article in English | MEDLINE | ID: mdl-29774772

ABSTRACT

Routine on-site reviews should focus primarily on facilities that are at risk of harming human subjects. Using human research protection program performance metric data from 107 facilities, we defined a facility to be at risk when one of its noncompliance/incident rates was among the top three highest rates of that performance metric. Based on 14 performance metrics with noncompliance and incidents in 2017, 27 facilities were identified to be at risk. These 27 facilities at risk, while constituting only 25% of all facilities, contributed to 70% ± 25% ( M ± SD; range = 32%-100%) of all reported noncompliance/incidents. Thus, performance metric data can be used to guide compliance oversight activities.


Subject(s)
Ethical Review , Ethics, Research , Patient Safety , Research Design/standards , Research , United States Department of Veterans Affairs , Humans , Risk Assessment , United States
11.
J Empir Res Hum Res Ethics ; 12(4): 217-228, 2017 10.
Article in English | MEDLINE | ID: mdl-28758521

ABSTRACT

We analyzed human research protection program performance metric data of all Department of Veterans Affairs research facilities obtained from 2010 to 2016. Among a total of 25 performance metrics, 21 (84%) showed improvement, four (16%) remained unchanged, and none deteriorated during the study period. The overall improvement from these 21 performance metrics was 81.1% ± 18.7% (mean ± SD), with a range of 30% to 100%. The four performance metrics that did not show improvement all had initial noncompliance/incidence rates of <1.0%, ranging from 0% to 0.98%. The initial noncompliance/incidence rates of the 21 performance metrics that showed improvement ranged from 0.05% to 60%. However, of the 21 performance metrics that showed improvement, 10 had initial noncompliance/incidence rates of <1.0%, suggesting that improvement could be achieved even with a very low initial noncompliance/incidence rate. We conclude that performance measurement is an effective tool in improving the performance of human research protection programs.


Subject(s)
Human Experimentation/ethics , Program Evaluation/standards , United States Department of Veterans Affairs , Adult , Child , Ethics Committees, Research , Ethics, Research , Guideline Adherence , Humans , Incidence , Informed Consent , Insurance, Health , Program Evaluation/methods , Quality Improvement , United States
13.
PLoS One ; 11(9): e0162141, 2016.
Article in English | MEDLINE | ID: mdl-27606820

ABSTRACT

The United States federal animal welfare regulations and the Public Health Service Policy on Humane Care and Use of Laboratory Animals require that institutional animal care and use committees (IACUCs) conduct continuing reviews of all animal research activities. However, little is known about the lapse rate of IACUC continuing reviews, and how frequently investigators continue research activities during the lapse. It is also not clear what factors may contribute to an institution's lapse in IACUC continuing reviews. As part of the quality assurance program, the Department of Veterans Affairs (VA) has collected performance metric data for animal care and use programs since 2011. We analyzed IACUC continuing review performance data at 74-75 VA research facilities from 2011 through 2015. The IACUC continuing review lapse rates improved from 5.6% in 2011 to 2.7% in 2015. The rate of investigators continuing research activities during the lapse also decreased from 47.2% in 2012 to 7.4% in 2015. The type of IACUCs used and the size of animal research programs appeared to have no effect in facility's rates of lapse in IACUC continuing reviews. While approximately 80% of facilities reported no lapse in IACUC continuing reviews, approximately 14% of facilities had lapse rates of >10% each year. Some facilities appeared to be repeat offenders. Four facilities had IACUC lapse rates of >10% in at least 3 out of 5 years, suggesting a system problem in these facilities requiring remedial actions to improve their IACUC continuing review processes.


Subject(s)
Animal Care Committees , Animal Experimentation/standards , Animal Welfare/standards , Animals , Animals, Laboratory
15.
Clin Trials ; 12(3): 224-31, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25631384

ABSTRACT

INTRODUCTION: Institutions conducting research involving human subjects establish human research protection programs to ensure the rights and welfare of research participants as well as to meet ethical and regulatory requirements. It is important to determine whether human research protection programs have achieved these objectives. METHODS: The Department of Veterans Affairs has developed quality indicators and annually collected human research protection program quality indicator data from its 108 research facilities since 2010. RESULTS: Analysis of Department of Veterans Affairs human research protection program quality indicator data revealed that facilities using affiliated university institutional review boards performed as well as those using their own Department of Veterans Affairs institutional review boards and that facilities with small research programs, that is, less than 50 human research protocols, performed at least as well as those with larger research programs. These quality indicator data also provided Department of Veterans Affairs facilities with valuable information for quality improvement. Many of these quality indicators have improved in subsequent years, and none has deteriorated. Lapse rates in institutional review board continuing reviews remained high and relatively constant at above 6.0% over a 4-year period from 2010 through 2013. DISCUSSION: Future efforts should be directed at developing a set of human research protection program quality indicators truly reflecting the quality of human research protection programs that are applicable to both Department of Veterans Affairs and non-Department of Veterans Affairs institutions and determining whether high-quality human research protection programs as measured using these quality indicators translate into better human subject protections.


Subject(s)
Clinical Trials as Topic/ethics , Clinical Trials as Topic/standards , Human Experimentation/ethics , Human Experimentation/standards , Human Rights , Confidentiality , Ethics Committees, Research/standards , Humans , Inservice Training , Quality Indicators, Health Care , United States , United States Department of Veterans Affairs
16.
Fed Pract ; 32(5): 31-36, 2015 May.
Article in English | MEDLINE | ID: mdl-30766062

ABSTRACT

An analysis reveals considerable improvements in human research protection programs at the VA, though more effort is needed to improve institutional review board procedures and practices.

17.
Fed Pract ; 32(9): 58-63, 2015 Sep.
Article in English | MEDLINE | ID: mdl-30766088

ABSTRACT

A set of 13 quality indicators were developed to assess the quality of VA animal care and use programs, emphasizing the measurement of performance outcomes.

20.
J Empir Res Hum Res Ethics ; 8(2): 153-60, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23651939

ABSTRACT

We compared the Human Research Protection Program (HRPP) quality indicator data of the Department of Veterans Affairs (VA) facilities using their own VA institutional review boards (IRBs) with those using affiliated university IRBs. From a total of 25 performance metrics, 13 did not demonstrate statistically significant differences, while 12 reached statistically significance differences. Among the 12 with statistically significant differences, facilities using their own VA IRBs performed better on four of the metrics, while facilities using affiliate IRBs performed better on eight. However, the absolute difference was small (0.2-2.7%) in all instances, suggesting that they were of no practical significance. We conclude that it is acceptable for facilities to use their own VA IRBs or affiliated university IRBs as their IRBs of record.


Subject(s)
Ethics Committees, Research/standards , Organizational Affiliation , United States Department of Veterans Affairs/ethics , Universities/ethics , Humans , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...