Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 41
Filter
1.
Contemp Clin Trials ; 142: 107573, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38759865

ABSTRACT

INTRODUCTION: Accurately estimating the costs of clinical trials is challenging. There is currently no reference class data to allow researchers to understand the potential costs associated with database change management in clinical trials. METHODS: We used a case-based approach, summarising post-live changes in eleven clinical trial databases managed by Sheffield Clinical Trials Research Unit. We reviewed the database specifications for each trial and summarised the number of changes, change type, change category, and timing of changes. We pooled our experiences and made observations in relation to key themes. RESULTS: Median total number of changes across the eleven trials was 71 (range 40-155) and median number of changes per study week was 0.48 (range 0.32-1.34). The most common change type was modification (median 39, range 20-90), followed by additions (median 32, range 18-55), then deletions (median 7, range 1-12). In our sample, changes were more common in the first half of the trial's lifespan, regardless of its overall duration. Trials which saw continuous changes seemed more likely to be external pilots or trials in areas where the trial team was either less experienced overall or within the particular therapeutic area. CONCLUSIONS: Researchers should plan trials with the expectation that clinical trial databases will require changes within the life of the trial, particularly in the early stages or with a less experienced trial team. More research is required to understand potential differences between clinical trial units and database types.


Subject(s)
Clinical Trials as Topic , Databases, Factual , Humans , Clinical Trials as Topic/organization & administration , Clinical Trials as Topic/methods , Clinical Trials as Topic/standards , United Kingdom , Data Management/methods
2.
Heliyon ; 10(6): e27846, 2024 Mar 30.
Article in English | MEDLINE | ID: mdl-38545152

ABSTRACT

Background: Clinical data management (CDM) collects, integrates, and makes data available. It plays a vital role in clinical research. However, there are few opportunities for Japanese clinical data managers to learn about its systematic framework, particularly in academic research organizations. While Japanese-language CDM training exists, its effectiveness in a Japanese context requires clarification. Objectives: We aimed to develop an advanced program of instruction for professionals to understand CDM and to determine the effectiveness of the training program. Methods and results: We developed an advanced program including risk-based monitoring and the Clinical Data Interchange Standards Consortium on a trial basis for clinical data managers to provide them with a comprehensive understanding of CDM. Fifty-two people attended the program and reported that they were highly satisfied with it. Conclusions: To provide comprehensive CDM training in Japan, it is imperative to continue improving the content and develop an advanced program. Due to the recent tightening of clinical research regulations and the development and dissemination of various systems for conducting clinical research, the competency-based educational program requires further development.

3.
BMC Med Res Methodol ; 24(1): 55, 2024 Mar 01.
Article in English | MEDLINE | ID: mdl-38429658

ABSTRACT

BACKGROUND: Research Electronic Data CAPture (REDCap) is a web application for creating and managing online surveys and databases. Clinical data management is an essential process before performing any statistical analysis to ensure the quality and reliability of study information. Processing REDCap data in R can be complex and often benefits from automation. While there are several R packages available for specific tasks, none offer an expansive approach to data management. RESULTS: The REDCapDM is an R package for accessing and managing REDCap data. It imports data from REDCap to R using either an API connection or the files in R format exported directly from REDCap. It has several functions for data processing and transformation, and it helps to generate and manage queries to clarify or resolve discrepancies found in the data. CONCLUSION: The REDCapDM package is a valuable tool for data scientists and clinical data managers who use REDCap and R. It assists in tasks such as importing, processing, and quality-checking data from their research studies.


Subject(s)
Data Management , Software , Humans , Reproducibility of Results , Surveys and Questionnaires , Records
4.
Clin Trials ; : 17407745231212190, 2023 Nov 14.
Article in English | MEDLINE | ID: mdl-37961913

ABSTRACT

BACKGROUND: The Opioid Analgesic Reduction Study is a double-blind, prospective, clinical trial investigating analgesic effectiveness in the management of acute post-surgical pain after impacted third molar extraction across five clinical sites. Specifically, Opioid Analgesic Reduction Study examines a commonly prescribed opioid combination (hydrocodone/acetaminophen) against a non-opioid combination (ibuprofen/acetaminophen). The Opioid Analgesic Reduction Study employs a novel, electronic infrastructure, leveraging the functionality of its data management system, Research Electronic Data Capture, to not only serve as its data reservoir but also provide the framework for its quality management program. METHODS: Within the Opioid Analgesic Reduction Study, Research Electronic Data Capture is expanded into a multi-function management tool, serving as the hub for its clinical data management, project management and credentialing, materials management, and quality management. Research Electronic Data Capture effectively captures data, displays/tracks study progress, triggers follow-up, and supports quality management processes. RESULTS: At 72% study completion, over 12,000 subject data forms have been executed in Research Electronic Data Capture with minimal missing (0.15%) or incomplete or erroneous forms (0.06%). Five hundred, twenty-three queries were initiated to request clarifications and/or address missing data and data discrepancies. CONCLUSION: Research Electronic Data Capture is an effective digital health technology that can be maximized to contribute to the success of a clinical trial. The Research Electronic Data Capture infrastructure and enhanced functionality used in Opioid Analgesic Reduction Study provides the framework and the logic that ensures complete, accurate, data while guiding an effective, efficient workflow that can be followed by team members across sites. This enhanced data reliability and comprehensive quality management processes allow for better preparedness and readiness for clinical monitoring and regulatory reporting.

5.
Cureus ; 15(5): e39166, 2023 May.
Article in English | MEDLINE | ID: mdl-37332444

ABSTRACT

Blockchain technology can potentially transform the dentistry sector by facilitating safe communication between dental practitioners and offering secure and efficient administration of patient information. Nevertheless, using this technology in dentistry confronts various barriers, including regulatory and legal obstacles, a lack of technical skills, and a lack of standardization. To overcome these challenges, dental practitioners, industry stakeholders, and regulators must work together to develop a legislative framework that encourages the use of blockchain technology in dentistry. Moreover, education and training programs must equip dental practitioners with the skills and expertise to properly incorporate and use blockchain technology. The use of blockchain technology in dentistry has the potential to greatly improve patient outcomes while also increasing the efficiency and security of the dental business.

6.
Res Sq ; 2023 Mar 27.
Article in English | MEDLINE | ID: mdl-37034600

ABSTRACT

Background: Medical record abstraction (MRA) is a commonly used method for data collection in clinical research, but is prone to error, and the influence of quality control (QC) measures is seldom and inconsistently assessed during the course of a study. We employed a novel, standardized MRA-QC framework as part of an ongoing observational study in an effort to control MRA error rates. In order to assess the effectiveness of our framework, we compared our error rates against traditional MRA studies that had not reported using formalized MRA-QC methods. Thus, the objective of this study was to compare the MRA error rates derived from the literature with the error rates found in a study using MRA as the sole method of data collection that employed an MRA-QC framework. Methods: Using a moderator meta-analysis employed with Q-test, the MRA error rates from the meta-analysis of the literature were compared with the error rate from a recent study that implemented formalized MRA training and continuous QC processes. Results: The MRA process for data acquisition in clinical research was associated with both high and highly variable error rates (70 - 2,784 errors per 10,000 fields). Error rates for the study using our MRA-QC framework were between 1.04% (optimistic, all-field rate) and 2.57% (conservative, populated-field rate) (or 104 - 257 errors per 10,000 fields), 4.00 - 5.53 percentage points less than the observed rate from the literature (p<0.0001). Conclusions: Review of the literature indicated that the accuracy associated with MRA varied widely across studies. However, our results demonstrate that, with appropriate training and continuous QC, MRA error rates can be significantly controlled during the course of a clinical research study.

7.
Clin Trials ; 20(2): 166-175, 2023 04.
Article in English | MEDLINE | ID: mdl-36734212

ABSTRACT

INTRODUCTION: In clinical trials, event adjudication is a process to review and confirm the accuracy of outcomes reported by site investigators. Despite efforts to automate the communication between a clinical-data-and-coordination center and an event adjudication committee, the review and confirmation of outcomes, as the core function of the process, still fully rely on human labor. To address this issue, we present an automated event adjudication system and its application in two randomized controlled trials. METHODS: Centrally executed by a clinical-data-and-coordination center, the automated event adjudication system automatedly assessed and classified outcomes in a clinical data management system. By checking clinically predefined criteria, the automated event adjudication system either confirmed or unconfirmed an outcome and automatedly updated its status in the database. It also served as a management tool to assist staff to oversee the process of event adjudication. The system has been applied in: (1) the Cardiovascular Outcomes for People Using Anticoagulation Strategies (COMPASS) trial and (2) the New Approach riVaroxaban Inhibition of Factor Xa in a Global trial versus Aspirin to prevenT Embolism in Embolic Stroke of Undetermined Source (NAVIGATE ESUS) trial. The automated event adjudication system first screened outcomes reported on a case report form and confirmed those with data matched to preset definitions. For selected primary efficacy, secondary, and safety outcomes, the unconfirmed cases were referred to a human event adjudication committee for a final decision. In the New Approach riVaroxaban Inhibition of Factor Xa in a Global trial versus Aspirin to prevenT Embolism in Embolic Stroke of Undetermined Source (NAVIGATE ESUS) trial, human adjudicators were given priority to review cases, while the automated event adjudication system took the lead in the Cardiovascular Outcomes for People Using Anticoagulation Strategies (COMPASS) trial. RESULTS: Outcomes that were adjudicated in a hybrid model are discussed here. The COMPASS automated event adjudication system adjudicated 3283 primary efficacy outcomes and confirmed 1652 (50.3%): 132 (21.1%) strokes, 522 (53%) myocardial infarctions, and 998 (59.7%) causes of deaths. The NAVIGATE ESUS one adjudicated 737 cases of selected outcomes and confirmed 383 (52%): 219 (51.5%) strokes, 34 (42.5%) myocardial infarctions, 73 (54.9%) causes of deaths, and 57 (57.6%) major bleedings. After one deducts the time needed for migrating the system to a new study, the automated event adjudication system helped to reduce the time required for human review from approximately 1303 to 716.5 h for the Cardiovascular Outcomes for People Using Anticoagulation Strategies trial and from 387 to 196 h for the New Approach riVaroxaban Inhibition of Factor Xa in a Global trial versus Aspirin to prevenT Embolism in Embolic Stroke of Undetermined Source trial. CONCLUSION: The automated event adjudication system in combination with human adjudicators provides a streamlined and efficient approach to event adjudication in clinical trials. To immediately apply automated event adjudication, one can first consider the automated event adjudication system and involve human assistance for cases unconfirmed by the former.


Subject(s)
Embolic Stroke , Embolism , Myocardial Infarction , Stroke , Humans , Rivaroxaban/therapeutic use , Embolic Stroke/complications , Embolic Stroke/drug therapy , Factor Xa/therapeutic use , Factor Xa Inhibitors/therapeutic use , Double-Blind Method , Aspirin/therapeutic use , Stroke/prevention & control , Stroke/drug therapy , Embolism/complications , Embolism/drug therapy , Myocardial Infarction/drug therapy
8.
Res Sq ; 2023 Dec 21.
Article in English | MEDLINE | ID: mdl-38196643

ABSTRACT

Background: In clinical research, prevention of systematic and random errors of data collected is paramount to ensuring reproducibility of trial results and the safety and efficacy of the resulting interventions. Over the last 40 years, empirical assessments of data accuracy in clinical research have been reported in the literature. Although there have been reports of data error and discrepancy rates in clinical studies, there has been little systematic synthesis of these results. Further, although notable exceptions exist, little evidence exists regarding the relative accuracy of different data processing methods. We aim to address this gap by evaluating error rates for 4 data processing methods. Methods: A systematic review of the literature identified through PubMed was performed to identify studies that evaluated the quality of data obtained through data processing methods typically used in clinical trials: medical record abstraction (MRA), optical scanning, single-data entry, and double-data entry. Quantitative information on data accuracy was abstracted from the manuscripts and pooled. Meta-analysis of single proportions based on the Freeman-Tukey transformation method and the generalized linear mixed model approach were used to derive an overall estimate of error rates across data processing methods used in each study for comparison. Results: A total of 93 papers (published from 1978 to 2008) meeting our inclusion criteria were categorized according to their data processing methods. The accuracy associated with data processing methods varied widely, with error rates ranging from 2 errors per 10,000 fields to 2,784 errors per 10,000 fields. MRA was associated with both high and highly variable error rates, having a pooled error rate of 6.57% (95% CI: 5.51, 7.72). In comparison, the pooled error rates for optical scanning, single-data entry, and double-data entry methods were 0.74% (0.21, 1.60), 0.29% (0.24, 0.35) and 0.14% (0.08, 0.20), respectively. Conclusions: Data processing and cleaning methods may explain a significant amount of the variability in data accuracy. MRA error rates, for example, were high enough to impact decisions made using the data and could necessitate increases in sample sizes to preserve statistical power. Thus, the choice of data processing methods can likely impact process capability and, ultimately, the validity of trial results.

9.
BMC Med Res Methodol ; 22(1): 227, 2022 08 15.
Article in English | MEDLINE | ID: mdl-35971057

ABSTRACT

BACKGROUND: Studies have shown that data collection by medical record abstraction (MRA) is a significant source of error in clinical research studies relying on secondary use data. Yet, the quality of data collected using MRA is seldom assessed. We employed a novel, theory-based framework for data quality assurance and quality control of MRA. The objective of this work is to determine the potential impact of formalized MRA training and continuous quality control (QC) processes on data quality over time. METHODS: We conducted a retrospective analysis of QC data collected during a cross-sectional medical record review of mother-infant dyads with Neonatal Opioid Withdrawal Syndrome. A confidence interval approach was used to calculate crude (Wald's method) and adjusted (generalized estimating equation) error rates over time. We calculated error rates using the number of errors divided by total fields ("all-field" error rate) and populated fields ("populated-field" error rate) as the denominators, to provide both an optimistic and a conservative measurement, respectively. RESULTS: On average, the ACT NOW CE Study maintained an error rate between 1% (optimistic) and 3% (conservative). Additionally, we observed a decrease of 0.51 percentage points with each additional QC Event conducted. CONCLUSIONS: Formalized MRA training and continuous QC resulted in lower error rates than have been found in previous literature and a decrease in error rates over time. This study newly demonstrates the importance of continuous process controls for MRA within the context of a multi-site clinical research study.


Subject(s)
Data Accuracy , Medical Records , Data Collection , Humans , Infant, Newborn , Research Design , Retrospective Studies
10.
Wiad Lek ; 75(5 pt 1): 1192-1196, 2022.
Article in English | MEDLINE | ID: mdl-35758501

ABSTRACT

OBJECTIVE: The aim: To review real-life regulatory-dependent study design and data management practices of post marketing multicenter studies of medical devices conducted in 2021 in Ukraine and Poland. PATIENTS AND METHODS: Materials and methods: This article presents the case study of 4 post marketing multicenter studies of medical devices conducted in 2021 in Ukraine and European Union. RESULTS: Results: The case study presented effective cross-border cooperation between Ukrainian and European actors. Despite the gaps in Ukrainian legislative framework on medical devices, complex solutions on employment of the most stringent regulatory provisions led to appropriate study design. Usage of the highly compliant electronic data capture led to fast-track study start-up and solid clinical data collection. CONCLUSION: Conclusions: Publications on real-life regulatory-dependent clinical trials conduct might be essential to innovate the regulatory system in Ukraine. The cross-border cooperation might assist the advancement of clinical trials industry in Ukraine. Gaps in medical devices regulations in Ukraine impede the context-specific clinical trials solutions for biotech industry in Ukraine. The regulatory framework and practice in Ukraine may be perceived as externally driven due to gaps in medical devices regulations, lack of capacities of domestic notified bodies and business interests of Sponsors.


Subject(s)
Clinical Trials as Topic , Data Management , Research Design , Commerce , European Union , Humans , Multicenter Studies as Topic , Poland , Ukraine
11.
Nagoya J Med Sci ; 84(1): 120-132, 2022 Feb.
Article in English | MEDLINE | ID: mdl-35392016

ABSTRACT

Submitting data compliant with the Clinical Data Interchange Standards Consortium (CDISC) standards is mandatory for new drug applications (NDAs). The standards set by CDISC are widely adopted in the pharmaceutical business world. Introduction of CDISC standards in academia can be necessary to reduce labor, resolve the shortage of data managers in academia, and gain new knowledge through standardized data accumulation. However, the introduction of CDISC standards has not progressed in communities within the academia that do not apply for NDAs. Therefore, herein, we created study data tabulation model (SDTM)-compliant datasets within the academia, without outsourcing, to reduce costs associated with investigator-initiated clinical trials. First, we input data from paper case report forms (CRFs) into an electronic data capture system with minimal function for paper CRFs, "Ptosh," which is compatible with SDTM. Then, we developed a generic program to convert data exported from Ptosh into fully SDTM-compliant datasets. The consistency was then verified with an SDTM validator, Pinnacle21 Community V3.0.1 (P21C). This resulted in generation of SDTM datasets, resolving all "Rejects" in P21C, thereby achieving the required quality level. Although Ptosh directly exports data in SDTM format, manual mapping of items on CRFs to SDTM variables prepared in Ptosh is necessary. SDTM mapping requires extensive knowledge and skills, and it was assumed that mapping is challenging for the staff without in-depth knowledge of CDISC standards and datasets. Therefore, for CDISC dissemination in academia, it is crucial to secure the staff, time, and funding to acquire the knowledge.

12.
Trials ; 23(1): 187, 2022 Mar 03.
Article in English | MEDLINE | ID: mdl-35241149

ABSTRACT

BACKGROUND: Clinical trials play an important role in expanding the knowledge of diabetes prevention, diagnosis, and treatment, and data management is one of the main issues in clinical trials. Lack of appropriate planning for data management in clinical trials may negatively influence achieving the desired results. The aim of this study was to explore data management processes in diabetes clinical trials in three research institutes in Iran. METHOD: This was a qualitative study conducted in 2019. In this study, data were collected through in-depth semi-structured interviews with 16 researchers in three endocrinology and metabolism research institutes. To analyze data, the method of thematic analysis was used. RESULTS: The five themes that emerged from data analysis included (1) clinical trial data collection, (2) technologies used in data management, (3) data security and confidentiality management, (4) data quality management, and (5) data management standards. In general, the findings indicated that no clear and standard process was used for data management in diabetes clinical trials, and each research center executed its own methods and processes. CONCLUSION: According to the results, the common methods of data management in diabetes clinical trials included a set of paper-based processes. It seems that using information technology can help facilitate data management processes in a variety of clinical trials, including diabetes clinical trials.


Subject(s)
Data Management , Diabetes Mellitus , Diabetes Mellitus/diagnosis , Diabetes Mellitus/therapy , Humans , Iran , Qualitative Research , Research Personnel
13.
Article in Chinese | WPRIM (Western Pacific) | ID: wpr-1014791

ABSTRACT

With the enormous resources having been invested in oncology drugs development in China in recent years, the Center for Drug Evaluation (CDE) of National Medical Products Administration has been issuing a number of technical guidelines to further standardize the requirements on implementation and registration of domestic oncology clinical trials. As data is the cornerstone of clinical trials, data integrity and quality will directly decide the outcome of clinical studies. Given the specific characteristics of oncology therapeutic clinical trials, and combined with the clinical data standards established by the Clinical Data Interchange Standards Consortium (CDISC) and the issued industrial guidelines, this article introduces the general considerations of clinical data management for oncology clinical trials, with the aim of emphasizing normative data collection and timely data monitoring to ensure the data quality and reliability of results of the study. This article discusses the impact of complex study design on CRF, design CRF according to CDASH, develop DVP scientifically, rolling submissions and data cut-off.

14.
Ther Innov Regul Sci ; 55(5): 1006-1012, 2021 09.
Article in English | MEDLINE | ID: mdl-33963525

ABSTRACT

BACKGROUND: The causes, degree and disruptive nature of mid-study database updates and other pain points were evaluated to understand if and how the clinical data management function is managing rapid growth in data volume and diversity. METHODS: Tufts Center for the Study of Drug Development (Tufts CSDD)-in collaboration with IBM Watson Health-conducted an online global survey between September and October 2020. RESULTS: One hundred ninety four verified responses were analyzed. Planned and unplanned mid-study updates were the top challenges mentioned and their management was time intensive. Respondents reported an average of 4.1 planned and 3.7 unplanned mid-study updates per clinical trial. CONCLUSION: Mid-study database updates are disruptive and present a major opportunity to accelerate cycle times and improve efficiency, particularly as protocol designs become more flexible and the diversity of data, most notably unstructured data, increases.


Subject(s)
Data Management , Drug Development , Humans , Pain , Surveys and Questionnaires
15.
Ther Innov Regul Sci ; 55(2): 272-281, 2021 03.
Article in English | MEDLINE | ID: mdl-32926350

ABSTRACT

BACKGROUND: Contending with a continuously expanding volume and variety of clinical data poses challenges and opportunities for the industry and clinical data management organizations. METHODS: Tufts CSDD conducted an online survey aimed at further quantifying and understanding the magnitude and impact that expanded data volume, sources and diversity are having on clinical trials. The survey was distributed between October and December 2019. Responses from a total of 149 individuals were included in the final analysis. RESULTS: The survey found that companies use or pilot from one to six different data sources with the majority of respondents using or piloting 3-4 different sources of data in their clinical trials. The results showed that average times to database lock have increased an average 5 days compared to a 2017 study, possibly as a result of managing an even larger number of data sources. Finally, three key mitigation strategies surfaced as techniques respondents used to tackle expanding data volume, sources, and diversity: the creation of a formalized data strategy, investment in new analytics tools and more sophisticated data technology infrastructures, and the development of new data science disciplines. CONCLUSION: Without further investments into infrastructure and developments of additional mitigation techniques in this area, database lock cycle times are likely to continue to increase as more and more data supporting a clinical trial are coming from nontraditional, CRF sources. Further research must be done into organizations who are handling these challenges appropriately.


Subject(s)
Surveys and Questionnaires , Humans
16.
J Educ Health Promot ; 9: 255, 2020.
Article in English | MEDLINE | ID: mdl-33224999

ABSTRACT

BACKGROUND: Oral soft tissue diseases include a broad spectrum, and the wide array of patient data elements need to be processed in their diagnosis. One of the biggest and most basic challenges is the analysis of this huge amount of complex patient data in an increasing number of complicated clinical decisions. This study seeks to identify the necessary steps for collecting and management of these data elements through establishing a consensus-based framework. METHODS: This research was conducted as a descriptive, cross-sectional study from April 2016 to January 2017, which has been performed in several steps: literature review, developing the initial draft (v. 0), submitting the draft to experts, validating by an expert panel, applying expert opinions and creating version v.i, performing Delphi rounds, and creating the final framework. RESULTS: The administrative data category with 17 and the historical data category with 23 data elements were utilized in recording data elements in the diagnosis of all of the different oral diseases. In the paraclinical indicator and clinical indicator categories, the necessary data elements were considered with respect to the 6 main axes of oral soft tissue diseases, according to Burket's Oral Medicine: ulcerative, vesicular, and bullous lesions; red and white lesions of the oral mucosa; pigmented lesions of the oral mucosa; benign lesions of the oral cavity and the jaws; oral and oropharyngeal cancer; and salivary gland diseases. CONCLUSIONS: The study achieved a consensus-based framework for the essential data element in the differential diagnosis of oral medicine using a comprehensive search with rich keywords in databases and reference texts, providing an environment for discussion and exchange of ideas among experts and the careful use of the Delphi decision technique.

17.
Stud Health Technol Inform ; 270: 1199-1200, 2020 Jun 16.
Article in English | MEDLINE | ID: mdl-32570578

ABSTRACT

OBJECTIVE: This job analysis was conducted to compare, assess and refine the competencies of the clinical research data management profession. MATERIALS AND METHODS: Two questionnaires were administered in 2015 and 2018 to collect information from data managers on professional competencies, types of data managed, types of studies supported, and necessary foundational knowledge. RESULTS: In 2018 survey, 67 professional competencies were identified. Job tasks differed between early- to mid-career and mid- to late-career practitioners. A large variation in the types of studies conducted and variation in the data managed by the participants was observed. DISCUSSION: Clinical research data managers managed different types of data with variety of research settings, which indicated a need for training in methods and concepts that could be applied across therapeutic areas and types of data. CONCLUSION: The competency survey reported here serves as the foundation for the upcoming revision of the Certified Clinical Data Manager (CCDMTM) exam.


Subject(s)
Data Management , Professional Competence , Certification , Humans , Surveys and Questionnaires
18.
Stud Health Technol Inform ; 258: 211-215, 2019.
Article in English | MEDLINE | ID: mdl-30942748

ABSTRACT

Clinical Data Management Systems (CDMS) are used to electronically capture and store data about study participants in clinical trials. CDMS tend to be superior compared to paper-based data capture with respect to data quality, consistency, completeness and traceability. Nevertheless, their application is not default - especially in small-scale, academic clinical studies. While clinical researchers can choose from many different software vendors, the vast requirements of data management and the growing need for integration with other systems make it hard to select the most suitable one. Additionally, the financial and personnel costs for purchasing, deploying and maintaining a commercial solution can easily go beyond the limits of a research project's resources. The aim of this paper is to assess the suitability of the web-based open-source software OpenClinica for academic clinical trials with regards to functionalities required in a large research network.


Subject(s)
Clinical Trials as Topic , Information Management , Software
19.
Stud Health Technol Inform ; 257: 526-539, 2019.
Article in English | MEDLINE | ID: mdl-30741251

ABSTRACT

Studies often rely on medical record abstraction as a major source of data. However, data quality from medical record abstraction has long been questioned. Electronic Health Records (EHRs) potentially add variability to the abstraction process due to the complexity of navigating and locating study data within these systems. We report training for and initial quality assessment of medical record abstraction for a clinical study conducted by the IDeA States Pediatric Clinical Trials Network (ISPCTN) and the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) Neonatal Research Network (NRN) using medical record abstraction as the primary data source. As part of overall quality assurance, study-specific training for medical record abstractors was developed and deployed during study start-up. The training consisted of a didactic session with an example case abstraction and an independent abstraction of two standardized cases. Sixty-nine site abstractors from thirty sites were trained. The training was designed to achieve an error rate for each abstractor of no greater than 4.93% with a mean of 2.53%, at study initiation. Twenty-three percent of the trainees exceeded the acceptance limit on one or both of the training test cases, supporting the need for such training. We describe lessons learned in the design and operationalization of the study-specific, medical record abstraction training program.


Subject(s)
Medical Errors , Medical Records , Abstracting and Indexing , Child , Humans , Information Storage and Retrieval , Research Design
20.
Rev Recent Clin Trials ; 14(3): 160-172, 2019.
Article in English | MEDLINE | ID: mdl-30734683

ABSTRACT

BACKGROUND: Data management is an important, complex and multidimensional process in clinical trials. The execution of this process is very difficult and expensive without the use of information technology. A clinical data management system is software that is vastly used for managing the data generated in clinical trials. The objective of this study was to review the technical features of clinical trial data management systems. METHODS: Related articles were identified by searching databases, such as Web of Science, Scopus, Science Direct, ProQuest, Ovid and PubMed. All of the research papers related to clinical data management systems which were published between 2007 and 2017 (n=19) were included in the study. RESULTS: Most of the clinical data management systems were web-based systems developed based on the needs of a specific clinical trial in the shortest possible time. The SQL Server and MySQL databases were used in the development of the systems. These systems did not fully support the process of clinical data management. In addition, most of the systems lacked flexibility and extensibility for system development. CONCLUSION: It seems that most of the systems used in the research centers were weak in terms of supporting the process of data management and managing clinical trial's workflow. Therefore, more attention should be paid to design a more complete, usable, and high quality data management system for clinical trials. More studies are suggested to identify the features of the successful systems used in clinical trials.


Subject(s)
Clinical Trials as Topic , Data Management , Databases, Factual , Software , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...