Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
1.
PLoS Biol ; 20(2): e3001285, 2022 02.
Article in English | MEDLINE | ID: covidwho-1662437

ABSTRACT

Amid the Coronavirus Disease 2019 (COVID-19) pandemic, preprints in the biomedical sciences are being posted and accessed at unprecedented rates, drawing widespread attention from the general public, press, and policymakers for the first time. This phenomenon has sharpened long-standing questions about the reliability of information shared prior to journal peer review. Does the information shared in preprints typically withstand the scrutiny of peer review, or are conclusions likely to change in the version of record? We assessed preprints from bioRxiv and medRxiv that had been posted and subsequently published in a journal through April 30, 2020, representing the initial phase of the pandemic response. We utilised a combination of automatic and manual annotations to quantify how an article changed between the preprinted and published version. We found that the total number of figure panels and tables changed little between preprint and published articles. Moreover, the conclusions of 7.2% of non-COVID-19-related and 17.2% of COVID-19-related abstracts undergo a discrete change by the time of publication, but the majority of these changes do not qualitatively change the conclusions of the paper.


Subject(s)
COVID-19/prevention & control , Information Dissemination/methods , Peer Review, Research/trends , Periodicals as Topic/trends , Publications/trends , COVID-19/epidemiology , COVID-19/virology , Humans , Pandemics/prevention & control , Peer Review, Research/methods , Peer Review, Research/standards , Periodicals as Topic/standards , Periodicals as Topic/statistics & numerical data , Publications/standards , Publications/statistics & numerical data , Publishing/standards , Publishing/statistics & numerical data , Publishing/trends , SARS-CoV-2/isolation & purification , SARS-CoV-2/physiology
3.
Am J Physiol Cell Physiol ; 321(1): C1-C2, 2021 07 01.
Article in English | MEDLINE | ID: covidwho-1319412
4.
PLoS One ; 16(6): e0244529, 2021.
Article in English | MEDLINE | ID: covidwho-1280615

ABSTRACT

Attitudes towards open peer review, open data and use of preprints influence scientists' engagement with those practices. Yet there is a lack of validated questionnaires that measure these attitudes. The goal of our study was to construct and validate such a questionnaire and use it to assess attitudes of Croatian scientists. We first developed a 21-item questionnaire called Attitudes towards Open data sharing, preprinting, and peer-review (ATOPP), which had a reliable four-factor structure, and measured attitudes towards open data, preprint servers, open peer-review and open peer-review in small scientific communities. We then used the ATOPP to explore attitudes of Croatian scientists (n = 541) towards these topics, and to assess the association of their attitudes with their open science practices and demographic information. Overall, Croatian scientists' attitudes towards these topics were generally neutral, with a median (Md) score of 3.3 out of max 5 on the scale score. We also found no gender (P = 0.995) or field differences (P = 0.523) in their attitudes. However, attitudes of scientist who previously engaged in open peer-review or preprinting were higher than of scientists that did not (Md 3.5 vs. 3.3, P<0.001, and Md 3.6 vs 3.3, P<0.001, respectively). Further research is needed to determine optimal ways of increasing scientists' attitudes and their open science practices.


Subject(s)
Peer Review, Research/trends , Preprints as Topic/trends , Scholarly Communication/trends , Adult , Aged , Attitude , Croatia , Cross-Sectional Studies , Faculty , Female , Humans , Information Dissemination/methods , Laboratory Personnel , Male , Middle Aged , Peer Review, Research/methods , Physicians , Psychometrics/methods , Surveys and Questionnaires
7.
PLoS One ; 16(5): e0250887, 2021.
Article in English | MEDLINE | ID: covidwho-1229046

ABSTRACT

OBJECTIVE: To determine whether medRxiv data availability statements describe open or closed data-that is, whether the data used in the study is openly available without restriction-and to examine if this changes on publication based on journal data-sharing policy. Additionally, to examine whether data availability statements are sufficient to capture code availability declarations. DESIGN: Observational study, following a pre-registered protocol, of preprints posted on the medRxiv repository between 25th June 2019 and 1st May 2020 and their published counterparts. MAIN OUTCOME MEASURES: Distribution of preprinted data availability statements across nine categories, determined by a prespecified classification system. Change in the percentage of data availability statements describing open data between the preprinted and published versions of the same record, stratified by journal sharing policy. Number of code availability declarations reported in the full-text preprint which were not captured in the corresponding data availability statement. RESULTS: 3938 medRxiv preprints with an applicable data availability statement were included in our sample, of which 911 (23.1%) were categorized as describing open data. 379 (9.6%) preprints were subsequently published, and of these published articles, only 155 contained an applicable data availability statement. Similar to the preprint stage, a minority (59 (38.1%)) of these published data availability statements described open data. Of the 151 records eligible for the comparison between preprinted and published stages, 57 (37.7%) were published in journals which mandated open data sharing. Data availability statements more frequently described open data on publication when the journal mandated data sharing (open at preprint: 33.3%, open at publication: 61.4%) compared to when the journal did not mandate data sharing (open at preprint: 20.2%, open at publication: 22.3%). CONCLUSION: Requiring that authors submit a data availability statement is a good first step, but is insufficient to ensure data availability. Strict editorial policies that mandate data sharing (where appropriate) as a condition of publication appear to be effective in making research data available. We would strongly encourage all journal editors to examine whether their data availability policies are sufficiently stringent and consistently enforced.


Subject(s)
Information Dissemination/methods , Peer Review, Research/trends , Preprints as Topic/trends , Data Accuracy , Editorial Policies , Humans , Peer Review, Research/methods , Policy
SELECTION OF CITATIONS
SEARCH DETAIL