This article is a Preprint
Preprints are preliminary research reports that have not been certified by peer review. They should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.
Preprints posted online allow authors to receive rapid feedback and the entire scientific community can appraise the work for themselves and respond appropriately. Those comments are posted alongside the preprints for anyone to read them and serve as a post publication assessment.
Exploring Text-transformers in AAAI 2021 Shared Task: COVID-19 Fake News Detection in English (preprint)
arxiv; 2021.
Preprint
in English
| PREPRINT-ARXIV | ID: ppzbmed-2101.02359v1
ABSTRACT
In this paper, we describe our system for the AAAI 2021 shared task of COVID-19 Fake News Detection in English, where we achieved the 3rd position with the weighted F1 score of 0.9859 on the test set. Specifically, we proposed an ensemble method of different pre-trained language models such as BERT, Roberta, Ernie, etc. with various training strategies including warm-up,learning rate schedule and k-fold cross-validation. We also conduct an extensive analysis of the samples that are not correctly classified. The code is available athttps//github.com/archersama/3rd-solution-COVID19-Fake-News-Detection-in-English.
Full text:
Available
Collection:
Preprints
Database:
PREPRINT-ARXIV
Main subject:
COVID-19
Language:
English
Year:
2021
Document Type:
Preprint
Similar
MEDLINE
...
LILACS
LIS