Your browser doesn't support javascript.
Self-Attention Mechanism of RoBERTa to Improve QAS for e-health Education
4th International Conference on Computer and Informatics Engineering, IC2IE 2021 ; : 221-225, 2021.
Article in English | Scopus | ID: covidwho-1700617
ABSTRACT
This paper aims to discuss about the establishment of a health education system in the form of a Question Answer System (QAS) related to the current COVID-19 pandemic. QAS allows users to state information needs in the form of natural language questions, and then this system will return short text quotes or sentence phrases as answers. This is due to the tendency for recipients of information to more easily understand news/information when they can answer questions that may arise in their minds. The approach used was self-attention mechanism such as a Robustly Optimized BERT Pretraining Approach (RoBERTa), a method for question answering with span-based training that predicting the starting limit for answer start and the end limit for the answer index. The final results using 835 non-description questions, the best evaluation value on the training data showed the exact match of 91.7% and F1 value of 93.3%. RoBERTa tends to show the better results on non- description questions or questions with short answers compared to the description questions with complex answers. © 2021 IEEE.
Keywords

Full text: Available Collection: Databases of international organizations Database: Scopus Language: English Journal: 4th International Conference on Computer and Informatics Engineering, IC2IE 2021 Year: 2021 Document Type: Article

Similar

MEDLINE

...
LILACS

LIS


Full text: Available Collection: Databases of international organizations Database: Scopus Language: English Journal: 4th International Conference on Computer and Informatics Engineering, IC2IE 2021 Year: 2021 Document Type: Article