Your browser doesn't support javascript.
Ensemble Distillation for BERT-Based Ranking Models
11th ACM SIGIR International Conference on Theory of Information Retrieval, ICTIR 2021 ; : 131-136, 2021.
Article in English | Scopus | ID: covidwho-1405232
ABSTRACT
Over the past two years, large pretrained language models such as BERT have been applied to text ranking problems and showed superior performance on multiple public benchmark data sets. Prior work demonstrated that an ensemble of multiple BERT-based ranking models can not only boost the performance, but also reduce the performance variance. However, an ensemble of models is more costly because it needs computing resource and/or inference time proportional to the number of models. In this paper, we study how to retain the performance of an ensemble of models at the inference cost of a single model by distilling the ensemble into a single BERT-based student ranking model. Specifically, we study different designs of teacher labels, various distillation strategies, as well as multiple distillation losses tailored for ranking problems. We conduct experiments on the MS MARCO passage ranking and the TREC-COVID data set. Our results show that even with these simple distillation techniques, the distilled model can effectively retain the performance gain of the ensemble of multiple models. More interestingly, the performances of distilled models are also more stable than models fine-tuned on original labeled data. The results reveal a promising direction to capitalize on the gains achieved by an ensemble of BERT-based ranking models. © 2021 Owner/Author.

Full text: Available Collection: Databases of international organizations Database: Scopus Language: English Journal: 11th ACM SIGIR International Conference on Theory of Information Retrieval, ICTIR 2021 Year: 2021 Document Type: Article

Similar

MEDLINE

...
LILACS

LIS


Full text: Available Collection: Databases of international organizations Database: Scopus Language: English Journal: 11th ACM SIGIR International Conference on Theory of Information Retrieval, ICTIR 2021 Year: 2021 Document Type: Article