Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems
Findings of the Association for Computational Linguistics: EMNLP 2022 Large transformer models can highly improve Answer Sentence Selection (AS2) tasks, but their high computational costs prevent their use in many real-world applications. In this paper, we explore the following research question: Ho...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Findings of the Association for Computational Linguistics: EMNLP
2022 Large transformer models can highly improve Answer Sentence Selection (AS2)
tasks, but their high computational costs prevent their use in many real-world
applications. In this paper, we explore the following research question: How
can we make the AS2 models more accurate without significantly increasing their
model complexity? To address the question, we propose a Multiple Heads Student
architecture (named CERBERUS), an efficient neural network designed to distill
an ensemble of large transformers into a single smaller model. CERBERUS
consists of two components: a stack of transformer layers that is used to
encode inputs, and a set of ranking heads; unlike traditional distillation
technique, each of them is trained by distilling a different large transformer
architecture in a way that preserves the diversity of the ensemble members. The
resulting model captures the knowledge of heterogeneous transformer models by
using just a few extra parameters. We show the effectiveness of CERBERUS on
three English datasets for AS2; our proposed approach outperforms all
single-model distillations we consider, rivaling the state-of-the-art large AS2
models that have 2.7x more parameters and run 2.5x slower. Code for our model
is available at https://github.com/amazon-research/wqa-cerberus |
---|---|
DOI: | 10.48550/arxiv.2201.05767 |