Constraint based Knowledge Base Distillation in End-to-End Task Oriented Dialogs
End-to-End task-oriented dialogue systems generate responses based on dialog history and an accompanying knowledge base (KB). Inferring those KB entities that are most relevant for an utterance is crucial for response generation. Existing state of the art scales to large KBs by softly filtering over...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | End-to-End task-oriented dialogue systems generate responses based on dialog
history and an accompanying knowledge base (KB). Inferring those KB entities
that are most relevant for an utterance is crucial for response generation.
Existing state of the art scales to large KBs by softly filtering over
irrelevant KB information. In this paper, we propose a novel filtering
technique that consists of (1) a pairwise similarity based filter that
identifies relevant information by respecting the n-ary structure in a KB
record. and, (2) an auxiliary loss that helps in separating contextually
unrelated KB information. We also propose a new metric -- multiset entity F1
which fixes a correctness issue in the existing entity F1 metric. Experimental
results on three publicly available task-oriented dialog datasets show that our
proposed approach outperforms existing state-of-the-art models. |
---|---|
DOI: | 10.48550/arxiv.2109.07396 |