Robust Multi-Prototypes Aware Integration for Zero-Shot Cross-Domain Slot Filling

Cross-domain slot filling is a widely explored problem in spoken language understanding (SLU), which requires the model to transfer between different domains under data sparsity conditions. Dominant two-step hierarchical models first extract slot entities and then calculate the similarity score betw...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters 2024, Vol.31, p.3169-3173
Hauptverfasser: Chen, Shaoshen, Huang, Peijie, Zhu, Zhanbiao, Zhang, Yexing, Xu, Yuhong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cross-domain slot filling is a widely explored problem in spoken language understanding (SLU), which requires the model to transfer between different domains under data sparsity conditions. Dominant two-step hierarchical models first extract slot entities and then calculate the similarity score between slot description-based prototypes and the last hidden layer of the slot entity, selecting the closest prototype as the predicted slot type. However, these models only use slot descriptions as prototypes, which lacks robustness. Moreover, these approaches have less regard for the inherent knowledge in the slot entity embedding to suffer from the issue of overfitting. In this letter, we propose a Robust Multi-prototypes Aware Integration (RMAI) method for zero-shot cross-domain slot filling. In RMAI, more robust slot entity-based prototypes and inherent knowledge in the slot entity embedding are utilized to improve the classification performance and alleviate the risk of overfitting. Furthermore, a multi-prototypes aware integration approach is proposed to effectively integrate both our proposed slot entity-based prototypes and the slot description-based prototypes. Experimental results on the SNIPS dataset demonstrate the well performance of RMAI.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2024.3495561