HyFit: Hybrid Fine-Tuning With Diverse Sampling for Abstractive Summarization
Abstractive summarization has made significant progress in recent years, which aims to generate a concise and coherent summary that contains the most important facts from the source document. Current fine-tuning approaches based on pre-training models typically rely on autoregressive and maximum lik...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on big data 2024, p.1-12 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Abstractive summarization has made significant progress in recent years, which aims to generate a concise and coherent summary that contains the most important facts from the source document. Current fine-tuning approaches based on pre-training models typically rely on autoregressive and maximum likelihood estimation, which may result in inconsistent historical distributions generated during the training and inference stages, i.e., exposure bias problem. To alleviate this problem, we propose a hybrid fine-tuning model(HyFit), which combines contrastive learning and reinforcement learning in a diverse sampling space. Firstly, we introduce reparameterization and probability-based sampling methods to generate a set of summary candidates called candidates bank, which improves the diversity and quality of the decoding sampling space and incorporates the potential for uncertainty. Secondly, hybrid fine-tuning with sampled candidates bank, upweighting confident summaries and downweighting unconfident ones. Experiments demonstrate that HyFit significantly outperforms the state-of-the-art models on SAMSum and DialogSum. HyFit also shows good performance on low-resource summarization, on DialogSum dataset, using only approximate 8% of the examples exceed the performance of the base model trained on all examples. |
---|---|
ISSN: | 2332-7790 2332-7790 2372-2096 |
DOI: | 10.1109/TBDATA.2024.3387311 |