A Generalized Language Model as the Combination of Skipped n-grams and Modified Kneser-Ney Smoothing
We introduce a novel approach for building language models based on a systematic, recursive exploration of skip n-gram models which are interpolated using modified Kneser-Ney smoothing. Our approach generalizes language models as it contains the classical interpolation with lower order models as a s...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We introduce a novel approach for building language models based on a
systematic, recursive exploration of skip n-gram models which are interpolated
using modified Kneser-Ney smoothing. Our approach generalizes language models
as it contains the classical interpolation with lower order models as a special
case. In this paper we motivate, formalize and present our approach. In an
extensive empirical experiment over English text corpora we demonstrate that
our generalized language models lead to a substantial reduction of perplexity
between 3.1% and 12.7% in comparison to traditional language models using
modified Kneser-Ney smoothing. Furthermore, we investigate the behaviour over
three other languages and a domain specific corpus where we observed consistent
improvements. Finally, we also show that the strength of our approach lies in
its ability to cope in particular with sparse training data. Using a very small
training data set of only 736 KB text we yield improvements of even 25.7%
reduction of perplexity. |
---|---|
DOI: | 10.48550/arxiv.1404.3377 |