Time Masking for Temporal Language Models
Our world is constantly evolving, and so is the content on the web. Consequently, our languages, often said to mirror the world, are dynamic in nature. However, most current contextual language models are static and cannot adapt to changes over time. In this work, we propose a temporal contextual la...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Our world is constantly evolving, and so is the content on the web.
Consequently, our languages, often said to mirror the world, are dynamic in
nature. However, most current contextual language models are static and cannot
adapt to changes over time. In this work, we propose a temporal contextual
language model called TempoBERT, which uses time as an additional context of
texts. Our technique is based on modifying texts with temporal information and
performing time masking - specific masking for the supplementary time
information. We leverage our approach for the tasks of semantic change
detection and sentence time prediction, experimenting on diverse datasets in
terms of time, size, genre, and language. Our extensive evaluation shows that
both tasks benefit from exploiting time masking. |
---|---|
DOI: | 10.48550/arxiv.2110.06366 |