SLANG: New Concept Comprehension of Large Language Models
The dynamic nature of language, particularly evident in the realm of slang and memes on the Internet, poses serious challenges to the adaptability of large language models (LLMs). Traditionally anchored to static datasets, these models often struggle to keep up with the rapid linguistic evolution ch...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The dynamic nature of language, particularly evident in the realm of slang
and memes on the Internet, poses serious challenges to the adaptability of
large language models (LLMs). Traditionally anchored to static datasets, these
models often struggle to keep up with the rapid linguistic evolution
characteristic of online communities. This research aims to bridge this gap by
enhancing LLMs' comprehension of the evolving new concepts on the Internet,
without the high cost of continual retraining. In pursuit of this goal, we
introduce $\textbf{SLANG}$, a benchmark designed to autonomously integrate
novel data and assess LLMs' ability to comprehend emerging concepts, alongside
$\textbf{FOCUS}$, an approach uses causal inference to enhance LLMs to
understand new phrases and their colloquial context. Our benchmark and approach
involves understanding real-world instances of linguistic shifts, serving as
contextual beacons, to form more precise and contextually relevant connections
between newly emerging expressions and their meanings. The empirical analysis
shows that our causal inference-based approach outperforms the baseline methods
in terms of precision and relevance in the comprehension of Internet slang and
memes. |
---|---|
DOI: | 10.48550/arxiv.2401.12585 |