Distributional formal semantics
Natural language semantics has recently sought to combine the complementary strengths of formal and distributional approaches to meaning. However, given the fundamentally different ‘representational currency’ underlying these approaches—models of the world versus linguistic co-occurrence—their unifi...
Gespeichert in:
Veröffentlicht in: | Information and computation 2022-09, Vol.287, p.104763, Article 104763 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Natural language semantics has recently sought to combine the complementary strengths of formal and distributional approaches to meaning. However, given the fundamentally different ‘representational currency’ underlying these approaches—models of the world versus linguistic co-occurrence—their unification has proven extremely difficult. Here, we define Distributional Formal Semantics, which integrates distributionality into a formal semantic system on the level of formal models. This approach offers probabilistic, distributed meaning representations that are inherently compositional, and that naturally capture fundamental semantic notions such as quantification and entailment. Furthermore, we show how the probabilistic nature of these representations allows for probabilistic inference, and how the information-theoretic notion of “information” (measured in Entropy and Surprisal) naturally follows from it. Finally, we illustrate how meaning representations can be derived incrementally from linguistic input using a recurrent neural network model, and how the resultant incremental semantic construction procedure intuitively captures key semantic phenomena, including negation, presupposition, and anaphoricity. |
---|---|
ISSN: | 0890-5401 1090-2651 |
DOI: | 10.1016/j.ic.2021.104763 |