Visualizing RNN States with Predictive Semantic Encodings

Recurrent Neural Networks are an effective and prevalent tool used to model sequential data such as natural language text. However, their deep nature and massive number of parameters pose a challenge for those intending to study precisely how they work. We present a visual technique that gives a hig...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2019-08
Hauptverfasser: Sawatzky, Lindsey, Bergner, Steven, Popowich, Fred
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recurrent Neural Networks are an effective and prevalent tool used to model sequential data such as natural language text. However, their deep nature and massive number of parameters pose a challenge for those intending to study precisely how they work. We present a visual technique that gives a high level intuition behind the semantics of the hidden states within Recurrent Neural Networks. This semantic encoding allows for hidden states to be compared throughout the model independent of their internal details. The proposed technique is displayed in a proof of concept visualization tool which is demonstrated to visualize the natural language processing task of language modelling.
ISSN:2331-8422
DOI:10.48550/arxiv.1908.00588