Numerical Sequence Prediction using Bayesian Concept Learning

When people learn mathematical patterns or sequences, they are able to identify the concepts (or rules) underlying those patterns. Having learned the underlying concepts, humans are also able to generalize those concepts to other numbers, so far as to even identify previously unseen combinations of...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2020-01
Hauptverfasser: Damarapati, Mohith, Enaganti, Inavamsi B, Alfred Ajay Aureate Rajakumar
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:When people learn mathematical patterns or sequences, they are able to identify the concepts (or rules) underlying those patterns. Having learned the underlying concepts, humans are also able to generalize those concepts to other numbers, so far as to even identify previously unseen combinations of those rules. Current state-of-the art RNN architectures like LSTMs perform well in predicting successive elements of sequential data, but require vast amounts of training examples. Even with extensive data, these models struggle to generalize concepts. From our behavioral study, we also found that humans are able to disregard noise and identify the underlying rules generating the corrupted sequences. We therefore propose a Bayesian model that captures these human-like learning capabilities to predict next number in a given sequence, better than traditional LSTMs.
ISSN:2331-8422