Numerical Sequence Prediction using Bayesian Concept Learning
When people learn mathematical patterns or sequences, they are able to identify the concepts (or rules) underlying those patterns. Having learned the underlying concepts, humans are also able to generalize those concepts to other numbers, so far as to even identify previously unseen combinations of...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2020-01 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | When people learn mathematical patterns or sequences, they are able to identify the concepts (or rules) underlying those patterns. Having learned the underlying concepts, humans are also able to generalize those concepts to other numbers, so far as to even identify previously unseen combinations of those rules. Current state-of-the art RNN architectures like LSTMs perform well in predicting successive elements of sequential data, but require vast amounts of training examples. Even with extensive data, these models struggle to generalize concepts. From our behavioral study, we also found that humans are able to disregard noise and identify the underlying rules generating the corrupted sequences. We therefore propose a Bayesian model that captures these human-like learning capabilities to predict next number in a given sequence, better than traditional LSTMs. |
---|---|
ISSN: | 2331-8422 |