Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE)
We present a framework based on iterative free-energy optimization with spiking neural networks for modeling the fronto-striatal system (PFC-BG) for the generation and recall of audio memory sequences. In line with neuroimaging studies carried out in the PFC, we propose a genuine coding strategy usi...
Gespeichert in:
Veröffentlicht in: | Neural networks 2020-01, Vol.121, p.242-258 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We present a framework based on iterative free-energy optimization with spiking neural networks for modeling the fronto-striatal system (PFC-BG) for the generation and recall of audio memory sequences. In line with neuroimaging studies carried out in the PFC, we propose a genuine coding strategy using the gain-modulation mechanism to represent abstract sequences based solely on the rank and location of items within them. Based on this mechanism, we show that we can construct a repertoire of neurons sensitive to the temporal structure in sequences from which we can represent any novel sequences. Free-energy optimization is then used to explore and to retrieve the missing indices of the items in the correct order for executive control and compositionality. We show that the gain-modulation mechanism permits the network to be robust to variabilities and to have long-term dependencies as it implements a gated recurrent neural network. This model, called Inferno Gate, is an extension of the neural architecture Inferno standing for Iterative Free-Energy Optimization of Recurrent Neural Networks with Gating or Gain-modulation. In experiments performed with an audio database of ten thousand MFCC vectors, Inferno Gate is capable of encoding efficiently and retrieving chunks of fifty items length. We then discuss the potential of our network to model the features of working memory in the PFC-BG loop for structural learning, goal-direction and hierarchical reinforcement learning.
•We present a Neural architecture based on Iterative Free-Energy Optimization and on a Gating mechanism for sequence encoding, generation and retrieving in spiking Recurrent Networks (Inferno Gate).•We present a novel gating mechanism based on rank-order coding for structure learning.•The gated network extracts temporal structure from feature to encode memory sequences.•It represents sequences based solely on the rank and location of items, not the items.•Experiments show fast learning and efficient retrieving of long memory sequences.•Its features are compositionality, robustness to noise and long-term dependencies. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/j.neunet.2019.09.023 |