A simple probabilistic spiking neuron model with Hebbian learning rules

Traditional spiking neural networks (SNNs) uses simulated spiking neuron models for computation units. Action potentials (APs or spikes) are generated when the integrated sensory or synaptic inputs to a neuron reach a threshold value. However, spiking generation is not a deterministic process, makin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Ting Wu, Siyao Fu, Long Cheng, Rui Zheng, Xiuqing Wang, Xinkai Kuai, Guosheng Yang
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Traditional spiking neural networks (SNNs) uses simulated spiking neuron models for computation units. Action potentials (APs or spikes) are generated when the integrated sensory or synaptic inputs to a neuron reach a threshold value. However, spiking generation is not a deterministic process, making current models limited for their potentials and applications. Here we consider the effects of adding probabilistic parameters to the spiking neuron model, which controls the synapses established during spiking generation and transmitting. The Hebbian learning rule is employed for controlling the probabilistic parameters self-adaptation and connection weights associated with the synapses are established using Thorpe's rule during the network learning procedure. The proposed framework combines the essence of stochastic characteristics of the cortical neurons in vivo, the biologically plausibility of Hodgkin-Huxley type neuron dynamics, as well as the computational efficiency of integrate-and-fire (I&F) type neurons. A simple simulation acquired following aforementioned instructions (based on Izhivich's SNN model) exhibits more explicit behavior and robust performance than the original model and deterministic network organizations.
ISSN:2161-4393
2161-4407
DOI:10.1109/IJCNN.2012.6252438