Spiking Denoising Diffusion Probabilistic Models
Spiking neural networks (SNNs) have ultra-low energy consumption and high biological plausibility due to their binary and bio-driven nature compared with artificial neural networks (ANNs). While previous research has primarily focused on enhancing the performance of SNNs in classification tasks, the...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Spiking neural networks (SNNs) have ultra-low energy consumption and high
biological plausibility due to their binary and bio-driven nature compared with
artificial neural networks (ANNs). While previous research has primarily
focused on enhancing the performance of SNNs in classification tasks, the
generative potential of SNNs remains relatively unexplored. In our paper, we
put forward Spiking Denoising Diffusion Probabilistic Models (SDDPM), a new
class of SNN-based generative models that achieve high sample quality. To fully
exploit the energy efficiency of SNNs, we propose a purely Spiking U-Net
architecture, which achieves comparable performance to its ANN counterpart
using only 4 time steps, resulting in significantly reduced energy consumption.
Extensive experimental results reveal that our approach achieves
state-of-the-art on the generative tasks and substantially outperforms other
SNN-based generative models, achieving up to 12x and 6x improvement on the
CIFAR-10 and the CelebA datasets, respectively. Moreover, we propose a
threshold-guided strategy that can further improve the performances by 2.69% in
a training-free manner. The SDDPM symbolizes a significant advancement in the
field of SNN generation, injecting new perspectives and potential avenues of
exploration. Our code is available at https://github.com/AndyCao1125/SDDPM. |
---|---|
DOI: | 10.48550/arxiv.2306.17046 |