Hyperparameter Optimization in Binary Communication Networks for Neuromorphic Deployment
Training neural networks for neuromorphic deployment is non-trivial. There have been a variety of approaches proposed to adapt back-propagation or back-propagation-like algorithms appropriate for training. Considering that these networks often have very different performance characteristics than tra...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Training neural networks for neuromorphic deployment is non-trivial. There
have been a variety of approaches proposed to adapt back-propagation or
back-propagation-like algorithms appropriate for training. Considering that
these networks often have very different performance characteristics than
traditional neural networks, it is often unclear how to set either the network
topology or the hyperparameters to achieve optimal performance. In this work,
we introduce a Bayesian approach for optimizing the hyperparameters of an
algorithm for training binary communication networks that can be deployed to
neuromorphic hardware. We show that by optimizing the hyperparameters on this
algorithm for each dataset, we can achieve improvements in accuracy over the
previous state-of-the-art for this algorithm on each dataset (by up to 15
percent). This jump in performance continues to emphasize the potential when
converting traditional neural networks to binary communication applicable to
neuromorphic hardware. |
---|---|
DOI: | 10.48550/arxiv.2005.04171 |