Evaluation of PNN pattern-layer activation function approximations in different training setups
The processing of inputs in the first two layers of the probabilistic neural network (PNN) is highly parallel which makes it quite appropriate for hardware implementations with FPGA. One of the main inconveniences however remains the implementation of the nonlinear activation function of the pattern...
Gespeichert in:
Veröffentlicht in: | International journal of speech technology 2019-12, Vol.22 (4), p.1039-1049 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The processing of inputs in the first two layers of the probabilistic neural network (PNN) is highly parallel which makes it quite appropriate for hardware implementations with FPGA. One of the main inconveniences however remains the implementation of the nonlinear activation function of the pattern layer neurons. In the present study, we investigate the applicability of three approximations of the exponential activation function with look-up tables of different precision and the effect this has on the training process and the classification accuracy. Furthermore, seeking for a highly-parallel hardware-friendly algorithm for the automated adjustment of the spread factor
σ
i
, we investigated the performance of fifteen PNN training setups, which are based on the differential evolution (DE) or unified particle swarm optimization (UPSO) methods. The experimental evaluation was performed following a common experimental protocol, which makes use of the Parkinson Speech Dataset, as this research aims to support the development of portable medical devices that are capable to detect episodes with exacerbation in patients with Parkinson’s disease. The performance of the most successful setups is discussed in terms of error rates and from the perspective of the resources required for an FPGA-based implementation. |
---|---|
ISSN: | 1381-2416 1572-8110 |
DOI: | 10.1007/s10772-019-09640-7 |