Efficient FPGA Implementation of Multilayer Perceptron for Real-Time Human Activity Classification
The smartphone-based human activity recognition (HAR) systems are not capable to deliver high-end performance for challenging applications. We propose a dedicated hardware-based HAR system for smart military wearables, which uses a multilayer perceptron (MLP) algorithm to perform activity classifica...
Gespeichert in:
Veröffentlicht in: | IEEE access 2019, Vol.7, p.26696-26706 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The smartphone-based human activity recognition (HAR) systems are not capable to deliver high-end performance for challenging applications. We propose a dedicated hardware-based HAR system for smart military wearables, which uses a multilayer perceptron (MLP) algorithm to perform activity classification. To achieve the flexible and efficient hardware design, the inherent MLP architecture with parallel computation is implemented on FPGA. The system performance has been evaluated using the UCI human activity dataset with 7767 feature samples of 20 subjects. The three combinations of a dataset are trained, validated, and tested on ten different MLP models with distinct topologies. The MLP design with the 7-6-5 topology is finalized from the classification accuracy and cross entropy performance. The five versions of the final MLP design (7-6-5) with different data precision are implemented on FPGA. The analysis shows that the MLP designed with 16-bit fixed-point data precision is the most efficient MLP implementation in the context of classification accuracy, resource utilization, and power consumption. The proposed MLP design requires only 270 ns for classification and consumes 120 mW of power. The recognition accuracy and hardware results performance achieved are better than many of the recently reported works. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2900084 |