Implementing Neural Networks on Nonvolatile FPGAs With Reprogramming
NV-FPGAs have attracted significant attention in research due to their high density, low leakage power, and reduced error rates. The nonvolatile memory (NVM) crossbar's compute-in-memory (CiM) capability further enables NV-FPGAs to execute high-efficiency, high-throughput neural network (NN) in...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on computer-aided design of integrated circuits and systems 2024-11, Vol.43 (11), p.3961-3972 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | NV-FPGAs have attracted significant attention in research due to their high density, low leakage power, and reduced error rates. The nonvolatile memory (NVM) crossbar's compute-in-memory (CiM) capability further enables NV-FPGAs to execute high-efficiency, high-throughput neural network (NN) inference tasks. However, with the rapid increase in network size and considering that the parameter size often exceeds the memory capacity of the field programmable gate array (FPGA), implementing the entire network on a single FPGA chip becomes impractical. In this article, we utilize FPGA's inherent run time reprogramming feature to implement oversized NNs on NV-FPGAs. This approach splits NN models into multiple tasks for the cyclical execution. Specifically, we propose a performance-driven task adapter (PD-Adapter), which aims to achieve high-performance NN inference by employing the task deployment to optimize settings, such as processing element size and quantity, and the task switching to select the most suitable switching type for each task. We integrate the proposed PD-Adapter into an open-source toolchain and evaluate it. Experimental results demonstrate that the PD-Adapter can achieve a run time reduction of 85.37% and 76.12% compared to the baseline and execution-time-first policy, respectively. |
---|---|
ISSN: | 0278-0070 1937-4151 |
DOI: | 10.1109/TCAD.2024.3443708 |