Optimized Inference Scheme for Conditional Computation in On-Device Object Detection
Recently, conditional computation has been applied to on-device object detection to solve the conflict between huge computation requirments of deep neural network (DNN) and limited computation resources of edge devices. There is a need for an optimized inference scheme that can efficiently perform c...
Gespeichert in:
Veröffentlicht in: | IEEE embedded systems letters 2024-12, p.1-1 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, conditional computation has been applied to on-device object detection to solve the conflict between huge computation requirments of deep neural network (DNN) and limited computation resources of edge devices. There is a need for an optimized inference scheme that can efficiently perform conditional computation in on-device object detection. This letter proposes a predictor which can predict router decisions of conditional computation. Based on the predictor, this letter also presents an inference scheme which hides router latency through concurrently executing router and the predicted branch. The proposed predictor shows higher accuracy than profiling-based method, and experiment shows that our inference scheme can get latency decrease over traditional scheme. |
---|---|
ISSN: | 1943-0663 |
DOI: | 10.1109/LES.2024.3514920 |