On the fly Deep Neural Network Optimization Control for Low-Power Computer Vision
Processing visual data on mobile devices has many applications, e.g., emergency response and tracking. State-of-the-art computer vision techniques rely on large Deep Neural Networks (DNNs) that are usually too power-hungry to be deployed on resource-constrained edge devices. Many techniques improve...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2023-09 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Kaur, Ishmeet Jadhav, Adwaita Janardhan |
description | Processing visual data on mobile devices has many applications, e.g., emergency response and tracking. State-of-the-art computer vision techniques rely on large Deep Neural Networks (DNNs) that are usually too power-hungry to be deployed on resource-constrained edge devices. Many techniques improve the efficiency of DNNs by using sparsity or quantization. However, the accuracy and efficiency of these techniques cannot be adapted for diverse edge applications with different hardware constraints and accuracy requirements. This paper presents a novel technique to allow DNNs to adapt their accuracy and energy consumption during run-time, without the need for any re-training. Our technique called AdaptiveActivation introduces a hyper-parameter that controls the output range of the DNNs' activation function to dynamically adjust the sparsity and precision in the DNN. AdaptiveActivation can be applied to any existing pre-trained DNN to improve their deployability in diverse edge environments. We conduct experiments on popular edge devices and show that the accuracy is within 1.5% of the baseline. We also show that our approach requires 10%--38% less memory than the baseline techniques leading to more accuracy-efficiency tradeoff options |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2861509344</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2861509344</sourcerecordid><originalsourceid>FETCH-proquest_journals_28615093443</originalsourceid><addsrcrecordid>eNqNir0KwjAURoMgWLTvcMG50CZtrXNVHMQqiGvpkGJqmhvzQ9GnN4MP4HQ-vnNmJKKMZUmVU7ogsbVDmqa03NCiYBG5Ngrcg0Mv37DjXMOZe9PJADeheUKjnRjFp3MCFdSonEEJPRo44ZRccOImvKP2Loy7sKFakXnfScvjH5dkfdjf6mOiDb48t64d0BsVVEurMivSLctz9l_1BWeNPwQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2861509344</pqid></control><display><type>article</type><title>On the fly Deep Neural Network Optimization Control for Low-Power Computer Vision</title><source>Free E- Journals</source><creator>Kaur, Ishmeet ; Jadhav, Adwaita Janardhan</creator><creatorcontrib>Kaur, Ishmeet ; Jadhav, Adwaita Janardhan</creatorcontrib><description>Processing visual data on mobile devices has many applications, e.g., emergency response and tracking. State-of-the-art computer vision techniques rely on large Deep Neural Networks (DNNs) that are usually too power-hungry to be deployed on resource-constrained edge devices. Many techniques improve the efficiency of DNNs by using sparsity or quantization. However, the accuracy and efficiency of these techniques cannot be adapted for diverse edge applications with different hardware constraints and accuracy requirements. This paper presents a novel technique to allow DNNs to adapt their accuracy and energy consumption during run-time, without the need for any re-training. Our technique called AdaptiveActivation introduces a hyper-parameter that controls the output range of the DNNs' activation function to dynamically adjust the sparsity and precision in the DNN. AdaptiveActivation can be applied to any existing pre-trained DNN to improve their deployability in diverse edge environments. We conduct experiments on popular edge devices and show that the accuracy is within 1.5% of the baseline. We also show that our approach requires 10%--38% less memory than the baseline techniques leading to more accuracy-efficiency tradeoff options</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Accuracy ; Artificial neural networks ; Computer vision ; Constraints ; Efficiency ; Emergency response ; Energy consumption ; Neural networks ; Optimization ; Power management ; Sparsity ; Visual flight</subject><ispartof>arXiv.org, 2023-09</ispartof><rights>2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Kaur, Ishmeet</creatorcontrib><creatorcontrib>Jadhav, Adwaita Janardhan</creatorcontrib><title>On the fly Deep Neural Network Optimization Control for Low-Power Computer Vision</title><title>arXiv.org</title><description>Processing visual data on mobile devices has many applications, e.g., emergency response and tracking. State-of-the-art computer vision techniques rely on large Deep Neural Networks (DNNs) that are usually too power-hungry to be deployed on resource-constrained edge devices. Many techniques improve the efficiency of DNNs by using sparsity or quantization. However, the accuracy and efficiency of these techniques cannot be adapted for diverse edge applications with different hardware constraints and accuracy requirements. This paper presents a novel technique to allow DNNs to adapt their accuracy and energy consumption during run-time, without the need for any re-training. Our technique called AdaptiveActivation introduces a hyper-parameter that controls the output range of the DNNs' activation function to dynamically adjust the sparsity and precision in the DNN. AdaptiveActivation can be applied to any existing pre-trained DNN to improve their deployability in diverse edge environments. We conduct experiments on popular edge devices and show that the accuracy is within 1.5% of the baseline. We also show that our approach requires 10%--38% less memory than the baseline techniques leading to more accuracy-efficiency tradeoff options</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Computer vision</subject><subject>Constraints</subject><subject>Efficiency</subject><subject>Emergency response</subject><subject>Energy consumption</subject><subject>Neural networks</subject><subject>Optimization</subject><subject>Power management</subject><subject>Sparsity</subject><subject>Visual flight</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNir0KwjAURoMgWLTvcMG50CZtrXNVHMQqiGvpkGJqmhvzQ9GnN4MP4HQ-vnNmJKKMZUmVU7ogsbVDmqa03NCiYBG5Ngrcg0Mv37DjXMOZe9PJADeheUKjnRjFp3MCFdSonEEJPRo44ZRccOImvKP2Loy7sKFakXnfScvjH5dkfdjf6mOiDb48t64d0BsVVEurMivSLctz9l_1BWeNPwQ</recordid><startdate>20230904</startdate><enddate>20230904</enddate><creator>Kaur, Ishmeet</creator><creator>Jadhav, Adwaita Janardhan</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope></search><sort><creationdate>20230904</creationdate><title>On the fly Deep Neural Network Optimization Control for Low-Power Computer Vision</title><author>Kaur, Ishmeet ; Jadhav, Adwaita Janardhan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28615093443</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Computer vision</topic><topic>Constraints</topic><topic>Efficiency</topic><topic>Emergency response</topic><topic>Energy consumption</topic><topic>Neural networks</topic><topic>Optimization</topic><topic>Power management</topic><topic>Sparsity</topic><topic>Visual flight</topic><toplevel>online_resources</toplevel><creatorcontrib>Kaur, Ishmeet</creatorcontrib><creatorcontrib>Jadhav, Adwaita Janardhan</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection (ProQuest)</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kaur, Ishmeet</au><au>Jadhav, Adwaita Janardhan</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>On the fly Deep Neural Network Optimization Control for Low-Power Computer Vision</atitle><jtitle>arXiv.org</jtitle><date>2023-09-04</date><risdate>2023</risdate><eissn>2331-8422</eissn><abstract>Processing visual data on mobile devices has many applications, e.g., emergency response and tracking. State-of-the-art computer vision techniques rely on large Deep Neural Networks (DNNs) that are usually too power-hungry to be deployed on resource-constrained edge devices. Many techniques improve the efficiency of DNNs by using sparsity or quantization. However, the accuracy and efficiency of these techniques cannot be adapted for diverse edge applications with different hardware constraints and accuracy requirements. This paper presents a novel technique to allow DNNs to adapt their accuracy and energy consumption during run-time, without the need for any re-training. Our technique called AdaptiveActivation introduces a hyper-parameter that controls the output range of the DNNs' activation function to dynamically adjust the sparsity and precision in the DNN. AdaptiveActivation can be applied to any existing pre-trained DNN to improve their deployability in diverse edge environments. We conduct experiments on popular edge devices and show that the accuracy is within 1.5% of the baseline. We also show that our approach requires 10%--38% less memory than the baseline techniques leading to more accuracy-efficiency tradeoff options</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2023-09 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2861509344 |
source | Free E- Journals |
subjects | Accuracy Artificial neural networks Computer vision Constraints Efficiency Emergency response Energy consumption Neural networks Optimization Power management Sparsity Visual flight |
title | On the fly Deep Neural Network Optimization Control for Low-Power Computer Vision |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-24T03%3A38%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=On%20the%20fly%20Deep%20Neural%20Network%20Optimization%20Control%20for%20Low-Power%20Computer%20Vision&rft.jtitle=arXiv.org&rft.au=Kaur,%20Ishmeet&rft.date=2023-09-04&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2861509344%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2861509344&rft_id=info:pmid/&rfr_iscdi=true |