QNNRepair: Quantized Neural Network Repair

We present QNNRepair, the first method in the literature for repairing quantized neural networks (QNNs). QNNRepair aims to improve the accuracy of a neural network model after quantization. It accepts the full-precision and weight-quantized neural networks, together with a repair dataset of passing...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Song, Xidan, Sun, Youcheng, Mustafa, Mustafa Asan, Cordeiro, Lucas
Format: Tagungsbericht
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 339
container_issue
container_start_page 320
container_title
container_volume 14323
creator Song, Xidan
Sun, Youcheng
Mustafa, Mustafa Asan
Cordeiro, Lucas
description We present QNNRepair, the first method in the literature for repairing quantized neural networks (QNNs). QNNRepair aims to improve the accuracy of a neural network model after quantization. It accepts the full-precision and weight-quantized neural networks, together with a repair dataset of passing and failing tests. At first, QNNRepair applies a software fault localization method to identify the neurons that cause performance degradation during neural network quantization. Then, it formulates the repair problem into a MILP, solving neuron weight parameters, which corrects the QNN's performance on failing tests while not compromising its performance on passing tests. We evaluate QNNRepair with widely used neural network architectures such as MobileNetV2, ResNet, and VGGNet on popular datasets, including high-resolution images. We also compare QNNRepair with the state-of-the-art data-free quantization method SQuant [22]. According to the experiment results, we conclude that QNNRepair is effective in improving the quantized model's performance in most cases. Its repaired models have 24% higher accuracy than SQuant's in the independent validation set, especially for the ImageNet dataset.
format Conference Proceeding
fullrecord <record><control><sourceid>kuleuven</sourceid><recordid>TN_cdi_kuleuven_dspace_20_500_12942_747847</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>20_500_12942_747847</sourcerecordid><originalsourceid>FETCH-kuleuven_dspace_20_500_12942_7478473</originalsourceid><addsrcrecordid>eNqVyssKgkAUANCBCrLyH1wLxp1HXKdtFK0Eo_0w5A1MMXFmKvr6gvqAWp3NGbFYYy5BcoWcq3zMIpAgMo1KTtnMuQsACNQiYmlZFAfqbT2skzLYztdPqpKCwmDbN_5-HZrkExZscrato_jrnKW77XGzz5rQUrhRZyrX2xMZAWYFYLjQShhUmCuUf-blz9n4h5cvknVD3A</addsrcrecordid><sourcetype>Institutional Repository</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>QNNRepair: Quantized Neural Network Repair</title><source>Lirias (KU Leuven Association)</source><source>Springer Books</source><creator>Song, Xidan ; Sun, Youcheng ; Mustafa, Mustafa Asan ; Cordeiro, Lucas</creator><contributor>Ferreira, C ; Willemse, T.A.C</contributor><creatorcontrib>Song, Xidan ; Sun, Youcheng ; Mustafa, Mustafa Asan ; Cordeiro, Lucas ; Ferreira, C ; Willemse, T.A.C</creatorcontrib><description>We present QNNRepair, the first method in the literature for repairing quantized neural networks (QNNs). QNNRepair aims to improve the accuracy of a neural network model after quantization. It accepts the full-precision and weight-quantized neural networks, together with a repair dataset of passing and failing tests. At first, QNNRepair applies a software fault localization method to identify the neurons that cause performance degradation during neural network quantization. Then, it formulates the repair problem into a MILP, solving neuron weight parameters, which corrects the QNN's performance on failing tests while not compromising its performance on passing tests. We evaluate QNNRepair with widely used neural network architectures such as MobileNetV2, ResNet, and VGGNet on popular datasets, including high-resolution images. We also compare QNNRepair with the state-of-the-art data-free quantization method SQuant [22]. According to the experiment results, we conclude that QNNRepair is effective in improving the quantized model's performance in most cases. Its repaired models have 24% higher accuracy than SQuant's in the independent validation set, especially for the ImageNet dataset.</description><identifier>ISSN: 0302-9743</identifier><identifier>ISBN: 9783031471148</identifier><identifier>ISBN: 3031471148</identifier><language>eng</language><publisher>Springer-VerlagBerlin, Heidelberg</publisher><ispartof>Software Engineering and Formal Methods: 21st International Conference, SEFM 2023, 2023, Vol.14323, p.320-339</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>309,310,315,776,780,785,786,27837</link.rule.ids></links><search><contributor>Ferreira, C</contributor><contributor>Willemse, T.A.C</contributor><creatorcontrib>Song, Xidan</creatorcontrib><creatorcontrib>Sun, Youcheng</creatorcontrib><creatorcontrib>Mustafa, Mustafa Asan</creatorcontrib><creatorcontrib>Cordeiro, Lucas</creatorcontrib><title>QNNRepair: Quantized Neural Network Repair</title><title>Software Engineering and Formal Methods: 21st International Conference, SEFM 2023</title><description>We present QNNRepair, the first method in the literature for repairing quantized neural networks (QNNs). QNNRepair aims to improve the accuracy of a neural network model after quantization. It accepts the full-precision and weight-quantized neural networks, together with a repair dataset of passing and failing tests. At first, QNNRepair applies a software fault localization method to identify the neurons that cause performance degradation during neural network quantization. Then, it formulates the repair problem into a MILP, solving neuron weight parameters, which corrects the QNN's performance on failing tests while not compromising its performance on passing tests. We evaluate QNNRepair with widely used neural network architectures such as MobileNetV2, ResNet, and VGGNet on popular datasets, including high-resolution images. We also compare QNNRepair with the state-of-the-art data-free quantization method SQuant [22]. According to the experiment results, we conclude that QNNRepair is effective in improving the quantized model's performance in most cases. Its repaired models have 24% higher accuracy than SQuant's in the independent validation set, especially for the ImageNet dataset.</description><issn>0302-9743</issn><isbn>9783031471148</isbn><isbn>3031471148</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2023</creationdate><recordtype>conference_proceeding</recordtype><sourceid>FZOIL</sourceid><recordid>eNqVyssKgkAUANCBCrLyH1wLxp1HXKdtFK0Eo_0w5A1MMXFmKvr6gvqAWp3NGbFYYy5BcoWcq3zMIpAgMo1KTtnMuQsACNQiYmlZFAfqbT2skzLYztdPqpKCwmDbN_5-HZrkExZscrato_jrnKW77XGzz5rQUrhRZyrX2xMZAWYFYLjQShhUmCuUf-blz9n4h5cvknVD3A</recordid><startdate>20231106</startdate><enddate>20231106</enddate><creator>Song, Xidan</creator><creator>Sun, Youcheng</creator><creator>Mustafa, Mustafa Asan</creator><creator>Cordeiro, Lucas</creator><general>Springer-VerlagBerlin, Heidelberg</general><scope>FZOIL</scope></search><sort><creationdate>20231106</creationdate><title>QNNRepair: Quantized Neural Network Repair</title><author>Song, Xidan ; Sun, Youcheng ; Mustafa, Mustafa Asan ; Cordeiro, Lucas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-kuleuven_dspace_20_500_12942_7478473</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2023</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Song, Xidan</creatorcontrib><creatorcontrib>Sun, Youcheng</creatorcontrib><creatorcontrib>Mustafa, Mustafa Asan</creatorcontrib><creatorcontrib>Cordeiro, Lucas</creatorcontrib><collection>Lirias (KU Leuven Association)</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Song, Xidan</au><au>Sun, Youcheng</au><au>Mustafa, Mustafa Asan</au><au>Cordeiro, Lucas</au><au>Ferreira, C</au><au>Willemse, T.A.C</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>QNNRepair: Quantized Neural Network Repair</atitle><btitle>Software Engineering and Formal Methods: 21st International Conference, SEFM 2023</btitle><date>2023-11-06</date><risdate>2023</risdate><volume>14323</volume><spage>320</spage><epage>339</epage><pages>320-339</pages><issn>0302-9743</issn><isbn>9783031471148</isbn><isbn>3031471148</isbn><abstract>We present QNNRepair, the first method in the literature for repairing quantized neural networks (QNNs). QNNRepair aims to improve the accuracy of a neural network model after quantization. It accepts the full-precision and weight-quantized neural networks, together with a repair dataset of passing and failing tests. At first, QNNRepair applies a software fault localization method to identify the neurons that cause performance degradation during neural network quantization. Then, it formulates the repair problem into a MILP, solving neuron weight parameters, which corrects the QNN's performance on failing tests while not compromising its performance on passing tests. We evaluate QNNRepair with widely used neural network architectures such as MobileNetV2, ResNet, and VGGNet on popular datasets, including high-resolution images. We also compare QNNRepair with the state-of-the-art data-free quantization method SQuant [22]. According to the experiment results, we conclude that QNNRepair is effective in improving the quantized model's performance in most cases. Its repaired models have 24% higher accuracy than SQuant's in the independent validation set, especially for the ImageNet dataset.</abstract><pub>Springer-VerlagBerlin, Heidelberg</pub></addata></record>
fulltext fulltext
identifier ISSN: 0302-9743
ispartof Software Engineering and Formal Methods: 21st International Conference, SEFM 2023, 2023, Vol.14323, p.320-339
issn 0302-9743
language eng
recordid cdi_kuleuven_dspace_20_500_12942_747847
source Lirias (KU Leuven Association); Springer Books
title QNNRepair: Quantized Neural Network Repair
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T02%3A05%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-kuleuven&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=QNNRepair:%20Quantized%20Neural%20Network%20Repair&rft.btitle=Software%20Engineering%20and%20Formal%20Methods:%2021st%20International%20Conference,%20SEFM%202023&rft.au=Song,%20Xidan&rft.date=2023-11-06&rft.volume=14323&rft.spage=320&rft.epage=339&rft.pages=320-339&rft.issn=0302-9743&rft.isbn=9783031471148&rft.isbn_list=3031471148&rft_id=info:doi/&rft_dat=%3Ckuleuven%3E20_500_12942_747847%3C/kuleuven%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true