Degradation Aware Approach to Image Restoration Using Knowledge Distillation

Image restoration is the task of recovering a clean image from a degraded version. In most cases, the degradation is spatially varying, and it requires the restoration network to both localize and restore the affected regions. In this paper, we present a new approach suitable for handling the image-...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of selected topics in signal processing 2021-02, Vol.15 (2), p.162-173
Hauptverfasser: Suin, Maitreya, Purohit, Kuldeep, Rajagopalan, A. N.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 173
container_issue 2
container_start_page 162
container_title IEEE journal of selected topics in signal processing
container_volume 15
creator Suin, Maitreya
Purohit, Kuldeep
Rajagopalan, A. N.
description Image restoration is the task of recovering a clean image from a degraded version. In most cases, the degradation is spatially varying, and it requires the restoration network to both localize and restore the affected regions. In this paper, we present a new approach suitable for handling the image-specific and spatially-varying nature of degradation in images affected by practically occurring artifacts such as rain-streaks, haze, raindrops and motion blur. We decompose the restoration task into two stages of degradation localization and degraded region-guided restoration, unlike existing methods which directly learn a mapping between the degraded and clean images. Our premise is to use the auxiliary task of degradation mask prediction to guide the restoration process. We demonstrate that the model trained for this auxiliary task contains vital region knowledge, which can be exploited to guide the restoration network's training using attentive knowledge distillation technique. Further, we propose mask-guided gated convolution and global context aggregation module leveraging the extra guidance from the predicted mask while focusing on restoring the degraded regions. We conduct an extensive evaluation on multiple datasets corresponding to four different restoration tasks to validate our method. Along with thorough ablation analysis and visualizations, the proposed approach's effectiveness is also demonstrated by achieving significant improvement over strong baselines for each restoration task.
doi_str_mv 10.1109/JSTSP.2020.3043622
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2493608100</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9288928</ieee_id><sourcerecordid>2493608100</sourcerecordid><originalsourceid>FETCH-LOGICAL-c295t-12ff30788dc69a3d2d8e7cdef2427ca66265d27e93341302ca0451d3abe550bb3</originalsourceid><addsrcrecordid>eNo9kE9PAjEQxRujiYh-Ab1s4nlxOm33z5GAKEqiETg3ZdvFJcsW2yXEb29hiYfJTPLem5n8CLmnMKAU8qe3-WL-OUBAGDDgLEG8ID2acxoDz_jlcWYYcyHYNbnxfgMg0oTyHpmNzdoprdrKNtHwoJyJhruds6r4jlobTbdqbaIv41vrOs_SV806em_soTY6aOPKt1Vdn8RbclWq2pu7c--T5eR5MXqNZx8v09FwFheYizamWJYM0izTRZIrplFnJi20KZFjWqgkwURoTE3OGKcMsFDABdVMrYwQsFqxPnns9oZHf_bhObmxe9eEkxJ5zhLIKEBwYecqnPXemVLuXLVV7ldSkEdq8kRNHqnJM7UQeuhClTHmP5BjloVifxYNaKk</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2493608100</pqid></control><display><type>article</type><title>Degradation Aware Approach to Image Restoration Using Knowledge Distillation</title><source>IEEE Electronic Library (IEL)</source><creator>Suin, Maitreya ; Purohit, Kuldeep ; Rajagopalan, A. N.</creator><creatorcontrib>Suin, Maitreya ; Purohit, Kuldeep ; Rajagopalan, A. N.</creatorcontrib><description>Image restoration is the task of recovering a clean image from a degraded version. In most cases, the degradation is spatially varying, and it requires the restoration network to both localize and restore the affected regions. In this paper, we present a new approach suitable for handling the image-specific and spatially-varying nature of degradation in images affected by practically occurring artifacts such as rain-streaks, haze, raindrops and motion blur. We decompose the restoration task into two stages of degradation localization and degraded region-guided restoration, unlike existing methods which directly learn a mapping between the degraded and clean images. Our premise is to use the auxiliary task of degradation mask prediction to guide the restoration process. We demonstrate that the model trained for this auxiliary task contains vital region knowledge, which can be exploited to guide the restoration network's training using attentive knowledge distillation technique. Further, we propose mask-guided gated convolution and global context aggregation module leveraging the extra guidance from the predicted mask while focusing on restoring the degraded regions. We conduct an extensive evaluation on multiple datasets corresponding to four different restoration tasks to validate our method. Along with thorough ablation analysis and visualizations, the proposed approach's effectiveness is also demonstrated by achieving significant improvement over strong baselines for each restoration task.</description><identifier>ISSN: 1932-4553</identifier><identifier>EISSN: 1941-0484</identifier><identifier>DOI: 10.1109/JSTSP.2020.3043622</identifier><identifier>CODEN: IJSTGY</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Ablation ; Blurring ; Convolution ; Deblurring ; Degradation ; dehazing ; deraining ; Distillation ; Feature extraction ; Haze ; Image degradation ; Image restoration ; Knowledge engineering ; Rain ; Raindrops ; Task analysis ; Training</subject><ispartof>IEEE journal of selected topics in signal processing, 2021-02, Vol.15 (2), p.162-173</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c295t-12ff30788dc69a3d2d8e7cdef2427ca66265d27e93341302ca0451d3abe550bb3</citedby><cites>FETCH-LOGICAL-c295t-12ff30788dc69a3d2d8e7cdef2427ca66265d27e93341302ca0451d3abe550bb3</cites><orcidid>0000-0002-6709-1627 ; 0000-0002-0004-181X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9288928$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27903,27904,54736</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9288928$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Suin, Maitreya</creatorcontrib><creatorcontrib>Purohit, Kuldeep</creatorcontrib><creatorcontrib>Rajagopalan, A. N.</creatorcontrib><title>Degradation Aware Approach to Image Restoration Using Knowledge Distillation</title><title>IEEE journal of selected topics in signal processing</title><addtitle>JSTSP</addtitle><description>Image restoration is the task of recovering a clean image from a degraded version. In most cases, the degradation is spatially varying, and it requires the restoration network to both localize and restore the affected regions. In this paper, we present a new approach suitable for handling the image-specific and spatially-varying nature of degradation in images affected by practically occurring artifacts such as rain-streaks, haze, raindrops and motion blur. We decompose the restoration task into two stages of degradation localization and degraded region-guided restoration, unlike existing methods which directly learn a mapping between the degraded and clean images. Our premise is to use the auxiliary task of degradation mask prediction to guide the restoration process. We demonstrate that the model trained for this auxiliary task contains vital region knowledge, which can be exploited to guide the restoration network's training using attentive knowledge distillation technique. Further, we propose mask-guided gated convolution and global context aggregation module leveraging the extra guidance from the predicted mask while focusing on restoring the degraded regions. We conduct an extensive evaluation on multiple datasets corresponding to four different restoration tasks to validate our method. Along with thorough ablation analysis and visualizations, the proposed approach's effectiveness is also demonstrated by achieving significant improvement over strong baselines for each restoration task.</description><subject>Ablation</subject><subject>Blurring</subject><subject>Convolution</subject><subject>Deblurring</subject><subject>Degradation</subject><subject>dehazing</subject><subject>deraining</subject><subject>Distillation</subject><subject>Feature extraction</subject><subject>Haze</subject><subject>Image degradation</subject><subject>Image restoration</subject><subject>Knowledge engineering</subject><subject>Rain</subject><subject>Raindrops</subject><subject>Task analysis</subject><subject>Training</subject><issn>1932-4553</issn><issn>1941-0484</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kE9PAjEQxRujiYh-Ab1s4nlxOm33z5GAKEqiETg3ZdvFJcsW2yXEb29hiYfJTPLem5n8CLmnMKAU8qe3-WL-OUBAGDDgLEG8ID2acxoDz_jlcWYYcyHYNbnxfgMg0oTyHpmNzdoprdrKNtHwoJyJhruds6r4jlobTbdqbaIv41vrOs_SV806em_soTY6aOPKt1Vdn8RbclWq2pu7c--T5eR5MXqNZx8v09FwFheYizamWJYM0izTRZIrplFnJi20KZFjWqgkwURoTE3OGKcMsFDABdVMrYwQsFqxPnns9oZHf_bhObmxe9eEkxJ5zhLIKEBwYecqnPXemVLuXLVV7ldSkEdq8kRNHqnJM7UQeuhClTHmP5BjloVifxYNaKk</recordid><startdate>20210201</startdate><enddate>20210201</enddate><creator>Suin, Maitreya</creator><creator>Purohit, Kuldeep</creator><creator>Rajagopalan, A. N.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>H8D</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0002-6709-1627</orcidid><orcidid>https://orcid.org/0000-0002-0004-181X</orcidid></search><sort><creationdate>20210201</creationdate><title>Degradation Aware Approach to Image Restoration Using Knowledge Distillation</title><author>Suin, Maitreya ; Purohit, Kuldeep ; Rajagopalan, A. N.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c295t-12ff30788dc69a3d2d8e7cdef2427ca66265d27e93341302ca0451d3abe550bb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Ablation</topic><topic>Blurring</topic><topic>Convolution</topic><topic>Deblurring</topic><topic>Degradation</topic><topic>dehazing</topic><topic>deraining</topic><topic>Distillation</topic><topic>Feature extraction</topic><topic>Haze</topic><topic>Image degradation</topic><topic>Image restoration</topic><topic>Knowledge engineering</topic><topic>Rain</topic><topic>Raindrops</topic><topic>Task analysis</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Suin, Maitreya</creatorcontrib><creatorcontrib>Purohit, Kuldeep</creatorcontrib><creatorcontrib>Rajagopalan, A. N.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Aerospace Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE journal of selected topics in signal processing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Suin, Maitreya</au><au>Purohit, Kuldeep</au><au>Rajagopalan, A. N.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Degradation Aware Approach to Image Restoration Using Knowledge Distillation</atitle><jtitle>IEEE journal of selected topics in signal processing</jtitle><stitle>JSTSP</stitle><date>2021-02-01</date><risdate>2021</risdate><volume>15</volume><issue>2</issue><spage>162</spage><epage>173</epage><pages>162-173</pages><issn>1932-4553</issn><eissn>1941-0484</eissn><coden>IJSTGY</coden><abstract>Image restoration is the task of recovering a clean image from a degraded version. In most cases, the degradation is spatially varying, and it requires the restoration network to both localize and restore the affected regions. In this paper, we present a new approach suitable for handling the image-specific and spatially-varying nature of degradation in images affected by practically occurring artifacts such as rain-streaks, haze, raindrops and motion blur. We decompose the restoration task into two stages of degradation localization and degraded region-guided restoration, unlike existing methods which directly learn a mapping between the degraded and clean images. Our premise is to use the auxiliary task of degradation mask prediction to guide the restoration process. We demonstrate that the model trained for this auxiliary task contains vital region knowledge, which can be exploited to guide the restoration network's training using attentive knowledge distillation technique. Further, we propose mask-guided gated convolution and global context aggregation module leveraging the extra guidance from the predicted mask while focusing on restoring the degraded regions. We conduct an extensive evaluation on multiple datasets corresponding to four different restoration tasks to validate our method. Along with thorough ablation analysis and visualizations, the proposed approach's effectiveness is also demonstrated by achieving significant improvement over strong baselines for each restoration task.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JSTSP.2020.3043622</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0002-6709-1627</orcidid><orcidid>https://orcid.org/0000-0002-0004-181X</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1932-4553
ispartof IEEE journal of selected topics in signal processing, 2021-02, Vol.15 (2), p.162-173
issn 1932-4553
1941-0484
language eng
recordid cdi_proquest_journals_2493608100
source IEEE Electronic Library (IEL)
subjects Ablation
Blurring
Convolution
Deblurring
Degradation
dehazing
deraining
Distillation
Feature extraction
Haze
Image degradation
Image restoration
Knowledge engineering
Rain
Raindrops
Task analysis
Training
title Degradation Aware Approach to Image Restoration Using Knowledge Distillation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T15%3A43%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Degradation%20Aware%20Approach%20to%20Image%20Restoration%20Using%20Knowledge%20Distillation&rft.jtitle=IEEE%20journal%20of%20selected%20topics%20in%20signal%20processing&rft.au=Suin,%20Maitreya&rft.date=2021-02-01&rft.volume=15&rft.issue=2&rft.spage=162&rft.epage=173&rft.pages=162-173&rft.issn=1932-4553&rft.eissn=1941-0484&rft.coden=IJSTGY&rft_id=info:doi/10.1109/JSTSP.2020.3043622&rft_dat=%3Cproquest_RIE%3E2493608100%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2493608100&rft_id=info:pmid/&rft_ieee_id=9288928&rfr_iscdi=true