Omnidirectional Image Quality Assessment With Knowledge Distillation
Omnidirectional images can be viewed through various projection formats. Different projection formats could offer different views, which may capture complementary information to boost feature representation effect. However, previous omnidirectional image quality assessment (OIQA) methods mostly focu...
Gespeichert in:
Veröffentlicht in: | IEEE signal processing letters 2023, Vol.30, p.1562-1566 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1566 |
---|---|
container_issue | |
container_start_page | 1562 |
container_title | IEEE signal processing letters |
container_volume | 30 |
creator | Liu, Lixiong Ma, Pingchuan Wang, Chongwen Xu, Dong |
description | Omnidirectional images can be viewed through various projection formats. Different projection formats could offer different views, which may capture complementary information to boost feature representation effect. However, previous omnidirectional image quality assessment (OIQA) methods mostly focus on single projection format, the relationship between different projection contents is rarely explored. In this letter, we propose a knowledge distillation based OIQA (KD-OIQA) framework that improves quality feature representation capability of student network under the guidance of the quality feature representation of teacher network through different projection formats. Specially, we firstly train a teacher network with viewport images. Then, we distill the knowledge from teacher network into student network trained on the equirectangular projection (ERP) images for boosting the feature representation of student network. Based on recent advance regarding knowledge distillation by applying masks, we also design a masked distillation module to screen out effective information from teacher's features to achieve more efficient knowledge distillation effect. Finally, the student network extracts more comprehensive features from ERP images for quality prediction. Extensive experiments conducted on three OIQA databases demonstrate the effectiveness of the proposed framework. |
doi_str_mv | 10.1109/LSP.2023.3327908 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2885655285</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10297422</ieee_id><sourcerecordid>2885655285</sourcerecordid><originalsourceid>FETCH-LOGICAL-c292t-744777c53433671fa5adfade0c81579401e2a519d0dd37439fe27b84ac57ce203</originalsourceid><addsrcrecordid>eNpNkE1LAzEQhoMoWKt3Dx4WPG-dJJsmOZbWarFQRcVjiNlZTdmPmqRI_71b2oOneRmedxgeQq4pjCgFfbd8fR4xYHzEOZMa1AkZUCFUzviYnvYZJOS635-TixjXAKCoEgMyWzWtL31Al3zX2jpbNPYLs5etrX3aZZMYMcYG25R9-PSdPbXdb41lT8x8TL6u7b52Sc4qW0e8Os4heZ_fv00f8-XqYTGdLHPHNEu5LAoppRO84HwsaWWFLStbIjhFhdQFUGRWUF1CWXJZcF0hk5-qsE5Ihwz4kNwe7m5C97PFmMy624b-62iYUmIsBFOip-BAudDFGLAym-AbG3aGgtm7Mr0rs3dljq76ys2h4hHxH860LBjjf04IZN0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2885655285</pqid></control><display><type>article</type><title>Omnidirectional Image Quality Assessment With Knowledge Distillation</title><source>IEEE Electronic Library (IEL)</source><creator>Liu, Lixiong ; Ma, Pingchuan ; Wang, Chongwen ; Xu, Dong</creator><creatorcontrib>Liu, Lixiong ; Ma, Pingchuan ; Wang, Chongwen ; Xu, Dong</creatorcontrib><description>Omnidirectional images can be viewed through various projection formats. Different projection formats could offer different views, which may capture complementary information to boost feature representation effect. However, previous omnidirectional image quality assessment (OIQA) methods mostly focus on single projection format, the relationship between different projection contents is rarely explored. In this letter, we propose a knowledge distillation based OIQA (KD-OIQA) framework that improves quality feature representation capability of student network under the guidance of the quality feature representation of teacher network through different projection formats. Specially, we firstly train a teacher network with viewport images. Then, we distill the knowledge from teacher network into student network trained on the equirectangular projection (ERP) images for boosting the feature representation of student network. Based on recent advance regarding knowledge distillation by applying masks, we also design a masked distillation module to screen out effective information from teacher's features to achieve more efficient knowledge distillation effect. Finally, the student network extracts more comprehensive features from ERP images for quality prediction. Extensive experiments conducted on three OIQA databases demonstrate the effectiveness of the proposed framework.</description><identifier>ISSN: 1070-9908</identifier><identifier>EISSN: 1558-2361</identifier><identifier>DOI: 10.1109/LSP.2023.3327908</identifier><identifier>CODEN: ISPLEM</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Data mining ; Distillation ; Feature extraction ; Image quality ; Knowledge distillation (KD) ; Knowledge engineering ; masked distillation ; omnidirectional image quality assessment (OIQA) ; Predictive models ; Quality assessment ; Representations ; Teachers ; Training</subject><ispartof>IEEE signal processing letters, 2023, Vol.30, p.1562-1566</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c292t-744777c53433671fa5adfade0c81579401e2a519d0dd37439fe27b84ac57ce203</citedby><cites>FETCH-LOGICAL-c292t-744777c53433671fa5adfade0c81579401e2a519d0dd37439fe27b84ac57ce203</cites><orcidid>0000-0001-8357-1113 ; 0009-0007-5973-6812 ; 0000-0003-0203-3167</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10297422$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,777,781,793,4010,27904,27905,27906,54739</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10297422$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Liu, Lixiong</creatorcontrib><creatorcontrib>Ma, Pingchuan</creatorcontrib><creatorcontrib>Wang, Chongwen</creatorcontrib><creatorcontrib>Xu, Dong</creatorcontrib><title>Omnidirectional Image Quality Assessment With Knowledge Distillation</title><title>IEEE signal processing letters</title><addtitle>LSP</addtitle><description>Omnidirectional images can be viewed through various projection formats. Different projection formats could offer different views, which may capture complementary information to boost feature representation effect. However, previous omnidirectional image quality assessment (OIQA) methods mostly focus on single projection format, the relationship between different projection contents is rarely explored. In this letter, we propose a knowledge distillation based OIQA (KD-OIQA) framework that improves quality feature representation capability of student network under the guidance of the quality feature representation of teacher network through different projection formats. Specially, we firstly train a teacher network with viewport images. Then, we distill the knowledge from teacher network into student network trained on the equirectangular projection (ERP) images for boosting the feature representation of student network. Based on recent advance regarding knowledge distillation by applying masks, we also design a masked distillation module to screen out effective information from teacher's features to achieve more efficient knowledge distillation effect. Finally, the student network extracts more comprehensive features from ERP images for quality prediction. Extensive experiments conducted on three OIQA databases demonstrate the effectiveness of the proposed framework.</description><subject>Data mining</subject><subject>Distillation</subject><subject>Feature extraction</subject><subject>Image quality</subject><subject>Knowledge distillation (KD)</subject><subject>Knowledge engineering</subject><subject>masked distillation</subject><subject>omnidirectional image quality assessment (OIQA)</subject><subject>Predictive models</subject><subject>Quality assessment</subject><subject>Representations</subject><subject>Teachers</subject><subject>Training</subject><issn>1070-9908</issn><issn>1558-2361</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE1LAzEQhoMoWKt3Dx4WPG-dJJsmOZbWarFQRcVjiNlZTdmPmqRI_71b2oOneRmedxgeQq4pjCgFfbd8fR4xYHzEOZMa1AkZUCFUzviYnvYZJOS635-TixjXAKCoEgMyWzWtL31Al3zX2jpbNPYLs5etrX3aZZMYMcYG25R9-PSdPbXdb41lT8x8TL6u7b52Sc4qW0e8Os4heZ_fv00f8-XqYTGdLHPHNEu5LAoppRO84HwsaWWFLStbIjhFhdQFUGRWUF1CWXJZcF0hk5-qsE5Ihwz4kNwe7m5C97PFmMy624b-62iYUmIsBFOip-BAudDFGLAym-AbG3aGgtm7Mr0rs3dljq76ys2h4hHxH860LBjjf04IZN0</recordid><startdate>2023</startdate><enddate>2023</enddate><creator>Liu, Lixiong</creator><creator>Ma, Pingchuan</creator><creator>Wang, Chongwen</creator><creator>Xu, Dong</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-8357-1113</orcidid><orcidid>https://orcid.org/0009-0007-5973-6812</orcidid><orcidid>https://orcid.org/0000-0003-0203-3167</orcidid></search><sort><creationdate>2023</creationdate><title>Omnidirectional Image Quality Assessment With Knowledge Distillation</title><author>Liu, Lixiong ; Ma, Pingchuan ; Wang, Chongwen ; Xu, Dong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c292t-744777c53433671fa5adfade0c81579401e2a519d0dd37439fe27b84ac57ce203</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Data mining</topic><topic>Distillation</topic><topic>Feature extraction</topic><topic>Image quality</topic><topic>Knowledge distillation (KD)</topic><topic>Knowledge engineering</topic><topic>masked distillation</topic><topic>omnidirectional image quality assessment (OIQA)</topic><topic>Predictive models</topic><topic>Quality assessment</topic><topic>Representations</topic><topic>Teachers</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Lixiong</creatorcontrib><creatorcontrib>Ma, Pingchuan</creatorcontrib><creatorcontrib>Wang, Chongwen</creatorcontrib><creatorcontrib>Xu, Dong</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE signal processing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Liu, Lixiong</au><au>Ma, Pingchuan</au><au>Wang, Chongwen</au><au>Xu, Dong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Omnidirectional Image Quality Assessment With Knowledge Distillation</atitle><jtitle>IEEE signal processing letters</jtitle><stitle>LSP</stitle><date>2023</date><risdate>2023</risdate><volume>30</volume><spage>1562</spage><epage>1566</epage><pages>1562-1566</pages><issn>1070-9908</issn><eissn>1558-2361</eissn><coden>ISPLEM</coden><abstract>Omnidirectional images can be viewed through various projection formats. Different projection formats could offer different views, which may capture complementary information to boost feature representation effect. However, previous omnidirectional image quality assessment (OIQA) methods mostly focus on single projection format, the relationship between different projection contents is rarely explored. In this letter, we propose a knowledge distillation based OIQA (KD-OIQA) framework that improves quality feature representation capability of student network under the guidance of the quality feature representation of teacher network through different projection formats. Specially, we firstly train a teacher network with viewport images. Then, we distill the knowledge from teacher network into student network trained on the equirectangular projection (ERP) images for boosting the feature representation of student network. Based on recent advance regarding knowledge distillation by applying masks, we also design a masked distillation module to screen out effective information from teacher's features to achieve more efficient knowledge distillation effect. Finally, the student network extracts more comprehensive features from ERP images for quality prediction. Extensive experiments conducted on three OIQA databases demonstrate the effectiveness of the proposed framework.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/LSP.2023.3327908</doi><tpages>5</tpages><orcidid>https://orcid.org/0000-0001-8357-1113</orcidid><orcidid>https://orcid.org/0009-0007-5973-6812</orcidid><orcidid>https://orcid.org/0000-0003-0203-3167</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1070-9908 |
ispartof | IEEE signal processing letters, 2023, Vol.30, p.1562-1566 |
issn | 1070-9908 1558-2361 |
language | eng |
recordid | cdi_proquest_journals_2885655285 |
source | IEEE Electronic Library (IEL) |
subjects | Data mining Distillation Feature extraction Image quality Knowledge distillation (KD) Knowledge engineering masked distillation omnidirectional image quality assessment (OIQA) Predictive models Quality assessment Representations Teachers Training |
title | Omnidirectional Image Quality Assessment With Knowledge Distillation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T18%3A23%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Omnidirectional%20Image%20Quality%20Assessment%20With%20Knowledge%20Distillation&rft.jtitle=IEEE%20signal%20processing%20letters&rft.au=Liu,%20Lixiong&rft.date=2023&rft.volume=30&rft.spage=1562&rft.epage=1566&rft.pages=1562-1566&rft.issn=1070-9908&rft.eissn=1558-2361&rft.coden=ISPLEM&rft_id=info:doi/10.1109/LSP.2023.3327908&rft_dat=%3Cproquest_RIE%3E2885655285%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2885655285&rft_id=info:pmid/&rft_ieee_id=10297422&rfr_iscdi=true |