Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects
Deep neural networks for automatic image colorization often suffer from the color-bleeding artifact, a problematic color spreading near the boundaries between adjacent objects. Such color-bleeding artifacts debase the reality of generated outputs, limiting the applicability of colorization models in...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2021-09 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Kim, Eungyeup Lee, Sanghyeon Park, Jeonghoon Choi, Somi Seo, Choonghyun Choo, Jaegul |
description | Deep neural networks for automatic image colorization often suffer from the color-bleeding artifact, a problematic color spreading near the boundaries between adjacent objects. Such color-bleeding artifacts debase the reality of generated outputs, limiting the applicability of colorization models in practice. Although previous approaches have attempted to address this problem in an automatic manner, they tend to work only in limited cases where a high contrast of gray-scale values are given in an input image. Alternatively, leveraging user interactions would be a promising approach for solving this color-breeding artifacts. In this paper, we propose a novel edge-enhancing network for the regions of interest via simple user scribbles indicating where to enhance. In addition, our method requires a minimal amount of effort from users for their satisfactory enhancement. Experimental results demonstrate that our interactive edge-enhancing approach effectively improves the color-bleeding artifacts compared to the existing baselines across various datasets. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2548743813</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2548743813</sourcerecordid><originalsourceid>FETCH-proquest_journals_25487438133</originalsourceid><addsrcrecordid>eNqNiksKwjAUAIMgWLR3CLgOtPnYbrXWz959Ce1rSClJTV4VPL2CHsDVwMwsSMKFyFkpOV-RNMYhyzK-K7hSIiHnI8BE684A2z91AHp1CEG3aB9AKz_6YF8arXdUG21dxK9khxGgs87Quu-hxbghy16PEdIf12R7qm_VhU3B32eI2Ax-Du6TGq5kWUhR5kL8d70Bzfw7Xg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2548743813</pqid></control><display><type>article</type><title>Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects</title><source>Free E- Journals</source><creator>Kim, Eungyeup ; Lee, Sanghyeon ; Park, Jeonghoon ; Choi, Somi ; Seo, Choonghyun ; Choo, Jaegul</creator><creatorcontrib>Kim, Eungyeup ; Lee, Sanghyeon ; Park, Jeonghoon ; Choi, Somi ; Seo, Choonghyun ; Choo, Jaegul</creatorcontrib><description>Deep neural networks for automatic image colorization often suffer from the color-bleeding artifact, a problematic color spreading near the boundaries between adjacent objects. Such color-bleeding artifacts debase the reality of generated outputs, limiting the applicability of colorization models in practice. Although previous approaches have attempted to address this problem in an automatic manner, they tend to work only in limited cases where a high contrast of gray-scale values are given in an input image. Alternatively, leveraging user interactions would be a promising approach for solving this color-breeding artifacts. In this paper, we propose a novel edge-enhancing network for the regions of interest via simple user scribbles indicating where to enhance. In addition, our method requires a minimal amount of effort from users for their satisfactory enhancement. Experimental results demonstrate that our interactive edge-enhancing approach effectively improves the color-bleeding artifacts compared to the existing baselines across various datasets.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Color ; Colorization ; Image contrast</subject><ispartof>arXiv.org, 2021-09</ispartof><rights>2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Kim, Eungyeup</creatorcontrib><creatorcontrib>Lee, Sanghyeon</creatorcontrib><creatorcontrib>Park, Jeonghoon</creatorcontrib><creatorcontrib>Choi, Somi</creatorcontrib><creatorcontrib>Seo, Choonghyun</creatorcontrib><creatorcontrib>Choo, Jaegul</creatorcontrib><title>Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects</title><title>arXiv.org</title><description>Deep neural networks for automatic image colorization often suffer from the color-bleeding artifact, a problematic color spreading near the boundaries between adjacent objects. Such color-bleeding artifacts debase the reality of generated outputs, limiting the applicability of colorization models in practice. Although previous approaches have attempted to address this problem in an automatic manner, they tend to work only in limited cases where a high contrast of gray-scale values are given in an input image. Alternatively, leveraging user interactions would be a promising approach for solving this color-breeding artifacts. In this paper, we propose a novel edge-enhancing network for the regions of interest via simple user scribbles indicating where to enhance. In addition, our method requires a minimal amount of effort from users for their satisfactory enhancement. Experimental results demonstrate that our interactive edge-enhancing approach effectively improves the color-bleeding artifacts compared to the existing baselines across various datasets.</description><subject>Color</subject><subject>Colorization</subject><subject>Image contrast</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNiksKwjAUAIMgWLR3CLgOtPnYbrXWz959Ce1rSClJTV4VPL2CHsDVwMwsSMKFyFkpOV-RNMYhyzK-K7hSIiHnI8BE684A2z91AHp1CEG3aB9AKz_6YF8arXdUG21dxK9khxGgs87Quu-hxbghy16PEdIf12R7qm_VhU3B32eI2Ax-Du6TGq5kWUhR5kL8d70Bzfw7Xg</recordid><startdate>20210921</startdate><enddate>20210921</enddate><creator>Kim, Eungyeup</creator><creator>Lee, Sanghyeon</creator><creator>Park, Jeonghoon</creator><creator>Choi, Somi</creator><creator>Seo, Choonghyun</creator><creator>Choo, Jaegul</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20210921</creationdate><title>Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects</title><author>Kim, Eungyeup ; Lee, Sanghyeon ; Park, Jeonghoon ; Choi, Somi ; Seo, Choonghyun ; Choo, Jaegul</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_25487438133</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Color</topic><topic>Colorization</topic><topic>Image contrast</topic><toplevel>online_resources</toplevel><creatorcontrib>Kim, Eungyeup</creatorcontrib><creatorcontrib>Lee, Sanghyeon</creatorcontrib><creatorcontrib>Park, Jeonghoon</creatorcontrib><creatorcontrib>Choi, Somi</creatorcontrib><creatorcontrib>Seo, Choonghyun</creatorcontrib><creatorcontrib>Choo, Jaegul</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kim, Eungyeup</au><au>Lee, Sanghyeon</au><au>Park, Jeonghoon</au><au>Choi, Somi</au><au>Seo, Choonghyun</au><au>Choo, Jaegul</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects</atitle><jtitle>arXiv.org</jtitle><date>2021-09-21</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>Deep neural networks for automatic image colorization often suffer from the color-bleeding artifact, a problematic color spreading near the boundaries between adjacent objects. Such color-bleeding artifacts debase the reality of generated outputs, limiting the applicability of colorization models in practice. Although previous approaches have attempted to address this problem in an automatic manner, they tend to work only in limited cases where a high contrast of gray-scale values are given in an input image. Alternatively, leveraging user interactions would be a promising approach for solving this color-breeding artifacts. In this paper, we propose a novel edge-enhancing network for the regions of interest via simple user scribbles indicating where to enhance. In addition, our method requires a minimal amount of effort from users for their satisfactory enhancement. Experimental results demonstrate that our interactive edge-enhancing approach effectively improves the color-bleeding artifacts compared to the existing baselines across various datasets.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2021-09 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2548743813 |
source | Free E- Journals |
subjects | Color Colorization Image contrast |
title | Deep Edge-Aware Interactive Colorization against Color-Bleeding Effects |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-22T10%3A29%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Deep%20Edge-Aware%20Interactive%20Colorization%20against%20Color-Bleeding%20Effects&rft.jtitle=arXiv.org&rft.au=Kim,%20Eungyeup&rft.date=2021-09-21&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2548743813%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2548743813&rft_id=info:pmid/&rfr_iscdi=true |