Masked Extended Attention for Zero-Shot Virtual Try-On In The Wild

Virtual Try-On (VTON) is a highly active line of research, with increasing demand. It aims to replace a piece of garment in an image with one from another, while preserving person and garment characteristics as well as image fidelity. Current literature takes a supervised approach for the task, impa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-06
Hauptverfasser: Orzech, Nadav, Nitzan, Yotam, Mizrahi, Ulysse, Danon, Dov, Bermano, Amit H
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Orzech, Nadav
Nitzan, Yotam
Mizrahi, Ulysse
Danon, Dov
Bermano, Amit H
description Virtual Try-On (VTON) is a highly active line of research, with increasing demand. It aims to replace a piece of garment in an image with one from another, while preserving person and garment characteristics as well as image fidelity. Current literature takes a supervised approach for the task, impairing generalization and imposing heavy computation. In this paper, we present a novel zero-shot training-free method for inpainting a clothing garment by reference. Our approach employs the prior of a diffusion model with no additional training, fully leveraging its native generalization capabilities. The method employs extended attention to transfer image information from reference to target images, overcoming two significant challenges. We first initially warp the reference garment over the target human using deep features, alleviating "texture sticking". We then leverage the extended attention mechanism with careful masking, eliminating leakage of reference background and unwanted influence. Through a user study, qualitative, and quantitative comparison to state-of-the-art approaches, we demonstrate superior image quality and garment preservation compared unseen clothing pieces or human figures.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3071631201</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3071631201</sourcerecordid><originalsourceid>FETCH-proquest_journals_30716312013</originalsourceid><addsrcrecordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRw8k0szk5NUXCtKEnNSwEyHEuAjJLM_DyFtPwihajUonzd4Iz8EoWwzKKS0sQchZCiSl3_PAXPPIWQjFSF8MycFB4G1rTEnOJUXijNzaDs5hri7KFbUJRfWJpaXBKflV9alAeUijc2MDc0MzY0MjA0Jk4VADaaOH0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3071631201</pqid></control><display><type>article</type><title>Masked Extended Attention for Zero-Shot Virtual Try-On In The Wild</title><source>Free E- Journals</source><creator>Orzech, Nadav ; Nitzan, Yotam ; Mizrahi, Ulysse ; Danon, Dov ; Bermano, Amit H</creator><creatorcontrib>Orzech, Nadav ; Nitzan, Yotam ; Mizrahi, Ulysse ; Danon, Dov ; Bermano, Amit H</creatorcontrib><description>Virtual Try-On (VTON) is a highly active line of research, with increasing demand. It aims to replace a piece of garment in an image with one from another, while preserving person and garment characteristics as well as image fidelity. Current literature takes a supervised approach for the task, impairing generalization and imposing heavy computation. In this paper, we present a novel zero-shot training-free method for inpainting a clothing garment by reference. Our approach employs the prior of a diffusion model with no additional training, fully leveraging its native generalization capabilities. The method employs extended attention to transfer image information from reference to target images, overcoming two significant challenges. We first initially warp the reference garment over the target human using deep features, alleviating "texture sticking". We then leverage the extended attention mechanism with careful masking, eliminating leakage of reference background and unwanted influence. Through a user study, qualitative, and quantitative comparison to state-of-the-art approaches, we demonstrate superior image quality and garment preservation compared unseen clothing pieces or human figures.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Garments ; Image quality</subject><ispartof>arXiv.org, 2024-06</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Orzech, Nadav</creatorcontrib><creatorcontrib>Nitzan, Yotam</creatorcontrib><creatorcontrib>Mizrahi, Ulysse</creatorcontrib><creatorcontrib>Danon, Dov</creatorcontrib><creatorcontrib>Bermano, Amit H</creatorcontrib><title>Masked Extended Attention for Zero-Shot Virtual Try-On In The Wild</title><title>arXiv.org</title><description>Virtual Try-On (VTON) is a highly active line of research, with increasing demand. It aims to replace a piece of garment in an image with one from another, while preserving person and garment characteristics as well as image fidelity. Current literature takes a supervised approach for the task, impairing generalization and imposing heavy computation. In this paper, we present a novel zero-shot training-free method for inpainting a clothing garment by reference. Our approach employs the prior of a diffusion model with no additional training, fully leveraging its native generalization capabilities. The method employs extended attention to transfer image information from reference to target images, overcoming two significant challenges. We first initially warp the reference garment over the target human using deep features, alleviating "texture sticking". We then leverage the extended attention mechanism with careful masking, eliminating leakage of reference background and unwanted influence. Through a user study, qualitative, and quantitative comparison to state-of-the-art approaches, we demonstrate superior image quality and garment preservation compared unseen clothing pieces or human figures.</description><subject>Garments</subject><subject>Image quality</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mRw8k0szk5NUXCtKEnNSwEyHEuAjJLM_DyFtPwihajUonzd4Iz8EoWwzKKS0sQchZCiSl3_PAXPPIWQjFSF8MycFB4G1rTEnOJUXijNzaDs5hri7KFbUJRfWJpaXBKflV9alAeUijc2MDc0MzY0MjA0Jk4VADaaOH0</recordid><startdate>20240621</startdate><enddate>20240621</enddate><creator>Orzech, Nadav</creator><creator>Nitzan, Yotam</creator><creator>Mizrahi, Ulysse</creator><creator>Danon, Dov</creator><creator>Bermano, Amit H</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240621</creationdate><title>Masked Extended Attention for Zero-Shot Virtual Try-On In The Wild</title><author>Orzech, Nadav ; Nitzan, Yotam ; Mizrahi, Ulysse ; Danon, Dov ; Bermano, Amit H</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_30716312013</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Garments</topic><topic>Image quality</topic><toplevel>online_resources</toplevel><creatorcontrib>Orzech, Nadav</creatorcontrib><creatorcontrib>Nitzan, Yotam</creatorcontrib><creatorcontrib>Mizrahi, Ulysse</creatorcontrib><creatorcontrib>Danon, Dov</creatorcontrib><creatorcontrib>Bermano, Amit H</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Orzech, Nadav</au><au>Nitzan, Yotam</au><au>Mizrahi, Ulysse</au><au>Danon, Dov</au><au>Bermano, Amit H</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Masked Extended Attention for Zero-Shot Virtual Try-On In The Wild</atitle><jtitle>arXiv.org</jtitle><date>2024-06-21</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Virtual Try-On (VTON) is a highly active line of research, with increasing demand. It aims to replace a piece of garment in an image with one from another, while preserving person and garment characteristics as well as image fidelity. Current literature takes a supervised approach for the task, impairing generalization and imposing heavy computation. In this paper, we present a novel zero-shot training-free method for inpainting a clothing garment by reference. Our approach employs the prior of a diffusion model with no additional training, fully leveraging its native generalization capabilities. The method employs extended attention to transfer image information from reference to target images, overcoming two significant challenges. We first initially warp the reference garment over the target human using deep features, alleviating "texture sticking". We then leverage the extended attention mechanism with careful masking, eliminating leakage of reference background and unwanted influence. Through a user study, qualitative, and quantitative comparison to state-of-the-art approaches, we demonstrate superior image quality and garment preservation compared unseen clothing pieces or human figures.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-06
issn 2331-8422
language eng
recordid cdi_proquest_journals_3071631201
source Free E- Journals
subjects Garments
Image quality
title Masked Extended Attention for Zero-Shot Virtual Try-On In The Wild
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T22%3A10%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Masked%20Extended%20Attention%20for%20Zero-Shot%20Virtual%20Try-On%20In%20The%20Wild&rft.jtitle=arXiv.org&rft.au=Orzech,%20Nadav&rft.date=2024-06-21&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3071631201%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3071631201&rft_id=info:pmid/&rfr_iscdi=true