Diffusion on the Probability Simplex

Diffusion models learn to reverse the progressive noising of a data distribution to create a generative model. However, the desired continuous nature of the noising process can be at odds with discrete data. To deal with this tension between continuous and discrete objects, we propose a method of pe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Floto, Griffin, Jonsson, Thorsteinn, Nica, Mihai, Sanner, Scott, Zhu, Eric Zhengyu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Floto, Griffin
Jonsson, Thorsteinn
Nica, Mihai
Sanner, Scott
Zhu, Eric Zhengyu
description Diffusion models learn to reverse the progressive noising of a data distribution to create a generative model. However, the desired continuous nature of the noising process can be at odds with discrete data. To deal with this tension between continuous and discrete objects, we propose a method of performing diffusion on the probability simplex. Using the probability simplex naturally creates an interpretation where points correspond to categorical probability distributions. Our method uses the softmax function applied to an Ornstein-Unlenbeck Process, a well-known stochastic differential equation. We find that our methodology also naturally extends to include diffusion on the unit cube which has applications for bounded image generation.
doi_str_mv 10.48550/arxiv.2309.02530
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2309_02530</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2309_02530</sourcerecordid><originalsourceid>FETCH-LOGICAL-a670-bf7629f186693a858d788b7b3ccf419ecd6d9b882308f6f7309f7270e1536ca43</originalsourceid><addsrcrecordid>eNotzr0OgjAUBeAuDkZ9ACcZXMFCaXs7GvxNTDTRnbTQG5ugEFAjby-iyUnOds5HyDSkQQyc04Wu3-4VRIyqgEac0SGZrxzis3Hl3evyuFrvVJdGG1e4R-ud3a0q7HtMBqiLxk7-PSKXzfqS7PzDcbtPlgdfC0l9g1JECkMQQjENHHIJYKRhWYZxqGyWi1wZgO4eUKDsFCgjSW3Imch0zEZk9pvtmWlVu5uu2_TLTXsu-wCb5zji</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Diffusion on the Probability Simplex</title><source>arXiv.org</source><creator>Floto, Griffin ; Jonsson, Thorsteinn ; Nica, Mihai ; Sanner, Scott ; Zhu, Eric Zhengyu</creator><creatorcontrib>Floto, Griffin ; Jonsson, Thorsteinn ; Nica, Mihai ; Sanner, Scott ; Zhu, Eric Zhengyu</creatorcontrib><description>Diffusion models learn to reverse the progressive noising of a data distribution to create a generative model. However, the desired continuous nature of the noising process can be at odds with discrete data. To deal with this tension between continuous and discrete objects, we propose a method of performing diffusion on the probability simplex. Using the probability simplex naturally creates an interpretation where points correspond to categorical probability distributions. Our method uses the softmax function applied to an Ornstein-Unlenbeck Process, a well-known stochastic differential equation. We find that our methodology also naturally extends to include diffusion on the unit cube which has applications for bounded image generation.</description><identifier>DOI: 10.48550/arxiv.2309.02530</identifier><language>eng</language><subject>Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2023-09</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2309.02530$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2309.02530$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Floto, Griffin</creatorcontrib><creatorcontrib>Jonsson, Thorsteinn</creatorcontrib><creatorcontrib>Nica, Mihai</creatorcontrib><creatorcontrib>Sanner, Scott</creatorcontrib><creatorcontrib>Zhu, Eric Zhengyu</creatorcontrib><title>Diffusion on the Probability Simplex</title><description>Diffusion models learn to reverse the progressive noising of a data distribution to create a generative model. However, the desired continuous nature of the noising process can be at odds with discrete data. To deal with this tension between continuous and discrete objects, we propose a method of performing diffusion on the probability simplex. Using the probability simplex naturally creates an interpretation where points correspond to categorical probability distributions. Our method uses the softmax function applied to an Ornstein-Unlenbeck Process, a well-known stochastic differential equation. We find that our methodology also naturally extends to include diffusion on the unit cube which has applications for bounded image generation.</description><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzr0OgjAUBeAuDkZ9ACcZXMFCaXs7GvxNTDTRnbTQG5ugEFAjby-iyUnOds5HyDSkQQyc04Wu3-4VRIyqgEac0SGZrxzis3Hl3evyuFrvVJdGG1e4R-ud3a0q7HtMBqiLxk7-PSKXzfqS7PzDcbtPlgdfC0l9g1JECkMQQjENHHIJYKRhWYZxqGyWi1wZgO4eUKDsFCgjSW3Imch0zEZk9pvtmWlVu5uu2_TLTXsu-wCb5zji</recordid><startdate>20230905</startdate><enddate>20230905</enddate><creator>Floto, Griffin</creator><creator>Jonsson, Thorsteinn</creator><creator>Nica, Mihai</creator><creator>Sanner, Scott</creator><creator>Zhu, Eric Zhengyu</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20230905</creationdate><title>Diffusion on the Probability Simplex</title><author>Floto, Griffin ; Jonsson, Thorsteinn ; Nica, Mihai ; Sanner, Scott ; Zhu, Eric Zhengyu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a670-bf7629f186693a858d788b7b3ccf419ecd6d9b882308f6f7309f7270e1536ca43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Floto, Griffin</creatorcontrib><creatorcontrib>Jonsson, Thorsteinn</creatorcontrib><creatorcontrib>Nica, Mihai</creatorcontrib><creatorcontrib>Sanner, Scott</creatorcontrib><creatorcontrib>Zhu, Eric Zhengyu</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Floto, Griffin</au><au>Jonsson, Thorsteinn</au><au>Nica, Mihai</au><au>Sanner, Scott</au><au>Zhu, Eric Zhengyu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Diffusion on the Probability Simplex</atitle><date>2023-09-05</date><risdate>2023</risdate><abstract>Diffusion models learn to reverse the progressive noising of a data distribution to create a generative model. However, the desired continuous nature of the noising process can be at odds with discrete data. To deal with this tension between continuous and discrete objects, we propose a method of performing diffusion on the probability simplex. Using the probability simplex naturally creates an interpretation where points correspond to categorical probability distributions. Our method uses the softmax function applied to an Ornstein-Unlenbeck Process, a well-known stochastic differential equation. We find that our methodology also naturally extends to include diffusion on the unit cube which has applications for bounded image generation.</abstract><doi>10.48550/arxiv.2309.02530</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2309.02530
ispartof
issn
language eng
recordid cdi_arxiv_primary_2309_02530
source arXiv.org
subjects Computer Science - Learning
Statistics - Machine Learning
title Diffusion on the Probability Simplex
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T07%3A40%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Diffusion%20on%20the%20Probability%20Simplex&rft.au=Floto,%20Griffin&rft.date=2023-09-05&rft_id=info:doi/10.48550/arxiv.2309.02530&rft_dat=%3Carxiv_GOX%3E2309_02530%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true