Variational Autoencoder with Optimizing Gaussian Mixture Model Priors
The latent variable prior of the variational autoencoder (VAE) often utilizes a standard Gaussian distribution because of the convenience in calculation, but has an underfitting problem. This paper proposes a variational autoencoder with optimizing Gaussian mixture model priors. This method utilizes...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020-01, Vol.8, p.1-1 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1 |
---|---|
container_issue | |
container_start_page | 1 |
container_title | IEEE access |
container_volume | 8 |
creator | Guo, Chunsheng Zhou, Jialuo Chen, Huahua Ying, Na Zhang, Jianwu Zhou, Di |
description | The latent variable prior of the variational autoencoder (VAE) often utilizes a standard Gaussian distribution because of the convenience in calculation, but has an underfitting problem. This paper proposes a variational autoencoder with optimizing Gaussian mixture model priors. This method utilizes a Gaussian mixture model to construct prior distribution, and utilizes the Kullback-Leibler (KL) distance between posterior and prior distribution to implement an iterative optimization of the prior distribution based on the data. The greedy algorithm is used to solve the KL distance for defining the approximate variational lower bound solution of the loss function, and for realizing the VAE with optimizing Gaussian mixture model priors. Compared with the standard VAE method, the proposed method obtains state-of-the-art results on MNIST, Omniglot, and Frey Face datasets, which shows that the VAE with optimizing Gaussian mixture model priors can learn a better model. |
doi_str_mv | 10.1109/ACCESS.2020.2977671 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2454927198</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9020116</ieee_id><doaj_id>oai_doaj_org_article_d75fd160e9c94db4af37bcd6a0b50ce4</doaj_id><sourcerecordid>2454927198</sourcerecordid><originalsourceid>FETCH-LOGICAL-c458t-7fa8a068601aef4a2669217dc25d71095ba3a7ee6fed700fdc11a73279ae3f1d3</originalsourceid><addsrcrecordid>eNpNkFtLAzEQhRdRULS_wJcFn1tz2U02j6XUWmhR8PIapslEU2pTk128_HpTV4rzMsNwzhnmK4pLSkaUEnU9nkymDw8jRhgZMSWlkPSoOGNUqCGvuTj-N58Wg5TWJFeTV7U8K6bPED20PmxhU467NuDWBIux_PDta3m3a_2b__bbl3IGXUoetuXSf7ZdxHKZZZvyPvoQ00Vx4mCTcPDXz4unm-nj5Ha4uJvNJ-PF0FR10w6lgwaIaAShgK4CJoRiVFrDaivzK_UKOEhE4dBKQpw1lILkTCpA7qjl58W8z7UB1noX_RvELx3A699FiC8aYuvNBrWVtbNUEFRGVXZVgeNyZawAsqqJwSpnXfVZuxjeO0ytXocuZgxJs6quFJNUNVnFe5WJIaWI7nCVEr3Hr3v8eo9f_-HPrsve5RHx4FBZQqngPzHAgWA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2454927198</pqid></control><display><type>article</type><title>Variational Autoencoder with Optimizing Gaussian Mixture Model Priors</title><source>Directory of Open Access Journals</source><source>IEEE Xplore Open Access Journals</source><source>EZB Electronic Journals Library</source><creator>Guo, Chunsheng ; Zhou, Jialuo ; Chen, Huahua ; Ying, Na ; Zhang, Jianwu ; Zhou, Di</creator><creatorcontrib>Guo, Chunsheng ; Zhou, Jialuo ; Chen, Huahua ; Ying, Na ; Zhang, Jianwu ; Zhou, Di</creatorcontrib><description>The latent variable prior of the variational autoencoder (VAE) often utilizes a standard Gaussian distribution because of the convenience in calculation, but has an underfitting problem. This paper proposes a variational autoencoder with optimizing Gaussian mixture model priors. This method utilizes a Gaussian mixture model to construct prior distribution, and utilizes the Kullback-Leibler (KL) distance between posterior and prior distribution to implement an iterative optimization of the prior distribution based on the data. The greedy algorithm is used to solve the KL distance for defining the approximate variational lower bound solution of the loss function, and for realizing the VAE with optimizing Gaussian mixture model priors. Compared with the standard VAE method, the proposed method obtains state-of-the-art results on MNIST, Omniglot, and Frey Face datasets, which shows that the VAE with optimizing Gaussian mixture model priors can learn a better model.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2020.2977671</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Aggregates ; Gaussian distribution ; Gaussian mixture model ; Greedy algorithms ; Iterative methods ; Kullback-Leibler distance ; Lower bounds ; Mathematical analysis ; Neural networks ; Normal distribution ; Optimization ; Probabilistic models ; Training ; Variational autoencoder</subject><ispartof>IEEE access, 2020-01, Vol.8, p.1-1</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c458t-7fa8a068601aef4a2669217dc25d71095ba3a7ee6fed700fdc11a73279ae3f1d3</citedby><cites>FETCH-LOGICAL-c458t-7fa8a068601aef4a2669217dc25d71095ba3a7ee6fed700fdc11a73279ae3f1d3</cites><orcidid>0000-0003-1631-551X ; 0000-0003-1765-3705 ; 0000-0002-4551-2830 ; 0000-0003-1633-3507 ; 0000-0001-8512-0412 ; 0000-0001-8288-8688</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9020116$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,860,2096,27610,27901,27902,54908</link.rule.ids></links><search><creatorcontrib>Guo, Chunsheng</creatorcontrib><creatorcontrib>Zhou, Jialuo</creatorcontrib><creatorcontrib>Chen, Huahua</creatorcontrib><creatorcontrib>Ying, Na</creatorcontrib><creatorcontrib>Zhang, Jianwu</creatorcontrib><creatorcontrib>Zhou, Di</creatorcontrib><title>Variational Autoencoder with Optimizing Gaussian Mixture Model Priors</title><title>IEEE access</title><addtitle>Access</addtitle><description>The latent variable prior of the variational autoencoder (VAE) often utilizes a standard Gaussian distribution because of the convenience in calculation, but has an underfitting problem. This paper proposes a variational autoencoder with optimizing Gaussian mixture model priors. This method utilizes a Gaussian mixture model to construct prior distribution, and utilizes the Kullback-Leibler (KL) distance between posterior and prior distribution to implement an iterative optimization of the prior distribution based on the data. The greedy algorithm is used to solve the KL distance for defining the approximate variational lower bound solution of the loss function, and for realizing the VAE with optimizing Gaussian mixture model priors. Compared with the standard VAE method, the proposed method obtains state-of-the-art results on MNIST, Omniglot, and Frey Face datasets, which shows that the VAE with optimizing Gaussian mixture model priors can learn a better model.</description><subject>Aggregates</subject><subject>Gaussian distribution</subject><subject>Gaussian mixture model</subject><subject>Greedy algorithms</subject><subject>Iterative methods</subject><subject>Kullback-Leibler distance</subject><subject>Lower bounds</subject><subject>Mathematical analysis</subject><subject>Neural networks</subject><subject>Normal distribution</subject><subject>Optimization</subject><subject>Probabilistic models</subject><subject>Training</subject><subject>Variational autoencoder</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNkFtLAzEQhRdRULS_wJcFn1tz2U02j6XUWmhR8PIapslEU2pTk128_HpTV4rzMsNwzhnmK4pLSkaUEnU9nkymDw8jRhgZMSWlkPSoOGNUqCGvuTj-N58Wg5TWJFeTV7U8K6bPED20PmxhU467NuDWBIux_PDta3m3a_2b__bbl3IGXUoetuXSf7ZdxHKZZZvyPvoQ00Vx4mCTcPDXz4unm-nj5Ha4uJvNJ-PF0FR10w6lgwaIaAShgK4CJoRiVFrDaivzK_UKOEhE4dBKQpw1lILkTCpA7qjl58W8z7UB1noX_RvELx3A699FiC8aYuvNBrWVtbNUEFRGVXZVgeNyZawAsqqJwSpnXfVZuxjeO0ytXocuZgxJs6quFJNUNVnFe5WJIaWI7nCVEr3Hr3v8eo9f_-HPrsve5RHx4FBZQqngPzHAgWA</recordid><startdate>20200101</startdate><enddate>20200101</enddate><creator>Guo, Chunsheng</creator><creator>Zhou, Jialuo</creator><creator>Chen, Huahua</creator><creator>Ying, Na</creator><creator>Zhang, Jianwu</creator><creator>Zhou, Di</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-1631-551X</orcidid><orcidid>https://orcid.org/0000-0003-1765-3705</orcidid><orcidid>https://orcid.org/0000-0002-4551-2830</orcidid><orcidid>https://orcid.org/0000-0003-1633-3507</orcidid><orcidid>https://orcid.org/0000-0001-8512-0412</orcidid><orcidid>https://orcid.org/0000-0001-8288-8688</orcidid></search><sort><creationdate>20200101</creationdate><title>Variational Autoencoder with Optimizing Gaussian Mixture Model Priors</title><author>Guo, Chunsheng ; Zhou, Jialuo ; Chen, Huahua ; Ying, Na ; Zhang, Jianwu ; Zhou, Di</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c458t-7fa8a068601aef4a2669217dc25d71095ba3a7ee6fed700fdc11a73279ae3f1d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Aggregates</topic><topic>Gaussian distribution</topic><topic>Gaussian mixture model</topic><topic>Greedy algorithms</topic><topic>Iterative methods</topic><topic>Kullback-Leibler distance</topic><topic>Lower bounds</topic><topic>Mathematical analysis</topic><topic>Neural networks</topic><topic>Normal distribution</topic><topic>Optimization</topic><topic>Probabilistic models</topic><topic>Training</topic><topic>Variational autoencoder</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Guo, Chunsheng</creatorcontrib><creatorcontrib>Zhou, Jialuo</creatorcontrib><creatorcontrib>Chen, Huahua</creatorcontrib><creatorcontrib>Ying, Na</creatorcontrib><creatorcontrib>Zhang, Jianwu</creatorcontrib><creatorcontrib>Zhou, Di</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Xplore Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998–Present</collection><collection>IEEE</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Guo, Chunsheng</au><au>Zhou, Jialuo</au><au>Chen, Huahua</au><au>Ying, Na</au><au>Zhang, Jianwu</au><au>Zhou, Di</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Variational Autoencoder with Optimizing Gaussian Mixture Model Priors</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2020-01-01</date><risdate>2020</risdate><volume>8</volume><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>The latent variable prior of the variational autoencoder (VAE) often utilizes a standard Gaussian distribution because of the convenience in calculation, but has an underfitting problem. This paper proposes a variational autoencoder with optimizing Gaussian mixture model priors. This method utilizes a Gaussian mixture model to construct prior distribution, and utilizes the Kullback-Leibler (KL) distance between posterior and prior distribution to implement an iterative optimization of the prior distribution based on the data. The greedy algorithm is used to solve the KL distance for defining the approximate variational lower bound solution of the loss function, and for realizing the VAE with optimizing Gaussian mixture model priors. Compared with the standard VAE method, the proposed method obtains state-of-the-art results on MNIST, Omniglot, and Frey Face datasets, which shows that the VAE with optimizing Gaussian mixture model priors can learn a better model.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2020.2977671</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0003-1631-551X</orcidid><orcidid>https://orcid.org/0000-0003-1765-3705</orcidid><orcidid>https://orcid.org/0000-0002-4551-2830</orcidid><orcidid>https://orcid.org/0000-0003-1633-3507</orcidid><orcidid>https://orcid.org/0000-0001-8512-0412</orcidid><orcidid>https://orcid.org/0000-0001-8288-8688</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2020-01, Vol.8, p.1-1 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_proquest_journals_2454927198 |
source | Directory of Open Access Journals; IEEE Xplore Open Access Journals; EZB Electronic Journals Library |
subjects | Aggregates Gaussian distribution Gaussian mixture model Greedy algorithms Iterative methods Kullback-Leibler distance Lower bounds Mathematical analysis Neural networks Normal distribution Optimization Probabilistic models Training Variational autoencoder |
title | Variational Autoencoder with Optimizing Gaussian Mixture Model Priors |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-11T19%3A14%3A07IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Variational%20Autoencoder%20with%20Optimizing%20Gaussian%20Mixture%20Model%20Priors&rft.jtitle=IEEE%20access&rft.au=Guo,%20Chunsheng&rft.date=2020-01-01&rft.volume=8&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2020.2977671&rft_dat=%3Cproquest_cross%3E2454927198%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2454927198&rft_id=info:pmid/&rft_ieee_id=9020116&rft_doaj_id=oai_doaj_org_article_d75fd160e9c94db4af37bcd6a0b50ce4&rfr_iscdi=true |