Online Kernel based Generative Adversarial Networks

One of the major breakthroughs in deep learning over the past five years has been the Generative Adversarial Network (GAN), a neural network-based generative model which aims to mimic some underlying distribution given a dataset of samples. In contrast to many supervised problems, where one tries to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2020-06
Hauptverfasser: Youn, Yeojoon, Thistlethwaite, Neil, Choe, Sang Keun, Abernethy, Jacob
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Youn, Yeojoon
Thistlethwaite, Neil
Choe, Sang Keun
Abernethy, Jacob
description One of the major breakthroughs in deep learning over the past five years has been the Generative Adversarial Network (GAN), a neural network-based generative model which aims to mimic some underlying distribution given a dataset of samples. In contrast to many supervised problems, where one tries to minimize a simple objective function of the parameters, GAN training is formulated as a min-max problem over a pair of network parameters. While empirically GANs have shown impressive success in several domains, researchers have been puzzled by unusual training behavior, including cycling so-called mode collapse. In this paper, we begin by providing a quantitative method to explore some of the challenges in GAN training, and we show empirically how this relates fundamentally to the parametric nature of the discriminator network. We propose a novel approach that resolves many of these issues by relying on a kernel-based non-parametric discriminator that is highly amenable to online training---we call this the Online Kernel-based Generative Adversarial Networks (OKGAN). We show empirically that OKGANs mitigate a number of training issues, including mode collapse and cycling, and are much more amenable to theoretical guarantees. OKGANs empirically perform dramatically better, with respect to reverse KL-divergence, than other GAN formulations on synthetic data; on classical vision datasets such as MNIST, SVHN, and CelebA, show comparable performance.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2416044561</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2416044561</sourcerecordid><originalsourceid>FETCH-proquest_journals_24160445613</originalsourceid><addsrcrecordid>eNqNyrsKwjAUgOEgCBbtOwScC7nrKuIFBF3cS6RHSA2JnpPW19fBB3D6h--fsEppLZu1UWrGaqJeCKHcSlmrK6YvKYYE_ASYIPKbJ-j4ARKgL2EEvulGQPIYfORnKO-MD1qw6d1HgvrXOVvud9ftsXlifg1Ape3zgOlLrTLSCWOsk_q_6wMVEzQ0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2416044561</pqid></control><display><type>article</type><title>Online Kernel based Generative Adversarial Networks</title><source>Free E- Journals</source><creator>Youn, Yeojoon ; Thistlethwaite, Neil ; Choe, Sang Keun ; Abernethy, Jacob</creator><creatorcontrib>Youn, Yeojoon ; Thistlethwaite, Neil ; Choe, Sang Keun ; Abernethy, Jacob</creatorcontrib><description>One of the major breakthroughs in deep learning over the past five years has been the Generative Adversarial Network (GAN), a neural network-based generative model which aims to mimic some underlying distribution given a dataset of samples. In contrast to many supervised problems, where one tries to minimize a simple objective function of the parameters, GAN training is formulated as a min-max problem over a pair of network parameters. While empirically GANs have shown impressive success in several domains, researchers have been puzzled by unusual training behavior, including cycling so-called mode collapse. In this paper, we begin by providing a quantitative method to explore some of the challenges in GAN training, and we show empirically how this relates fundamentally to the parametric nature of the discriminator network. We propose a novel approach that resolves many of these issues by relying on a kernel-based non-parametric discriminator that is highly amenable to online training---we call this the Online Kernel-based Generative Adversarial Networks (OKGAN). We show empirically that OKGANs mitigate a number of training issues, including mode collapse and cycling, and are much more amenable to theoretical guarantees. OKGANs empirically perform dramatically better, with respect to reverse KL-divergence, than other GAN formulations on synthetic data; on classical vision datasets such as MNIST, SVHN, and CelebA, show comparable performance.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Cycles ; Datasets ; Generative adversarial networks ; Kernels ; Machine learning ; Neural networks ; Parameters ; Training</subject><ispartof>arXiv.org, 2020-06</ispartof><rights>2020. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Youn, Yeojoon</creatorcontrib><creatorcontrib>Thistlethwaite, Neil</creatorcontrib><creatorcontrib>Choe, Sang Keun</creatorcontrib><creatorcontrib>Abernethy, Jacob</creatorcontrib><title>Online Kernel based Generative Adversarial Networks</title><title>arXiv.org</title><description>One of the major breakthroughs in deep learning over the past five years has been the Generative Adversarial Network (GAN), a neural network-based generative model which aims to mimic some underlying distribution given a dataset of samples. In contrast to many supervised problems, where one tries to minimize a simple objective function of the parameters, GAN training is formulated as a min-max problem over a pair of network parameters. While empirically GANs have shown impressive success in several domains, researchers have been puzzled by unusual training behavior, including cycling so-called mode collapse. In this paper, we begin by providing a quantitative method to explore some of the challenges in GAN training, and we show empirically how this relates fundamentally to the parametric nature of the discriminator network. We propose a novel approach that resolves many of these issues by relying on a kernel-based non-parametric discriminator that is highly amenable to online training---we call this the Online Kernel-based Generative Adversarial Networks (OKGAN). We show empirically that OKGANs mitigate a number of training issues, including mode collapse and cycling, and are much more amenable to theoretical guarantees. OKGANs empirically perform dramatically better, with respect to reverse KL-divergence, than other GAN formulations on synthetic data; on classical vision datasets such as MNIST, SVHN, and CelebA, show comparable performance.</description><subject>Cycles</subject><subject>Datasets</subject><subject>Generative adversarial networks</subject><subject>Kernels</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Parameters</subject><subject>Training</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNyrsKwjAUgOEgCBbtOwScC7nrKuIFBF3cS6RHSA2JnpPW19fBB3D6h--fsEppLZu1UWrGaqJeCKHcSlmrK6YvKYYE_ASYIPKbJ-j4ARKgL2EEvulGQPIYfORnKO-MD1qw6d1HgvrXOVvud9ftsXlifg1Ape3zgOlLrTLSCWOsk_q_6wMVEzQ0</recordid><startdate>20200619</startdate><enddate>20200619</enddate><creator>Youn, Yeojoon</creator><creator>Thistlethwaite, Neil</creator><creator>Choe, Sang Keun</creator><creator>Abernethy, Jacob</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20200619</creationdate><title>Online Kernel based Generative Adversarial Networks</title><author>Youn, Yeojoon ; Thistlethwaite, Neil ; Choe, Sang Keun ; Abernethy, Jacob</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_24160445613</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Cycles</topic><topic>Datasets</topic><topic>Generative adversarial networks</topic><topic>Kernels</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Parameters</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Youn, Yeojoon</creatorcontrib><creatorcontrib>Thistlethwaite, Neil</creatorcontrib><creatorcontrib>Choe, Sang Keun</creatorcontrib><creatorcontrib>Abernethy, Jacob</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Youn, Yeojoon</au><au>Thistlethwaite, Neil</au><au>Choe, Sang Keun</au><au>Abernethy, Jacob</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Online Kernel based Generative Adversarial Networks</atitle><jtitle>arXiv.org</jtitle><date>2020-06-19</date><risdate>2020</risdate><eissn>2331-8422</eissn><abstract>One of the major breakthroughs in deep learning over the past five years has been the Generative Adversarial Network (GAN), a neural network-based generative model which aims to mimic some underlying distribution given a dataset of samples. In contrast to many supervised problems, where one tries to minimize a simple objective function of the parameters, GAN training is formulated as a min-max problem over a pair of network parameters. While empirically GANs have shown impressive success in several domains, researchers have been puzzled by unusual training behavior, including cycling so-called mode collapse. In this paper, we begin by providing a quantitative method to explore some of the challenges in GAN training, and we show empirically how this relates fundamentally to the parametric nature of the discriminator network. We propose a novel approach that resolves many of these issues by relying on a kernel-based non-parametric discriminator that is highly amenable to online training---we call this the Online Kernel-based Generative Adversarial Networks (OKGAN). We show empirically that OKGANs mitigate a number of training issues, including mode collapse and cycling, and are much more amenable to theoretical guarantees. OKGANs empirically perform dramatically better, with respect to reverse KL-divergence, than other GAN formulations on synthetic data; on classical vision datasets such as MNIST, SVHN, and CelebA, show comparable performance.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2020-06
issn 2331-8422
language eng
recordid cdi_proquest_journals_2416044561
source Free E- Journals
subjects Cycles
Datasets
Generative adversarial networks
Kernels
Machine learning
Neural networks
Parameters
Training
title Online Kernel based Generative Adversarial Networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T11%3A17%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Online%20Kernel%20based%20Generative%20Adversarial%20Networks&rft.jtitle=arXiv.org&rft.au=Youn,%20Yeojoon&rft.date=2020-06-19&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2416044561%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2416044561&rft_id=info:pmid/&rfr_iscdi=true