HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting

We introduce Hyper-Conditioned Neural Autoregressive Flow (HCNAF); a powerful universal distribution approximator designed to model arbitrarily complex conditional probability density functions. HCNAF consists of a neural-net based conditional autoregressive flow (AF) and a hyper-network that can ta...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Oh, Geunseob, Valois, Jean-Sebastien
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Oh, Geunseob
Valois, Jean-Sebastien
description We introduce Hyper-Conditioned Neural Autoregressive Flow (HCNAF); a powerful universal distribution approximator designed to model arbitrarily complex conditional probability density functions. HCNAF consists of a neural-net based conditional autoregressive flow (AF) and a hyper-network that can take large conditions in non-autoregressive fashion and outputs the network parameters of the AF. Like other flow models, HCNAF performs exact likelihood inference. We conduct a number of density estimation tasks on toy experiments and MNIST to demonstrate the effectiveness and attributes of HCNAF, including its generalization capability over unseen conditions and expressivity. Finally, we show that HCNAF scales up to complex high-dimensional prediction problems of the magnitude of self-driving and that HCNAF yields a state-of-the-art performance in a public self-driving dataset.
doi_str_mv 10.48550/arxiv.1912.08111
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1912_08111</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1912_08111</sourcerecordid><originalsourceid>FETCH-LOGICAL-a671-c2e2a01a57a9093d4dcec026d37975df2ed389747fc1a80df59500ff898e49713</originalsourceid><addsrcrecordid>eNotj71OwzAYRb0woMIDMOEXSLCTuLbZoogQpNIydI---qeyFGLLTgp5e9rS6Q5H90gHoSdK8kowRl4g_rpTTiUtciIopfcodc22bl9xtwQTs8aP2k3Oj0bjrZkjDLieJx_NMZqU3MngdvA_GEaN3ZRwHcLgFFwO2PqIv6I_wMENLk1O4Z1Sc4BRLfgTAm7PFgVnMB4f0J2FIZnH267Qvn3bN1222b1_NPUmgzWnmSpMAYQC4yCJLHWllVGkWOuSS860LYwuheQVt4qCINoyyQixVkhhKslpuULP_9prdR-i-4a49Jf6_lpf_gG9a1Yl</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting</title><source>arXiv.org</source><creator>Oh, Geunseob ; Valois, Jean-Sebastien</creator><creatorcontrib>Oh, Geunseob ; Valois, Jean-Sebastien</creatorcontrib><description>We introduce Hyper-Conditioned Neural Autoregressive Flow (HCNAF); a powerful universal distribution approximator designed to model arbitrarily complex conditional probability density functions. HCNAF consists of a neural-net based conditional autoregressive flow (AF) and a hyper-network that can take large conditions in non-autoregressive fashion and outputs the network parameters of the AF. Like other flow models, HCNAF performs exact likelihood inference. We conduct a number of density estimation tasks on toy experiments and MNIST to demonstrate the effectiveness and attributes of HCNAF, including its generalization capability over unseen conditions and expressivity. Finally, we show that HCNAF scales up to complex high-dimensional prediction problems of the magnitude of self-driving and that HCNAF yields a state-of-the-art performance in a public self-driving dataset.</description><identifier>DOI: 10.48550/arxiv.1912.08111</identifier><language>eng</language><subject>Computer Science - Learning ; Computer Science - Robotics ; Statistics - Machine Learning</subject><creationdate>2019-12</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1912.08111$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1912.08111$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Oh, Geunseob</creatorcontrib><creatorcontrib>Valois, Jean-Sebastien</creatorcontrib><title>HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting</title><description>We introduce Hyper-Conditioned Neural Autoregressive Flow (HCNAF); a powerful universal distribution approximator designed to model arbitrarily complex conditional probability density functions. HCNAF consists of a neural-net based conditional autoregressive flow (AF) and a hyper-network that can take large conditions in non-autoregressive fashion and outputs the network parameters of the AF. Like other flow models, HCNAF performs exact likelihood inference. We conduct a number of density estimation tasks on toy experiments and MNIST to demonstrate the effectiveness and attributes of HCNAF, including its generalization capability over unseen conditions and expressivity. Finally, we show that HCNAF scales up to complex high-dimensional prediction problems of the magnitude of self-driving and that HCNAF yields a state-of-the-art performance in a public self-driving dataset.</description><subject>Computer Science - Learning</subject><subject>Computer Science - Robotics</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj71OwzAYRb0woMIDMOEXSLCTuLbZoogQpNIydI---qeyFGLLTgp5e9rS6Q5H90gHoSdK8kowRl4g_rpTTiUtciIopfcodc22bl9xtwQTs8aP2k3Oj0bjrZkjDLieJx_NMZqU3MngdvA_GEaN3ZRwHcLgFFwO2PqIv6I_wMENLk1O4Z1Sc4BRLfgTAm7PFgVnMB4f0J2FIZnH267Qvn3bN1222b1_NPUmgzWnmSpMAYQC4yCJLHWllVGkWOuSS860LYwuheQVt4qCINoyyQixVkhhKslpuULP_9prdR-i-4a49Jf6_lpf_gG9a1Yl</recordid><startdate>20191217</startdate><enddate>20191217</enddate><creator>Oh, Geunseob</creator><creator>Valois, Jean-Sebastien</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20191217</creationdate><title>HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting</title><author>Oh, Geunseob ; Valois, Jean-Sebastien</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a671-c2e2a01a57a9093d4dcec026d37975df2ed389747fc1a80df59500ff898e49713</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Science - Learning</topic><topic>Computer Science - Robotics</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Oh, Geunseob</creatorcontrib><creatorcontrib>Valois, Jean-Sebastien</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Oh, Geunseob</au><au>Valois, Jean-Sebastien</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting</atitle><date>2019-12-17</date><risdate>2019</risdate><abstract>We introduce Hyper-Conditioned Neural Autoregressive Flow (HCNAF); a powerful universal distribution approximator designed to model arbitrarily complex conditional probability density functions. HCNAF consists of a neural-net based conditional autoregressive flow (AF) and a hyper-network that can take large conditions in non-autoregressive fashion and outputs the network parameters of the AF. Like other flow models, HCNAF performs exact likelihood inference. We conduct a number of density estimation tasks on toy experiments and MNIST to demonstrate the effectiveness and attributes of HCNAF, including its generalization capability over unseen conditions and expressivity. Finally, we show that HCNAF scales up to complex high-dimensional prediction problems of the magnitude of self-driving and that HCNAF yields a state-of-the-art performance in a public self-driving dataset.</abstract><doi>10.48550/arxiv.1912.08111</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1912.08111
ispartof
issn
language eng
recordid cdi_arxiv_primary_1912_08111
source arXiv.org
subjects Computer Science - Learning
Computer Science - Robotics
Statistics - Machine Learning
title HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T16%3A29%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=HCNAF:%20Hyper-Conditioned%20Neural%20Autoregressive%20Flow%20and%20its%20Application%20for%20Probabilistic%20Occupancy%20Map%20Forecasting&rft.au=Oh,%20Geunseob&rft.date=2019-12-17&rft_id=info:doi/10.48550/arxiv.1912.08111&rft_dat=%3Carxiv_GOX%3E1912_08111%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true