Guaranteed Scalable Learning of Latent Tree Models

We present an integrated approach for structure and parameter estimation in latent tree graphical models. Our overall approach follows a "divide-and-conquer" strategy that learns models over small groups of variables and iteratively merges onto a global solution. The structure learning inv...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Huang, Furong, N, Niranjan U, Perros, Ioakeim, Chen, Robert, Sun, Jimeng, Anandkumar, Anima
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Huang, Furong
N, Niranjan U
Perros, Ioakeim
Chen, Robert
Sun, Jimeng
Anandkumar, Anima
description We present an integrated approach for structure and parameter estimation in latent tree graphical models. Our overall approach follows a "divide-and-conquer" strategy that learns models over small groups of variables and iteratively merges onto a global solution. The structure learning involves combinatorial operations such as minimum spanning tree construction and local recursive grouping; the parameter learning is based on the method of moments and on tensor decompositions. Our method is guaranteed to correctly recover the unknown tree structure and the model parameters with low sample complexity for the class of linear multivariate latent tree models which includes discrete and Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel algorithm is implemented in parallel and the parallel computation complexity increases only logarithmically with the number of variables and linearly with dimensionality of each variable.
doi_str_mv 10.48550/arxiv.1406.4566
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1406_4566</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1406_4566</sourcerecordid><originalsourceid>FETCH-LOGICAL-a656-51051044af1f9bf0cd74074a8e9c15db7f260457daaf4b836389ae98f78b2b9d3</originalsourceid><addsrcrecordid>eNotzjtrwzAUhmEtHYLbPVPRH7Aj2Ue3sYTcwCVDvZsj6ygYHKcobkn_fW6FD97t42FsLkUBVimxwHTpfwsJQhegtJ6xcvODCceJKPCvDgf0A_GaMI39eOCnyGucaJx4k4j45ynQcH5lLxGHM739N2PNetUst3m93-yWH3WOWulcSXEbAEYZnY-iCwaEAbTkOqmCN7HUApQJiBG8rXRlHZKz0VhfeheqjL0_bx_m9jv1R0x_7d3e3u3VFfQVPZ4</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Guaranteed Scalable Learning of Latent Tree Models</title><source>arXiv.org</source><creator>Huang, Furong ; N, Niranjan U ; Perros, Ioakeim ; Chen, Robert ; Sun, Jimeng ; Anandkumar, Anima</creator><creatorcontrib>Huang, Furong ; N, Niranjan U ; Perros, Ioakeim ; Chen, Robert ; Sun, Jimeng ; Anandkumar, Anima</creatorcontrib><description>We present an integrated approach for structure and parameter estimation in latent tree graphical models. Our overall approach follows a "divide-and-conquer" strategy that learns models over small groups of variables and iteratively merges onto a global solution. The structure learning involves combinatorial operations such as minimum spanning tree construction and local recursive grouping; the parameter learning is based on the method of moments and on tensor decompositions. Our method is guaranteed to correctly recover the unknown tree structure and the model parameters with low sample complexity for the class of linear multivariate latent tree models which includes discrete and Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel algorithm is implemented in parallel and the parallel computation complexity increases only logarithmically with the number of variables and linearly with dimensionality of each variable.</description><identifier>DOI: 10.48550/arxiv.1406.4566</identifier><language>eng</language><subject>Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2014-06</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1406.4566$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1406.4566$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Huang, Furong</creatorcontrib><creatorcontrib>N, Niranjan U</creatorcontrib><creatorcontrib>Perros, Ioakeim</creatorcontrib><creatorcontrib>Chen, Robert</creatorcontrib><creatorcontrib>Sun, Jimeng</creatorcontrib><creatorcontrib>Anandkumar, Anima</creatorcontrib><title>Guaranteed Scalable Learning of Latent Tree Models</title><description>We present an integrated approach for structure and parameter estimation in latent tree graphical models. Our overall approach follows a "divide-and-conquer" strategy that learns models over small groups of variables and iteratively merges onto a global solution. The structure learning involves combinatorial operations such as minimum spanning tree construction and local recursive grouping; the parameter learning is based on the method of moments and on tensor decompositions. Our method is guaranteed to correctly recover the unknown tree structure and the model parameters with low sample complexity for the class of linear multivariate latent tree models which includes discrete and Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel algorithm is implemented in parallel and the parallel computation complexity increases only logarithmically with the number of variables and linearly with dimensionality of each variable.</description><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzjtrwzAUhmEtHYLbPVPRH7Aj2Ue3sYTcwCVDvZsj6ygYHKcobkn_fW6FD97t42FsLkUBVimxwHTpfwsJQhegtJ6xcvODCceJKPCvDgf0A_GaMI39eOCnyGucaJx4k4j45ynQcH5lLxGHM739N2PNetUst3m93-yWH3WOWulcSXEbAEYZnY-iCwaEAbTkOqmCN7HUApQJiBG8rXRlHZKz0VhfeheqjL0_bx_m9jv1R0x_7d3e3u3VFfQVPZ4</recordid><startdate>20140617</startdate><enddate>20140617</enddate><creator>Huang, Furong</creator><creator>N, Niranjan U</creator><creator>Perros, Ioakeim</creator><creator>Chen, Robert</creator><creator>Sun, Jimeng</creator><creator>Anandkumar, Anima</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20140617</creationdate><title>Guaranteed Scalable Learning of Latent Tree Models</title><author>Huang, Furong ; N, Niranjan U ; Perros, Ioakeim ; Chen, Robert ; Sun, Jimeng ; Anandkumar, Anima</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a656-51051044af1f9bf0cd74074a8e9c15db7f260457daaf4b836389ae98f78b2b9d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Huang, Furong</creatorcontrib><creatorcontrib>N, Niranjan U</creatorcontrib><creatorcontrib>Perros, Ioakeim</creatorcontrib><creatorcontrib>Chen, Robert</creatorcontrib><creatorcontrib>Sun, Jimeng</creatorcontrib><creatorcontrib>Anandkumar, Anima</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Huang, Furong</au><au>N, Niranjan U</au><au>Perros, Ioakeim</au><au>Chen, Robert</au><au>Sun, Jimeng</au><au>Anandkumar, Anima</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Guaranteed Scalable Learning of Latent Tree Models</atitle><date>2014-06-17</date><risdate>2014</risdate><abstract>We present an integrated approach for structure and parameter estimation in latent tree graphical models. Our overall approach follows a "divide-and-conquer" strategy that learns models over small groups of variables and iteratively merges onto a global solution. The structure learning involves combinatorial operations such as minimum spanning tree construction and local recursive grouping; the parameter learning is based on the method of moments and on tensor decompositions. Our method is guaranteed to correctly recover the unknown tree structure and the model parameters with low sample complexity for the class of linear multivariate latent tree models which includes discrete and Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel algorithm is implemented in parallel and the parallel computation complexity increases only logarithmically with the number of variables and linearly with dimensionality of each variable.</abstract><doi>10.48550/arxiv.1406.4566</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1406.4566
ispartof
issn
language eng
recordid cdi_arxiv_primary_1406_4566
source arXiv.org
subjects Computer Science - Learning
Statistics - Machine Learning
title Guaranteed Scalable Learning of Latent Tree Models
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T10%3A11%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Guaranteed%20Scalable%20Learning%20of%20Latent%20Tree%20Models&rft.au=Huang,%20Furong&rft.date=2014-06-17&rft_id=info:doi/10.48550/arxiv.1406.4566&rft_dat=%3Carxiv_GOX%3E1406_4566%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true