Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1
Machine learning and data mining algorithms are becoming increasingly important in analyzing large volume, multi-relational and multi--modal datasets, which are often conveniently represented as multiway arrays or tensors. It is therefore timely and valuable for the multidisciplinary research commun...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2017-09 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Cichocki, A Lee, N Oseledets, I V A -H Phan Zhao, Q Mandic, D |
description | Machine learning and data mining algorithms are becoming increasingly important in analyzing large volume, multi-relational and multi--modal datasets, which are often conveniently represented as multiway arrays or tensors. It is therefore timely and valuable for the multidisciplinary research community to review tensor decompositions and tensor networks as emerging tools for large-scale data analysis and data mining. We provide the mathematical and graphical representations and interpretation of tensor networks, with the main focus on the Tucker and Tensor Train (TT) decompositions and their extensions or generalizations. Keywords: Tensor networks, Function-related tensors, CP decomposition, Tucker models, tensor train (TT) decompositions, matrix product states (MPS), matrix product operators (MPO), basic tensor operations, multiway component analysis, multilinear blind source separation, tensor completion, linear/multilinear dimensionality reduction, large-scale optimization problems, symmetric eigenvalue decomposition (EVD), PCA/SVD, huge systems of linear equations, pseudo-inverse of very large matrices, Lasso and Canonical Correlation Analysis (CCA) (This is Part 1) |
doi_str_mv | 10.48550/arxiv.1609.00893 |
format | Article |
fullrecord | <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_1609_00893</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2076485194</sourcerecordid><originalsourceid>FETCH-LOGICAL-a524-d0581bb11485dbcf12ed5e1da712e2ad2287c92fcdd927d6e711c0e2608a25243</originalsourceid><addsrcrecordid>eNotUMtOwzAQtJCQqEo_gBOWOKfYmzgPblWBghTRquQeOfGmpM0LO20pF34dk3Land3ZWc0QcsPZ1AuFYPdSf5WHKfdZNGUsjNwLMgLX5U7oAVyRiTFbxhj4AQjhjshP3B6dtWx2NMHGtJq-YX9s9c7QwoLHsrbTsm1kVfYnuka1z3sLqWwUjaXeoPOeywrpsuvLuvyWw3Kl26zC2jzQFWrToT05oBlu5h-yqrDZWLiarRPKr8llISuDk_86JsnzUzJ_ceLl4nU-ix0pwHMUEyHPMs6tRZXlBQdUArmSge1AKoAwyCMocqUiCJSPAec5Q_BZKMEKuGNye5Yd0kk7XdZSn9K_lNIhJcu4OzM63X7u0fTptt1ra9ykwALfPuaR5_4CWiprkQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2076485194</pqid></control><display><type>article</type><title>Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Cichocki, A ; Lee, N ; Oseledets, I V ; A -H Phan ; Zhao, Q ; Mandic, D</creator><creatorcontrib>Cichocki, A ; Lee, N ; Oseledets, I V ; A -H Phan ; Zhao, Q ; Mandic, D</creatorcontrib><description>Machine learning and data mining algorithms are becoming increasingly important in analyzing large volume, multi-relational and multi--modal datasets, which are often conveniently represented as multiway arrays or tensors. It is therefore timely and valuable for the multidisciplinary research community to review tensor decompositions and tensor networks as emerging tools for large-scale data analysis and data mining. We provide the mathematical and graphical representations and interpretation of tensor networks, with the main focus on the Tucker and Tensor Train (TT) decompositions and their extensions or generalizations. Keywords: Tensor networks, Function-related tensors, CP decomposition, Tucker models, tensor train (TT) decompositions, matrix product states (MPS), matrix product operators (MPO), basic tensor operations, multiway component analysis, multilinear blind source separation, tensor completion, linear/multilinear dimensionality reduction, large-scale optimization problems, symmetric eigenvalue decomposition (EVD), PCA/SVD, huge systems of linear equations, pseudo-inverse of very large matrices, Lasso and Canonical Correlation Analysis (CCA) (This is Part 1)</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.1609.00893</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Computer Science - Numerical Analysis ; Correlation analysis ; Data analysis ; Data mining ; Decomposition ; Eigenvalues ; Graphical representations ; Linear equations ; Machine learning ; Mathematical analysis ; Matrix methods ; Multidisciplinary research ; Networks ; Operators (mathematics) ; Optimization ; Reduction ; Signal processing ; Tensors</subject><ispartof>arXiv.org, 2017-09</ispartof><rights>2017. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,784,885,27923</link.rule.ids><backlink>$$Uhttps://doi.org/10.1561/2200000059$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.1609.00893$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Cichocki, A</creatorcontrib><creatorcontrib>Lee, N</creatorcontrib><creatorcontrib>Oseledets, I V</creatorcontrib><creatorcontrib>A -H Phan</creatorcontrib><creatorcontrib>Zhao, Q</creatorcontrib><creatorcontrib>Mandic, D</creatorcontrib><title>Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1</title><title>arXiv.org</title><description>Machine learning and data mining algorithms are becoming increasingly important in analyzing large volume, multi-relational and multi--modal datasets, which are often conveniently represented as multiway arrays or tensors. It is therefore timely and valuable for the multidisciplinary research community to review tensor decompositions and tensor networks as emerging tools for large-scale data analysis and data mining. We provide the mathematical and graphical representations and interpretation of tensor networks, with the main focus on the Tucker and Tensor Train (TT) decompositions and their extensions or generalizations. Keywords: Tensor networks, Function-related tensors, CP decomposition, Tucker models, tensor train (TT) decompositions, matrix product states (MPS), matrix product operators (MPO), basic tensor operations, multiway component analysis, multilinear blind source separation, tensor completion, linear/multilinear dimensionality reduction, large-scale optimization problems, symmetric eigenvalue decomposition (EVD), PCA/SVD, huge systems of linear equations, pseudo-inverse of very large matrices, Lasso and Canonical Correlation Analysis (CCA) (This is Part 1)</description><subject>Algorithms</subject><subject>Computer Science - Numerical Analysis</subject><subject>Correlation analysis</subject><subject>Data analysis</subject><subject>Data mining</subject><subject>Decomposition</subject><subject>Eigenvalues</subject><subject>Graphical representations</subject><subject>Linear equations</subject><subject>Machine learning</subject><subject>Mathematical analysis</subject><subject>Matrix methods</subject><subject>Multidisciplinary research</subject><subject>Networks</subject><subject>Operators (mathematics)</subject><subject>Optimization</subject><subject>Reduction</subject><subject>Signal processing</subject><subject>Tensors</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotUMtOwzAQtJCQqEo_gBOWOKfYmzgPblWBghTRquQeOfGmpM0LO20pF34dk3Land3ZWc0QcsPZ1AuFYPdSf5WHKfdZNGUsjNwLMgLX5U7oAVyRiTFbxhj4AQjhjshP3B6dtWx2NMHGtJq-YX9s9c7QwoLHsrbTsm1kVfYnuka1z3sLqWwUjaXeoPOeywrpsuvLuvyWw3Kl26zC2jzQFWrToT05oBlu5h-yqrDZWLiarRPKr8llISuDk_86JsnzUzJ_ceLl4nU-ix0pwHMUEyHPMs6tRZXlBQdUArmSge1AKoAwyCMocqUiCJSPAec5Q_BZKMEKuGNye5Yd0kk7XdZSn9K_lNIhJcu4OzM63X7u0fTptt1ra9ykwALfPuaR5_4CWiprkQ</recordid><startdate>20170911</startdate><enddate>20170911</enddate><creator>Cichocki, A</creator><creator>Lee, N</creator><creator>Oseledets, I V</creator><creator>A -H Phan</creator><creator>Zhao, Q</creator><creator>Mandic, D</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20170911</creationdate><title>Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1</title><author>Cichocki, A ; Lee, N ; Oseledets, I V ; A -H Phan ; Zhao, Q ; Mandic, D</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a524-d0581bb11485dbcf12ed5e1da712e2ad2287c92fcdd927d6e711c0e2608a25243</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Algorithms</topic><topic>Computer Science - Numerical Analysis</topic><topic>Correlation analysis</topic><topic>Data analysis</topic><topic>Data mining</topic><topic>Decomposition</topic><topic>Eigenvalues</topic><topic>Graphical representations</topic><topic>Linear equations</topic><topic>Machine learning</topic><topic>Mathematical analysis</topic><topic>Matrix methods</topic><topic>Multidisciplinary research</topic><topic>Networks</topic><topic>Operators (mathematics)</topic><topic>Optimization</topic><topic>Reduction</topic><topic>Signal processing</topic><topic>Tensors</topic><toplevel>online_resources</toplevel><creatorcontrib>Cichocki, A</creatorcontrib><creatorcontrib>Lee, N</creatorcontrib><creatorcontrib>Oseledets, I V</creatorcontrib><creatorcontrib>A -H Phan</creatorcontrib><creatorcontrib>Zhao, Q</creatorcontrib><creatorcontrib>Mandic, D</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cichocki, A</au><au>Lee, N</au><au>Oseledets, I V</au><au>A -H Phan</au><au>Zhao, Q</au><au>Mandic, D</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1</atitle><jtitle>arXiv.org</jtitle><date>2017-09-11</date><risdate>2017</risdate><eissn>2331-8422</eissn><abstract>Machine learning and data mining algorithms are becoming increasingly important in analyzing large volume, multi-relational and multi--modal datasets, which are often conveniently represented as multiway arrays or tensors. It is therefore timely and valuable for the multidisciplinary research community to review tensor decompositions and tensor networks as emerging tools for large-scale data analysis and data mining. We provide the mathematical and graphical representations and interpretation of tensor networks, with the main focus on the Tucker and Tensor Train (TT) decompositions and their extensions or generalizations. Keywords: Tensor networks, Function-related tensors, CP decomposition, Tucker models, tensor train (TT) decompositions, matrix product states (MPS), matrix product operators (MPO), basic tensor operations, multiway component analysis, multilinear blind source separation, tensor completion, linear/multilinear dimensionality reduction, large-scale optimization problems, symmetric eigenvalue decomposition (EVD), PCA/SVD, huge systems of linear equations, pseudo-inverse of very large matrices, Lasso and Canonical Correlation Analysis (CCA) (This is Part 1)</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.1609.00893</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2017-09 |
issn | 2331-8422 |
language | eng |
recordid | cdi_arxiv_primary_1609_00893 |
source | arXiv.org; Free E- Journals |
subjects | Algorithms Computer Science - Numerical Analysis Correlation analysis Data analysis Data mining Decomposition Eigenvalues Graphical representations Linear equations Machine learning Mathematical analysis Matrix methods Multidisciplinary research Networks Operators (mathematics) Optimization Reduction Signal processing Tensors |
title | Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1 |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T03%3A04%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Low-Rank%20Tensor%20Networks%20for%20Dimensionality%20Reduction%20and%20Large-Scale%20Optimization%20Problems:%20Perspectives%20and%20Challenges%20PART%201&rft.jtitle=arXiv.org&rft.au=Cichocki,%20A&rft.date=2017-09-11&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.1609.00893&rft_dat=%3Cproquest_arxiv%3E2076485194%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2076485194&rft_id=info:pmid/&rfr_iscdi=true |