CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra

Many areas of machine learning and science involve large linear algebra problems, such as eigendecompositions, solving linear systems, computing matrix exponentials, and trace estimation. The matrices involved often have Kronecker, convolutional, block diagonal, sum, or product structure. In this pa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Potapczynski, Andres, Finzi, Marc, Pleiss, Geoff, Wilson, Andrew Gordon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Potapczynski, Andres
Finzi, Marc
Pleiss, Geoff
Wilson, Andrew Gordon
description Many areas of machine learning and science involve large linear algebra problems, such as eigendecompositions, solving linear systems, computing matrix exponentials, and trace estimation. The matrices involved often have Kronecker, convolutional, block diagonal, sum, or product structure. In this paper, we propose a simple but general framework for large-scale linear algebra problems in machine learning, named CoLA (Compositional Linear Algebra). By combining a linear operator abstraction with compositional dispatch rules, CoLA automatically constructs memory and runtime efficient numerical algorithms. Moreover, CoLA provides memory efficient automatic differentiation, low precision computation, and GPU acceleration in both JAX and PyTorch, while also accommodating new objects, operations, and rules in downstream packages via multiple dispatch. CoLA can accelerate many algebraic operations, while making it easy to prototype matrix structures and algorithms, providing an appealing drop-in tool for virtually any computational effort that requires linear algebra. We showcase its efficacy across a broad range of applications, including partial differential equations, Gaussian processes, equivariant model construction, and unsupervised learning.
doi_str_mv 10.48550/arxiv.2309.03060
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2309_03060</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2309_03060</sourcerecordid><originalsourceid>FETCH-LOGICAL-a670-7261c4f2191536b009c1fff2fdb916a3028033adbe892e1d1a818ea9ab7a91fc3</originalsourceid><addsrcrecordid>eNotj8tKxDAYRrNxIaMP4Mq8QOufZHqJu1LqBYounK2UP2kyBNqkZFIZ3946uvq-xeHAIeSOQb6viwIeMJ7dV84FyBwElHBNPtvQN4-0Oy9TcMn5I23DvITT9oPHiX6kuOq0RkNtiLRZU5gxOU3Rj7Sz1mlnfKJv62yi0xvfO29wA6ejURFvyJXF6WRu_3dHDk_doX3J-vfn17bpMywryCpeMr23nElWiFIBSM2stdyOSrISBfAahMBRmVpyw0aGNasNSlQVSma12JH7P-2lb1iimzF-D7-dw6VT_ACrkU5V</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra</title><source>arXiv.org</source><creator>Potapczynski, Andres ; Finzi, Marc ; Pleiss, Geoff ; Wilson, Andrew Gordon</creator><creatorcontrib>Potapczynski, Andres ; Finzi, Marc ; Pleiss, Geoff ; Wilson, Andrew Gordon</creatorcontrib><description>Many areas of machine learning and science involve large linear algebra problems, such as eigendecompositions, solving linear systems, computing matrix exponentials, and trace estimation. The matrices involved often have Kronecker, convolutional, block diagonal, sum, or product structure. In this paper, we propose a simple but general framework for large-scale linear algebra problems in machine learning, named CoLA (Compositional Linear Algebra). By combining a linear operator abstraction with compositional dispatch rules, CoLA automatically constructs memory and runtime efficient numerical algorithms. Moreover, CoLA provides memory efficient automatic differentiation, low precision computation, and GPU acceleration in both JAX and PyTorch, while also accommodating new objects, operations, and rules in downstream packages via multiple dispatch. CoLA can accelerate many algebraic operations, while making it easy to prototype matrix structures and algorithms, providing an appealing drop-in tool for virtually any computational effort that requires linear algebra. We showcase its efficacy across a broad range of applications, including partial differential equations, Gaussian processes, equivariant model construction, and unsupervised learning.</description><identifier>DOI: 10.48550/arxiv.2309.03060</identifier><language>eng</language><subject>Computer Science - Learning ; Computer Science - Numerical Analysis ; Mathematics - Numerical Analysis ; Statistics - Machine Learning</subject><creationdate>2023-09</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,782,887</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2309.03060$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2309.03060$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Potapczynski, Andres</creatorcontrib><creatorcontrib>Finzi, Marc</creatorcontrib><creatorcontrib>Pleiss, Geoff</creatorcontrib><creatorcontrib>Wilson, Andrew Gordon</creatorcontrib><title>CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra</title><description>Many areas of machine learning and science involve large linear algebra problems, such as eigendecompositions, solving linear systems, computing matrix exponentials, and trace estimation. The matrices involved often have Kronecker, convolutional, block diagonal, sum, or product structure. In this paper, we propose a simple but general framework for large-scale linear algebra problems in machine learning, named CoLA (Compositional Linear Algebra). By combining a linear operator abstraction with compositional dispatch rules, CoLA automatically constructs memory and runtime efficient numerical algorithms. Moreover, CoLA provides memory efficient automatic differentiation, low precision computation, and GPU acceleration in both JAX and PyTorch, while also accommodating new objects, operations, and rules in downstream packages via multiple dispatch. CoLA can accelerate many algebraic operations, while making it easy to prototype matrix structures and algorithms, providing an appealing drop-in tool for virtually any computational effort that requires linear algebra. We showcase its efficacy across a broad range of applications, including partial differential equations, Gaussian processes, equivariant model construction, and unsupervised learning.</description><subject>Computer Science - Learning</subject><subject>Computer Science - Numerical Analysis</subject><subject>Mathematics - Numerical Analysis</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8tKxDAYRrNxIaMP4Mq8QOufZHqJu1LqBYounK2UP2kyBNqkZFIZ3946uvq-xeHAIeSOQb6viwIeMJ7dV84FyBwElHBNPtvQN4-0Oy9TcMn5I23DvITT9oPHiX6kuOq0RkNtiLRZU5gxOU3Rj7Sz1mlnfKJv62yi0xvfO29wA6ejURFvyJXF6WRu_3dHDk_doX3J-vfn17bpMywryCpeMr23nElWiFIBSM2stdyOSrISBfAahMBRmVpyw0aGNasNSlQVSma12JH7P-2lb1iimzF-D7-dw6VT_ACrkU5V</recordid><startdate>20230906</startdate><enddate>20230906</enddate><creator>Potapczynski, Andres</creator><creator>Finzi, Marc</creator><creator>Pleiss, Geoff</creator><creator>Wilson, Andrew Gordon</creator><scope>AKY</scope><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20230906</creationdate><title>CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra</title><author>Potapczynski, Andres ; Finzi, Marc ; Pleiss, Geoff ; Wilson, Andrew Gordon</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a670-7261c4f2191536b009c1fff2fdb916a3028033adbe892e1d1a818ea9ab7a91fc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Learning</topic><topic>Computer Science - Numerical Analysis</topic><topic>Mathematics - Numerical Analysis</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Potapczynski, Andres</creatorcontrib><creatorcontrib>Finzi, Marc</creatorcontrib><creatorcontrib>Pleiss, Geoff</creatorcontrib><creatorcontrib>Wilson, Andrew Gordon</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Potapczynski, Andres</au><au>Finzi, Marc</au><au>Pleiss, Geoff</au><au>Wilson, Andrew Gordon</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra</atitle><date>2023-09-06</date><risdate>2023</risdate><abstract>Many areas of machine learning and science involve large linear algebra problems, such as eigendecompositions, solving linear systems, computing matrix exponentials, and trace estimation. The matrices involved often have Kronecker, convolutional, block diagonal, sum, or product structure. In this paper, we propose a simple but general framework for large-scale linear algebra problems in machine learning, named CoLA (Compositional Linear Algebra). By combining a linear operator abstraction with compositional dispatch rules, CoLA automatically constructs memory and runtime efficient numerical algorithms. Moreover, CoLA provides memory efficient automatic differentiation, low precision computation, and GPU acceleration in both JAX and PyTorch, while also accommodating new objects, operations, and rules in downstream packages via multiple dispatch. CoLA can accelerate many algebraic operations, while making it easy to prototype matrix structures and algorithms, providing an appealing drop-in tool for virtually any computational effort that requires linear algebra. We showcase its efficacy across a broad range of applications, including partial differential equations, Gaussian processes, equivariant model construction, and unsupervised learning.</abstract><doi>10.48550/arxiv.2309.03060</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2309.03060
ispartof
issn
language eng
recordid cdi_arxiv_primary_2309_03060
source arXiv.org
subjects Computer Science - Learning
Computer Science - Numerical Analysis
Mathematics - Numerical Analysis
Statistics - Machine Learning
title CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-04T00%3A13%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=CoLA:%20Exploiting%20Compositional%20Structure%20for%20Automatic%20and%20Efficient%20Numerical%20Linear%20Algebra&rft.au=Potapczynski,%20Andres&rft.date=2023-09-06&rft_id=info:doi/10.48550/arxiv.2309.03060&rft_dat=%3Carxiv_GOX%3E2309_03060%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true