CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra
Many areas of machine learning and science involve large linear algebra problems, such as eigendecompositions, solving linear systems, computing matrix exponentials, and trace estimation. The matrices involved often have Kronecker, convolutional, block diagonal, sum, or product structure. In this pa...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Many areas of machine learning and science involve large linear algebra
problems, such as eigendecompositions, solving linear systems, computing matrix
exponentials, and trace estimation. The matrices involved often have Kronecker,
convolutional, block diagonal, sum, or product structure. In this paper, we
propose a simple but general framework for large-scale linear algebra problems
in machine learning, named CoLA (Compositional Linear Algebra). By combining a
linear operator abstraction with compositional dispatch rules, CoLA
automatically constructs memory and runtime efficient numerical algorithms.
Moreover, CoLA provides memory efficient automatic differentiation, low
precision computation, and GPU acceleration in both JAX and PyTorch, while also
accommodating new objects, operations, and rules in downstream packages via
multiple dispatch. CoLA can accelerate many algebraic operations, while making
it easy to prototype matrix structures and algorithms, providing an appealing
drop-in tool for virtually any computational effort that requires linear
algebra. We showcase its efficacy across a broad range of applications,
including partial differential equations, Gaussian processes, equivariant model
construction, and unsupervised learning. |
---|---|
DOI: | 10.48550/arxiv.2309.03060 |