Forward-Mode Automatic Differentiation in Julia
We present ForwardDiff, a Julia package for forward-mode automatic differentiation (AD) featuring performance competitive with low-level languages like C++. Unlike recently developed AD tools in other popular high-level languages such as Python and MATLAB, ForwardDiff takes advantage of just-in-time...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We present ForwardDiff, a Julia package for forward-mode automatic
differentiation (AD) featuring performance competitive with low-level languages
like C++. Unlike recently developed AD tools in other popular high-level
languages such as Python and MATLAB, ForwardDiff takes advantage of
just-in-time (JIT) compilation to transparently recompile AD-unaware user code,
enabling efficient support for higher-order differentiation and differentiation
using custom number types (including complex numbers). For gradient and
Jacobian calculations, ForwardDiff provides a variant of vector-forward mode
that avoids expensive heap allocation and makes better use of memory bandwidth
than traditional vector mode. In our numerical experiments, we demonstrate that
for nontrivially large dimensions, ForwardDiff's gradient computations can be
faster than a reverse-mode implementation from the Python-based autograd
package. We also illustrate how ForwardDiff is used effectively within JuMP, a
modeling language for optimization. According to our usage statistics, 41
unique repositories on GitHub depend on ForwardDiff, with users from diverse
fields such as astronomy, optimization, finite element analysis, and
statistics.
This document is an extended abstract that has been accepted for presentation
at the AD2016 7th International Conference on Algorithmic Differentiation. |
---|---|
DOI: | 10.48550/arxiv.1607.07892 |