A Trace-restricted Kronecker-Factored Approximation to Natural Gradient
Second-order optimization methods have the ability to accelerate convergence by modifying the gradient through the curvature matrix. There have been many attempts to use second-order optimization methods for training deep neural networks. Inspired by diagonal approximations and factored approximatio...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Second-order optimization methods have the ability to accelerate convergence
by modifying the gradient through the curvature matrix. There have been many
attempts to use second-order optimization methods for training deep neural
networks. Inspired by diagonal approximations and factored approximations such
as Kronecker-Factored Approximate Curvature (KFAC), we propose a new
approximation to the Fisher information matrix (FIM) called Trace-restricted
Kronecker-factored Approximate Curvature (TKFAC) in this work, which can hold
the certain trace relationship between the exact and the approximate FIM. In
TKFAC, we decompose each block of the approximate FIM as a Kronecker product of
two smaller matrices and scaled by a coefficient related to trace. We
theoretically analyze TKFAC's approximation error and give an upper bound of
it. We also propose a new damping technique for TKFAC on convolutional neural
networks to maintain the superiority of second-order optimization methods
during training. Experiments show that our method has better performance
compared with several state-of-the-art algorithms on some deep network
architectures. |
---|---|
DOI: | 10.48550/arxiv.2011.10741 |