On the Computational Power of Online Gradient Descent
We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behav...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We prove that the evolution of weight vectors in online gradient descent can
encode arbitrary polynomial-space computations, even in very simple learning
settings. Our results imply that, under weak complexity-theoretic assumptions,
it is impossible to reason efficiently about the fine-grained behavior of
online gradient descent. |
---|---|
DOI: | 10.48550/arxiv.1807.01280 |