Any Deep ReLU Network is Shallow
We constructively prove that every deep ReLU network can be rewritten as a functionally identical three-layer network with weights valued in the extended reals. Based on this proof, we provide an algorithm that, given a deep ReLU network, finds the explicit weights of the corresponding shallow netwo...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We constructively prove that every deep ReLU network can be rewritten as a
functionally identical three-layer network with weights valued in the extended
reals. Based on this proof, we provide an algorithm that, given a deep ReLU
network, finds the explicit weights of the corresponding shallow network. The
resulting shallow network is transparent and used to generate explanations of
the model s behaviour. |
---|---|
DOI: | 10.48550/arxiv.2306.11827 |