Wilsonian Renormalization of Neural Network Gaussian Processes
Separating relevant and irrelevant information is key to any modeling process or scientific inquiry. Theoretical physics offers a powerful tool for achieving this in the form of the renormalization group (RG). Here we demonstrate a practical approach to performing Wilsonian RG in the context of Gaus...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Separating relevant and irrelevant information is key to any modeling process
or scientific inquiry. Theoretical physics offers a powerful tool for achieving
this in the form of the renormalization group (RG). Here we demonstrate a
practical approach to performing Wilsonian RG in the context of Gaussian
Process (GP) Regression. We systematically integrate out the unlearnable modes
of the GP kernel, thereby obtaining an RG flow of the GP in which the data sets
the IR scale. In simple cases, this results in a universal flow of the ridge
parameter, which becomes input-dependent in the richer scenario in which
non-Gaussianities are included. In addition to being analytically tractable,
this approach goes beyond structural analogies between RG and neural networks
by providing a natural connection between RG flow and learnable vs. unlearnable
modes. Studying such flows may improve our understanding of feature learning in
deep neural networks, and enable us to identify potential universality classes
in these models. |
---|---|
DOI: | 10.48550/arxiv.2405.06008 |