Kernel vs. Kernel: Exploring How the Data Structure Affects Neural Collapse
Recently, a vast amount of literature has focused on the "Neural Collapse" (NC) phenomenon, which emerges when training neural network (NN) classifiers beyond the zero training error point. The core component of NC is the decrease in the within class variability of the network's deepe...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, a vast amount of literature has focused on the "Neural Collapse"
(NC) phenomenon, which emerges when training neural network (NN) classifiers
beyond the zero training error point. The core component of NC is the decrease
in the within class variability of the network's deepest features, dubbed as
NC1. The theoretical works that study NC are typically based on simplified
unconstrained features models (UFMs) that mask any effect of the data on the
extent of collapse. In this paper, we provide a kernel-based analysis that does
not suffer from this limitation. First, given a kernel function, we establish
expressions for the traces of the within- and between-class covariance matrices
of the samples' features (and consequently an NC1 metric). Then, we turn to
focus on kernels associated with shallow NNs. First, we consider the NN
Gaussian Process kernel (NNGP), associated with the network at initialization,
and the complement Neural Tangent Kernel (NTK), associated with its training in
the "lazy regime". Interestingly, we show that the NTK does not represent more
collapsed features than the NNGP for prototypical data models. As NC emerges
from training, we then consider an alternative to NTK: the recently proposed
adaptive kernel, which generalizes NNGP to model the feature mapping learned
from the training data. Contrasting our NC1 analysis for these two kernels
enables gaining insights into the effect of data distribution on the extent of
collapse, which are empirically aligned with the behavior observed with
practical training of NNs. |
---|---|
DOI: | 10.48550/arxiv.2406.02105 |