The Selective G-Bispectrum and its Inversion: Applications to G-Invariant Networks

An important problem in signal processing and deep learning is to achieve \textit{invariance} to nuisance factors not relevant for the task. Since many of these factors are describable as the action of a group \(G\) (e.g. rotations, translations, scalings), we want methods to be \(G\)-invariant. The...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-11
Hauptverfasser: Mataigne, Simon, Mathe, Johan, Sanborn, Sophia, Hillar, Christopher, Miolane, Nina
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:An important problem in signal processing and deep learning is to achieve \textit{invariance} to nuisance factors not relevant for the task. Since many of these factors are describable as the action of a group \(G\) (e.g. rotations, translations, scalings), we want methods to be \(G\)-invariant. The \(G\)-Bispectrum extracts every characteristic of a given signal up to group action: for example, the shape of an object in an image, but not its orientation. Consequently, the \(G\)-Bispectrum has been incorporated into deep neural network architectures as a computational primitive for \(G\)-invariance\textemdash akin to a pooling mechanism, but with greater selectivity and robustness. However, the computational cost of the \(G\)-Bispectrum (\(\mathcal{O}(|G|^2)\), with \(|G|\) the size of the group) has limited its widespread adoption. Here, we show that the \(G\)-Bispectrum computation contains redundancies that can be reduced into a \textit{selective \(G\)-Bispectrum} with \(\mathcal{O}(|G|)\) complexity. We prove desirable mathematical properties of the selective \(G\)-Bispectrum and demonstrate how its integration in neural networks enhances accuracy and robustness compared to traditional approaches, while enjoying considerable speeds-up compared to the full \(G\)-Bispectrum.
ISSN:2331-8422