Hopfield-type neural ordinary differential equation for robust machine learning
•We propose a neural ODE inspired by Hopfield-type neural networks.•We prove the stability of both transient/steady state output of the proposed ODE.•We provide a method for stabilizing the output of the proposed ODE layer.•We show experimentally that the proposed layer improves adversarial robustne...
Gespeichert in:
Veröffentlicht in: | Pattern recognition letters 2021-12, Vol.152, p.180-187 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •We propose a neural ODE inspired by Hopfield-type neural networks.•We prove the stability of both transient/steady state output of the proposed ODE.•We provide a method for stabilizing the output of the proposed ODE layer.•We show experimentally that the proposed layer improves adversarial robustness.
Neural networks are vulnerable to adversarial input perturbations imperceptible to human, which calls for robust machine learning for safety-critical applications. In this paper, we propose a new neural ODE layer which is inspired by Hopfield-type neural networks. We prove that the proposed ODE layer has global asymptotic stability on the projected space, which implies the existence and uniqueness of its steady state. We further show that the proposed layer satisfies the local stability condition such that the output is Lipschitz continuous in the ODE layer input, guaranteeing that the norm of perturbation on the hidden state does not grow over time. By experiments we show that an appropriate level of stability constraints imposed on the proposed ODE layer can improve the adversarial robustness of ODE layers, and present a heuristic method for finding good hyperparameters for stability constraints. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/j.patrec.2021.10.008 |