Accelerating Hopfield Network Dynamics: Beyond Synchronous Updates and Forward Euler
The Hopfield network serves as a fundamental energy-based model in machine learning, capturing memory retrieval dynamics through an ordinary differential equation (ODE). The model's output, the equilibrium point of the ODE, is traditionally computed via synchronous updates using the forward Eul...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The Hopfield network serves as a fundamental energy-based model in machine
learning, capturing memory retrieval dynamics through an ordinary differential
equation (ODE). The model's output, the equilibrium point of the ODE, is
traditionally computed via synchronous updates using the forward Euler method.
This paper aims to overcome some of the disadvantages of this approach. We
propose a conceptual shift, viewing Hopfield networks as instances of Deep
Equilibrium Models (DEQs). The DEQ framework not only allows for the use of
specialized solvers, but also leads to new insights on an empirical inference
technique that we will refer to as 'even-odd splitting'. Our theoretical
analysis of the method uncovers a parallelizable asynchronous update scheme,
which should converge roughly twice as fast as the conventional synchronous
updates. Empirical evaluations validate these findings, showcasing the
advantages of both the DEQ framework and even-odd splitting in digitally
simulating energy minimization in Hopfield networks. The code is available at
https://github.com/cgoemaere/hopdeq |
---|---|
DOI: | 10.48550/arxiv.2311.15673 |