Multi-Organ Registration With Continual Learning

Neural networks have found widespread application in medical image registration, although they typically assume access to the entire training dataset during training. In clinical scenarios, medical images of various anatomical targets, such as the heart, brain, and liver, may be obtained successivel...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters 2024, Vol.31, p.1204-1208
Hauptverfasser: Ding, Wangbin, Sun, Haoran, Pei, Chenhao, Jia, Dengqiang, Huang, Liqin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Neural networks have found widespread application in medical image registration, although they typically assume access to the entire training dataset during training. In clinical scenarios, medical images of various anatomical targets, such as the heart, brain, and liver, may be obtained successively with advancements in imaging technologies and diagnostic procedures. The accuracy of registration on a new target may degrade over time, as the registration models become outdated due to domain shifts occurring at unpredictable intervals. In this study, we introduce a deep registration model based on continual learning to mitigate the issue of catastrophic forgetting during training with continuous data streams. To enable continuous network training, we propose a dynamic memory system based on a density-based clustering algorithm to retain representative samples from the data stream. Training the registration network on these representative samples enhances its generalization capabilities to accommodate new targets within the data stream. We evaluated our approach using the CHAOS dataset, which comprises multiple targets, such as the liver, left kidney, and spleen, to simulate a data stream. The experimental findings illustrate that the proposed continual registration network achieves comparable performance to a model trained with full data visibility.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2024.3388954