Multimodal image registration based on the expectation–maximisation methodology

In this study, a new framework for multimodal image registration is proposed based on the expectation–maximisation (EM) methodology. This framework allows to address simultaneously parametric and elastic registrations independently on the modality of the target and source images without making any a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET image processing 2017-12, Vol.11 (12), p.1246-1253
Hauptverfasser: Arce-Santana, Edgar R, Campos-Delgado, Daniel U, Reducindo, Isnardo, Mejia-Rodriguez, Aldo R
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this study, a new framework for multimodal image registration is proposed based on the expectation–maximisation (EM) methodology. This framework allows to address simultaneously parametric and elastic registrations independently on the modality of the target and source images without making any assumptions about their intensity relationship. The EM formulation for the image registration problem leads to a regularised quadratic optimisation scheme to compute the displacement vector field (DVF) that aligns the images and depends on their joint intensity distribution. At the first stage, a parametric transformation is assumed for the DVF, where the resulting quadratic optimisation is computed recursively to calculate its optimal parameters. Next, a general unknown deformation models the elastic part of the DVF, which is represented by an additive structure. The resulting optimisation process by the EM formulation results in a cost function that involves data and regularisation terms, which is also solved recursively. A comprehensive evaluation of the parametric and elastic proposals is carried out by comparing to state-of-the-art algorithms and images from different application fields, where an advantage is visualised by the authors’ proposal in terms of a compromise between accuracy and robustness.
ISSN:1751-9659
1751-9667
DOI:10.1049/iet-ipr.2017.0234