Domain Adaptation via CycleGAN for Retina Segmentation in Optical Coherence Tomography
With the FDA approval of Artificial Intelligence (AI) for point-of-care clinical diagnoses, model generalizability is of the utmost importance as clinical decision-making must be domain-agnostic. A method of tackling the problem is to increase the dataset to include images from a multitude of domain...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the FDA approval of Artificial Intelligence (AI) for point-of-care
clinical diagnoses, model generalizability is of the utmost importance as
clinical decision-making must be domain-agnostic. A method of tackling the
problem is to increase the dataset to include images from a multitude of
domains; while this technique is ideal, the security requirements of medical
data is a major limitation. Additionally, researchers with developed tools
benefit from the addition of open-sourced data, but are limited by the
difference in domains. Herewith, we investigated the implementation of a
Cycle-Consistent Generative Adversarial Networks (CycleGAN) for the domain
adaptation of Optical Coherence Tomography (OCT) volumes. This study was done
in collaboration with the Biomedical Optics Research Group and Functional &
Anatomical Imaging & Shape Analysis Lab at Simon Fraser University. In this
study, we investigated a learning-based approach of adapting the domain of a
publicly available dataset, UK Biobank dataset (UKB). To evaluate the
performance of domain adaptation, we utilized pre-existing retinal layer
segmentation tools developed on a different set of RETOUCH OCT data. This study
provides insight on state-of-the-art tools for domain adaptation compared to
traditional processing techniques as well as a pipeline for adapting publicly
available retinal data to the domains previously used by our collaborators. |
---|---|
DOI: | 10.48550/arxiv.2107.02345 |