Retinal OCT Layer Segmentation via Joint Motion Correction and Graph-Assisted 3D Neural Network
Optical Coherence Tomography (OCT) is a widely used 3D imaging technology in ophthalmology. Segmentation of retinal layers in OCT is important for diagnosis and evaluation of various retinal and systemic diseases. While 2D segmentation algorithms have been developed, they do not fully utilize contex...
Gespeichert in:
Veröffentlicht in: | IEEE access 2023, Vol.11, p.103319-103332 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Optical Coherence Tomography (OCT) is a widely used 3D imaging technology in ophthalmology. Segmentation of retinal layers in OCT is important for diagnosis and evaluation of various retinal and systemic diseases. While 2D segmentation algorithms have been developed, they do not fully utilize contextual information and suffer from inconsistency in 3D. We propose neural networks to combine motion correction and segmentation in 3D. The proposed segmentation network utilizes 3D convolution and a novel graph pyramid structure with graph-inspired building blocks. We also collected one of the largest OCT segmentation dataset with manually corrected segmentation covering both normal examples and various diseases. The experimental results on three datasets with multiple instruments and various diseases show the proposed method can achieve improved segmentation accuracy compared with commercial softwares and conventional or deep learning methods in literature. Specifically, the proposed method reduced the average error from 38.47% to 11.43% compared to clinically available commercial software for severe deformations caused by diseases. The diagnosis and evaluation of diseases with large deformation such as DME, wet AMD and CRVO would greatly benefit from the improved accuracy, which impacts tens of millions of patients. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3317011 |