IACC: Cross-Illumination Awareness and Color Correction for Underwater Images Under Mixed Natural and Artificial Lighting
Enhancing underwater images captured under mixed artificial and natural lighting conditions presents two critical challenges. Existing methods lack a unified luminance feature extraction paradigm for mixed lighting scenes, leading to imbalance in luminance features, and consequent local overexposure...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2024, Vol.62, p.1-15 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Enhancing underwater images captured under mixed artificial and natural lighting conditions presents two critical challenges. Existing methods lack a unified luminance feature extraction paradigm for mixed lighting scenes, leading to imbalance in luminance features, and consequent local overexposure or underexposure. Additionally, some color correction methods, through the fusion of features across multiple color spaces neglect the information loss due to the absence of feature alignment in cross-space fusion. To address these challenges, we propose a specialized method, namely IACC, which unifies the luminance features of underwater images under mixed lighting and guides consistent enhancement across similar luminance regions. Furthermore, complementary colors are introduced to globally guide the correction of color discrepancies, preserving the structural consistency and mitigating potential structural information loss during the original image feature extraction. Extensive experiments on various underwater datasets demonstrate the superiority of our method, which outperforms state-of-the-art methods in both machine and human visual perception. Our code is available at https://github.com/zhoujingchun03/IACC . |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2023.3346384 |