Novel Neural Networks for Camera Calibration in Underwater Environments
A novel method for camera calibration in underwater environments using convolutional networks is presented. Two modified neural network architectures, ZCalibAquaNet and BCalibAquaNet, were developed for calibration matrix estimation from chessboard images in underwater environments. The process bega...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024-11, Vol.12, p.1-1 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | A novel method for camera calibration in underwater environments using convolutional networks is presented. Two modified neural network architectures, ZCalibAquaNet and BCalibAquaNet, were developed for calibration matrix estimation from chessboard images in underwater environments. The process began with the training of the InceptionResNetV2 neural network from scratch, followed by the adjustment of the dense layers for regression and calibration matrix estimation. To ensure a suitable underwater environment, the dataset used in this work was created by the authors, using a ZED 2 stereo camera with a resolution of 1280 x 720 pixels and a baseline of 12 cm. A total of 2700 images were captured in an underwater environment. Three distinct scenes clean, green and blue waters were used to study the network performance under different lighting and color conditions. For network training, the Mean Squared Error (MSE) was used as the loss function, and the L 2 norm was applied to the dense layers for 256 epochs. Additionally, 3D reconstruction of objects with known geometry in underwater environments was performed. The results showed that both networks, ZCalibAquaNet and BCalibAquaNet, provided better camera calibration and improved the measurement quality in underwater environments compared to Zhang and Bouquet models. |
---|---|
ISSN: | 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3509452 |