Optimizing Hyperspectral Imaging Classification Performance with CNN and Batch Normalization
Background: Hyperspectral imaging systems face numerous challenges in acquiring accurate spatial-spectral hypercubes due to sample surface heterogeneity, environmental instability, and instrumental noise. Preprocessing strategies such as outlier detection, calibration, smoothing, and normalization a...
Gespeichert in:
Veröffentlicht in: | Applied Spectroscopy Practica 2023-09, Vol.1 (2) |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Background: Hyperspectral imaging systems face numerous challenges in acquiring accurate spatial-spectral hypercubes due to sample surface heterogeneity, environmental instability, and instrumental noise. Preprocessing strategies such as outlier detection, calibration, smoothing, and normalization are typically employed to address these issues, selecting appropriate techniques based on prediction performance evaluation. However, the risk of misusing inappropriate preprocessing methods remains a concern. Methods: In this study, we evaluate the impact of five normalization methods on the classification performance of six different classifiers using honey hyperspectral images. Our results show that different classifiers have varying compatible normalization techniques and that using Batch Normalization with Convolutional Neural Networks (CNN) can significantly improve classification performance and diminish the variations among other normalization techniques. The CNN with Batch Normalization can achieve a macro average F1 score of ≥0.99 with four different normalization methods and ≥0.97 without normalization. Furthermore, we analyze kernel weights' distribution in the CNN models' final convolutional layers using statistical measurements and kernel density estimation (KDE) graphs. Results: We find that the performance improvements resulting from adding BatchNorm layers are associated with kernel weight range, kurtosis, and density around 0. However, the differences among normalization methods do not show a strong correlation with kernel weight distribution. In conclusion, our findings demonstrate that the CNN with Batch Normalization layers can achieve better prediction results and avoid the risk of inappropriate normalization.
Graphical abstract
This is a visual representation of the abstract. |
---|---|
ISSN: | 2755-1857 2755-1857 |
DOI: | 10.1177/27551857231204622 |