Vision Transformers for Anomaly Classification and Localization in Optical Networks Using SOP Spectrograms
Monitoring the state of polarization (SOP) in optical communication networks is vital for maintaining network reliability and performance. SOP data, influenced by environmental factors, presents significant challenges for conventional methods due to its multidimensional nature and susceptibility to...
Gespeichert in:
Veröffentlicht in: | Journal of lightwave technology 2024-12, p.1-13 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Monitoring the state of polarization (SOP) in optical communication networks is vital for maintaining network reliability and performance. SOP data, influenced by environmental factors, presents significant challenges for conventional methods due to its multidimensional nature and susceptibility to noise. Machine learning (ML) algorithms provide a promising solution by effectively learning complex patterns in SOP data, thereby enhancing anomaly detection capabilities. In this paper, we introduce an enhanced vision transformer-based approach for anomaly classification and localization in SOP data. Our method leverages spectrograms derived from continuous SOP measurements and has been validated using experimental data from a 2600 km bidirectional link. The proposed approach achieves an accuracy of 99% and a timestamping precision with a root mean square error (RMSE) of 7 ms. Comparative evaluations against several ML baselines underscore the superiority of our method, particularly in accurately localizing SOP transients within spectrograms and handling overlapping events, though these are treated as single combined events. These results reaffirm the efficacy of our approach in improving anomaly classification and localization capabilities in optical networks. |
---|---|
ISSN: | 0733-8724 |
DOI: | 10.1109/JLT.2024.3519755 |