A low complexity convolutional neural network for feature extraction of satellite images used in path loss prediction

Determination of localized path loss values in an environment is vital to the design and upgrade of wireless communications networks. As path loss values depend on environments, Convolutional Neural Networks (CNN) are trained with satellite images to map path loss values with environmental character...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Sani, Usman Sammani, Lai, Daphne T. C., Malik, Owais Ahmed
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Determination of localized path loss values in an environment is vital to the design and upgrade of wireless communications networks. As path loss values depend on environments, Convolutional Neural Networks (CNN) are trained with satellite images to map path loss values with environmental characteristics. Pretrained CNN models are often used, but they take time to get trained and require a large memory size. We developed a deep learning architecture composed of a low complexity CNN to extract features from satellite images and an XGBoost regressor that maps a combination of the extracted features with some numerical features to path loss values. The CNN used for feature extraction is composed of five convolutional layers, with individual number of filters and kernel sizes obtained using Bayesian hyper-parameter optimization. The developed CNN provided an accuracy comparable to the best of the pretrained models based on a significant difference test and better in terms of train time. The single model was developed with a dataset composed of measurements from multiple environments: rural, suburban, urban, and urban highrise, and of multiple frequencies and antenna heights. The model's prediction error in terms of RMSE was below known standard threshold values at all frequencies and environments.
ISSN:0094-243X
1551-7616
DOI:10.1063/5.0110889