Spatiotemporal Clustering of Parking Lots at the City Level for Efficiently Sharing Occupancy Forecasting Models

This study aims to address the challenge of developing accurate and efficient parking occupancy forecasting models at the city level for autonomous vehicles. Although deep learning techniques have been successfully employed to develop such models for individual parking lots, it is a resource-intensi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2023-05, Vol.23 (11), p.5248
Hauptverfasser: Mufida, Miratul Khusna, Ait El Cadi, Abdessamad, Delot, Thierry, Trépanier, Martin, Zekri, Dorsaf
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study aims to address the challenge of developing accurate and efficient parking occupancy forecasting models at the city level for autonomous vehicles. Although deep learning techniques have been successfully employed to develop such models for individual parking lots, it is a resource-intensive process that requires significant amounts of time and data for each parking lot. To overcome this challenge, we propose a novel two-step clustering technique that groups parking lots based on their spatiotemporal patterns. By identifying the relevant spatial and temporal characteristics of each parking lot (parking profile) and grouping them accordingly, our approach allows for the development of accurate occupancy forecasting models for a set of parking lots, thereby reducing computational costs and improving model transferability. Our models were built and evaluated using real-time parking data. The obtained correlation rates of 86% for the spatial dimension, 96% for the temporal one, and 92% for both demonstrate the effectiveness of the proposed strategy in reducing model deployment costs while improving model applicability and transfer learning across parking lots.
ISSN:1424-8220
1424-8220
DOI:10.3390/s23115248