Research on Efficient Asymmetric Attention Module for Real-Time Semantic Segmentation Networks in Urban Scenes
Currently, numerous high-precision models have been proposed for semantic segmentation, but the model parameters are large and the segmentation speed is slow. Real-time semantic segmentation for urban scenes necessitates a balance between accuracy, inference speed, and model size. In this paper, we...
Gespeichert in:
Veröffentlicht in: | Journal of advanced computational intelligence and intelligent informatics 2024-05, Vol.28 (3), p.562-572 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Currently, numerous high-precision models have been proposed for semantic segmentation, but the model parameters are large and the segmentation speed is slow. Real-time semantic segmentation for urban scenes necessitates a balance between accuracy, inference speed, and model size. In this paper, we present an efficient solution to this challenge, efficient asymmetric attention module net (EAAMNet) for the semantic segmentation of urban scenes, which adopts an asymmetric encoder–decoder structure. The encoder part of the network utilizes an efficient asymmetric attention module to form the network backbone. In the decoding part, we propose a lightweight multi-feature fusion decoder that can maintain good segmentation accuracy with a small number of parameters. Our extensive evaluations demonstrate that EAAMNet achieves a favorable equilibrium between segmentation efficiency, model parameters, and segmentation accuracy, rendering it highly suitable for real-time semantic segmentation in urban scenes. Remarkably, EAAMNet attains a 73.31% mIoU at 128 fps on Cityscapes and a 69.32% mIoU at 141 fps on CamVid without any pre-training. Compared to state-of-the-art models, our approach not only matches their model parameters but also enhances accuracy and increases speed. |
---|---|
ISSN: | 1343-0130 1883-8014 |
DOI: | 10.20965/jaciii.2024.p0562 |