Reconstruction of Three-Dimensional (3D) Indoor Interiors with Multiple Stories via Comprehensive Segmentation

The fast and stable reconstruction of building interiors from scanned point clouds has recently attracted considerable research interest. However, reconstructing long corridors and connected areas across multiple floors has emerged as a substantial challenge. This paper presents a comprehensive segm...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing (Basel, Switzerland) Switzerland), 2018-08, Vol.10 (8), p.1281
Hauptverfasser: Li, Lin, Su, Fei, Yang, Fan, Zhu, Haihong, Li, Dalin, Zuo, Xinkai, Li, Feng, Liu, Yu, Ying, Shen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The fast and stable reconstruction of building interiors from scanned point clouds has recently attracted considerable research interest. However, reconstructing long corridors and connected areas across multiple floors has emerged as a substantial challenge. This paper presents a comprehensive segmentation method for reconstructing a three-dimensional (3D) indoor structure with multiple stories. With this method, the over-segmentation that usually occurs in the reconstruction of long corridors in a complex indoor environment is overcome by morphologically eroding the floor space to segment rooms and by overlapping the segmented room-space with partitioned cells via extracted wall lines. Such segmentation ensures both the integrity of the room-space partitions and the geometric regularity of the rooms. For spaces across floors in a multistory building, a peak-nadir-peak strategy in the distribution of points along the z-axis is proposed in order to extract connected areas across multiple floors. A series of experimental tests while using seven real-world 3D scans and eight synthetic models of indoor environments show the effectiveness and feasibility of the proposed method.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs10081281