3D MODEL FOR INDOOR SPACES USING DEPTH SENSOR

In recent years, 3D model for indoor spaces have become highly demanded in the development of technology. Many approaches to 3D visualisation and modelling especially for indoor environment was developed such as laser scanner, photogrammetry, computer vision, image and many more. However, most of th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Mukhtar, N. F., Azri, S., Ujang, U., Cuétara, M. G., Retortillo, G. M., Mohd Salleh, S.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In recent years, 3D model for indoor spaces have become highly demanded in the development of technology. Many approaches to 3D visualisation and modelling especially for indoor environment was developed such as laser scanner, photogrammetry, computer vision, image and many more. However, most of the technique relies on the experience of the operator to get the best result. Besides that, the equipment is quite expensive and time-consuming in terms of processing. This paper focuses on the data acquisition and visualisation of a 3D model for an indoor space by using a depth sensor. In this study, EyesMap3D Pro by Ecapture is used to collect 3D data of the indoor spaces. The EyesMap3D Pro depth sensor is able to generate 3D point clouds in high speed and high mobility due to the portability and light weight of the device. However, more attention must be paid on data acquisition, data processing, visualizing, and evaluation of the depth sensor data. Hence, this paper will discuss the data processing from extracting features from 3D point clouds to 3D indoor models. Afterwards, the evaluation on the 3D models is made to ensure the suitability in indoor model and indoor mapping application. In this study, the 3D model was exported to 3D GIS-ready format for displaying and storing more information of the indoor spaces.
ISSN:2194-9034
1682-1750
2194-9034
DOI:10.5194/isprs-archives-XLII-4-W16-471-2019