Image segmentation of underfloor scenes using a mask regions convolutional neural network with two-stage transfer learning
Enclosed spaces are common in built structures but pose a challenge to many forms of manual or robotic surveying and maintenance tasks. Part of this challenge is to train robot systems to understand their environment without human intervention. This paper presents a method to automatically classify...
Gespeichert in:
Veröffentlicht in: | Automation in construction 2020-05, Vol.113, p.103118, Article 103118 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Enclosed spaces are common in built structures but pose a challenge to many forms of manual or robotic surveying and maintenance tasks. Part of this challenge is to train robot systems to understand their environment without human intervention. This paper presents a method to automatically classify features within a closed void using deep learning. Specifically, the paper considers a robot placed under floorboards for the purpose of autonomously surveying the underfloor void. The robot uses images captured using an RGB camera to identify regions such as floorboards, joists, air vents and pipework. The paper first presents a standard mask regions convolutional neural network approach, which gives modest performance. The method is then enhanced using a two-stage transfer learning approach with an existing dataset for interior scenes. The conclusion from this work is that, even with limited training data, it is possible to automatically detect many common features of such areas.
•A method for semantic segmentation in closed spaces•Database of underfloor images are captured and segmented using a mask R-CNN.•Two-stage transfer learning and other optimisations presented to maximise accuracy•Case-study application to the detection of floorboards, joists, vents, pipes, etc for automated insulation spraying•Mapping of segmented data labels to 3D point cloud data |
---|---|
ISSN: | 0926-5805 1872-7891 |
DOI: | 10.1016/j.autcon.2020.103118 |