Effective detection of exposed target regions based on deep learning from multimedia data

With the development of high-performance visual sensors, it has been very easy to obtain a variety of image data. Of these image data, human face regions contain personal information to distinguish one from the others. Therefore, it is important to accurately detect unhidden face regions from an inp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2020-06, Vol.79 (23-24), p.16609-16625
Hauptverfasser: Jang, Seok-Woo, Ahn, Byeongtae
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the development of high-performance visual sensors, it has been very easy to obtain a variety of image data. Of these image data, human face regions contain personal information to distinguish one from the others. Therefore, it is important to accurately detect unhidden face regions from an input image. This paper proposes a method of robustly detecting human face regions from an input color image with the use of a deep learning algorithm, one of the machine learning algorithms. The proposed method first transforms the RGB color model of an input image to the YC b C r color model, and then removes other regions than face regions to segment skin regions with the use of the pre-learned elliptical skin color distribution model. Subsequently, a CNN model-based deep learning algorithm was applied to robustly detect human face regions from the detected skin regions in the previous step. As a result, the proposed method segments face regions more efficiently than an existing method. The face region detection method proposed in this paper is expected to be usefully applied to practical areas related to multimedia data processing, such as video surveillance, target blocking, image security, visual data analysis, and object recognition and tracking.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-019-07832-6