Visual concept learning system based on lexical elements and feature key points conjunction

Subject of Research. The paper deals withthe process of visual concept building based on two unlabeled sources of information (visual and textual). Method. Visual concept-based learning is carried out with image patterns and lexical elements simultaneous conjunction. Concept-based learning consists...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nauchno-tekhnicheskiĭ vestnik informat͡s︡ionnykh tekhnologiĭ, mekhaniki i optiki mekhaniki i optiki, 2016-07, Vol.16 (4), p.689-696
Hauptverfasser: Filatov, V.I., Potapov, A.S.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Subject of Research. The paper deals withthe process of visual concept building based on two unlabeled sources of information (visual and textual). Method. Visual concept-based learning is carried out with image patterns and lexical elements simultaneous conjunction. Concept-based learning consists of two basic stages: early learning acquisition (primary learning) and lexical-semantic learning (secondary learning). In early learning acquisition stage the visual concept dictionary is created providing background for the next stage. The lexical-semantic learning makes two sources timeline analysis and extracts features in both information channels. Feature vectors are formed by extraction of separated information units in both channels. Mutual information between two sources describes visual concepts building criteria. Main Results. Visual concept-based learning system has been developed; it uses video data with subtitles. The results of research have shown principal ability of visual concepts building by our system. Practical Relevance.Recommended application area of described system is an object detection, image retrieval and automatic building of visual concept-based data tasks.
ISSN:2226-1494
2500-0373
DOI:10.17586/2226-1494-2016-16-4-689-696