ANALYSIS OF THE SITUATION ON THE OBSERVED SCENE CONTAINING MANY MOVING OBJECTS

In many processes, control is based on the classification of the current states of objects of interest or situations and the selection of appropriate control actions. In the presented work, the task of classifying dynamic situations based on the analysis of the received video sequence is considered,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Girenko, D. S., Kim, N. V.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In many processes, control is based on the classification of the current states of objects of interest or situations and the selection of appropriate control actions. In the presented work, the task of classifying dynamic situations based on the analysis of the received video sequence is considered, for example, when monitoring traffic, observing the behavior of a crowd of people or animals, etc. abnormal, dangerous, etc.). At the same time, the classification of situations by separate static images of similar observed scenes can be difficult, physically unrealizable or impractical. The aim of the study is to increase the efficiency of process management by using some features (signs) of dynamic situations for classification. Automatic classification of current dynamic situations in such processes will allow timely organization of measures to correct undesirable development of situations and/or reduce possible predicted losses. A technique for classifying dynamic situations based on the analysis of motion parameters identified by the transformer network and subsequent classification implemented by the perceptron is presented. As a demonstration example, the classification of street situations determined by people's behavior is investigated. Examples of distinguished classes of situations are given, confirming the possibility of implementing the proposed methodology.
ISSN:2194-9034
1682-1750
2194-9034
DOI:10.5194/isprs-archives-XLVIII-2-W3-2023-65-2023