Online classification of visual tasks for industrial workflow monitoring
Modelling and classification of time series stemming from visual workflows is a very challenging problem due to the inherent complexity of the activity patterns involved and the difficulty in tracking moving targets. In this paper, we propose a framework for classification of visual tasks in industr...
Gespeichert in:
Veröffentlicht in: | Neural networks 2011-10, Vol.24 (8), p.852-860 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Modelling and classification of time series stemming from visual workflows is a very challenging problem due to the inherent complexity of the activity patterns involved and the difficulty in tracking moving targets. In this paper, we propose a framework for classification of visual tasks in industrial environments. We propose a novel method to automatically segment the input stream and to classify the resulting segments using prior knowledge and hidden Markov models (HMMs), combined through a genetic algorithm. We compare this method to an echo state network (ESN) approach, which is appropriate for general-purpose time-series classification. In addition, we explore the applicability of several fusion schemes for multicamera configuration in order to mitigate the problem of limited visibility and occlusions. The performance of the suggested approaches is evaluated on real-world visual behaviour scenarios.
► We present a framework for online activity recognition in a complex industrial environment. ► We provide a novel method to automatically segment the input stream and classify segments. ► We propose GA–HMM: a hidden Markov model (HMM) combined with prior knowledge through a genetic algorithm (GA). ► GA–HMM outperforms an echo state network (ESN) in online recognition rates. ► Employing fusion schemes for multiple camera streams can improve accuracy. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/j.neunet.2011.06.001 |