Underwater Fish Tracking for Moving Cameras Based on Deformable Multiple Kernels
Fishery surveys that call for the use of single or multiple underwater cameras have been an emerging technology as a nonextractive mean to estimate the abundance of fish stocks. Tracking live fish in an open aquatic environment posts challenges that are different from general pedestrian or vehicle t...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on systems, man, and cybernetics. Systems man, and cybernetics. Systems, 2017-09, Vol.47 (9), p.2467-2477 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Fishery surveys that call for the use of single or multiple underwater cameras have been an emerging technology as a nonextractive mean to estimate the abundance of fish stocks. Tracking live fish in an open aquatic environment posts challenges that are different from general pedestrian or vehicle tracking in surveillance applications. In many rough habitats, fish are monitored by cameras installed on moving platforms, where tracking is even more challenging due to inapplicability of background models. In this paper, a novel tracking algorithm based on the deformable multiple kernels is proposed to address these challenges. Inspired by the deformable part model technique, a set of kernels is defined to represent the holistic object and several parts that are arranged in a deformable configuration. Color histogram, texture histogram, and the histogram of oriented gradients (HOGs) are extracted and serve as object features. Kernel motion is efficiently estimated by the mean-shift algorithm on color and texture features to realize tracking. Furthermore, the HOG-feature deformation costs are adopted as soft constraints on kernel positions to maintain the part configuration. Experimental results on practical video set from underwater moving cameras show the reliable performance of the proposed method with much less computational cost comparing with state-of-the-art techniques. |
---|---|
ISSN: | 2168-2216 2168-2232 |
DOI: | 10.1109/TSMC.2016.2523943 |