Detection of salient events in large datasets of underwater video

The aim of this work is to perform the automatic detection of events of interest, in this case defined as animal motion, in deep-sea videos and then to use the detected events as the basis for creating video abstracts. Video is collected by seafloor cameras connected to a cabled observatory network...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Gebali, A., Albu, A. B., Hoeberechts, M.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The aim of this work is to perform the automatic detection of events of interest, in this case defined as animal motion, in deep-sea videos and then to use the detected events as the basis for creating video abstracts. Video is collected by seafloor cameras connected to a cabled observatory network which provides power to the lights and sensors and enables two-way communication with the cameras. Continuous power and connectivity on the network permit high volumes of data to be collected. Such video data is of importance for marine biologists who are able to remotely observe species in the deep-sea environment through scheduled recordings of the video data. It is extremely time consuming for researchers looking for particular events of interest to manually search in the video database, and therefore, our study focuses on automatic detection of these events. Our approach is based on the Laptev spatio-temporal interest points detection method [1]. The output of the analysis is a summary video clip that contains all detected salient events with their associated start and end frames. We report experimental results on video abstraction using a database of videos from the NEPTUNE Canada cabled observatory.
ISSN:0197-7385
DOI:10.1109/OCEANS.2012.6404996