Automatic generation of difficulty maps for datasets using neural network

To select videos to compose a change detection dataset, we can consider the videos’ difficulty level. We need to use difficulty maps, which store values representing the pixels’ difficulty level, to estimate these levels. The problem is that ground truth is needed to generate a difficulty map, and g...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2024-01, Vol.83 (25), p.66499-66516
Hauptverfasser: Sanches, Silvio Ricardo Rodrigues, Custódio Junior, Elton, Corrêa, Cléber Gimenez, Oliveira, Claiton, Freire, Valdinei, Saito, Priscila Tiemi Maeda, Bugatti, Pedro Henrique
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To select videos to compose a change detection dataset, we can consider the videos’ difficulty level. We need to use difficulty maps, which store values representing the pixels’ difficulty level, to estimate these levels. The problem is that ground truth is needed to generate a difficulty map, and generating the ground truth requires manual attribution of labels to the pixels of the frames. Identifying the difficulty level of a video before producing its ground truth allows researchers to obtain the difficulty level, select the videos considering this information, and, subsequently, generate ground truths only for the videos with different difficulty levels. Datasets containing videos with different difficulty levels can evaluate an algorithm more adequately. In this research, we developed a method to generate difficulty maps of a video without using its ground truth. Our method uses the videos and the ground truths from the CDNet 2014 dataset to generate difficulty maps to train a pix2pix neural network. The results showed that the trained network could generate difficulty maps similar to those generated by the traditional approach.
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-024-18271-3