Video sequence feature extraction and segmentation using likelihood regression model

The development of digital technology is utilized by people to capture and share video frames. At present, rather than capturing images, people are interested in recording video footage for exploring information. Here, retrieval of video from large databases is challenging due to the continuous fram...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2021-07, Vol.80 (16), p.24343-24361
Hauptverfasser: Kumar, B. Satheesh, Seetharaman, K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The development of digital technology is utilized by people to capture and share video frames. At present, rather than capturing images, people are interested in recording video footage for exploring information. Here, retrieval of video from large databases is challenging due to the continuous frame count. To overcome these challenges associated with the retrieval of video from available databases, this research proposed a likelihood-based regression approach for video processing. To improve the retrieval accuracy of video sequences, the proposed method utilizes a likelihood estimation technique integrated with a regression model. The likelihood estimate measures the pixel level roughly for estimating the pixel range, after which the regression approach measures the pixel level for transforming certainly blurred and unwanted pixels. In the proposed likelihood regression approach, the video is converted into a video frame and stored in a database. Query frames are taken into account by the generated database depending on the features which are used for a given video to be retrieved. The significant video retrieval performance obtained from the simulation results for the proposed likelihood-based regression model shows that the proposed model performs well over the other state-of-the-art techniques.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-021-10829-9