Block-Wise Background Subtraction Based on Gaussian Mixture Models
The background subtraction of an image enables us to distinguish a moving object in a video sequence and enter higher levels of video processing. Background processing is an essential strategy for many video processing applications and its most primary method (utilized to determine the difference of...
Gespeichert in:
Veröffentlicht in: | Applied Mechanics and Materials 2014-01, Vol.490-491 (Mechanical Design and Power Engineering), p.1221-1227 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The background subtraction of an image enables us to distinguish a moving object in a video sequence and enter higher levels of video processing. Background processing is an essential strategy for many video processing applications and its most primary method (utilized to determine the difference of sequential frames) is very rapid and easy, but not appropriate for complicated scenes. In this article we introduce a method to remove the false distinction of the foreground. In our proposed method the updating of which is automatic, a mixture of Gaussians have been used. In addition this method depends on the passage of time. The previously introduced methods are often based on processing on the pixel and ignore the neighbor pixels in order to improve the background. Our method does not make do with one pixel but rather benefits from a block of pixels in order to include all the pixels included in the block. Experimental findings indicate considerable developments in the proposed method which can quickly and without morphological filtering model the background using image supervising cameras placed in both inside and outside locations and lighting changes, repetitive motion from clutter and scene changes. Subsequently the moving object in the scene can appear for real-time tracking and recognition applications. |
---|---|
ISSN: | 1660-9336 1662-7482 1662-7482 |
DOI: | 10.4028/www.scientific.net/AMM.490-491.1221 |