Smart compositing: A real-time content-adaptive blending method for remote visual collaboration
This paper proposes a content-adaptive blending method, smart compositing, for displaying two overlapped video frames on the same screen while preserving the readability of both. A pixel-wise adaptive blending factor map is generated according to the edge and saturation information of the content of...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper proposes a content-adaptive blending method, smart compositing, for displaying two overlapped video frames on the same screen while preserving the readability of both. A pixel-wise adaptive blending factor map is generated according to the edge and saturation information of the content of only the overlay frame. Using this blending factor map, regions of the overlay frame with edges or saturated color are assigned to be more opaque and the remaining regions are assigned to be more transparent. A halo is also created around the edges of the overlay content which enhances the edges and disambiguates them from the underlying frame. The proposed method is suitable for overlaying many different types of content (e.g. drawings, slides, texts, and pictures) and does not require any information (e.g., an opacity mask) from the application which generates the content. This method has low computational complexity and has been implemented in real-time. |
---|---|
ISSN: | 1520-6149 2379-190X |
DOI: | 10.1109/ICASSP.2012.6288378 |