Automatic Image Tagging Model Based on Multigrid Image Segmentation and Object Recognition

Since rapid growth of Internet technologies and mobile devices, multimedia data such as images and videos are explosively growing on the Internet. Managing large scale multimedia data with correct tags and annotations is very important task. Incorrect tags and annotations make it hard to manage mult...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Advances in Multimedia 2014-01, Vol.2014 (2014), p.221-227
Hauptverfasser: Jun, Woogyoung, Jun, Byoung-Min, Lee, Yillbyung
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Since rapid growth of Internet technologies and mobile devices, multimedia data such as images and videos are explosively growing on the Internet. Managing large scale multimedia data with correct tags and annotations is very important task. Incorrect tags and annotations make it hard to manage multimedia data. Accurate tags and annotation ease management of multimedia data and give high quality retrieve results. Fully manual image tagging which is tagged by user will be most accurate tags when the user tags correct information. Nevertheless, most of users do not make effort on task of tagging. Therefore, we suffer from lots of noisy tags. Best solution for accurate image tagging is to tag image automatically. Robust automatic image tagging models are proposed by many researchers and it is still most interesting research field these days. Since there are still lots of limitations in automatic image tagging models, we propose efficient automatic image tagging model using multigrid based image segmentation and feature extraction method. Our model can improve the object descriptions of images and image regions. Our method is tested with Corel dataset and the result showed that our model performance is efficient and effective compared to other models.
ISSN:1687-5680
1687-5699
DOI:10.1155/2014/857682