Robust image classification based on a non-causal hidden Markov Gauss mixture model
We propose a novel image classification method using a non-causal hidden Markov Gauss mixture model (HMGMM) We apply supervised learning assuming that the observation probability distribution given each class can be estimated using Gauss mixture vector quantization (GMVQ) designed using the generali...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 788 vol.3 |
---|---|
container_issue | |
container_start_page | 785 |
container_title | |
container_volume | 3 |
creator | Kyungsuk Pyun Chee Sun Won Johan Lim Gray, R.M. |
description | We propose a novel image classification method using a non-causal hidden Markov Gauss mixture model (HMGMM) We apply supervised learning assuming that the observation probability distribution given each class can be estimated using Gauss mixture vector quantization (GMVQ) designed using the generalized Lloyd algorithm with a minimum discrimination information (MDI) distortion. The maximum a posteriori (MAP) hidden states in an Ising model are estimated by a stochastic EM algorithm. We demonstrate that HMGMM obtains better classification than several popular methods, including CART, LVQ, causal HMM, and multiresolution HMM, in terms of Bayes risk and the spatial homogeneity of the classified objects. A heuristic solution for the number of clusters achieves a robust image classification. |
doi_str_mv | 10.1109/ICIP.2002.1039089 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_1039089</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1039089</ieee_id><sourcerecordid>1039089</sourcerecordid><originalsourceid>FETCH-LOGICAL-c153t-4bc12ec53f1d7d47c0635d5146e7684d81a673fe8d602c355da94c08ae4729da3</originalsourceid><addsrcrecordid>eNotkNlKA0EURBsXMIl-gPjSPzCxb-_zKEFjIKK4PIeb7jvaOotMTyT-vQPmqYqCU1DF2CWIOYAor1eL1dNcCiHnIFQpfHnEJlJ5KLzR5TGbCueFclZKe8ImYKQstPfijE1z_hwpAQom7OW52-7ywFOD78RDjTmnKgUcUtfyLWaKfDTI264tAu4y1vwjxUgtf8D-q_vhyzHMvEn7YdcTb7pI9Tk7rbDOdHHQGXu7u31d3Bfrx-VqcbMuAhg1FHobQFIwqoLoonZBWGWiAW3JWa-jB7ROVeSjFTIoYyKWOgiPpJ0sI6oZu_rvTUS0-e7HDf3v5nCG-gPMuFFO</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Robust image classification based on a non-causal hidden Markov Gauss mixture model</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Kyungsuk Pyun ; Chee Sun Won ; Johan Lim ; Gray, R.M.</creator><creatorcontrib>Kyungsuk Pyun ; Chee Sun Won ; Johan Lim ; Gray, R.M.</creatorcontrib><description>We propose a novel image classification method using a non-causal hidden Markov Gauss mixture model (HMGMM) We apply supervised learning assuming that the observation probability distribution given each class can be estimated using Gauss mixture vector quantization (GMVQ) designed using the generalized Lloyd algorithm with a minimum discrimination information (MDI) distortion. The maximum a posteriori (MAP) hidden states in an Ising model are estimated by a stochastic EM algorithm. We demonstrate that HMGMM obtains better classification than several popular methods, including CART, LVQ, causal HMM, and multiresolution HMM, in terms of Bayes risk and the spatial homogeneity of the classified objects. A heuristic solution for the number of clusters achieves a robust image classification.</description><identifier>ISSN: 1522-4880</identifier><identifier>ISBN: 0780376226</identifier><identifier>ISBN: 9780780376229</identifier><identifier>EISSN: 2381-8549</identifier><identifier>DOI: 10.1109/ICIP.2002.1039089</identifier><language>eng</language><publisher>IEEE</publisher><subject>Algorithm design and analysis ; Gaussian distribution ; Gaussian processes ; Hidden Markov models ; Image classification ; Probability distribution ; Robustness ; State estimation ; Supervised learning ; Vector quantization</subject><ispartof>Proceedings. International Conference on Image Processing, 2002, Vol.3, p.785-788 vol.3</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c153t-4bc12ec53f1d7d47c0635d5146e7684d81a673fe8d602c355da94c08ae4729da3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1039089$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,4036,4037,27902,54895</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1039089$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Kyungsuk Pyun</creatorcontrib><creatorcontrib>Chee Sun Won</creatorcontrib><creatorcontrib>Johan Lim</creatorcontrib><creatorcontrib>Gray, R.M.</creatorcontrib><title>Robust image classification based on a non-causal hidden Markov Gauss mixture model</title><title>Proceedings. International Conference on Image Processing</title><addtitle>ICIP</addtitle><description>We propose a novel image classification method using a non-causal hidden Markov Gauss mixture model (HMGMM) We apply supervised learning assuming that the observation probability distribution given each class can be estimated using Gauss mixture vector quantization (GMVQ) designed using the generalized Lloyd algorithm with a minimum discrimination information (MDI) distortion. The maximum a posteriori (MAP) hidden states in an Ising model are estimated by a stochastic EM algorithm. We demonstrate that HMGMM obtains better classification than several popular methods, including CART, LVQ, causal HMM, and multiresolution HMM, in terms of Bayes risk and the spatial homogeneity of the classified objects. A heuristic solution for the number of clusters achieves a robust image classification.</description><subject>Algorithm design and analysis</subject><subject>Gaussian distribution</subject><subject>Gaussian processes</subject><subject>Hidden Markov models</subject><subject>Image classification</subject><subject>Probability distribution</subject><subject>Robustness</subject><subject>State estimation</subject><subject>Supervised learning</subject><subject>Vector quantization</subject><issn>1522-4880</issn><issn>2381-8549</issn><isbn>0780376226</isbn><isbn>9780780376229</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2002</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNotkNlKA0EURBsXMIl-gPjSPzCxb-_zKEFjIKK4PIeb7jvaOotMTyT-vQPmqYqCU1DF2CWIOYAor1eL1dNcCiHnIFQpfHnEJlJ5KLzR5TGbCueFclZKe8ImYKQstPfijE1z_hwpAQom7OW52-7ywFOD78RDjTmnKgUcUtfyLWaKfDTI264tAu4y1vwjxUgtf8D-q_vhyzHMvEn7YdcTb7pI9Tk7rbDOdHHQGXu7u31d3Bfrx-VqcbMuAhg1FHobQFIwqoLoonZBWGWiAW3JWa-jB7ROVeSjFTIoYyKWOgiPpJ0sI6oZu_rvTUS0-e7HDf3v5nCG-gPMuFFO</recordid><startdate>2002</startdate><enddate>2002</enddate><creator>Kyungsuk Pyun</creator><creator>Chee Sun Won</creator><creator>Johan Lim</creator><creator>Gray, R.M.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>2002</creationdate><title>Robust image classification based on a non-causal hidden Markov Gauss mixture model</title><author>Kyungsuk Pyun ; Chee Sun Won ; Johan Lim ; Gray, R.M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c153t-4bc12ec53f1d7d47c0635d5146e7684d81a673fe8d602c355da94c08ae4729da3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2002</creationdate><topic>Algorithm design and analysis</topic><topic>Gaussian distribution</topic><topic>Gaussian processes</topic><topic>Hidden Markov models</topic><topic>Image classification</topic><topic>Probability distribution</topic><topic>Robustness</topic><topic>State estimation</topic><topic>Supervised learning</topic><topic>Vector quantization</topic><toplevel>online_resources</toplevel><creatorcontrib>Kyungsuk Pyun</creatorcontrib><creatorcontrib>Chee Sun Won</creatorcontrib><creatorcontrib>Johan Lim</creatorcontrib><creatorcontrib>Gray, R.M.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kyungsuk Pyun</au><au>Chee Sun Won</au><au>Johan Lim</au><au>Gray, R.M.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Robust image classification based on a non-causal hidden Markov Gauss mixture model</atitle><btitle>Proceedings. International Conference on Image Processing</btitle><stitle>ICIP</stitle><date>2002</date><risdate>2002</risdate><volume>3</volume><spage>785</spage><epage>788 vol.3</epage><pages>785-788 vol.3</pages><issn>1522-4880</issn><eissn>2381-8549</eissn><isbn>0780376226</isbn><isbn>9780780376229</isbn><abstract>We propose a novel image classification method using a non-causal hidden Markov Gauss mixture model (HMGMM) We apply supervised learning assuming that the observation probability distribution given each class can be estimated using Gauss mixture vector quantization (GMVQ) designed using the generalized Lloyd algorithm with a minimum discrimination information (MDI) distortion. The maximum a posteriori (MAP) hidden states in an Ising model are estimated by a stochastic EM algorithm. We demonstrate that HMGMM obtains better classification than several popular methods, including CART, LVQ, causal HMM, and multiresolution HMM, in terms of Bayes risk and the spatial homogeneity of the classified objects. A heuristic solution for the number of clusters achieves a robust image classification.</abstract><pub>IEEE</pub><doi>10.1109/ICIP.2002.1039089</doi></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1522-4880 |
ispartof | Proceedings. International Conference on Image Processing, 2002, Vol.3, p.785-788 vol.3 |
issn | 1522-4880 2381-8549 |
language | eng |
recordid | cdi_ieee_primary_1039089 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Algorithm design and analysis Gaussian distribution Gaussian processes Hidden Markov models Image classification Probability distribution Robustness State estimation Supervised learning Vector quantization |
title | Robust image classification based on a non-causal hidden Markov Gauss mixture model |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T15%3A31%3A50IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Robust%20image%20classification%20based%20on%20a%20non-causal%20hidden%20Markov%20Gauss%20mixture%20model&rft.btitle=Proceedings.%20International%20Conference%20on%20Image%20Processing&rft.au=Kyungsuk%20Pyun&rft.date=2002&rft.volume=3&rft.spage=785&rft.epage=788%20vol.3&rft.pages=785-788%20vol.3&rft.issn=1522-4880&rft.eissn=2381-8549&rft.isbn=0780376226&rft.isbn_list=9780780376229&rft_id=info:doi/10.1109/ICIP.2002.1039089&rft_dat=%3Cieee_6IE%3E1039089%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=1039089&rfr_iscdi=true |