Combining dissimilarity measures for image classification

•Proposing a novel method for image classification.•Combining global and local dissimilarity measures from two sample spaces for image classification.•Proposing three novel distance ratios for distance metric.•Revealing the difference between the Euclidean and block distances.•Extensive experimental...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2019-12, Vol.128, p.536-543
Hauptverfasser: Liu, Chuanyi, Wang, Junqian, Duan, Shaoming, Xu, Yong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 543
container_issue
container_start_page 536
container_title Pattern recognition letters
container_volume 128
creator Liu, Chuanyi
Wang, Junqian
Duan, Shaoming
Xu, Yong
description •Proposing a novel method for image classification.•Combining global and local dissimilarity measures from two sample spaces for image classification.•Proposing three novel distance ratios for distance metric.•Revealing the difference between the Euclidean and block distances.•Extensive experimental results are conducted. The local dissimilarity has been verified as one of effective metrics for pattern classification. For high-dimensional data, because of the property of high-dimension, even if there are a number of available samples, they are still only resultant observations of a sampling process of the high-dimensional population. As a consequence, available samples at least partially possess the property of randomness and are not “accurate” representations of the true and total sample space. Besides the prevailing local dissimilarity measure, global dissimilarity measures might also be exploited for improving the classification approach. In this paper, we propose to directly exploit global and local dissimilarity measures to efficiently perform image classification. The proposed method proposes to simultaneously use three dissimilarities derived from the original and transform sample space. These dissimilarities including the elaborated distance ratio enable space relations of the probe sample and gallery samples to be measured from three viewpoints, so the combination of them provide us with more reliable measurements on spatial geometric relationship of samples. An obvious advantage of this combination is that we can attain a very robust evaluation on the space distance between samples and the consequent classification decision will be less affected by the noise in data. The experiments prove that the proposed method does achieve the desired goal, i.e., very satisfactory accuracy improvement in comparison with the previous state-of-the art methods. [Display omitted]
doi_str_mv 10.1016/j.patrec.2019.10.026
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2329720091</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0167865519303058</els_id><sourcerecordid>2329720091</sourcerecordid><originalsourceid>FETCH-LOGICAL-c334t-9a8f046f55d766f3c758ba37f496069757a168d9e88ca693b0a6a56f5b505d443</originalsourceid><addsrcrecordid>eNp9kEtLxDAUhYMoOI7-AxcF1603zXsjyOALBtzoOqRpMqRMmzHpCP57M9S1qwuH75x770HoFkODAfP7oTmYOTnbtIBVkRpo-RlaYSnaWhBKz9GqYKKWnLFLdJXzAACcKLlCahPHLkxh2lV9yDmMYW9SmH-q0Zl8TC5XPqYqjGbnKrs3hfDBmjnE6RpdeLPP7uZvrtHn89PH5rXevr-8bR63tSWEzrUy0gPlnrFecO6JFUx2hghPFQeuBBMGc9krJ6U1XJEODDes8B0D1lNK1uhuyT2k-HV0edZDPKaprNQtaZVoARQuFF0om2LOyXl9SOXq9KMx6FNJetBLSfpU0kktJRXbw2Jz5YPv4JLONrjJuj4UdNZ9DP8H_AI1DHFD</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2329720091</pqid></control><display><type>article</type><title>Combining dissimilarity measures for image classification</title><source>Elsevier ScienceDirect Journals</source><creator>Liu, Chuanyi ; Wang, Junqian ; Duan, Shaoming ; Xu, Yong</creator><creatorcontrib>Liu, Chuanyi ; Wang, Junqian ; Duan, Shaoming ; Xu, Yong</creatorcontrib><description>•Proposing a novel method for image classification.•Combining global and local dissimilarity measures from two sample spaces for image classification.•Proposing three novel distance ratios for distance metric.•Revealing the difference between the Euclidean and block distances.•Extensive experimental results are conducted. The local dissimilarity has been verified as one of effective metrics for pattern classification. For high-dimensional data, because of the property of high-dimension, even if there are a number of available samples, they are still only resultant observations of a sampling process of the high-dimensional population. As a consequence, available samples at least partially possess the property of randomness and are not “accurate” representations of the true and total sample space. Besides the prevailing local dissimilarity measure, global dissimilarity measures might also be exploited for improving the classification approach. In this paper, we propose to directly exploit global and local dissimilarity measures to efficiently perform image classification. The proposed method proposes to simultaneously use three dissimilarities derived from the original and transform sample space. These dissimilarities including the elaborated distance ratio enable space relations of the probe sample and gallery samples to be measured from three viewpoints, so the combination of them provide us with more reliable measurements on spatial geometric relationship of samples. An obvious advantage of this combination is that we can attain a very robust evaluation on the space distance between samples and the consequent classification decision will be less affected by the noise in data. The experiments prove that the proposed method does achieve the desired goal, i.e., very satisfactory accuracy improvement in comparison with the previous state-of-the art methods. [Display omitted]</description><identifier>ISSN: 0167-8655</identifier><identifier>EISSN: 1872-7344</identifier><identifier>DOI: 10.1016/j.patrec.2019.10.026</identifier><language>eng</language><publisher>Amsterdam: Elsevier B.V</publisher><subject>Classification ; Distance measures ; Fusion distance ; Image classification ; Pattern recognition</subject><ispartof>Pattern recognition letters, 2019-12, Vol.128, p.536-543</ispartof><rights>2019</rights><rights>Copyright Elsevier Science Ltd. Dec 1, 2019</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c334t-9a8f046f55d766f3c758ba37f496069757a168d9e88ca693b0a6a56f5b505d443</citedby><cites>FETCH-LOGICAL-c334t-9a8f046f55d766f3c758ba37f496069757a168d9e88ca693b0a6a56f5b505d443</cites><orcidid>0000-0002-6650-9219</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0167865519303058$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27903,27904,65309</link.rule.ids></links><search><creatorcontrib>Liu, Chuanyi</creatorcontrib><creatorcontrib>Wang, Junqian</creatorcontrib><creatorcontrib>Duan, Shaoming</creatorcontrib><creatorcontrib>Xu, Yong</creatorcontrib><title>Combining dissimilarity measures for image classification</title><title>Pattern recognition letters</title><description>•Proposing a novel method for image classification.•Combining global and local dissimilarity measures from two sample spaces for image classification.•Proposing three novel distance ratios for distance metric.•Revealing the difference between the Euclidean and block distances.•Extensive experimental results are conducted. The local dissimilarity has been verified as one of effective metrics for pattern classification. For high-dimensional data, because of the property of high-dimension, even if there are a number of available samples, they are still only resultant observations of a sampling process of the high-dimensional population. As a consequence, available samples at least partially possess the property of randomness and are not “accurate” representations of the true and total sample space. Besides the prevailing local dissimilarity measure, global dissimilarity measures might also be exploited for improving the classification approach. In this paper, we propose to directly exploit global and local dissimilarity measures to efficiently perform image classification. The proposed method proposes to simultaneously use three dissimilarities derived from the original and transform sample space. These dissimilarities including the elaborated distance ratio enable space relations of the probe sample and gallery samples to be measured from three viewpoints, so the combination of them provide us with more reliable measurements on spatial geometric relationship of samples. An obvious advantage of this combination is that we can attain a very robust evaluation on the space distance between samples and the consequent classification decision will be less affected by the noise in data. The experiments prove that the proposed method does achieve the desired goal, i.e., very satisfactory accuracy improvement in comparison with the previous state-of-the art methods. [Display omitted]</description><subject>Classification</subject><subject>Distance measures</subject><subject>Fusion distance</subject><subject>Image classification</subject><subject>Pattern recognition</subject><issn>0167-8655</issn><issn>1872-7344</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp9kEtLxDAUhYMoOI7-AxcF1603zXsjyOALBtzoOqRpMqRMmzHpCP57M9S1qwuH75x770HoFkODAfP7oTmYOTnbtIBVkRpo-RlaYSnaWhBKz9GqYKKWnLFLdJXzAACcKLlCahPHLkxh2lV9yDmMYW9SmH-q0Zl8TC5XPqYqjGbnKrs3hfDBmjnE6RpdeLPP7uZvrtHn89PH5rXevr-8bR63tSWEzrUy0gPlnrFecO6JFUx2hghPFQeuBBMGc9krJ6U1XJEODDes8B0D1lNK1uhuyT2k-HV0edZDPKaprNQtaZVoARQuFF0om2LOyXl9SOXq9KMx6FNJetBLSfpU0kktJRXbw2Jz5YPv4JLONrjJuj4UdNZ9DP8H_AI1DHFD</recordid><startdate>20191201</startdate><enddate>20191201</enddate><creator>Liu, Chuanyi</creator><creator>Wang, Junqian</creator><creator>Duan, Shaoming</creator><creator>Xu, Yong</creator><general>Elsevier B.V</general><general>Elsevier Science Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7TK</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-6650-9219</orcidid></search><sort><creationdate>20191201</creationdate><title>Combining dissimilarity measures for image classification</title><author>Liu, Chuanyi ; Wang, Junqian ; Duan, Shaoming ; Xu, Yong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c334t-9a8f046f55d766f3c758ba37f496069757a168d9e88ca693b0a6a56f5b505d443</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Classification</topic><topic>Distance measures</topic><topic>Fusion distance</topic><topic>Image classification</topic><topic>Pattern recognition</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Chuanyi</creatorcontrib><creatorcontrib>Wang, Junqian</creatorcontrib><creatorcontrib>Duan, Shaoming</creatorcontrib><creatorcontrib>Xu, Yong</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Pattern recognition letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Liu, Chuanyi</au><au>Wang, Junqian</au><au>Duan, Shaoming</au><au>Xu, Yong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Combining dissimilarity measures for image classification</atitle><jtitle>Pattern recognition letters</jtitle><date>2019-12-01</date><risdate>2019</risdate><volume>128</volume><spage>536</spage><epage>543</epage><pages>536-543</pages><issn>0167-8655</issn><eissn>1872-7344</eissn><abstract>•Proposing a novel method for image classification.•Combining global and local dissimilarity measures from two sample spaces for image classification.•Proposing three novel distance ratios for distance metric.•Revealing the difference between the Euclidean and block distances.•Extensive experimental results are conducted. The local dissimilarity has been verified as one of effective metrics for pattern classification. For high-dimensional data, because of the property of high-dimension, even if there are a number of available samples, they are still only resultant observations of a sampling process of the high-dimensional population. As a consequence, available samples at least partially possess the property of randomness and are not “accurate” representations of the true and total sample space. Besides the prevailing local dissimilarity measure, global dissimilarity measures might also be exploited for improving the classification approach. In this paper, we propose to directly exploit global and local dissimilarity measures to efficiently perform image classification. The proposed method proposes to simultaneously use three dissimilarities derived from the original and transform sample space. These dissimilarities including the elaborated distance ratio enable space relations of the probe sample and gallery samples to be measured from three viewpoints, so the combination of them provide us with more reliable measurements on spatial geometric relationship of samples. An obvious advantage of this combination is that we can attain a very robust evaluation on the space distance between samples and the consequent classification decision will be less affected by the noise in data. The experiments prove that the proposed method does achieve the desired goal, i.e., very satisfactory accuracy improvement in comparison with the previous state-of-the art methods. [Display omitted]</abstract><cop>Amsterdam</cop><pub>Elsevier B.V</pub><doi>10.1016/j.patrec.2019.10.026</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0002-6650-9219</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0167-8655
ispartof Pattern recognition letters, 2019-12, Vol.128, p.536-543
issn 0167-8655
1872-7344
language eng
recordid cdi_proquest_journals_2329720091
source Elsevier ScienceDirect Journals
subjects Classification
Distance measures
Fusion distance
Image classification
Pattern recognition
title Combining dissimilarity measures for image classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T16%3A27%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Combining%20dissimilarity%20measures%20for%20image%20classification&rft.jtitle=Pattern%20recognition%20letters&rft.au=Liu,%20Chuanyi&rft.date=2019-12-01&rft.volume=128&rft.spage=536&rft.epage=543&rft.pages=536-543&rft.issn=0167-8655&rft.eissn=1872-7344&rft_id=info:doi/10.1016/j.patrec.2019.10.026&rft_dat=%3Cproquest_cross%3E2329720091%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2329720091&rft_id=info:pmid/&rft_els_id=S0167865519303058&rfr_iscdi=true