Example-Based Explainable AI and its Application for Remote Sensing Image Classification
We present a method of explainable artificial intelligence (XAI), "What I Know (WIK)", to provide additional information to verify the reliability of a deep learning model by showing an example of an instance in a training dataset that is similar to the input data to be inferred and demons...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Ishikawa, Shin-nosuke Todo, Masato Taki, Masato Uchiyama, Yasunobu Matsunaga, Kazunari Lin, Peihsuan Ogihara, Taiki Yasui, Masao |
description | We present a method of explainable artificial intelligence (XAI), "What I
Know (WIK)", to provide additional information to verify the reliability of a
deep learning model by showing an example of an instance in a training dataset
that is similar to the input data to be inferred and demonstrate it in a remote
sensing image classification task. One of the expected roles of XAI methods is
verifying whether inferences of a trained machine learning model are valid for
an application, and it is an important factor that what datasets are used for
training the model as well as the model architecture. Our data-centric approach
can help determine whether the training dataset is sufficient for each
inference by checking the selected example data. If the selected example looks
similar to the input data, we can confirm that the model was not trained on a
dataset with a feature distribution far from the feature of the input data.
With this method, the criteria for selecting an example are not merely data
similarity with the input data but also data similarity in the context of the
model task. Using a remote sensing image dataset from the Sentinel-2 satellite,
the concept was successfully demonstrated with reasonably selected examples.
This method can be applied to various machine-learning tasks, including
classification and regression. |
doi_str_mv | 10.48550/arxiv.2302.01526 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2302_01526</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2302_01526</sourcerecordid><originalsourceid>FETCH-LOGICAL-a676-93583869218dcf15eb1b575daa64ee9d28847c50f6fd3abb6be21f10e8b55bee3</originalsourceid><addsrcrecordid>eNotz81KxDAYheFsXMjoBbgyN9CanyZNl7VULQwIOgt35UvzZQikPzRF6t2L46zO5uXAQ8gDZ3lhlGJPsO7hOxeSiZxxJfQt-Wp3GJeI2TMkdLTdlwhhAhuR1h2FydGwJVovSwwDbGGeqJ9X-oHjvCH9xCmF6Uy7Ec5ImwgpBX_t7siNh5jw_roHcnppT81bdnx_7Zr6mIEudVZJZaTRleDGDZ4rtNyqUjkAXSBWThhTlINiXnsnwVptUXDPGRqrlEWUB_L4f3uh9csaRlh_-j9ifyHKX5sHTII</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Example-Based Explainable AI and its Application for Remote Sensing Image Classification</title><source>arXiv.org</source><creator>Ishikawa, Shin-nosuke ; Todo, Masato ; Taki, Masato ; Uchiyama, Yasunobu ; Matsunaga, Kazunari ; Lin, Peihsuan ; Ogihara, Taiki ; Yasui, Masao</creator><creatorcontrib>Ishikawa, Shin-nosuke ; Todo, Masato ; Taki, Masato ; Uchiyama, Yasunobu ; Matsunaga, Kazunari ; Lin, Peihsuan ; Ogihara, Taiki ; Yasui, Masao</creatorcontrib><description>We present a method of explainable artificial intelligence (XAI), "What I
Know (WIK)", to provide additional information to verify the reliability of a
deep learning model by showing an example of an instance in a training dataset
that is similar to the input data to be inferred and demonstrate it in a remote
sensing image classification task. One of the expected roles of XAI methods is
verifying whether inferences of a trained machine learning model are valid for
an application, and it is an important factor that what datasets are used for
training the model as well as the model architecture. Our data-centric approach
can help determine whether the training dataset is sufficient for each
inference by checking the selected example data. If the selected example looks
similar to the input data, we can confirm that the model was not trained on a
dataset with a feature distribution far from the feature of the input data.
With this method, the criteria for selecting an example are not merely data
similarity with the input data but also data similarity in the context of the
model task. Using a remote sensing image dataset from the Sentinel-2 satellite,
the concept was successfully demonstrated with reasonably selected examples.
This method can be applied to various machine-learning tasks, including
classification and regression.</description><identifier>DOI: 10.48550/arxiv.2302.01526</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning ; Physics - Geophysics</subject><creationdate>2023-02</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2302.01526$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2302.01526$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Ishikawa, Shin-nosuke</creatorcontrib><creatorcontrib>Todo, Masato</creatorcontrib><creatorcontrib>Taki, Masato</creatorcontrib><creatorcontrib>Uchiyama, Yasunobu</creatorcontrib><creatorcontrib>Matsunaga, Kazunari</creatorcontrib><creatorcontrib>Lin, Peihsuan</creatorcontrib><creatorcontrib>Ogihara, Taiki</creatorcontrib><creatorcontrib>Yasui, Masao</creatorcontrib><title>Example-Based Explainable AI and its Application for Remote Sensing Image Classification</title><description>We present a method of explainable artificial intelligence (XAI), "What I
Know (WIK)", to provide additional information to verify the reliability of a
deep learning model by showing an example of an instance in a training dataset
that is similar to the input data to be inferred and demonstrate it in a remote
sensing image classification task. One of the expected roles of XAI methods is
verifying whether inferences of a trained machine learning model are valid for
an application, and it is an important factor that what datasets are used for
training the model as well as the model architecture. Our data-centric approach
can help determine whether the training dataset is sufficient for each
inference by checking the selected example data. If the selected example looks
similar to the input data, we can confirm that the model was not trained on a
dataset with a feature distribution far from the feature of the input data.
With this method, the criteria for selecting an example are not merely data
similarity with the input data but also data similarity in the context of the
model task. Using a remote sensing image dataset from the Sentinel-2 satellite,
the concept was successfully demonstrated with reasonably selected examples.
This method can be applied to various machine-learning tasks, including
classification and regression.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><subject>Physics - Geophysics</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz81KxDAYheFsXMjoBbgyN9CanyZNl7VULQwIOgt35UvzZQikPzRF6t2L46zO5uXAQ8gDZ3lhlGJPsO7hOxeSiZxxJfQt-Wp3GJeI2TMkdLTdlwhhAhuR1h2FydGwJVovSwwDbGGeqJ9X-oHjvCH9xCmF6Uy7Ec5ImwgpBX_t7siNh5jw_roHcnppT81bdnx_7Zr6mIEudVZJZaTRleDGDZ4rtNyqUjkAXSBWThhTlINiXnsnwVptUXDPGRqrlEWUB_L4f3uh9csaRlh_-j9ifyHKX5sHTII</recordid><startdate>20230202</startdate><enddate>20230202</enddate><creator>Ishikawa, Shin-nosuke</creator><creator>Todo, Masato</creator><creator>Taki, Masato</creator><creator>Uchiyama, Yasunobu</creator><creator>Matsunaga, Kazunari</creator><creator>Lin, Peihsuan</creator><creator>Ogihara, Taiki</creator><creator>Yasui, Masao</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20230202</creationdate><title>Example-Based Explainable AI and its Application for Remote Sensing Image Classification</title><author>Ishikawa, Shin-nosuke ; Todo, Masato ; Taki, Masato ; Uchiyama, Yasunobu ; Matsunaga, Kazunari ; Lin, Peihsuan ; Ogihara, Taiki ; Yasui, Masao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a676-93583869218dcf15eb1b575daa64ee9d28847c50f6fd3abb6be21f10e8b55bee3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><topic>Physics - Geophysics</topic><toplevel>online_resources</toplevel><creatorcontrib>Ishikawa, Shin-nosuke</creatorcontrib><creatorcontrib>Todo, Masato</creatorcontrib><creatorcontrib>Taki, Masato</creatorcontrib><creatorcontrib>Uchiyama, Yasunobu</creatorcontrib><creatorcontrib>Matsunaga, Kazunari</creatorcontrib><creatorcontrib>Lin, Peihsuan</creatorcontrib><creatorcontrib>Ogihara, Taiki</creatorcontrib><creatorcontrib>Yasui, Masao</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ishikawa, Shin-nosuke</au><au>Todo, Masato</au><au>Taki, Masato</au><au>Uchiyama, Yasunobu</au><au>Matsunaga, Kazunari</au><au>Lin, Peihsuan</au><au>Ogihara, Taiki</au><au>Yasui, Masao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Example-Based Explainable AI and its Application for Remote Sensing Image Classification</atitle><date>2023-02-02</date><risdate>2023</risdate><abstract>We present a method of explainable artificial intelligence (XAI), "What I
Know (WIK)", to provide additional information to verify the reliability of a
deep learning model by showing an example of an instance in a training dataset
that is similar to the input data to be inferred and demonstrate it in a remote
sensing image classification task. One of the expected roles of XAI methods is
verifying whether inferences of a trained machine learning model are valid for
an application, and it is an important factor that what datasets are used for
training the model as well as the model architecture. Our data-centric approach
can help determine whether the training dataset is sufficient for each
inference by checking the selected example data. If the selected example looks
similar to the input data, we can confirm that the model was not trained on a
dataset with a feature distribution far from the feature of the input data.
With this method, the criteria for selecting an example are not merely data
similarity with the input data but also data similarity in the context of the
model task. Using a remote sensing image dataset from the Sentinel-2 satellite,
the concept was successfully demonstrated with reasonably selected examples.
This method can be applied to various machine-learning tasks, including
classification and regression.</abstract><doi>10.48550/arxiv.2302.01526</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2302.01526 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2302_01526 |
source | arXiv.org |
subjects | Computer Science - Artificial Intelligence Computer Science - Computer Vision and Pattern Recognition Computer Science - Learning Physics - Geophysics |
title | Example-Based Explainable AI and its Application for Remote Sensing Image Classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-23T17%3A56%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Example-Based%20Explainable%20AI%20and%20its%20Application%20for%20Remote%20Sensing%20Image%20Classification&rft.au=Ishikawa,%20Shin-nosuke&rft.date=2023-02-02&rft_id=info:doi/10.48550/arxiv.2302.01526&rft_dat=%3Carxiv_GOX%3E2302_01526%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |