Leveraging Cognitive Context Knowledge for Argumentation-Based Object Classification in Multi-Sensor Networks

It is a great challenge to achieve interpretable collaborative object classification in multi-sensor networks. In this situation, argumentation-based object classification has been considered a promising paradigm, due to its natural means of justifying and explaining complicated decision making with...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.71361-71373
Hauptverfasser: Hao, Zhiyong, Wu, Junfeng, Liu, Tingting, Chen, Xiaohong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:It is a great challenge to achieve interpretable collaborative object classification in multi-sensor networks. In this situation, argumentation-based object classification has been considered a promising paradigm, due to its natural means of justifying and explaining complicated decision making within multiple agents. However, disagreements between sensor agents are often encountered because of various object category levels. To address this category of granularity inconsistent problem in multi-sensor collaborative object classification tasks, we propose a cognitive context knowledge-enriched method for classification conflict resolution. The cognitive context is concerned, in this paper, to investigate how rich contextual knowledge-equipped cognitive agents can facilitate semantic consensus in argumentation-based object classification. The empirical evaluation demonstrates the effectiveness of our method with improvement over state-of-the-art, especially in the presence of noisy sensor data, while giving argumentative explanations. Therefore, it is suggested that people who can benefit from the proposed method in this paper are the human user of multi-sensor object classification systems, in which explaining decision support is one of the important factors concerned.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2919073