Imbalance accuracy metric for model selection in multi-class imbalance classification problems

The overall accuracy, macro precision, macro recall, F-score and class balance accuracy, due to their simplicity and easy interpretation, have been among the most popular metrics to measure the performance of classifiers on multi-class problems. However, on imbalance datasets, some of these metrics...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2020-12, Vol.210, p.106490, Article 106490
1. Verfasser: Mortaz, Ebrahim
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The overall accuracy, macro precision, macro recall, F-score and class balance accuracy, due to their simplicity and easy interpretation, have been among the most popular metrics to measure the performance of classifiers on multi-class problems. However, on imbalance datasets, some of these metrics can be unfairly influenced by heavier classes. Therefore, it is recommended that they are used as a group and not individually. This strategy can unnecessarily complicate the model selection and evaluation in imbalance datasets. In this paper, we introduce a new metric, imbalance accuracy metric (IAM), that can be used as a solo measure for model evaluation and selection. The IAM is built up on top of the existing metrics, is simple to use, and easy to interpret. This metric is meant to be used as a bottom-line measure aiming to eliminate the need for group metric computation and simplify the model selection. •The IAM is proposed as a metric for model selection in multi-class imbalance problems.•The IAM is built up on top of the existing metrics and is simple to use.•The IAM shows how well a classifier does not classify an instance in incorrect classes.•The IAM is to eliminate the need for multiple metric computation in model selection.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2020.106490