Monte Carlo averaging for uncertainty estimation in neural networks

Although convolutional neural networks (CNNs) are widely used in modern classifiers, they are affected by overfitting and lack robustness leading to overconfident false predictions (FPs). By preventing FPs, certain consequences (such as accidents and financial losses) can be avoided and the use of C...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of physics. Conference series 2023-05, Vol.2506 (1), p.12004
Hauptverfasser: Tassi, Cedrique Rovile Njieutcheu, Börner, Anko, Triebel, Rudolph
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Although convolutional neural networks (CNNs) are widely used in modern classifiers, they are affected by overfitting and lack robustness leading to overconfident false predictions (FPs). By preventing FPs, certain consequences (such as accidents and financial losses) can be avoided and the use of CNNs in safety- and/or mission-critical applications would be effective. In this work, we aim to improve the separability of true predictions (TPs) and FPs by enforcing the confidence determining uncertainty to be high for TPs and low for FPs. To achieve this, we must devise a suitable method. We proposed the use of Monte Carlo averaging (MCA) and thus compare it with related methods, such as baseline (single CNN), Monte Carlo dropout (MCD), ensemble, and mixture of Monte Carlo dropout (MMCD). This comparison is performed using the results of experiments conducted on four datasets with three different architectures. The results show that MCA performs as well as or even better than MMCD, which in turn performs better than baseline, ensemble, and MCD. Consequently, MCA could be used instead of MMCD for uncertainty estimation, especially because it does not require a predefined distribution and it is less expensive than MMCD.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/2506/1/012004