Usage of the Kullback–Leibler divergence on posterior Dirichlet distributions to create a training dataset for a learning algorithm to classify driving behaviour events
Information theory uses the Kullback–Leibler divergence to compare distributions. In this paper, we apply it to bayesian posterior distributions and we show how it can be used to train a machine learning algorithm as well. The data sample used in this study is an OCTOTelematics set of driving behavi...
Gespeichert in:
Veröffentlicht in: | Journal of Computational Mathematics and Data Science 2023-08, Vol.8, p.100081, Article 100081 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Information theory uses the Kullback–Leibler divergence to compare distributions. In this paper, we apply it to bayesian posterior distributions and we show how it can be used to train a machine learning algorithm as well. The data sample used in this study is an OCTOTelematics set of driving behaviour data. |
---|---|
ISSN: | 2772-4158 2772-4158 |
DOI: | 10.1016/j.jcmds.2023.100081 |