Optimal Projections for Classification with Naive Bayes
In the Naive Bayes classification model the class conditional densities are estimated as the products of their marginal densities along the cardinal basis directions. We study the problem of obtaining an alternative basis for this factorisation with the objective of enhancing the discriminatory powe...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the Naive Bayes classification model the class conditional densities are
estimated as the products of their marginal densities along the cardinal basis
directions. We study the problem of obtaining an alternative basis for this
factorisation with the objective of enhancing the discriminatory power of the
associated classification model. We formulate the problem as a projection
pursuit to find the optimal linear projection on which to perform
classification. Optimality is determined based on the multinomial likelihood
within which probabilities are estimated using the Naive Bayes factorisation of
the projected data. Projection pursuit offers the added benefits of dimension
reduction and visualisation. We discuss an intuitive connection with class
conditional independent components analysis, and show how this is realised
visually in practical applications. The performance of the resulting
classification models is investigated using a large collection of (162)
publicly available benchmark data sets and in comparison with relevant
alternatives. We find that the proposed approach substantially outperforms
other popular probabilistic discriminant analysis models and is highly
competitive with Support Vector Machines. |
---|---|
DOI: | 10.48550/arxiv.2409.05635 |