Attribute and instance weighted naive Bayes

•Many different categories of approaches have been proposed to improve naive Bayes.•Few works simultaneously pay attention to attribute weighting and instance weighting.•We propose attribute and instance weighted naive Bayes (AIWNB) in this paper.•To learn AIWNB, we propose an eager and a lazy algor...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition 2021-03, Vol.111, p.107674, Article 107674
Hauptverfasser: Zhang, Huan, Jiang, Liangxiao, Yu, Liangjun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Many different categories of approaches have been proposed to improve naive Bayes.•Few works simultaneously pay attention to attribute weighting and instance weighting.•We propose attribute and instance weighted naive Bayes (AIWNB) in this paper.•To learn AIWNB, we propose an eager and a lazy algorithms: AIWNBE and AIWNBL.•The experimental results validate the effectiveness of the proposed algorithms. Naive Bayes (NB) continues to be one of the top 10 data mining algorithms, but its conditional independence assumption rarely holds true in real-world applications. Therefore, many different categories of improved approaches, including attribute weighting and instance weighting, have been proposed to alleviate this assumption. However, few of these approaches simultaneously pay attention to attribute weighting and instance weighting. In this study, we propose a new improved model called attribute and instance weighted naive Bayes (AIWNB), which combines attribute weighting with instance weighting into one uniform framework. In AIWNB, the attribute weights are incorporated into the naive Bayesian classification formula, and then the prior and conditional probabilities are estimated using instance weighted training data. To learn instance weights, we single out an eager approach and a lazy approach, and thus two different versions are created, which we denote as AIWNBE and AIWNBL, respectively. Extensive experimental results show that both AIWNBE and AIWNBL significantly outperform NB and all the other existing state-of-the-art competitors.
ISSN:0031-3203
1873-5142
DOI:10.1016/j.patcog.2020.107674