Adaptive weighted ensemble clustering via kernel learning and local information preservation
Ensemble clustering refers to learning a robust and accurate consensus result from a collection of base clustering results. Despite extensive research on this topic, it remains challenging due to the absence of raw features and real labels of samples. Current ensemble clustering methods based on co-...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2024-06, Vol.294, p.111793, Article 111793 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Ensemble clustering refers to learning a robust and accurate consensus result from a collection of base clustering results. Despite extensive research on this topic, it remains challenging due to the absence of raw features and real labels of samples. Current ensemble clustering methods based on co-association (CA) matrix have received much attention from researchers. However, the information contained in the CA matrix is difficult to fully exploit within its inherent representation space. To address this issue and further unveil the underlying patterns and nonlinear relationships within the CA matrix, this paper proposes a novel Adaptive Weighted method based on Kernel learning and local information preservation for Ensemble Clustering, termed AWKEC. Unlike other ensemble clustering studies, AWKEC introduces kernel ideas to ensemble clustering to learn consistent sample relationships across multiple kernel spaces. Moreover, considering the properties of the CA matrix, highly reliable local information is preserved and embedded into the learned similarity coefficient matrices via Hadamard product. The experimental results, conducted on 13 datasets and compared to 10 representative comparative methods, validate the effectiveness of the proposed AWKEC. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2024.111793 |