Local-to-Global Deep Clustering on Approximate Uniform Manifold

Deep clustering usually treats the clustering assignments as supervisory signals to learn a more compact representation with deep neural networks, under the guidance of clustering-oriented losses. Nevertheless, we observe that, without reliable supervision, such losses for global clustering would de...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on knowledge and data engineering 2023-05, Vol.35 (5), p.5035-5046
Hauptverfasser: Wang, Tuo, Zhang, Xiang, Lan, Long, Luo, Zhigang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep clustering usually treats the clustering assignments as supervisory signals to learn a more compact representation with deep neural networks, under the guidance of clustering-oriented losses. Nevertheless, we observe that, without reliable supervision, such losses for global clustering would destroy the locally geometric structure underlying data. In this paper, we propose a local-to-global deep clustering method based on approximate uniform manifold (LGC-AUM) to address this issue in a two-stage fashion. In the local stage, an intra-manifold preservation loss is proposed to preserve intra-manifold structures locally on basis of approximate uniform manifold, and an inter-manifold discrimination loss is for global inter-manifold structure. Thus, this stage serves to learn more discriminative structure-preserving features by reducing the correlations between different manifolds, which paves the way for the final clustering. Build off the learned features, the second stage explores a clustering loss based on approximate uniform manifold to establish stable network training for effective clustering with two auxiliary distributions. Experiments on five benchmark datasets verify the efficacy of our LGC-AUM as compared to several well-behaved clustering counterparts.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2022.3144952