Orthogonal parametric non-negative matrix tri-factorization with α-divergence for co-clustering

Co-clustering algorithms can seek homogeneous sub-matrices into a dyadic data matrix, such as a document-word matrix. Algorithms for co-clustering can be expressed as a non-negative matrix tri-factorization problem such that X≈FSG⊤, which is associated with the non-negativity conditions on all matri...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems with applications 2023-11, Vol.231, p.120680, Article 120680
Hauptverfasser: Hoseinipour, Saeid, Aminghafari, Mina, Mohammadpour, Adel
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Co-clustering algorithms can seek homogeneous sub-matrices into a dyadic data matrix, such as a document-word matrix. Algorithms for co-clustering can be expressed as a non-negative matrix tri-factorization problem such that X≈FSG⊤, which is associated with the non-negativity conditions on all matrices and the orthogonality of F (row-coefficient) and G (column-coefficient) matrices. Most algorithms are based on Euclidean distance and Kullback–Leibler divergence without parameters to control orthogonality. We propose to apply the orthogonality of parameters by adding two penalty terms based on the α-divergence objective function. Orthogonal parametric non-negative matrix tri-factorization uses orthogonal parameters for row and column space, separately. Finally, we compare the proposed algorithms with other algorithms on six real text datasets. •Our algorithm works by multiplicative update rules and it is convergence.•Adding two penalties for controlling the orthogonality of row and column clusters.•Unifying a class of algorithms for co-clustering based on α-divergence.•All datasets and algorithm codes are available on GitHub as NMTFcoclust repository.
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2023.120680