An Efficient Alternating Newton Method for Learning Factorization Machines

To date, factorization machines (FMs) have emerged as a powerful model in many applications. In this work, we study the training of FM with the logistic loss for binary classification, which is a nonlinear extension of the linear model with the logistic loss (i.e., logistic regression). For the trai...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on intelligent systems and technology 2018-11, Vol.9 (6), p.1-31
Hauptverfasser: Chin, Wei-Sheng, Yuan, Bo-Wen, Yang, Meng-Yuan, Lin, Chih-Jen
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To date, factorization machines (FMs) have emerged as a powerful model in many applications. In this work, we study the training of FM with the logistic loss for binary classification, which is a nonlinear extension of the linear model with the logistic loss (i.e., logistic regression). For the training of large-scale logistic regression, Newton methods have been shown to be an effective approach, but it is difficult to apply such methods to FM because of the nonconvexity. We consider a modification of FM that is multiblock convex and propose an alternating minimization algorithm based on Newton methods. Some novel optimization techniques are introduced to reduce the running time. Our experiments demonstrate that the proposed algorithm is more efficient than stochastic gradient algorithms and coordinate descent methods. The parallelism of our method is also investigated for the acceleration in multithreading environments.
ISSN:2157-6904
2157-6912
DOI:10.1145/3230710