Misclassification Excess Risk Bounds for 1‐Bit Matrix Completion
This study investigates the misclassification excess risk bound in the context of 1‐bit matrix completion, a significant problem in machine learning involving the recovery of an unknown matrix from a limited subset of its entries. Matrix completion has garnered considerable attention in the last two...
Gespeichert in:
Veröffentlicht in: | Stat (International Statistical Institute) 2024-12, Vol.13 (4) |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This study investigates the misclassification excess risk bound in the context of 1‐bit matrix completion, a significant problem in machine learning involving the recovery of an unknown matrix from a limited subset of its entries. Matrix completion has garnered considerable attention in the last two decades due to its diverse applications across various fields. Unlike conventional approaches that deal with real‐valued samples, 1‐bit matrix completion is concerned with binary observations. While prior research has predominantly focused on the estimation error of proposed estimators, our study shifts attention to the prediction error. This paper offers theoretical analysis regarding the prediction errors of two previous works utilizing the logistic regression model: one employing a max‐norm constrained minimization and the other employing nuclear‐norm penalization. Significantly, our findings demonstrate that the latter achieves the minimax‐optimal rate without the need for an additional logarithmic term. These novel results contribute to a deeper understanding of 1‐bit matrix completion by shedding light on the predictive performance of specific methodologies. |
---|---|
ISSN: | 2049-1573 2049-1573 |
DOI: | 10.1002/sta4.70003 |