Structure-Exploiting Discriminative Ordinal Multioutput Regression

Although the least-squares regression (LSR) has achieved great success in regression tasks, its discriminating ability is limited since the margins between classes are not specially preserved. To mitigate this issue, dragging techniques have been introduced to remodel the regression targets of LSR....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2021-01, Vol.32 (1), p.266-280
Hauptverfasser: Tian, Qing, Cao, Meng, Chen, Songcan, Yin, Hujun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Although the least-squares regression (LSR) has achieved great success in regression tasks, its discriminating ability is limited since the margins between classes are not specially preserved. To mitigate this issue, dragging techniques have been introduced to remodel the regression targets of LSR. Such variants have gained certain performance improvement, but their generalization ability is still unsatisfactory when handling real data. This is because structure-related information, which is typically contained in the data, is not exploited. To overcome this shortcoming, in this article, we construct a multioutput regression model by exploiting the intraclass correlations and input-output relationships via a structure matrix. We also discriminatively enlarge the regression margins by embedding a metric that is guided automatically by the training data. To better handle such structured data with ordinal labels, we encode the model output as cumulative attributes and, hence, obtain our proposed model, termed structure-exploiting discriminative ordinal multioutput regression (SEDOMOR). In addition, to further enhance its distinguishing ability, we extend the SEDOMOR to its nonlinear counterparts with kernel functions and deep architectures. We also derive the corresponding optimization algorithms for solving these models and prove their convergence. Finally, extensive experiments have testified the effectiveness and superiority of the proposed methods.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2020.2978508