Angular Margin-Mining Softmax Loss for Face Recognition

Face recognition methods have been significantly improved in recent years owing to the advances made in loss functions. Typically, loss functions are designed to enhance the separability power by concentrating on hard samples in mining-based approaches or by increasing the feature margin between dif...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.43071-43080
Hauptverfasser: Lee, Jwajin, Wang, Yooseung, Cho, Sunyoung
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Face recognition methods have been significantly improved in recent years owing to the advances made in loss functions. Typically, loss functions are designed to enhance the separability power by concentrating on hard samples in mining-based approaches or by increasing the feature margin between different classes in margin-based approaches. However, margin-based methods lack the utilization of informative hard sample, and mining-based methods also fail to learn the latent correlations between classes. Moreover, there are no methods that simultaneously consider the effects of hard samples and feature margin through the same shape of feature angular margin. Therefore, this paper introduces the Angular Margin-Mining Softmax (AMM-Softmax) loss function, which adaptively emphasizes hard samples while also increasing the decision margins. The proposed AMM-Softmax loss function introduces a linear angular margin for hard samples, enabling the direct optimization of the geodesic distance margin and maximization of class separability. Furthermore, the proposed AMM-Softmax loss function is computationally efficient and can be easily converged by rapidly switching from the hard samples to easy samples. The results of the extensive experimental analyses conducted on popular benchmarks demonstrate the superiority of the proposed AMM-Softmax loss function over the existing state-of-the-art methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3168310