Distill-2MD-MTL: Data Distillation based on Multi-Dataset Multi-Domain Multi-Task Frame Work to Solve Face Related Tasksks, Multi Task Learning, Semi-Supervised Learning
We propose a new semi-supervised learning method on face-related tasks based on Multi-Task Learning (MTL) and data distillation. The proposed method exploits multiple datasets with different labels for different-but-related tasks such as simultaneous age, gender, race, facial expression estimation....
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We propose a new semi-supervised learning method on face-related tasks based
on Multi-Task Learning (MTL) and data distillation. The proposed method
exploits multiple datasets with different labels for different-but-related
tasks such as simultaneous age, gender, race, facial expression estimation.
Specifically, when there are only a few well-labeled data for a specific task
among the multiple related ones, we exploit the labels of other related tasks
in different domains. Our approach is composed of (1) a new MTL method which
can deal with weakly labeled datasets and perform several tasks simultaneously,
and (2) an MTL-based data distillation framework which enables network
generalization for the training and test data from different domains.
Experiments show that the proposed multi-task system performs each task better
than the baseline single task. It is also demonstrated that using different
domain datasets along with the main dataset can enhance network generalization
and overcome the domain differences between datasets. Also, comparing data
distillation both on the baseline and MTL framework, the latter shows more
accurate predictions on unlabeled data from different domains. Furthermore, by
proposing a new learning-rate optimization method, our proposed network is able
to dynamically tune its learning rate. |
---|---|
DOI: | 10.48550/arxiv.1907.03402 |