Tackling Multiple Tasks with One Single Learning Framework
Deep Multi-Task Learning (DMTL) has been widely studied in the machine learning community and applied to a broad range of real-world applications. Searching for the optimal knowledge sharing in DMTL is more challenging for sequential learning problems, as the task relationship will change in the tem...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep Multi-Task Learning (DMTL) has been widely studied in the machine
learning community and applied to a broad range of real-world applications.
Searching for the optimal knowledge sharing in DMTL is more challenging for
sequential learning problems, as the task relationship will change in the
temporal dimension. In this paper, we propose a flexible and efficient
framework called HierarchicalTemporal Activation Network (HTAN) to
simultaneously explore the optimal sharing of the neural network hierarchy
(hierarchical axis) and the time-variant task relationship (temporal axis).
HTAN learns a set of time-variant activation functions to encode the task
relation. A functional regularization implemented by a modulated SPDNet and
adversarial learning is further proposed to enhance the DMTL performance.
Comprehensive experiments on several challenging applications demonstrate that
our HTAN-SPD framework outperforms SOTA methods significantly in sequential
DMTL. |
---|---|
DOI: | 10.48550/arxiv.2206.06322 |