Mitigating Label Noise using Prompt-Based Hyperbolic Meta-Learning in Open-Set Domain Generalization
Open-Set Domain Generalization (OSDG) is a challenging task requiring models to accurately predict familiar categories while minimizing confidence for unknown categories to effectively reject them in unseen domains. While the OSDG field has seen considerable advancements, the impact of label noise--...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Open-Set Domain Generalization (OSDG) is a challenging task requiring models
to accurately predict familiar categories while minimizing confidence for
unknown categories to effectively reject them in unseen domains. While the OSDG
field has seen considerable advancements, the impact of label noise--a common
issue in real-world datasets--has been largely overlooked. Label noise can
mislead model optimization, thereby exacerbating the challenges of open-set
recognition in novel domains. In this study, we take the first step towards
addressing Open-Set Domain Generalization under Noisy Labels (OSDG-NL) by
constructing dedicated benchmarks derived from widely used OSDG datasets,
including PACS and DigitsDG. We evaluate baseline approaches by integrating
techniques from both label denoising and OSDG methodologies, highlighting the
limitations of existing strategies in handling label noise effectively. To
address these limitations, we propose HyProMeta, a novel framework that
integrates hyperbolic category prototypes for label noise-aware meta-learning
alongside a learnable new-category agnostic prompt designed to enhance
generalization to unseen classes. Our extensive experiments demonstrate the
superior performance of HyProMeta compared to state-of-the-art methods across
the newly established benchmarks. The source code of this work is released at
https://github.com/KPeng9510/HyProMeta. |
---|---|
DOI: | 10.48550/arxiv.2412.18342 |